CN108230397B - Multi-view camera calibration and correction method and apparatus, device, program and medium - Google Patents

Multi-view camera calibration and correction method and apparatus, device, program and medium Download PDF

Info

Publication number
CN108230397B
CN108230397B CN201711298424.0A CN201711298424A CN108230397B CN 108230397 B CN108230397 B CN 108230397B CN 201711298424 A CN201711298424 A CN 201711298424A CN 108230397 B CN108230397 B CN 108230397B
Authority
CN
China
Prior art keywords
camera
calibration
parameters
cameras
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711298424.0A
Other languages
Chinese (zh)
Other versions
CN108230397A (en
Inventor
周恩宇
孙文秀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Sensetime Technology Co Ltd
Original Assignee
Shenzhen Sensetime Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Sensetime Technology Co Ltd filed Critical Shenzhen Sensetime Technology Co Ltd
Priority to CN201711298424.0A priority Critical patent/CN108230397B/en
Publication of CN108230397A publication Critical patent/CN108230397A/en
Application granted granted Critical
Publication of CN108230397B publication Critical patent/CN108230397B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The embodiment of the invention discloses a calibration and correction method and device of a multi-view camera, electronic equipment, a computer program and a storage medium. The method comprises the following steps: calibrating camera parameters of the multi-view cameras respectively, wherein at least two view cameras in the multi-view cameras are calibrated respectively based on images which are shot by the cameras and have self-resolution; and correcting the multi-view camera according to the calibrated data, and adjusting the images shot by each camera in the at least two view cameras into images with the same resolution in the correction process. The embodiment of the invention can improve the calibration and correction precision of the multi-view camera.

Description

Multi-view camera calibration and correction method and apparatus, device, program and medium
Technical Field
The invention belongs to the technical field of computer vision, and particularly relates to a calibration and correction method and device for a multi-view camera, electronic equipment, a computer program and a storage medium.
Background
The camera calibration is mainly used for obtaining camera parameters. The correction of a multi-view camera, such as a binocular camera, is mainly to make the same object in two images have the same size and be horizontally aligned.
The calibration and correction of the multi-view camera are the basis of algorithms in many computer vision fields, and have important influence on the accuracy and the effect of the related algorithms in many fields.
Disclosure of Invention
The embodiment of the invention provides a technical scheme for calibrating and correcting a multi-view camera.
According to an aspect of the embodiments of the present invention, there is provided a calibration and correction method for a multi-view camera, including:
calibrating camera parameters of the multi-view cameras respectively, wherein at least two view cameras in the multi-view cameras are calibrated respectively based on images which are shot by the cameras and have self-resolution;
and correcting the multi-view camera according to the calibrated data, and adjusting the images shot by each camera in the at least two view cameras into images with the same resolution in the correction process.
According to another aspect of the embodiments of the present invention, there is provided a calibration and correction device for a multi-view camera, including:
the calibration unit is used for respectively calibrating camera parameters of the multi-view cameras, wherein at least two view cameras in the multi-view cameras are respectively calibrated based on images which are shot by the cameras and have self-resolution;
and the correcting unit is used for correcting the multi-view camera according to the calibrated data, and images shot by each camera in the at least two view cameras are adjusted to images with the same resolution in the correcting process.
According to another aspect of the embodiments of the present invention, there is provided an electronic device including the apparatus according to any of the above embodiments.
According to still another aspect of an embodiment of the present invention, there is provided an electronic apparatus including:
a memory for storing executable instructions; and
a processor in communication with the memory for executing the executable instructions to perform the operations of the method of any of the above embodiments.
According to a further aspect of the embodiments of the present invention, there is provided a computer program, including computer readable code, which when run on a device, a processor in the device executes instructions for implementing the steps of the method according to any of the above embodiments.
According to yet another aspect of the embodiments of the present invention, a computer storage medium is provided for storing computer-readable instructions, which when executed perform the operations of the method according to any of the above embodiments.
Based on the calibration and correction method and apparatus, electronic device, computer program and storage medium for a multi-view camera provided in the above embodiments of the present invention, in the calibration process of the multi-view camera, the camera performs calibration of camera parameters based on images with their own resolutions respectively captured by the cameras, and in the calibration process of the multi-view camera according to the calibrated data, the images captured by the cameras are adjusted to images with the same resolution, when there are cameras with different resolutions in the multi-view camera, since it is not necessary to adjust the images with different resolutions before calibration and perform parameter calculation based on the adjusted images, errors in calibration of camera parameters due to adjustment of images can be reduced or even avoided, and further, when the calibration is performed according to the calibrated data, errors in camera calibration results due to error transfer can be reduced or even avoided, thereby improving the accuracy of camera calibration and correction. The image shot by the multi-view camera after high-precision calibration and correction can be used as the basis for processing related algorithms in the computer vision field, for example, the image can be used as the basis for processing such as a binocular stereo matching algorithm, a multi-camera three-dimensional reconstruction algorithm, a multi-camera denoising algorithm and the like, so that the algorithm processing precision is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
The invention will be more clearly understood from the following detailed description, taken with reference to the accompanying drawings, in which:
fig. 1 is a flowchart of an embodiment of a calibration and correction method for a multi-view camera according to the present invention.
Fig. 2 is a flowchart of an embodiment of the method for calibrating and correcting a multi-view camera according to the present invention, wherein the method adjusts the resolution of an image during the correction process.
Fig. 3 is a flowchart of an embodiment of camera parameter calibration performed by the calibration and correction method of the multi-view camera according to the embodiment of the present invention.
Fig. 4 is a schematic diagram of an embodiment of a calibration board used in the calibration and correction method of the multi-view camera according to the embodiment of the invention.
Fig. 5 is a schematic diagram of an embodiment of a calibration board set used in the calibration and correction method of the multi-view camera according to the embodiment of the invention.
Fig. 6 is a flowchart of another embodiment of camera parameter calibration performed by the calibration and correction method of the multi-view camera according to the embodiment of the invention.
Fig. 7 is a schematic structural diagram of an embodiment of a calibration and correction apparatus of a multi-view camera according to an embodiment of the present invention.
Fig. 8 is a schematic structural diagram of an embodiment of a calibration unit of the calibration and correction device of the multi-view camera according to the embodiment of the invention.
Fig. 9 is a schematic structural diagram of an embodiment of a calibration unit of the calibration and correction device of the multi-view camera according to the embodiment of the invention.
Fig. 10 is a schematic structural diagram of another embodiment of the calibration unit of the calibration and correction device of the multi-view camera according to the embodiment of the invention.
Fig. 11 is a schematic structural diagram of an embodiment of an electronic device according to the embodiment of the present invention.
Detailed Description
Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components, the numerical expressions, and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise.
Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
Embodiments of the invention are operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the computer system/server include, but are not limited to: personal computer systems, server computer systems, thin clients, thick clients, hand-held or laptop devices, microprocessor-based systems, set-top boxes, programmable consumer electronics, networked personal computers, minicomputer systems, mainframe computer systems, distributed cloud computing environments that include any of the above, and the like.
The computer system/server may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, etc. that perform particular tasks or implement particular abstract data types. The computer system/server may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computer system storage media including memory storage devices.
Fig. 1 is a flowchart of an embodiment of a calibration and correction method for a multi-view camera according to the present invention. As shown in fig. 1, the method of this embodiment includes:
and 102, calibrating camera parameters of the multi-view cameras respectively, wherein calibration of at least two view cameras in the multi-view cameras is carried out respectively based on images which are shot by the cameras and have self-resolution.
And 104, correcting the multi-view camera according to the calibrated data, and adjusting the images shot by each camera in at least two view cameras into images with the same resolution in the correction process.
The camera is a device having a camera shooting function, and may also be referred to as a camera, a camera module, a camera lens, or a lens.
Optionally, the camera parameters include one or more of the following parameters of the camera: internal parameters, external parameters, distortion parameters. Internal parameters of the camera such as focal length, lens center, etc.; external parameters of the cameras such as relative position information of a camera in the multi-view camera relative to other cameras, such as rotation parameters and displacement angles; distortion parameters of the camera such as tangential distortion, radial distortion, etc.
The multi-view camera can be a binocular camera, a three-view camera, a four-view camera and even more cameras.
Optionally, the multi-view cameras include cameras of at least two different view resolutions; for a binocular camera, the resolution of the two eyes is different; for a three-eye camera, a four-eye camera and even more cameras, wherein at least two-eye resolution is different, on the basis, other eye resolutions can be the same or different.
Alternatively, operation 102 calibrates cameras with different resolutions among the multi-view cameras directly with images with their own resolutions respectively captured in the calibration of the camera parameters to obtain the camera parameters, and operation 104 adjusts the images with their own resolutions respectively captured by the cameras with different resolutions among the multi-view cameras in the calibration process according to the calibrated data to obtain the images with the same resolution.
Alternatively, the operation 104 may adjust the images captured by the cameras to the images with the same resolution through a zoom operation during the correction process, and the manner of the zoom operation may be very flexible, for example: the images captured by the cameras may be adjusted to a certain preset size, or the images captured by the cameras may be adjusted to a certain resolution of one of the cameras. The scaling operations may include, but are not limited to, upsampling and/or downsampling, etc.
In practical applications, the multi-view camera may include multiple cameras with different resolutions for cost reduction or other reasons. For example: one of the binocular cameras has a high resolution and the other camera has a low resolution. In the calibration process, images obtained by simultaneously shooting by cameras with different resolutions in the multi-view camera have different sizes, and the images with different sizes are adjusted to be images with the same size through processing such as scaling and interpolation, so that the calibration of the cameras can be respectively carried out based on the images with the same size. The camera parameter calculation after image adjustment is based on the adjusted image, not based on the originally shot image, and the error of camera parameter calibration is increased due to interpolation and other reasons of the adjusted image.
Based on the calibration and correction method for the multi-view camera provided by the above embodiment of the invention, in the calibration process of the multi-view camera, the camera is used for calibrating the camera parameters based on the images with the resolution of the camera, in the process of correcting the multi-view camera according to the calibrated data, the images shot by the camera are adjusted to the images with the same resolution, when the multi-view camera has cameras with different resolutions, since it is not necessary to adjust the images with different resolutions before the calibration, and to perform parameter calculation based on the adjusted images, errors in the calibration of camera parameters due to the adjustment of the image can be reduced or even avoided, furthermore, when the calibration is carried out according to the calibrated data, the error of the camera calibration result caused by error transmission can be reduced or even avoided, thereby improving the precision of the calibration and the calibration of the camera. The image shot by the multi-view camera after high-precision calibration and correction can be used as the basis for processing related algorithms in the computer vision field, for example, the image can be used as the basis for processing such as a binocular stereo matching algorithm, a multi-camera three-dimensional reconstruction algorithm, a multi-camera denoising algorithm and the like, so that the algorithm processing precision is improved.
Fig. 2 is a flowchart of an embodiment of the method for calibrating and correcting a multi-view camera according to the present invention, wherein the method adjusts the resolution of an image during the correction process. As shown in fig. 2, the method of this embodiment includes:
202, extracting rows and columns of the images shot by the cameras to obtain corresponding two-dimensional dot-matrix maps.
Alternatively, operation 202 may employ uniform extraction of each row and each column of the image captured by each camera to obtain a corresponding two-dimensional bitmap.
And 204, determining coordinates of pixel points in the corresponding two-dimensional dot matrix diagram after distortion removal and transformation according to the distortion parameters of each camera.
Optionally, operation 204 may calculate the coordinates of the pixels in the corresponding two-dimensional bitmap after the distortion removal transformation according to the distortion parameters of each camera.
206, acquiring the external frame of the lattice after the distortion removal and transformation of each two-dimensional lattice diagram.
Alternatively, operation 206 may be performed by obtaining a minimum bounding rectangle of each two-dimensional lattice map after the distortion removal transformation as a bounding box.
208, determining the ratio of the size of each external frame to the preset size.
Alternatively, operation 208 may obtain the ratio of the size of each bounding box to the preset size by calculating the ratio of the size of each minimum bounding rectangle to the preset size.
And 210, zooming the images which are obtained by the distortion removal of the images shot by the cameras according to the proportion to obtain the images with the same resolution.
Optionally, the preset size may be the size of the finally obtained correction map, and may be set manually as needed, and a correction map with any resolution may be output according to the different sizes of the correction maps, and since the finally obtained correction maps have the same resolution, these correction maps may be used for the subsequent calculation of the depth of the image.
Optionally, after the operation 210, the method may further include: and correcting the external parameters between any two cameras in the multi-view cameras according to the images with the same resolution.
Fig. 3 is a flowchart of an embodiment of camera parameter calibration performed by the calibration and correction method of the multi-view camera according to the embodiment of the present invention. As shown in fig. 3, the embodiment is to calibrate the camera parameters of one camera in the multi-view camera, and the method includes:
302, a set of calibration plates is photographed to obtain an image containing the set of calibration plates.
Wherein, calibration board group includes the calibration board of at least two noncoplanar settings.
Alternatively, the calibration plate in operation 302 may be a calibration plate having any pattern, such as: a checkerboard calibration plate, a circular array calibration plate, etc., the embodiment does not specifically limit the pattern of the calibration plate, and as shown in fig. 4, the calibration plate with a checkerboard pattern is applicable to the embodiment of the present invention.
The calibration plate group comprises a plurality of calibration plates, and the calibration plates in the calibration plate group are not coplanar, so that more characteristic point information can be provided, the times of shooting the calibration plates can be reduced in the calibration process, and the complexity of the calibration process can be reduced. In this embodiment, the number of the calibration plates in the calibration plate set and the way of non-coplanar arrangement of each calibration plate in the calibration plate set are not specifically limited.
Alternatively, as shown in fig. 5, the calibration plate set may be used in an embodiment of the present invention, and the calibration plate set includes 4 calibration plates, where the 4 calibration plates are respectively disposed at 4 positions, namely, upper left, upper right, lower left, and lower right, where the calibration plate disposed at the upper left position is located in a reference plane formed by coplanarity of the four calibration plates, and the calibration plates disposed at the upper right, lower left, and lower right positions are respectively rotated by 30 degrees from the reference plane toward the y direction, by 30 degrees from the reference plane toward the x direction, and by 30 degrees from the reference plane toward the x direction and the y direction, so as to form a calibration plate set that is not coplanar with the calibration plates disposed at the upper left position.
Optionally, in the image captured in operation 302, except that all the calibration boards in the calibration board group need to be included, the captured image may include as few blank regions as possible, so that the proportion of the calibration boards in the image is increased, which is beneficial to improving the accuracy of subsequent feature point detection.
And 304, detecting the characteristic points of the calibration plate in the image to obtain the positions of the characteristic points.
Optionally, for the checkerboard calibration plate, the characteristic points are inner corner points of the checkerboard; for a circular array calibration plate, the characteristic point is the center of the circular array.
Optionally, before operation 304, the method may further include: dividing the image into a plurality of image blocks, such that each image block includes at least one complete calibration board, then operation 304 is: and respectively detecting the characteristic points of the calibration board in each image block so as to obtain the positions of the characteristic points.
In one particular example, operation 304 may include: carrying out binarization processing on the image; acquiring information of the square and the adjacent square according to the image after the binarization processing; obtaining common points of adjacent grids according to the information of the adjacent grids; and obtaining the positions of the corner points in the corresponding checkerboards according to the common points of the adjacent squares.
The binarization processing method for the image to be detected can make the image have good robustness for slight differences between different illumination and different camera viewing angles, and the binarization processing method comprises the following steps: all pixels in the image with the brightness higher than the preset threshold k1 are set to be 255, namely pure white, and all pixels with the brightness lower than the preset threshold k1 are set to be 0, namely pure black, so that the contrast of the picture is enhanced.
The method comprises the steps of obtaining information of squares and adjacent squares of the squares according to an image after binarization processing, separating each black square in the image by performing expansion processing on the image after binarization processing, obtaining a contour of each square by extracting the contour, performing polygon fitting on the contour of each square, eliminating an interference contour, searching adjacent squares of each square respectively, and recording the information of the adjacent squares.
The common point of the adjacent squares is obtained according to the information of the adjacent squares, because the adjacent squares are originally separated after the image is subjected to expansion processing, and one common point of the originally connected part is separated into two points, the common point of the originally adjacent squares can be calculated according to the information of the adjacent squares.
Obtaining the positions of the angular points in the corresponding checkerboards according to the common points of the adjacent squares, and firstly judging whether each square is a square in the corresponding checkerboard according to the known angular point quantity of the checkerboards; if the square is the square in the corresponding checkerboard, the squares need to be sequenced, namely which row of the square is located in the checkerboard is indicated, in the process, the lacking squares in the checkerboard can be added, the redundant squares in the checkerboard are deleted, then on the basis of obtaining the correct squares in the checkerboard, the positions of the angular points in the corresponding checkerboard are obtained according to the common points of the adjacent squares in the checkerboard, and the angular points are numbered.
The known angular point quantity in the checkerboard is obtained according to the information of the calibration plate, and the information of the calibration plate comprises: the length and width of the board, the size of the individual squares, etc. are calibrated.
Optionally, operation 304 may further comprise: the positions of the angular points in the checkerboard are subjected to sub-pixel extraction, and the error of the positions of the angular points in the obtained checkerboard can be within a range smaller than one pixel by virtue of the sub-pixel extraction of the positions of the angular points in the checkerboard, so that the positions of the angular points are accurately positioned.
And 306, determining the camera parameters according to the positions of the characteristic points.
Optionally, operation 306 may include: determining internal parameters of the camera according to the positions of the feature points; determining external parameters of the camera according to the corresponding relation information of the internal parameters and the characteristic points of the camera; and determining the distortion parameters of the camera according to the internal parameters of the camera, the corresponding relation information of the characteristic points and the external parameters of the camera.
Optionally, operation 306 may include: determining a homography matrix according to the coordinates of the feature points in a camera coordinate system corresponding to the image plane and a world coordinate system corresponding to the calibration plate plane respectively; determining internal parameters of the camera according to the homography matrix; or determining internal parameters of the camera according to the homography matrix, and determining external parameters of the camera according to the homography matrix and the internal parameters; or determining the internal parameters of the camera according to the homography matrix, determining the external parameters of the camera according to the homography matrix and the internal parameters, and determining the distortion parameters of the camera according to the coordinates, the internal parameters and the external parameters of the characteristic points in the camera coordinate system corresponding to the image plane and the world coordinate system corresponding to the calibration plate plane respectively.
Specifically, a pinhole imaging model is established:
Figure GDA0001629269370000101
wherein, (X, Y) is the coordinate of the feature point in the camera coordinate system, (X, Y,0) is the coordinate of the corresponding feature point in the world coordinate system, and R ═ R1 r2 r3]Is a rotation transformation matrix from the world coordinate system to the camera coordinate system, and t is a translation from the world coordinate system to the camera coordinate systemAnd transforming the vector, wherein M is a camera internal parameter matrix, and s is a scaling coefficient.
According to the pinhole imaging model, acquiring a homography matrix between an image plane and a calibration plate plane:
H=[h1 h2 h3]=sM[r1 r2 t]
and simultaneously solving a homography matrix H by a plurality of equations according to the coordinates of the feature points in the camera coordinate system corresponding to the image plane and the world coordinate system corresponding to the calibration plate plane.
Wherein, the coordinates of the feature points in the camera coordinate system are obtained according to the operation of detecting the feature points, i.e. operation 304; the coordinates of the feature points in the world coordinate system are obtained according to the information of the calibration board and the serial numbers of the feature points, for example: according to the length of the square grid in the checkerboard calibration board being 5cm, the number of the angular point is line 2 and line 3, and the coordinate of the angular point in the world coordinate system is obtained as (5cm x 3,5cm x 2).
According to the homography matrix H, two constraint equations are obtained by utilizing the constraint conditions that two rotation vectors r1 and r2 are mutually orthogonal and the modulus of the vectors is equal:
Figure GDA0001629269370000102
wherein, it is made
Figure GDA0001629269370000111
And substituting the matrix B and the known homography matrix H into the two constraint equations to solve the matrix B.
According to the following steps:
Figure GDA0001629269370000112
substituting the matrix M to obtain a general form closed solution of the matrix B:
Figure GDA0001629269370000113
thus, an expression of the internal parameters of the camera is obtained:
Figure GDA0001629269370000114
wherein:
λ=B33-[B13 2+cy(B12B13-B11B23)]/B11
wherein fx is the horizontal focal length of the camera, fy is the vertical focal length of the camera, Cx is the abscissa of the lens center, and Cy is the ordinate of the lens center, and the known matrix B is substituted into the above relation to solve the internal parameters of the camera.
Three equations are derived from the homography matrix H:
Figure GDA0001629269370000115
thus, an expression of the external parameters of the camera is obtained:
Figure GDA0001629269370000116
Figure GDA0001629269370000122
and substituting the internal parameters of the camera into the relational expression according to the known homography matrix H, and solving the external parameters of the camera.
Based on the pinhole imaging model, the equation is obtained by considering the lens distortion:
Figure GDA0001629269370000121
wherein, (xp, yp) is the coordinate of the distorted feature point in the camera coordinate system, (xd, yd) is the coordinate of the corresponding feature point in the world coordinate system, k1, k2, k3 are respectively the first, second and third order radial distortion, and p1, p2 are the tangential distortions in different directions.
And substituting the coordinates of the characteristic points in a camera coordinate system and a world coordinate system, and the internal parameters and the external parameters of the camera to solve the distortion parameters of the camera.
Optionally, before operation 306, the method may further include: estimating the distortion degree of the camera; and adjusting the number of the camera parameters to be determined according to the distortion degree of the camera.
Alternatively, the method of estimating the degree of distortion of the camera may include: and estimating the distortion degree of the camera according to the phase height table of the camera, or estimating the distortion degree of the camera according to the image quality shot by the camera.
The method for estimating the distortion degree of the camera according to the phase height table of the camera is to estimate the distortion condition of the camera according to a first distortion parameter provided by a camera manufacturer, for example: whether the distortion exists or not, what the distortion degree is and the like, a first distortion parameter provided by a manufacturer is usually measured by an optical method, and the expression form of the first distortion parameter is a phase height table, the phase height table is a normalized value of a pixel point from the center of an image to a specific pixel position, and a vertical coordinate is an actual measurement value of the pixel point from the center of the image to the specific pixel position.
The degree of distortion of the camera is estimated from the quality of the image taken by the camera, for example: and observing whether the image edge has obvious distortion or not to estimate the distortion degree of the camera.
Because the order of radial distortion and the selection of tangential distortion to estimate the camera are selected, the degree of distortion of the camera depends on, generally, the larger the distortion is, the higher order of radial distortion is needed to estimate the camera, and meanwhile, the tangential distortion is needed to estimate the camera, otherwise, the lower order of radial distortion is needed to estimate the camera, and meanwhile, the tangential distortion is not needed to estimate the camera.
For example: if the radial distortion degree of the camera is estimated to be lower, the parameter k3 in the equation for solving the camera distortion is set to 0 or the parameters k2 and k3 are set to 0, and the parameters p1 and p2 are respectively set to 0, so that the number of distortion parameters needing to be solved can be reduced, and the error of the parameters needing to be solved is reduced. If the estimated camera radial distortion degree is high, the radial distortion parameters k2 and k3 are not set to 0, and the p1 and p2 are not set to 0.
In the specific implementation, the distortion degree of the cameras is estimated, and the operation of adjusting the number of the parameters of the cameras to be determined according to the distortion degree of the cameras is usually performed only once on the same batch of cameras.
Fig. 6 is a flowchart of another embodiment of camera parameter calibration performed by the calibration and correction method of the multi-view camera according to the embodiment of the invention. As shown in fig. 6, the embodiment is to calibrate the camera parameters of one camera in the multi-view camera, and the method includes:
the set of calibration plates is photographed 602 and an image containing the set of calibration plates is obtained.
Wherein, calibration board group includes the calibration board of at least two noncoplanar settings.
604, feature points of the calibration plate in the image are detected to obtain positions of the feature points.
And 606, determining camera parameters according to the positions of the characteristic points.
And 608, determining a calibration error according to the distance from the characteristic point of the calibration plate to the epipolar line.
In operation 608, an offset error may be obtained by calculating a distance from the feature point to the epipolar line, and the offset error is used as a calibration error to measure how good the calibration result is, where a larger offset is generally the worse the calibration result is, and an average offset error and a maximum offset error may also be obtained according to the distance from the feature point to the epipolar line.
Optionally, before operation 608, the method may further include: an operation for determining epipolar lines between binocular cameras based on the determined camera parameters, comprising:
obtaining a basic matrix F between the binocular cameras according to the internal parameters, the external parameters and the distortion parameters of the cameras:
Figure GDA0001629269370000141
wherein M is a camera internal parameter matrix, r is the right, l is the left, and Tx, Ty and Tz are three components of a translation transformation matrix;
determining epipolar lines between the binocular cameras according to the basic matrix F between the binocular cameras;
the epipolar lines between the two cameras can be represented by Pr ═ F × Pl ', where Pr represents the coordinates of a certain feature point in the image of the right camera, and Pl' represents the various possible coordinates of the feature point in the image of the right camera in the image of the left camera. And obtaining the coordinate relation of the left image and the right image through the F matrix. However, since F is a rank-deficient matrix, each feature point in each of the left and right images corresponds to a line in the other image, which is called epipolar line.
Based on the calibration and correction method of the multi-view camera provided by the embodiment of the invention, the calibration error is measured by calculating the distance from the characteristic point of each calibration plate to the epipolar line, and the quality of the calibration result can be quantitatively measured in pixel precision. In specific application, the calibration error is determined according to the distance from the characteristic point of the calibration plate to the polar line, so that product quality inspection can be performed, and specifically, the quality of the camera is determined to be unqualified by judging that the error is greater than a preset threshold value, and the camera returns to a factory for processing again.
Optionally, after determining the calibration error according to the distance from the characteristic point of the calibration board to the epipolar line, the method may further include: and checking the calibration error.
The calibration error can be checked to be accurate by introducing the operation of checking the calibration error after the calibration process, wherein the operation of checking the calibration error is to calculate the offset error by using the newly detected characteristic point and check whether the calibration error is accurate by comparing whether the calculated result has a larger deviation with the result calculated in the calibration process. It may include: shooting a calibration plate group to obtain an image containing the calibration plate group; wherein the calibration plate group comprises at least two calibration plates; detecting the characteristic points of the calibration plate in the image to obtain the positions of the characteristic points; and determining the distance from the characteristic point of the corresponding calibration plate to the epipolar line according to the position of the characteristic point.
Wherein for a multi-view camera, the process of verification requires that the relative positions between different cameras in the multi-view camera be kept constant. The calibration plates in the calibration plate group can be arranged in a coplanar manner, namely, another set of calibration plate can be used for photographing in the inspection process. When determining the distance from the feature point of the corresponding calibration board to the epipolar line according to the detected position of the feature point, the basic matrix F in the calibration process should be kept unchanged.
Fig. 7 is a schematic structural diagram of an embodiment of a calibration and correction apparatus of a multi-view camera according to an embodiment of the present invention. As shown in fig. 7, the apparatus of this embodiment includes: a calibration unit and a correction unit. Wherein
And the calibration unit is used for respectively calibrating the camera parameters of the multi-view cameras, wherein the calibration of at least two view cameras in the multi-view cameras is respectively carried out on the basis of the images which are shot by the cameras and have the self-resolution.
And the correcting unit is used for correcting the multi-view camera according to the calibrated data, and images shot by each camera in at least two view cameras are adjusted to images with the same resolution in the correcting process.
The camera is a device having a camera shooting function, and may also be referred to as a camera, a camera module, a camera lens, or a lens.
Optionally, the camera parameters include one or more of the following parameters of the camera: internal parameters, external parameters, distortion parameters. Internal parameters of the camera such as focal length, lens center, etc.; external parameters of the cameras such as relative position information of a camera in the multi-view camera relative to other cameras, such as rotation parameters and displacement angles; distortion parameters of the camera such as tangential distortion, radial distortion, etc.
The multi-view camera can be a binocular camera, a three-view camera, a four-view camera and even more cameras.
Optionally, the multi-view cameras include cameras of at least two different view resolutions; for a binocular camera, the resolution of the two eyes is different; for a three-eye camera, a four-eye camera and even more cameras, wherein at least two-eye resolution is different, on the basis, other eye resolutions can be the same or different.
Optionally, the calibration unit is configured to, when calibrating the camera parameters, directly calibrate the cameras with different resolutions in the multi-view camera with the images with the respective resolutions, to obtain the camera parameters, and the correction unit is configured to, during the process of performing correction according to the calibrated data, adjust the images with the respective resolutions, which are respectively captured by the cameras with different resolutions in the multi-view camera, to obtain the images with the same resolution.
Optionally, the correction unit is specifically configured to adjust the images captured by the cameras to images with the same resolution through a zoom operation during the correction process, and the manner of the zoom operation may be very flexible, for example: the images captured by the cameras may be adjusted to a certain preset size, or the images captured by the cameras may be adjusted to a certain resolution of one of the cameras. The scaling operations may include, but are not limited to, upsampling and/or downsampling, etc.
Based on the calibration and correction device for the multi-view camera provided by the above-mentioned embodiment of the present invention, in the calibration process of the multi-view camera, the camera is used for calibrating the camera parameters based on the images with the resolution of the camera, in the process of correcting the multi-view camera according to the calibrated data, the images shot by the camera are adjusted to the images with the same resolution, when the multi-view camera has cameras with different resolutions, since it is not necessary to adjust the images with different resolutions before the calibration, and to perform parameter calculation based on the adjusted images, errors in the calibration of camera parameters due to the adjustment of the image can be reduced or even avoided, furthermore, when the calibration is carried out according to the calibrated data, the error of the camera calibration result caused by error transmission can be reduced or even avoided, thereby improving the precision of the calibration and the calibration of the camera. The image shot by the multi-view camera after high-precision calibration and correction can be used as the basis for processing related algorithms in the computer vision field, for example, the image can be used as the basis for processing such as a binocular stereo matching algorithm, a multi-camera three-dimensional reconstruction algorithm, a multi-camera denoising algorithm and the like, so that the algorithm processing precision is improved.
Fig. 8 is a schematic structural diagram of an embodiment of a calibration unit of the calibration and correction device of the multi-view camera according to the embodiment of the invention. As shown in fig. 8, the correction unit of this embodiment includes: the device comprises a dot matrix extraction module, a distortion removal module, an external frame acquisition module, a proportion determination module and a proportion scaling module. Wherein
And the dot matrix extraction module is used for performing row-column extraction on the images shot by the cameras to obtain corresponding two-dimensional dot matrix maps.
Optionally, the dot matrix extraction module may uniformly extract each row and each column of the image captured by each camera to obtain the corresponding two-dimensional dot matrix map.
And the distortion removal module is used for determining coordinates of pixel points in the corresponding two-dimensional dot-matrix map after distortion removal transformation according to the distortion parameters of each camera.
Optionally, the distortion removal module may calculate coordinates of pixels in the corresponding two-dimensional lattice map after distortion removal transformation according to distortion parameters of each camera.
And the outer frame acquisition module is used for acquiring the outer frame of the lattice after the distortion removal and transformation of each two-dimensional lattice diagram.
Optionally, the outer frame obtaining module may obtain a minimum outer rectangle of the lattice after the distortion removal transformation of each two-dimensional lattice map as the outer frame.
And the proportion determining module is used for determining the proportion of the size of each external frame to the preset size.
Optionally, the ratio determining module may obtain a ratio of the size of each bounding box to a preset size by calculating a ratio of the size of each minimum bounding rectangle to the preset size.
And the scaling module is used for scaling the images which are shot by the cameras and subjected to distortion removal according to the proportion to obtain the images with the same resolution.
Optionally, the preset size may be the size of the finally obtained correction map, and may be set manually as needed, and a correction map with any resolution may be output according to the different sizes of the correction maps, and since the finally obtained correction maps have the same resolution, these correction maps may be used for the subsequent calculation of the depth of the image.
Alternatively, the correction unit may be further configured to perform an operation of correcting the external parameter between any two cameras in the multi-view cameras according to the images with the same resolution.
Fig. 9 is a schematic structural diagram of an embodiment of a calibration unit of the calibration and correction device of the multi-view camera according to the embodiment of the invention. As shown in fig. 9, the calibration unit of this embodiment includes: the device comprises a shooting module, a detection module and a calibration module. Wherein
And the shooting module is used for shooting the calibration plate group to obtain an image containing the calibration plate group.
Wherein, calibration board group includes the calibration board of at least two noncoplanar settings.
Alternatively, the calibration plate may adopt a calibration plate having any pattern, for example: a checkerboard calibration plate, a circular array calibration plate, etc., and the pattern of the calibration plate is not specifically limited in this embodiment.
The calibration plate group comprises a plurality of calibration plates, and the calibration plates in the calibration plate group are not coplanar, so that more characteristic point information can be provided, the times of shooting the calibration plates can be reduced in the calibration process, and the complexity of the calibration process can be reduced. In this embodiment, the number of the calibration plates in the calibration plate set and the way of non-coplanar arrangement of each calibration plate in the calibration plate set are not specifically limited.
Optionally, in the image captured by the capturing module, except for the fact that all the calibration plates in the calibration plate group need to be included, the captured image may include as few blank regions as possible, so that the proportion of the calibration plates in the image is increased, which is beneficial to improving the accuracy of subsequent feature point detection.
And the detection module is used for detecting the characteristic points of the calibration plate in the image to obtain the positions of the characteristic points.
Optionally, for the checkerboard calibration plate, the characteristic points are inner corner points of the checkerboard; for a circular array calibration plate, the characteristic point is the center of the circular array.
Optionally, before the detecting module, the method may further include: the detection module is specifically configured to detect the feature points of the calibration board in each image block respectively at this time, so as to obtain the positions of the feature points.
In a specific example, the detection module is specifically configured to: carrying out binarization processing on the image; acquiring information of the square and the adjacent square according to the image after the binarization processing; obtaining common points of adjacent grids according to the information of the adjacent grids; and obtaining the positions of the corner points in the corresponding checkerboards according to the common points of the adjacent squares.
Optionally, the detection module is further configured to perform sub-pixel extraction on the positions of the corners in the checkerboard, and the error of the obtained positions of the corners in the checkerboard can be made to be within a range smaller than one pixel by performing sub-pixel extraction on the positions of the corners in the checkerboard, so that the positions of the corners are accurately located.
And the calibration module is used for determining the camera parameters according to the positions of the characteristic points.
Optionally, the calibration module is specifically configured to: determining internal parameters of the camera according to the positions of the feature points; determining external parameters of the camera according to the corresponding relation information of the internal parameters and the characteristic points of the camera; and determining the distortion parameters of the camera according to the internal parameters of the camera, the corresponding relation information of the characteristic points and the external parameters of the camera.
Optionally, the calibration module is specifically configured to: determining a homography matrix according to the coordinates of the feature points in a camera coordinate system corresponding to the image plane and a world coordinate system corresponding to the calibration plate plane respectively; determining internal parameters of the camera according to the homography matrix; or determining internal parameters of the camera according to the homography matrix, and determining external parameters of the camera according to the homography matrix and the internal parameters; or determining the internal parameters of the camera according to the homography matrix, determining the external parameters of the camera according to the homography matrix and the internal parameters, and determining the distortion parameters of the camera according to the coordinates, the internal parameters and the external parameters of the characteristic points in the camera coordinate system corresponding to the image plane and the world coordinate system corresponding to the calibration plate plane respectively.
Optionally, before the calibration module, the method may further include: the adjusting module is used for estimating the distortion degree of the camera; and adjusting the number of the camera parameters to be determined according to the distortion degree of the camera.
Optionally, the adjusting module is specifically configured to: and estimating the distortion degree of the camera according to the phase height table of the camera, or estimating the distortion degree of the camera according to the image quality shot by the camera.
Fig. 10 is a schematic structural diagram of another embodiment of the calibration unit of the calibration and correction device of the multi-view camera according to the embodiment of the invention. As shown in fig. 10, the calibration unit of this embodiment includes: the device comprises a shooting module, a detection module, a calibration module and an error determination module. Wherein
And the shooting module is used for shooting the calibration plate group to obtain an image containing the calibration plate group.
Wherein, calibration board group includes the calibration board of at least two noncoplanar settings.
And the detection module is used for detecting the characteristic points of the calibration plate in the image to obtain the positions of the characteristic points.
And the calibration module is used for determining the camera parameters according to the positions of the characteristic points.
And the error determining module is used for determining the calibration error according to the distance from the characteristic point of the calibration plate to the polar line.
The error determination module can obtain an offset error by calculating the distance from the characteristic point to the epipolar line, and the offset error is used as a calibration error to measure the quality of a calibration result, wherein the calibration result is worse when the offset is larger, and an average offset error and a maximum offset error can be obtained according to the distance from the characteristic point to the epipolar line.
Optionally, the error determination module is further configured to: and determining epipolar lines between the binocular cameras according to the determined camera parameters.
Based on the calibration and correction device of the multi-view camera provided by the embodiment of the invention, the calibration error is measured by calculating the distance from the characteristic point of each calibration plate to the polar line, and the quality of the calibration result can be quantitatively measured in pixel precision. In specific application, the calibration error is determined according to the distance from the characteristic point of the calibration plate to the polar line, so that product quality inspection can be performed, and specifically, the quality of the camera is determined to be unqualified by judging that the error is greater than a preset threshold value, and the camera returns to a factory for processing again.
Optionally, the calibration and correction device for a multi-view camera according to an embodiment of the present invention may further include: and the checking unit is used for checking the operation of the calibration error.
The calibration error can be checked to be accurate by introducing the operation of checking the calibration error after the calibration process, wherein the operation of checking the calibration error is to calculate the offset error by using the newly detected characteristic point and check whether the calibration error is accurate by comparing whether the calculated result has a larger deviation with the result calculated in the calibration process. The inspection unit is specifically configured to: shooting a calibration plate group to obtain an image containing the calibration plate group; wherein the calibration plate group comprises at least two calibration plates; detecting the characteristic points of the calibration plate in the image to obtain the positions of the characteristic points; and determining the distance from the characteristic point of the corresponding calibration plate to the epipolar line according to the position of the characteristic point.
Wherein for a multi-view camera, the process of verification requires that the relative positions between different cameras in the multi-view camera be kept constant. The calibration plates in the calibration plate group can be arranged in a coplanar manner, namely, another set of calibration plate can be used for photographing in the inspection process. When determining the distance from the feature point of the corresponding calibration board to the epipolar line according to the detected position of the feature point, the basic matrix F in the calibration process should be kept unchanged.
In addition, an embodiment of the present invention further provides an electronic device, which may be, for example, a mobile terminal, a Personal Computer (PC), a tablet computer, a server, and the like, and the electronic device is provided with the calibration and correction apparatus for a multi-view camera according to any of the above embodiments of the present invention.
Fig. 11 is a schematic structural diagram of an embodiment of an electronic device according to an embodiment of the present invention, and as shown in fig. 11, the electronic device for implementing an embodiment of the present invention includes a Central Processing Unit (CPU) that can perform various appropriate actions and processes according to executable instructions stored in a Read Only Memory (ROM) or executable instructions loaded from a storage section into a Random Access Memory (RAM). The central processing unit may communicate with the read-only memory and/or the random access memory to execute the executable instructions so as to perform operations corresponding to the calibration and correction method of the multi-view camera provided by the embodiment of the present invention, for example: calibrating camera parameters of the multi-view cameras respectively, wherein calibration of at least two view cameras in the multi-view cameras is performed respectively based on images which are shot by the cameras and have self-resolution; and correcting the multi-view camera according to the calibrated data, and adjusting the images shot by each camera in at least two view cameras into images with the same resolution in the correction process.
In addition, in the RAM, various programs and data necessary for system operation may also be stored. The CPU, ROM, and RAM are connected to each other via a bus. An input/output (I/O) interface is also connected to the bus.
The following components are connected to the I/O interface: an input section including a keyboard, a mouse, and the like; an output section including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section including a hard disk and the like; and a communication section including a network interface card such as a LAN card, a modem, or the like. The communication section performs communication processing via a network such as the internet. The drive is also connected to the I/O interface as needed. A removable medium such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive as necessary, so that a computer program read out therefrom is mounted into the storage section as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program tangibly embodied on a machine-readable medium, the computer program comprising program code for executing the method illustrated in the flowchart, the program code may include instructions corresponding to performing the steps of any one of the calibration and correction methods of the multi-view camera provided by the embodiments of the present invention, for example, instructions for performing calibration of camera parameters for the multi-view cameras, respectively, wherein the calibration of at least two of the multi-view cameras is performed based on respective captured images having their own resolution; and adjusting images shot by each camera in at least two cameras into images with the same resolution in the correction process according to the calibrated data. The computer program may be downloaded and installed from a network through the communication section, and/or installed from a removable medium. The computer program performs the above-described functions defined in the method of the present invention when executed by a Central Processing Unit (CPU).
The embodiment of the present invention further provides a computer storage medium, configured to store a computer readable instruction, where the instruction, when executed, performs an operation corresponding to the calibration and correction method of the multi-view camera according to any one of the above embodiments of the present invention. The instructions may include, for example: calibrating camera parameters of the multi-view cameras respectively, wherein calibration of at least two view cameras in the multi-view cameras is performed respectively based on images which are shot by the cameras and have self-resolution; and correcting the multi-view camera according to the calibrated data, and adjusting the images shot by each camera in at least two view cameras into images with the same resolution in the correction process.
In addition, an embodiment of the present invention further provides an electronic device, including:
a memory storing executable instructions;
and the processor is communicated with the memory to execute the executable instructions so as to complete the operation corresponding to the calibration and correction method of the multi-view camera in any one of the above embodiments of the invention.
In the present specification, the embodiments are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same or similar parts in the embodiments are referred to each other. For the system embodiment, since it basically corresponds to the method embodiment, the description is relatively simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The method and apparatus, device of the present invention may be implemented in a number of ways. For example, the method, apparatus and device of the present invention may be implemented by software, hardware, firmware or any combination of software, hardware and firmware. The above-described order for the steps of the method is for illustrative purposes only, and the steps of the method of the present invention are not limited to the order specifically described above unless specifically indicated otherwise. Furthermore, in some embodiments, the present invention may also be embodied as a program recorded in a recording medium, the program including machine-readable instructions for implementing a method according to the present invention. Thus, the present invention also covers a recording medium storing a program for executing the method according to the present invention.
The description of the present invention has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to practitioners skilled in this art. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.

Claims (41)

1. A calibration and correction method for a multi-view camera is characterized by comprising the following steps:
acquiring the position of a characteristic point of a calibration plate in a calibration plate set in an image; the calibration plate group comprises at least two calibration plates which are arranged in a non-coplanar manner;
calibrating camera parameters of the multi-view cameras respectively based on the positions of the feature points, wherein at least two view cameras in the multi-view cameras are calibrated respectively based on images which are shot by the cameras and have self-resolution;
correcting the multi-view camera according to the calibrated data, and adjusting the images shot by each camera in the at least two view cameras into images with the same resolution in the correction process;
the calibrating of the camera parameters of the multi-view camera based on the positions of the feature points comprises the following steps:
and respectively calibrating the camera parameters of the multi-view camera based on the characteristic point positions of the calibration plates in the calibration plate group in the single image.
2. The method of claim 1, wherein the at least two eye cameras comprise cameras of different at least two eye resolutions.
3. The method of claim 1, wherein the camera parameters include one or more of the following parameters of the camera: internal parameters, external parameters, distortion parameters.
4. The method of claim 1, wherein adjusting the images captured by each of the at least two cameras to the same resolution during the calibration process comprises:
and in the correction process, images shot by the cameras are adjusted to images with the same resolution through zooming operation.
5. The method of claim 1, wherein adjusting the images captured by each of the at least two cameras to the same resolution during the calibration process comprises:
performing row-column extraction on the images shot by the cameras to obtain corresponding two-dimensional dot-matrix maps;
determining coordinates of pixels in the corresponding two-dimensional dot matrix map after distortion removal transformation according to distortion parameters of the cameras;
acquiring an external frame of the lattice after the distortion removal and transformation of each two-dimensional lattice diagram;
determining the proportion of the size of each external frame to a preset size;
and zooming the images shot by the cameras after the distortion of the images is removed according to the proportion to obtain the images with the same resolution.
6. The method of claim 1, wherein said calibrating the multi-view camera based on the calibrated data comprises:
and correcting external parameters between any two cameras in the multi-view cameras according to the images with the same resolution.
7. The method of any one of claims 1 to 6, wherein obtaining the positions of the feature points of the calibration plate within the set of calibration plates in the image comprises:
shooting a calibration plate group to obtain an image containing the calibration plate group; the calibration plate group comprises at least two calibration plates which are arranged in a non-coplanar manner;
detecting the characteristic points of the calibration plate in the image to obtain the positions of the characteristic points;
the calibrating of the camera parameters of one camera in the multi-view cameras based on the positions of the feature points comprises the following steps: and determining the camera parameters according to the positions of the characteristic points.
8. The method of claim 7, wherein before detecting the feature points of the calibration plate in the image, further comprising:
dividing the image into a plurality of image blocks, wherein each image block at least comprises a complete calibration board;
the detecting the characteristic points of the calibration plate in the image comprises:
and respectively detecting the characteristic points of the calibration board in each image block to obtain the positions of the characteristic points.
9. The method of claim 7,
the calibration board comprises a checkerboard, and the feature points comprise inner corner points of the checkerboard; or,
the calibration plate comprises a circle array, and the characteristic points comprise the circle centers of the circle array.
10. The method of claim 9, wherein the detecting feature points of a calibration plate in the image comprises:
carrying out binarization processing on the image;
acquiring information of the square and the adjacent square according to the image after the binarization processing;
obtaining common points of adjacent grids according to the information of the adjacent grids;
and obtaining the positions of the corner points in the corresponding checkerboards according to the common points of the adjacent squares.
11. The method of claim 10, wherein the detecting feature points of a calibration plate in the image further comprises:
and performing sub-pixel extraction on the positions of the corner points in the checkerboard.
12. The method of claim 7, wherein determining the camera parameters based on the locations of the feature points comprises:
determining internal parameters of the camera according to the positions of the feature points;
determining external parameters of the camera according to the corresponding relation information of the internal parameters and the characteristic points of the camera;
and determining the distortion parameters of the camera according to the internal parameters of the camera, the corresponding relation information of the characteristic points and the external parameters of the camera.
13. The method of claim 12, wherein determining the camera parameters based on the locations of the feature points comprises:
determining a homography matrix according to the coordinates of the feature points in a camera coordinate system corresponding to the image plane and a world coordinate system corresponding to the calibration plate plane respectively;
determining internal parameters of the camera according to the homography matrix; or,
determining internal parameters of the camera according to the homography matrix, and determining external parameters of the camera according to the homography matrix and the internal parameters; or,
determining internal parameters of the camera according to the homography matrix, determining external parameters of the camera according to the homography matrix and the internal parameters, and determining distortion parameters of the camera according to coordinates of the characteristic points in a camera coordinate system corresponding to the image plane and a world coordinate system corresponding to the calibration plate plane, the internal parameters and the external parameters.
14. The method of claim 7, wherein before determining the camera parameters based on the locations of the feature points, further comprising:
estimating the distortion degree of the camera;
and adjusting the number of the camera parameters to be determined according to the distortion degree of the camera.
15. The method of claim 14, wherein estimating the degree of camera distortion comprises:
estimating the distortion degree of the camera according to a phase height table of the camera; or
And estimating the distortion degree of the camera according to the quality of the image shot by the camera.
16. The method of claim 7, wherein after determining the camera parameters according to the locations of the feature points, further comprising:
and determining a calibration error according to the distance from the characteristic point of the calibration plate to the polar line.
17. The method of claim 16, wherein prior to determining the calibration error based on the characteristic point to epipolar line distance of the calibration plate, further comprising:
determining epipolar lines between the binocular cameras according to the determined camera parameters.
18. The method of claim 16, further comprising:
and checking the calibration error.
19. The method of claim 18, wherein said verifying said calibration error comprises:
shooting a calibration plate group to obtain an image containing the calibration plate group; the calibration plate group comprises at least two calibration plates;
detecting the characteristic points of the calibration plate in the image to obtain the positions of the characteristic points;
and determining the distance from the characteristic point of the corresponding calibration plate to the epipolar line according to the position of the characteristic point.
20. A calibration and correction device for a multi-view camera is characterized by comprising:
the acquisition unit is used for acquiring the position of the characteristic point of the calibration plate in the calibration plate group in the image; the calibration plate group comprises at least two calibration plates which are arranged in a non-coplanar manner; the calibration unit is used for respectively calibrating the camera parameters of the multi-view cameras based on the positions of the characteristic points, wherein the calibration of at least two view cameras in the multi-view cameras is respectively carried out based on the images which are shot by the cameras and have the self-resolution;
the correction unit is used for correcting the multi-view camera according to calibrated data, and images shot by each camera in the at least two view cameras are adjusted to images with the same resolution in the correction process;
the calibration unit is further configured to calibrate the camera parameters of the multi-view camera based on the feature point positions of the calibration plates in the calibration plate group in the single image.
21. The apparatus of claim 20, wherein the at least two eye cameras comprise cameras of different at least two eye resolutions.
22. The apparatus of claim 20, wherein the camera parameters comprise one or more of the following parameters of the camera: internal parameters, external parameters, distortion parameters.
23. The apparatus according to claim 20, wherein the correction unit is configured to adjust the images captured by the cameras to images with the same resolution by a zoom operation during the correction process.
24. The apparatus of claim 20, wherein the correction unit comprises:
the dot matrix extraction module is used for performing row-column extraction on the images shot by the cameras to obtain corresponding two-dimensional dot matrix maps;
the distortion removal module is used for determining coordinates of pixels in the corresponding two-dimensional dot-matrix map after distortion removal transformation according to the distortion parameters of the cameras;
the outer frame acquisition module is used for acquiring the outer frame of the lattice after the distortion removal and transformation of each two-dimensional lattice diagram;
the proportion determining module is used for determining the proportion of the size of each external frame to a preset size;
and the scaling module is used for scaling the images which are shot by the cameras and subjected to distortion removal according to the proportion to obtain the images with the same resolution.
25. The apparatus according to claim 20, wherein the correction unit is further configured to correct the external parameters between any two cameras in the multi-view cameras according to the images with the same resolution.
26. The apparatus according to any one of claims 20 to 25, wherein the obtaining unit comprises:
the shooting module is used for shooting a calibration plate group to obtain an image containing the calibration plate group; the calibration plate group comprises at least two calibration plates which are arranged in a non-coplanar manner;
the detection module is used for detecting the characteristic points of the calibration plate in the image to obtain the positions of the characteristic points;
the calibration unit comprises:
and the calibration module is used for determining the camera parameters according to the positions of the characteristic points.
27. The apparatus of claim 26, further comprising:
the segmentation module is used for segmenting the image into a plurality of image blocks, so that each image block at least comprises a complete calibration board;
and the detection module is specifically used for respectively detecting the feature points of the calibration board in each image block to obtain the positions of the feature points.
28. The apparatus of claim 26,
the calibration board comprises a checkerboard, and the feature points comprise inner corner points of the checkerboard; or,
the calibration plate comprises a circle array, and the characteristic points comprise the circle centers of the circle array.
29. The apparatus according to claim 28, wherein the detection module is specifically configured to:
carrying out binarization processing on the image;
acquiring information of the square and the adjacent square according to the image after the binarization processing;
obtaining common points of adjacent grids according to the information of the adjacent grids;
and obtaining the positions of the corner points in the corresponding checkerboards according to the common points of the adjacent squares.
30. The apparatus according to claim 29, wherein the detection module is further configured to perform sub-pixel extraction on the positions of the corner points in the checkerboard.
31. The apparatus of claim 26, wherein the calibration module is specifically configured to:
determining internal parameters of the camera according to the positions of the feature points;
determining external parameters of the camera according to the corresponding relation information of the internal parameters and the characteristic points of the camera;
and determining the distortion parameters of the camera according to the internal parameters of the camera, the corresponding relation information of the characteristic points and the external parameters of the camera.
32. The apparatus of claim 31, wherein the calibration module is specifically configured to:
determining a homography matrix according to the coordinates of the feature points in a camera coordinate system corresponding to the image plane and a world coordinate system corresponding to the calibration plate plane respectively;
determining internal parameters of the camera according to the homography matrix; or,
determining internal parameters of the camera according to the homography matrix, and determining external parameters of the camera according to the homography matrix and the internal parameters; or,
determining internal parameters of the camera according to the homography matrix, determining external parameters of the camera according to the homography matrix and the internal parameters, and determining the external parameters of the camera according to coordinates of the characteristic points in a camera coordinate system corresponding to the image plane and a world coordinate system corresponding to the calibration plate plane, the internal parameters and the external parameters.
33. The apparatus of claim 26, further comprising:
the adjusting module is used for estimating the distortion degree of the camera; and adjusting the number of the camera parameters to be determined according to the distortion degree of the camera.
34. The apparatus of claim 33, wherein the adjustment module is specifically configured to:
estimating the distortion degree of the camera according to a phase height table of the camera; or
And estimating the distortion degree of the camera according to the quality of the image shot by the camera.
35. The apparatus of claim 26, further comprising:
and the error determining module is used for determining the calibration error according to the distance from the characteristic point of the calibration plate to the polar line.
36. The apparatus of claim 35, wherein the error determination module is further configured to determine epipolar lines between the binocular cameras according to the determined camera parameters.
37. The apparatus of claim 35, further comprising:
and the checking unit is used for checking the calibration error.
38. The device according to claim 37, characterized in that said checking unit is particularly adapted to:
shooting a calibration plate group to obtain an image containing the calibration plate group; the calibration plate group comprises at least two calibration plates;
detecting the characteristic points of the calibration plate in the image to obtain the positions of the characteristic points;
and determining the distance from the characteristic point of the corresponding calibration plate to the epipolar line according to the position of the characteristic point.
39. An electronic device, characterized in that it comprises the apparatus of any of claims 20 to 38.
40. An electronic device, comprising:
a memory for storing executable instructions; and
a processor in communication with the memory to execute the executable instructions to perform operations corresponding to the method of any one of claims 1 to 19.
41. A computer storage medium storing computer-readable instructions, wherein the instructions, when executed, perform operations corresponding to the method of any one of claims 1 to 19.
CN201711298424.0A 2017-12-08 2017-12-08 Multi-view camera calibration and correction method and apparatus, device, program and medium Active CN108230397B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711298424.0A CN108230397B (en) 2017-12-08 2017-12-08 Multi-view camera calibration and correction method and apparatus, device, program and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711298424.0A CN108230397B (en) 2017-12-08 2017-12-08 Multi-view camera calibration and correction method and apparatus, device, program and medium

Publications (2)

Publication Number Publication Date
CN108230397A CN108230397A (en) 2018-06-29
CN108230397B true CN108230397B (en) 2021-04-02

Family

ID=62653396

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711298424.0A Active CN108230397B (en) 2017-12-08 2017-12-08 Multi-view camera calibration and correction method and apparatus, device, program and medium

Country Status (1)

Country Link
CN (1) CN108230397B (en)

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109241875B (en) * 2018-08-20 2020-08-25 北京市商汤科技开发有限公司 Attitude detection method and apparatus, electronic device, and storage medium
KR102324001B1 (en) 2018-08-20 2021-11-09 베이징 센스타임 테크놀로지 디벨롭먼트 컴퍼니 리미티드 Position and posture detection method and device, electronic device and storage medium
CN109215087B (en) * 2018-08-28 2021-04-27 维沃移动通信有限公司 Calibration method and device of double-camera module and terminal
CN109472760B (en) * 2019-02-01 2019-05-21 深兰人工智能芯片研究院(江苏)有限公司 A kind of method, apparatus of correcting distorted image
CN110264524B (en) * 2019-05-24 2023-07-21 联想(上海)信息技术有限公司 Calibration method, device, system and storage medium
CN110443855A (en) * 2019-08-08 2019-11-12 Oppo广东移动通信有限公司 Multi-camera calibration, device, storage medium and electronic equipment
CN110458951B (en) * 2019-08-15 2023-06-30 广东电网有限责任公司 Modeling data acquisition method and related device for power grid pole tower
CN110580724B (en) * 2019-08-28 2022-02-25 贝壳技术有限公司 Method and device for calibrating binocular camera set and storage medium
WO2021072767A1 (en) * 2019-10-18 2021-04-22 深圳市大疆创新科技有限公司 Calibration method and system for camera device, and stereoscopic calibration device and storage medium
CN112785650B (en) * 2019-11-11 2024-01-12 北京京邦达贸易有限公司 Camera parameter calibration method and device
CN111127560B (en) * 2019-11-11 2022-05-03 江苏濠汉信息技术有限公司 Calibration method and system for three-dimensional reconstruction binocular vision system
CN111311693B (en) * 2020-03-16 2023-11-14 威海经济技术开发区天智创新技术研究院 Online calibration method and system for multi-camera
CN112465913A (en) * 2020-11-18 2021-03-09 广东博智林机器人有限公司 Binocular camera-based correction method and device
CN113327290B (en) * 2021-06-07 2022-11-11 深圳市商汤科技有限公司 Binocular module calibration method and device, storage medium and electronic equipment
CN113706632B (en) * 2021-08-31 2024-01-16 上海景吾智能科技有限公司 Calibration method and system based on three-dimensional vision calibration plate
TWI807449B (en) * 2021-10-15 2023-07-01 國立臺灣科技大學 Method and system for generating a multiview stereoscopic image
CN115942119B (en) * 2022-08-12 2023-11-21 北京小米移动软件有限公司 Linkage monitoring method and device, electronic equipment and readable storage medium
CN117029695A (en) * 2023-10-08 2023-11-10 钛玛科(北京)工业科技有限公司 Material width measuring method and device

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102364299B (en) * 2011-08-30 2015-01-14 西南科技大学 Calibration technology for multiple structured light projected three-dimensional profile measuring heads
US20130091026A1 (en) * 2011-10-10 2013-04-11 Arcsoft, Inc. Photo Sharing with Digital Album
CN103674063B (en) * 2013-12-05 2016-08-31 中国资源卫星应用中心 A kind of optical remote sensing camera geometric calibration method in-orbit
CN103679729A (en) * 2013-12-17 2014-03-26 中国人民解放军第二炮兵工程大学 Full-automatic camera parameter calibration method based on colored calibration board
CN104021548A (en) * 2014-05-16 2014-09-03 中国科学院西安光学精密机械研究所 Method for acquiring scene 4D information
WO2016086379A1 (en) * 2014-12-04 2016-06-09 SZ DJI Technology Co., Ltd. Imaging system and method
US9418396B2 (en) * 2015-01-15 2016-08-16 Gopro, Inc. Watermarking digital images to increase bit depth
CN106767682A (en) * 2016-12-01 2017-05-31 腾讯科技(深圳)有限公司 A kind of method and aircraft for obtaining flying height information

Also Published As

Publication number Publication date
CN108230397A (en) 2018-06-29

Similar Documents

Publication Publication Date Title
CN108230397B (en) Multi-view camera calibration and correction method and apparatus, device, program and medium
US11570423B2 (en) System and methods for calibration of an array camera
CN110717942B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN109737874B (en) Object size measuring method and device based on three-dimensional vision technology
CN111210468B (en) Image depth information acquisition method and device
CN110809786B (en) Calibration device, calibration chart, chart pattern generation device, and calibration method
CN112070845B (en) Calibration method and device of binocular camera and terminal equipment
US9094672B2 (en) Stereo picture generating device, and stereo picture generating method
CN108038886B (en) Binocular camera system calibration method and device and automobile
EP2194725A1 (en) Method and apparatus for correcting a depth image
US8538198B2 (en) Method and apparatus for determining misalignment
EP2707838A1 (en) Camera calibration using an easily produced 3d calibration pattern
KR20230137937A (en) Device and method for correspondence analysis in images
CN112985360B (en) Lane line-based binocular ranging correction method, device, equipment and storage medium
CN116433737A (en) Method and device for registering laser radar point cloud and image and intelligent terminal
CN111383254A (en) Depth information acquisition method and system and terminal equipment
CN117557657A (en) Binocular fisheye camera calibration method and system based on Churco calibration plate
CN116051652A (en) Parameter calibration method, electronic equipment and storage medium
CN117392161B (en) Calibration plate corner point for long-distance large perspective distortion and corner point number determination method
CN115631245A (en) Correction method, terminal device and storage medium
CN112822481A (en) Detection method and detection system for correction quality of stereo camera
Zongqian Camera calibration based on liquid crystal display (lcd)
CN116309440B (en) Method and device for manufacturing template image for AOI detection
CN110728714B (en) Image processing method and device, storage medium and electronic equipment
CN117911634A (en) Visual imaging optimization method, system and equipment based on digital image reconstruction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant