CN112470192A - Dual-camera calibration method, electronic device and computer-readable storage medium - Google Patents

Dual-camera calibration method, electronic device and computer-readable storage medium Download PDF

Info

Publication number
CN112470192A
CN112470192A CN201880095497.6A CN201880095497A CN112470192A CN 112470192 A CN112470192 A CN 112470192A CN 201880095497 A CN201880095497 A CN 201880095497A CN 112470192 A CN112470192 A CN 112470192A
Authority
CN
China
Prior art keywords
camera
calibration
point set
distortion
fitting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880095497.6A
Other languages
Chinese (zh)
Inventor
张弓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Publication of CN112470192A publication Critical patent/CN112470192A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

A calibration method of double cameras comprises the steps of shooting calibration plates at different angles to obtain calibration images, adopting a blocking surface function to fit according to internal parameters and external parameters of a single camera to obtain a distortion coefficient of the single camera, and determining initial external parameters of a double-camera module according to the external parameters of the single camera in the double-camera module; and processing according to the initial external parameters of the double-camera module, the distortion coefficient of the single camera and the internal parameters of the single camera to obtain the target external parameters of the double-camera module.

Description

Dual-camera calibration method, electronic device and computer-readable storage medium Technical Field
The present application relates to the field of imaging technologies, and in particular, to a dual-camera calibration method, an electronic device, and a computer-readable storage medium.
Background
With the development of electronic devices and imaging technologies, more and more users use cameras of electronic devices to capture images. In order to shoot images, the camera needs to be calibrated before leaving the factory. The traditional double-shooting calibration method has low precision.
Disclosure of Invention
The embodiment of the application provides a double-camera calibration method, electronic equipment and a computer readable storage medium, which can improve the precision of double-camera calibration.
A dual-camera calibration method comprises the following steps:
shooting calibration plates at different angles through the double-camera module to obtain calibration images at different angles;
detecting feature points in the calibration image;
adopting a block curved surface function to perform fitting according to the internal parameters and the external parameters of a single camera in the double-camera module and the corresponding characteristic points in the calibration image to obtain the distortion coefficient of the single camera, wherein the fitting result of the block curved surface function is continuous;
determining initial external parameters of the double-camera module according to the external parameters of a single camera in the double-camera module; and
and processing according to the initial external parameters of the double-camera module, the distortion coefficient of the single camera and the internal parameters of the single camera to obtain the target external parameters of the double-camera module.
An electronic device comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of:
shooting calibration plates at different angles through the double-camera module to obtain calibration images at different angles;
detecting homonymous feature points in the calibration image;
adopting a block curved surface function to perform fitting according to the internal parameters and the external parameters of a single camera in the double-camera module and the corresponding characteristic points in the calibration image to obtain the distortion coefficient of the single camera, wherein the fitting result of the block curved surface function is continuous;
determining initial external parameters of the double-camera module according to the external parameters of a single camera in the double-camera module; and
and processing according to the initial external parameters of the double-camera module, the distortion coefficient of the single camera and the internal parameters of the single camera to obtain the target external parameters of the double-camera module.
A non-transitory computer readable storage medium having stored thereon a computer program that, when executed by a processor, performs operations comprising:
shooting calibration plates at different angles through the double-camera module to obtain calibration images at different angles;
detecting feature points in the calibration image;
adopting a block curved surface function to perform fitting according to the internal parameters and the external parameters of a single camera in the double-camera module and the corresponding characteristic points in the calibration image to obtain the distortion coefficient of the single camera, wherein the fitting result of the block curved surface function is continuous;
determining initial external parameters of the double-camera module according to the external parameters of a single camera in the double-camera module; and
and processing according to the initial external parameters of the double-camera module, the distortion coefficient of the single camera and the internal parameters of the single camera to obtain the target external parameters of the double-camera module.
According to the double-camera calibration method, the electronic device and the computer readable storage medium in the embodiment of the application, the blocking surface function fitting is carried out according to the internal parameters and the external parameters of the first camera and the second camera and the characteristic points in the calibration image to obtain the distortion coefficient, and the optimization is carried out according to the internal parameters and the distortion coefficient of the single camera and the initial external parameters of the double-camera module, so that the calibration precision is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic application environment diagram of a dual-camera calibration method in an embodiment.
FIG. 2 is a flow chart of a dual-camera calibration method in one embodiment.
Fig. 3 is a flowchart of obtaining target point sets and distortion point sets of a single camera at different angles according to the internal reference and the external reference of the single camera in the dual-camera module and the feature points in the calibration image in one embodiment.
FIG. 4 is a diagram illustrating a target point set partitioning block in an embodiment.
FIG. 5 is a diagram illustrating the fitting results of the deformed surface with x-direction distortion in one embodiment.
FIG. 6 is a diagram illustrating the fitting of a distortion surface in the y-direction in one embodiment.
Fig. 7 is a flowchart of a dual-camera calibration method in yet another embodiment.
Fig. 8 is a block diagram of a dual-camera calibration apparatus in one embodiment.
Fig. 9 is a schematic diagram of an internal structure of an electronic device in one embodiment.
FIG. 10 is a schematic diagram of an image processing circuit in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, the first calibration image may be referred to as a second calibration image, and similarly, the second calibration image may be referred to as the first calibration image, without departing from the scope of the present application. Both the first calibration image and the second calibration image are calibration images, but they are not the same calibration image.
Fig. 1 is a schematic application environment diagram of a dual-camera calibration method in an embodiment. As shown in fig. 1, the application environment includes a bi-camera jig 110 and a calibration board 120. The dual-camera jig 110 is used for placing an electronic device with dual camera modules or dual camera modules. The calibration plate 120(chart) has a chart pattern thereon. The calibration plate 120 can rotate to maintain the poses at different angles. Two camera modules or the electronic equipment who has two camera modules on the tool 110 of taking a photograph are in different distances, different angles shoot chart pattern on calibration board 120, shoot image at least 3 angles usually, like two camera module optical axis perpendicular to calibration board rotation axis in fig. 1, calibration board 120 is around the rotatory three angle of Y axle, and one of them angle is 0 degree, and other two rotation angles are theta degrees, and theta is greater than 15 to guarantee decoupling zero between the gesture. The calibration plates at different angles are shot through the double-camera module to obtain calibration images at different angles, the distortion coefficient of the single camera is obtained by adopting block curved surface function fitting according to the internal parameters and the external parameters of the single camera in the double-camera module and the characteristic points in the calibration images, the deviation of the homonymous characteristic points in the calibration images shot by the first camera and the second camera is used as a target function, the internal parameters and the distortion coefficient of the single camera are used as the input of the target function, the external parameters of the double-camera module are used as the initial parameters of the target function, and the target external parameters of the double-camera module are obtained by processing. The calibration precision of the double-camera module can be improved.
FIG. 2 is a flow chart of a dual-camera calibration method in one embodiment. As shown in fig. 2, a method of dual camera calibration begins with operation 202.
And operation 202, shooting the calibration plates at different angles through the double-camera module to obtain calibration images at different angles.
The camera is used for collecting images, and the camera needs to be calibrated before leaving a factory. The single-camera calibration refers to determining the values of the internal reference and the external reference of the single camera. The internal reference of the single camera may include fx、f y、c x、c yWherein f isxDenotes the unit pixel size, f, of the focal length in the x-axis direction of the image coordinate systemyDenotes the unit pixel size of the focal length in the y-axis direction of the image coordinate system, cx、c yCoordinates of principal points representing the image plane, the principal points being the optical axis and the imageThe intersection of the planes. f. ofx=f/d x,f y=f/d yWhere f is the focal length of a single camera and dxRepresenting the width of a pixel in the x-axis direction of the image coordinate system, dyRepresenting the width of one pixel in the y-axis direction of the image coordinate system. The image coordinate system is a coordinate system established based on a two-dimensional image captured by the camera and used for specifying the position of an object in the captured image. The origin of the (x, y) coordinate system in the image coordinate system is located at the focal point (c) of the optical axis of the camera and the imaging planex,c y) The unit is length unit, i.e. meter, the origin of the (u, v) coordinate system in the pixel coordinate system is in the upper left corner of the image, the unit is number unit, i.e. number. (x, y) is used for representing the perspective projection relation of the object from the camera coordinate system to the image coordinate system, and (u, v) is used for representing the pixel coordinate. The conversion relationship between (x, y) and (u, v) is as in equation (1):
Figure PCTCN2018090464-APPB-000001
the perspective projection is a single-side projection image which is relatively close to the visual effect and is obtained by projecting the shape onto a projection surface by using a central projection method.
The external parameters of the single camera comprise a rotation matrix and a translation matrix which are converted from the coordinates under the world coordinate system to the coordinates under the camera coordinate system. The world coordinate system reaches the camera coordinate system through rigid body transformation, and the camera coordinate system reaches the image coordinate system through perspective projection transformation. The rigid body transformation refers to the rigid body transformation which is performed by rotating and translating a geometric object when the object is not deformed in a three-dimensional space. Rigid body transformation as formula (2)
Figure PCTCN2018090464-APPB-000002
X c=RX+T,
Figure PCTCN2018090464-APPB-000003
Wherein, XcRepresenting the camera coordinate system, X representing the world coordinate system, R representing the rotation matrix from the world coordinate system to the camera coordinate system, and T representing the translation matrix from the world coordinate system to the camera coordinate system. The distance between the world coordinate system origin and the camera coordinate system origin is controlled by components in the directions of three axes of x, y and z, and has three degrees of freedom, and R is the sum of the effects of rotating around X, Y, Z axes respectively. t is txRepresenting the amount of translation, t, in the x-axis directionyIndicating the amount of translation, t, in the y-axis directionzIndicating the amount of translation in the z-axis direction.
The world coordinate system is an absolute coordinate system of an objective three-dimensional space and can be established at any position. For example, for each calibration image, a world coordinate system may be established with the corner point at the upper left corner of the calibration plate as the origin, the plane of the calibration plate as the XY plane, and the Z-axis facing up perpendicular to the plane of the calibration plate. The camera coordinate system takes the optical center of the camera as the origin of the coordinate system, takes the optical axis of the camera as the Z axis, and the X axis and the Y axis are respectively parallel to the X axis and the Y axis of the image coordinate system. The principal point of the image coordinate system is the intersection of the optical axis and the image plane. The image coordinate system takes the principal point as an origin. The pixel coordinate system refers to the position where the origin is defined at the upper left corner of the image plane.
The double-camera calibration refers to determining external parameters of the double-camera module. The external parameters of the double-camera module comprise a rotation matrix and a translation matrix between the two cameras.
The double-camera module comprises a first camera and a second camera. The first camera and the second camera can be both color cameras, or one is a black and white camera, and the other is a color camera, or two black and white cameras.
The calibration plate is a reference plate used for calibrating the camera. The calibration plate is provided with a pattern. The calibration plate can be a plane calibration plate or a three-dimensional calibration plate. The pattern on the plane calibration plate can be a checkerboard pattern, a circle array pattern or a ring array pattern or other two-dimensional code patterns.
The first camera and the second camera of the double-camera module shoot calibration plates at different angles to obtain calibration images at different angles. Calibration plates at different angles need to cover the field of view (FOV) of the entire camera. If the calibration plate is a plane calibration plate, the plane calibration plate is rotated at different angles, for example, one angle is 0 degree, the other two rotation angles are +/-theta degrees, and theta is greater than 15. If the calibration plate is a three-dimensional calibration plate, three vertical surfaces of the three-dimensional calibration plate can be directly shot to obtain calibration images at different angles.
In operation 204, feature points in the calibration image are detected.
The method comprises the steps of firstly detecting corresponding feature points of feature points in a calibration plate in a calibration image shot by a first camera, then detecting corresponding feature points in the calibration image shot by a second camera, and then searching corresponding homonymous feature points in the calibration image shot by the second camera according to the feature points in the calibration image shot by the first camera. The homonymous feature point refers to a first feature point corresponding to the feature point in the calibration plate in a calibration image obtained by shooting the same feature point in the calibration plate by the first camera, and a second feature point corresponding to the same feature point in the calibration plate in a calibration image shot by the second camera, wherein the feature point, the first feature point and the second feature point in the calibration plate are homonymous feature points. And detecting homonymous feature points in the calibration images shot by the first camera and the second camera.
In one embodiment, if the pattern on the calibration plate is a checkerboard pattern, detecting the feature points in the calibration image may include: obtaining an initial value of an angular point in an image by using a Harris angular point detection operator; detecting edge information in the calibration image, and grouping the obtained angular points to obtain an edge point set; and performing curve fitting on the selected edge points, wherein the curve fitting comprises global fitting curves and local fitting curves, and the intersection point of the global curves and the local curves is obtained to obtain the obtained angular points, namely the characteristic points in the calibration image.
In one embodiment, if the pattern of the calibration plate is an ellipse or a circle, detecting the feature point in the calibration image comprises: extracting elliptical edge information by adopting a canny edge, and fitting by using a general equation of an ellipse and a least square method to obtain a central point of the ellipse; the position of each ellipse in the image is represented by the coordinates of the center point of the ellipse, and the center points of the ellipses can be sorted by the coordinates of the center points of the ellipses.
And respectively extracting characteristic points of the calibration images shot by the first camera and the second camera to obtain the characteristic points in the respective calibration images. For any feature point in a calibration image of a certain angle of the first camera, epipolar constraint can be utilized to search and match in the calibration image of the second camera corresponding to the certain angle. For example, for a point p in three-dimensional space, projected to two different planes L1 and L2, the projected points are p1 and p2, respectively, and p, p1 and p2 constitute a plane S in three-dimensional space. The intersection n1 of S with the plane L1 crosses the point p1, which is called the epipolar line corresponding to p 2. Epipolar constraint refers to the mapping of the same point on both images, with the mapping point p1 known, then the mapping point p2 is on the epipolar line relative to p 1.
And operation 206, obtaining the distortion coefficient of the single camera by adopting a block curved function fitting according to the internal parameters and the external parameters of the single camera in the double-camera module and the corresponding characteristic points in the calibration image, wherein the fitting result of the block curved function is continuous.
The blocking surface function can be one of a B spline function, a free surface function and a Zernike polynomial function. The blocking surface function can realize continuous derivation between blocks, and the fitting result is continuous.
In operation 208, an initial external parameter of the dual-camera module is determined according to the external parameters of the single camera in the dual-camera module.
The external parameters of the double-camera module comprise a rotation matrix between the double cameras and a translation matrix between the double cameras. The rotation matrix and the translation matrix between the two cameras can be obtained by formula (3).
Figure PCTCN2018090464-APPB-000004
Wherein R 'is a rotation matrix between the two cameras, T' is a translation matrix between the two cameras, RrIs a rotation matrix of the first camera relative to the calibration object (i.e. the rotation matrix of the coordinate of the calibration object in the world coordinate system is converted into the coordinate of the camera coordinate system of the first camera), TrThe coordinate transformation method is a translation matrix of the first camera relative to the calibration object (namely, the translation matrix of the coordinate of the calibration object in the world coordinate system is transformed into the coordinate of the camera coordinate system of the first camera) obtained through calibration. RlIs a rotation matrix of the second camera relative to the calibration object (i.e. the rotation matrix of the coordinate of the calibration object in the world coordinate system is converted into the coordinate of the camera coordinate system of the second camera), TlThe translation matrix of the second camera relative to the calibration object is obtained through calibration (namely the translation matrix of the coordinate of the calibration object in the world coordinate system is converted into the coordinate of the camera coordinate system of the second camera).
And operation 210, processing according to the initial external parameters of the dual-camera module, the distortion coefficient of the single camera and the internal parameters of the single camera to obtain the target external parameters of the dual-camera module.
According to the double-camera calibration method in the embodiment of the application, the blocking surface function fitting is carried out according to the internal parameter and the external parameter of the single camera and the characteristic points in the calibration image to obtain the distortion coefficient, the obtained distortion coefficient precision is higher, the target external parameter of the double-camera module can be obtained by processing according to the internal parameter and the distortion coefficient of the single camera and the initial external parameter of the double-camera module, and the calibration precision is improved.
In one embodiment, operation 210 includes: and taking the deviation of the homonymous feature points in the calibration images shot by the first camera and the second camera as an objective function, taking the internal parameter and the distortion coefficient of the single camera as the input of the objective function, taking the initial external parameter of the double-camera module as the initial parameter of the objective function, and processing to obtain the target external parameter of the double-camera module.
The method comprises the steps of using the deviation of mapping feature points of a calibration image shot by a first camera to homonymous feature points of a calibration image shot by a second camera as an external parameter optimization objective function of the double cameras, using the feature points in the calibration image shot by the first camera and the homonymous feature points in the calibration image shot by the second camera, using internal parameters, external parameters and distortion coefficients of the first camera and parameters, external parameters and distortion coefficients of the second camera as the input of the external parameter optimization objective function of the double cameras, using initial external parameters of a double-camera module as initial parameters of the objective function, and processing to obtain target external parameters of the double-camera module.
The double-camera calibration method in the embodiment of the application performs block curved function fitting according to the internal parameters and the external parameters of the first camera and the second camera and the characteristic points in the calibration image to obtain a distortion coefficient, acquires the homonymous characteristic points in the calibration image shot by the first camera and the second camera, takes the deviation of the homonymous characteristic points in the calibration image shot by the first camera and the second camera as a target function, takes the characteristic points of the calibration image, the internal parameters of the first camera and the second camera as the input of the target function, optimizes the initial external parameters of the double-camera module as the initial parameters of the target function to obtain the target external parameters of the double-camera module, and obtains the distortion coefficient by using block curve function fitting, and the external parameters of the double-camera module are optimized by taking the deviation of the homonymous feature points in the calibration image as a target function, so that the calibration precision is improved.
In one embodiment, the obtaining of the distortion coefficient of the single camera by using a block curved function fitting according to the internal parameters and the external parameters of the single camera in the dual-camera module and the corresponding feature points in the calibration image comprises: obtaining a target point set and a distortion point set of the single camera at different angles according to the internal parameters and the external parameters of the single camera in the double-camera module and the characteristic points in the calibration image, fitting the target point set and the distortion point set by adopting a blocking surface function to obtain a fitting coefficient, and taking the fitting coefficient as the distortion coefficient of the corresponding camera.
The target point in the target point set refers to the point coordinates of the characteristic points projected on the angle calibration plate to the camera coordinate system after normalization processing. Distortion points in the distortion point set refer to point coordinates obtained after feature points on the calibration image of each angle are converted into a camera coordinate system through normalization processing.
In one embodiment, as shown in fig. 3, a set of target points and a set of distortion points for different angles of a single camera are obtained according to the internal reference and the external reference of the single camera in the dual-camera module and the feature points in the calibration image, beginning with operation 302.
And operation 302, calculating feature points on the calibration plates with different angles according to external parameters of a single camera in the double-camera module, projecting the feature points to a camera coordinate system, and normalizing to obtain coordinates of a target point.
And calculating the feature points on the calibration plate at different angles by using external parameters of a single camera, projecting the feature points to a camera coordinate system, and carrying out normalization processing to obtain the coordinates of the target point.
Firstly, the complete geometric model of the camera adopts a formula (4)
Figure PCTCN2018090464-APPB-000005
The formula (4) is a geometric model obtained by constructing a world coordinate system on a plane with Z being 0, X and Y are world coordinates of feature points on a plane calibration plate, and X, Y and Z are physical coordinates of the feature points on the calibration plate in a camera coordinate system.
Figure PCTCN2018090464-APPB-000006
R is a rotation matrix from the world coordinate system of the calibration plate to the camera coordinate system, and T is a translation matrix from the world coordinate system of the calibration plate to the camera coordinate system.
Figure PCTCN2018090464-APPB-000007
Wherein Z is 0, then
Figure PCTCN2018090464-APPB-000008
Homogeneous coordinates of coordinate points of the world coordinate system.
And (3) carrying out normalization processing on the physical coordinates [ x, y, z ] of the characteristic points on the calibration plate in the camera coordinate system to obtain target coordinate points (x ', y').
Figure PCTCN2018090464-APPB-000009
And operation 304, converting the detected feature points on the calibration images at different angles into camera coordinates according to the internal parameters of the single camera in the double-camera module, and normalizing to obtain distortion point coordinates.
And converting the characteristic points on the calibration images of different angles detected by the single camera into camera coordinates by using the internal reference of the single camera, and carrying out normalization processing to obtain distortion coordinates (x ', y').
Figure PCTCN2018090464-APPB-000010
And operation 306, obtaining a target point set of the single camera according to the target point coordinates of different angles, and obtaining a distortion point set of the single camera according to the distortion point coordinates of different angles.
Synthesizing the coordinates of the target points of different angles to obtain a target point set (x ') of the single camera'T,y' T) Synthesizing the distortion point coordinates of different angles to obtain the distortion point set (x ″') of the single cameraT,y″ T)。
Figure PCTCN2018090464-APPB-000011
In equation (7), 1,2,3, … …, n are different angles, and n is the number of angles.
In one embodiment, fitting the target point set and the distortion point set by using a block curved function to obtain a fitting coefficient includes: and dividing the target point set according to the block regions, fitting the divided target point set of each block and the corresponding distortion point set by adopting a block curved surface function to obtain a fitting coefficient of each block, and obtaining a distortion coefficient of the corresponding camera according to the fitting coefficient of each block.
Respectively performing surface fitting on the deformation in the x direction and the y direction by using the block curved surface function on the target point set and the distortion point set to obtain a fitting coefficient Dx,D y
x″ T=D x(x' T,y' T),y″ T=D y(x' T,y' T) Formula (8)
The blocking surface function may be one of a B-spline function, a free surface function, and a zernike polynomial function. The zernike polynomial function is composed of an infinite complete set of polynomials with two variables, ρ and θ, which are continuously orthogonal inside the unit circle. Taking a B-spline function as an example, the target point set is divided into block intervals according to the radial distortion curve characteristic of the lens, and the block intervals may be the same or different in size. The number of blocks is set according to the distortion complexity of the lens, for a lens with a smooth distortion curve, the long side direction of a general image is divided into 6 blocks, the short side direction is divided into 4 blocks, which can describe the lens distortion, as shown in fig. 4, a target point set is divided into 6 × 4 blocks, 7 points are arranged in the long side direction, and 5 points are arranged in the short side direction. More divisions are required for more complex lens distortion characteristics. Each block is described by a binary N-degree equation, N is larger than 2, and the setting is carried out according to the actual distortion deformation condition. Due to the characteristic of the B-spline, the curved surface between the blocks is continuous, and the fitted lens distortion is also continuous.
B-spline curves are developed based on Bezier curves, and are usually quadratic. Parameter equation of quadratic B-spline curve, three plane discrete points P are known0、P 1、P 2From these three points, a quadratic parabola can be definedLine segment, the parametric vector equation is in the form:
P(t)=A 0+A 1t+A 2t 2(t is more than or equal to 0 and less than or equal to 1) formula (9)
The rectangular form of the quadratic B-spline curve parametric equation is:
Figure PCTCN2018090464-APPB-000012
the quadratic B-spline curve is characterized by a starting point of P0、P 1The midpoint of the point and the line segment P0P 1Tangent with an end point of P1、P 2And is connected to line segment P1P 2Tangent, except for the starting point and the end point, the middle point pulls the curve to itself. And if the number of the discrete points is more than 3, performing piecewise fitting by adopting a quadratic B-spline curve.
FIG. 5 is a diagram illustrating the fitting results of the deformed surface with x-direction distortion in one embodiment. FIG. 6 is a diagram illustrating the fitting of a distortion surface in the y-direction in one embodiment. As can be seen from fig. 5, the curved surfaces between the B-spline fitted blocks are continuous, and as can be seen from fig. 6, the curved surfaces between the B-spline fitted blocks are continuous.
In one embodiment, the calibration image includes a first calibration image and a second calibration image at different angles;
obtaining a target point set and a distortion point set of a single camera at different angles according to internal parameters and external parameters of the single camera in the double-camera module and characteristic points in a calibration image, fitting the target point set and the distortion point set by adopting a block curved surface function to obtain a fitting coefficient, and taking the fitting coefficient as a distortion coefficient of the corresponding camera, wherein the method comprises the following steps:
obtaining a first target point set and a first distortion point set of different angles corresponding to the first camera according to the internal parameter and the external parameter of the first camera and the characteristic points of the first calibration image, fitting the first target point set and the first distortion point set by adopting a block curved surface function to obtain a first fitting coefficient, and taking the first fitting coefficient as the distortion coefficient of the first camera; and
and obtaining a second target point set and a second distortion point set of different angles corresponding to the second camera according to the internal parameter and the external parameter of the second camera and the characteristic points of the second calibration image, fitting the second target point set and the second distortion point set by adopting a blocking surface function to obtain a second fitting coefficient, and taking the second fitting coefficient as the distortion coefficient of the second camera.
In one embodiment, the calibration image is an image taken when the plane of the calibration plate is perpendicular to the optical axis of the dual-camera module, and the calibration plate covers the field angle of the dual-camera module.
In one embodiment, the dual-camera calibration method further includes: and calibrating the first camera and the second camera by using a single camera to obtain the internal reference and the external reference of the first camera and the internal reference and the external reference of the second camera.
Specifically, a calibration plate is obtained, calibration plates at different angles are shot through a single camera to obtain calibration images, feature points are extracted from the calibration images, 5 internal parameters and 2 external parameters of the single camera are calculated under the distortion-free condition, a distortion coefficient is calculated through a least square method, and then optimization is carried out through a maximum likelihood method to obtain the final internal parameters and the final external parameters of the single camera.
First, a camera model is established, and equation (11) is obtained.
Figure PCTCN2018090464-APPB-000013
Wherein,
Figure PCTCN2018090464-APPB-000014
the homogeneous coordinates of (a) represent the pixel coordinates (u, v,1) of the image plane,
Figure PCTCN2018090464-APPB-000015
the homogeneous coordinates of (a) represent coordinate points (X, Y, Z,1) of the world coordinate system, a represents an internal reference matrix, R represents a rotation matrix for conversion of the world coordinate system to the camera coordinate system, and T represents a translation matrix for conversion of the world coordinate system to the camera coordinate system.
Figure PCTCN2018090464-APPB-000016
Wherein α ═ f/dx,β=f/d yF is the focal length of a single camera, dxRepresenting the width of a pixel in the x-axis direction of the image coordinate system, dyRepresenting the width of one pixel in the y-axis direction of the image coordinate system. And gamma represents the deviation of the pixel point in the x and y directions. u. of0、v 0Which represents the coordinates of the principal point of the image plane, which is the intersection of the optical axis and the image plane.
The world coordinate system is constructed on a plane where Z is 0, homography calculation is performed, and the above equation is converted into equation (13) by setting Z to 0.
Figure PCTCN2018090464-APPB-000017
Homography refers to a projection mapping defined in computer vision as one plane to another. Let H be A [ r1 r 2 t]And H is a homography matrix. H is a 3 x 3 matrix and has one element as homogeneous coordinate, therefore, H has 8 unknowns to solve. The homography matrix is written in the form of three column vectors, i.e. H ═ H1 h 2 h 3]Thereby obtaining equation (14).
[h 1 h 2 h 3]=λA[r 1 r 2 t]Formula (14)
For equation (14), two constraints are employed, first, r1,r 2Orthogonal to obtain r1r 2=0,r 1,r 2Respectively rotate around the x and y axes. Second, the modulus of the rotation vector is 1, i.e. | r1|=|r 21. By two constraints, r is1,r 2Substitution by h1,h 2And A in combination. Namely r1=h 1A -1,r 2=h 2A -1. From two constraints, equation (15) can be derived:
Figure PCTCN2018090464-APPB-000018
order to
Figure PCTCN2018090464-APPB-000019
B is a symmetric array, so the effective elements of B are 6, and 6 elements constitute a vector B.
b=[B 11,B 12,B 22,B 13,B 23,B 33] T
Figure PCTCN2018090464-APPB-000020
V can be calculatedij=[h i1h j1,h i1h j2+h i2h j1,h i2h j2,h i3h j1+h i1h j3,h i3h j2+h i2h j3,h i3h j3] T
And (3) obtaining an equation system by using constraint conditions:
Figure PCTCN2018090464-APPB-000021
and B is estimated by applying a formula (16) through at least three images, and the B is decomposed to obtain an initial value of an internal reference matrix A of the camera.
And calculating the external parameter matrix based on the internal parameter matrix to obtain an initial value of the external parameter matrix.
Figure PCTCN2018090464-APPB-000022
Wherein λ 1/| | a-1h 1||=1/||A -1h 2||。
The complete geometric model of the camera adopts a formula (4)
Figure PCTCN2018090464-APPB-000023
The formula (4) is a geometric model obtained by constructing a world coordinate system on a plane with Z being 0, X and Y are world coordinates of feature points on a plane calibration plate, and X, Y and Z are physical coordinates of the feature points on the calibration plate in a camera coordinate system.
Figure PCTCN2018090464-APPB-000024
R is a rotation matrix from the world coordinate system of the calibration plate to the camera coordinate system, and T is a translation matrix from the world coordinate system of the calibration plate to the camera coordinate system.
And (3) carrying out normalization processing on the physical coordinates [ x, y, z ] of the characteristic points on the calibration plate in the camera coordinate system to obtain target coordinate points (x ', y').
Figure PCTCN2018090464-APPB-000025
And carrying out distortion deformation processing on the camera coordinate system image points by using a distortion model.
Figure PCTCN2018090464-APPB-000026
The physical coordinates are converted to image coordinates using the internal reference.
Figure PCTCN2018090464-APPB-000027
And importing the initial values of the internal reference matrix and the external reference matrix into a maximum likelihood formula to obtain the final internal reference matrix and the final external reference matrix. The maximum likelihood formula is
Figure PCTCN2018090464-APPB-000028
And calculating the minimum value.
In one embodiment, the single-camera calibration of the first camera and the second camera to obtain the internal reference and the external reference of the first camera and the internal reference and the external reference of the second camera includes: and carrying out single-camera calibration on the first camera and the second camera by adopting a blocking curved surface function to obtain the internal parameters and the external parameters of the first camera and the internal parameters and the external parameters of the second camera.
The method comprises the steps of firstly obtaining a calibration plate, shooting the calibration plate at different angles through a single camera to obtain a calibration image, extracting feature points from the calibration image, calculating an internal parameter initial value and an external parameter initial value of the single camera under the distortion-free condition, obtaining a distortion coefficient by applying block curved surface function fitting, and then optimizing the internal parameter initial value, the external parameter initial value and the distortion coefficient as input through a maximum likelihood method to obtain the final internal parameter and the final external parameter of the single camera.
FIG. 7 is a flow chart of a method for dual camera calibration in yet another embodiment. As shown in fig. 7, a method of dual camera calibration begins with operation 702.
In operation 702, a first camera captures calibration plate patterns at a plurality of angles to obtain first calibration images at the plurality of angles.
In operation 704, homonymous feature points in the first calibration image are detected.
Operation 706, perform single-camera calibration on the first camera to obtain the internal reference and the external reference of the first camera.
In operation 708, a distortion coefficient of the first camera is obtained by using a block curved function fitting according to the internal reference and the external reference of the first camera and the corresponding feature points in the first calibration image.
In operation 712, the second camera captures calibration plate patterns at a plurality of angles to obtain a second calibration image at the plurality of angles.
In operation 714, the feature points with the same name in the second calibration image are detected.
And operation 716, performing single-camera calibration on the second camera to obtain the internal parameters and the external parameters of the second camera.
In operation 718, a distortion coefficient of the second camera is obtained by using a block curved function fitting according to the internal reference and the external reference of the second camera and the corresponding feature points in the second calibration image.
In operation 720, an initial external parameter of the dual-camera module is calculated according to the first camera external parameter and the second camera external parameter.
In operation 722, taking the deviation of the homonymous feature points in the calibration images captured by the first camera and the second camera as an objective function, taking the homonymous feature points in the calibration images captured by the first camera and the second camera, the internal reference and distortion coefficient of the first camera, and the internal reference and distortion coefficient of the second camera as inputs of the objective function, taking the initial external reference of the dual-camera module as an initial parameter of the objective function, and processing to obtain the target external reference of the dual-camera module.
The double-camera calibration method in the embodiment of the application performs block curved function fitting according to the internal parameters and the external parameters of the first camera and the second camera and the characteristic points in the calibration image to obtain a distortion coefficient, acquires the homonymous characteristic points in the calibration image shot by the first camera and the second camera, takes the deviation of the homonymous characteristic points in the calibration image shot by the first camera and the second camera as a target function, takes the characteristic points of the calibration image, the internal parameters of the first camera and the second camera as the input of the target function, optimizes the initial external parameters of the double-camera module as the initial parameters of the target function to obtain the target external parameters of the double-camera module, and obtains the distortion coefficient by using block curve function fitting, and the external parameters of the double-camera module are optimized by taking the deviation of the homonymous feature points in the calibration image as a target function, so that the calibration precision is improved.
It should be understood that, although the steps in the flowcharts in fig. 2, 4 and 7 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in fig. 2, 4, and 7 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternatingly with other steps or at least some of the sub-steps or stages of other steps.
Fig. 8 is a block diagram of a dual-camera calibration apparatus in one embodiment. As shown in fig. 8, a dual-camera calibration apparatus includes an acquisition unit 810, a detection unit 820, a fitting unit 830, a determination unit 840, and an adjustment unit 850.
The collecting unit 810 is used for shooting calibration plates at different angles through the double-camera module to obtain calibration images at different angles. This two camera modules include first camera and second camera.
The detection unit 820 is used for detecting the feature points in the calibration image.
The fitting unit 830 is configured to obtain a distortion coefficient of the single camera by using a block curved function according to the internal reference and the external reference of the single camera in the dual-camera module and the corresponding feature points in the calibration image, where the fitting result of the block curved function is continuous. The block surface function is continuously derivable from block to block and the fitting result is continuous.
The determining unit 840 is configured to determine an initial external parameter of the dual-camera module according to the external parameter of the single camera in the dual-camera module.
The adjusting unit 850 is used for processing the initial external parameters of the dual-camera module, the distortion coefficient of the single camera and the internal parameters of the single camera to obtain the target external parameters of the dual-camera module.
The adjusting unit 850 is further configured to use the deviation of the homonymous feature points in the calibration images captured by the first camera and the second camera in the dual-camera module as an objective function, use the homonymous feature points in the calibration images captured by the first camera and the second camera, the internal parameters and the distortion coefficients of the single camera as inputs of the objective function, use the initial external parameters of the dual-camera module as initial parameters of the objective function, and perform processing to obtain the target external parameters of the dual-camera module.
In an embodiment, the fitting unit 830 is further configured to obtain a target point set and a distortion point set of a single camera at different angles according to the internal reference and the external reference of the single camera in the dual-camera module and the feature points in the calibration image, fit the target point set and the distortion point set by using a blocking surface function to obtain a fitting coefficient, and use the fitting coefficient as a distortion coefficient of the corresponding camera.
In one embodiment, the fitting unit 830 is further configured to calculate feature points on the calibration plates with different angles according to external parameters of a single camera in the dual-camera module, project the feature points onto a camera coordinate system, and normalize the feature points to obtain coordinates of a target point; converting the detected characteristic points on the calibration images with different angles into camera coordinates according to the internal parameters of a single camera in the double-camera module and normalizing to obtain distortion point coordinates; and obtaining a target point set of the single camera according to the target point coordinates of different angles, and obtaining a distortion point set of the single camera according to the distortion point coordinates of different angles.
In one embodiment, the calibration image includes a first calibration image and a second calibration image at different angles;
the fitting unit 830 is further configured to obtain a first target point set and a first distortion point set corresponding to the first camera at different angles according to the internal reference and the external reference of the first camera and the feature points of the first calibration image, fit the first target point set and the first distortion point set by using a blocking surface function to obtain a first fitting coefficient, and use the first fitting coefficient as a distortion coefficient of the first camera; and obtaining a second target point set and a second distortion point set of different angles corresponding to the second camera according to the internal parameter and the external parameter of the second camera and the characteristic points of the second calibration image, fitting the second target point set and the second distortion point set by adopting a blocking surface function to obtain a second fitting coefficient, and taking the second fitting coefficient as the distortion coefficient of the second camera.
In an embodiment, the fitting unit 830 is further configured to divide the target point set according to block intervals, fit the divided target point set and the corresponding distortion point set of each block by using a block curved function to obtain a fitting coefficient of each block, and obtain a distortion coefficient of the corresponding camera according to the fitting coefficient of each block.
In one embodiment, the block intervals are the same or different in size.
In one embodiment, the calibration image is an image taken when the plane of the calibration plate is perpendicular to the optical axis of the dual-camera module, and the calibration plate covers the field angle of the dual-camera module.
In one embodiment, the dual-camera calibration apparatus further comprises a calibration unit. The calibration unit is used for carrying out single-camera calibration on the first camera and the second camera to obtain the internal parameters and the external parameters of the first camera and the internal parameters and the external parameters of the second camera.
In an embodiment, the calibration unit is further configured to perform single-camera calibration on the first camera and the second camera by using a blocking surface function, so as to obtain the internal parameters and the external parameters of the first camera and the internal parameters and the external parameters of the second camera.
In one embodiment, the blocking surface function is one of a B-spline function, a free-form surface function, and a zernike polynomial function.
In one embodiment, the calibration plate is a planar calibration plate or a three-dimensional calibration plate, and the pattern of the planar calibration plate is a checkerboard, a circular array or a circular ring array.
The embodiment of the application also provides the electronic equipment. The electronic device comprises a memory and a processor, wherein the memory stores a computer program, and the computer program causes the processor to execute the operation of the dual-camera calibration method when being executed by the processor.
The embodiment of the application provides a nonvolatile computer readable storage medium. A non-transitory computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements operations in the following dual-camera calibration method.
Fig. 9 is a schematic diagram of an internal structure of an electronic device in one embodiment. As shown in fig. 9, the electronic device includes a processor, a memory, and a network interface connected by a system bus. Wherein, the processor is used for providing calculation and control capability and supporting the operation of the whole electronic equipment. The memory is used for storing data, programs and the like, and the memory stores at least one computer program which can be executed by the processor to realize the wireless network communication method suitable for the electronic device provided by the embodiment of the application. The memory may include a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The computer program can be executed by a processor to implement a dual-camera calibration method provided in the following embodiments. The internal memory provides a cached execution environment for the operating system computer programs in the non-volatile storage medium. The network interface may be an ethernet card or a wireless network card, etc. for communicating with an external electronic device. The electronic device may be a mobile phone, a tablet computer, or a personal digital assistant or a wearable device, etc.
The implementation of each module in the dual-camera calibration device provided in the embodiment of the present application may be in the form of a computer program. The computer program may be run on a terminal or a server. The program modules constituted by the computer program may be stored on the memory of the terminal or the server. Which when executed by a processor, performs the steps of the method described in the embodiments of the present application.
A computer program product containing instructions which, when run on a computer, cause the computer to perform a dual-camera calibration method.
The embodiment of the application also provides the electronic equipment. The electronic device includes therein an Image Processing circuit, which may be implemented using hardware and/or software components, and may include various Processing units defining an ISP (Image Signal Processing) pipeline. FIG. 10 is a schematic diagram of an image processing circuit in one embodiment. As shown in fig. 10, for convenience of explanation, only aspects of the image processing technology related to the embodiments of the present application are shown.
As shown in fig. 10, the image processing circuit includes a first ISP processor 1030, a second ISP processor 1040, and a control logic 1050. The first camera 1010 includes one or more first lenses 1012 and a first image sensor 1014. First image sensor 1014 may include a color filter array (e.g., a Bayer filter), and first image sensor 1014 may acquire light intensity and wavelength information captured with each imaging pixel of first image sensor 1014 and provide a set of image data that may be processed by first ISP processor 1030. The second camera 1020 includes one or more second lenses 1022 and a second image sensor 1024. The second image sensor 1024 may include a color filter array (e.g., a Bayer filter), and the second image sensor 1024 may acquire light intensity and wavelength information captured with each imaging pixel of the second image sensor 1024 and provide a set of image data that may be processed by the second ISP processor 1040.
The first image acquired by the first camera 1010 is transmitted to the first ISP processor 1030 to be processed, after the first ISP processor 1030 processes the first image, the statistical data (such as the brightness of the image, the contrast value of the image, the color of the image, and the like) of the first image can be sent to the control logic 1050, and the control logic 1050 can determine the control parameter of the first camera 1010 according to the statistical data, so that the first camera 1010 can perform operations such as automatic focusing and automatic exposure according to the control parameter. The first image may be stored in the image memory 1060 after being processed by the first ISP processor 1030, and the first ISP processor 1030 may also read the image stored in the image memory 1060 for processing. In addition, the first image may be directly transmitted to the display 1070 to be displayed after being processed by the ISP processor 1030, and the display 1070 may also read and display the image in the image memory 1060.
Wherein the first ISP processor 1030 processes the image data pixel by pixel in a plurality of formats. For example, each image pixel may have a bit depth of 8, 10, 12, or 14 bits, and the first ISP processor 1030 may perform one or more image processing operations on the image data, collecting statistics about the image data. Wherein the image processing operations may be performed with the same or different bit depth precision.
The image Memory 1060 may be a portion of a Memory device, a storage device, or a separate dedicated Memory within an electronic device, and may include a DMA (Direct Memory Access) feature.
Upon receiving an interface from first image sensor 1014, first ISP processor 1030 may perform one or more image processing operations, such as temporal filtering. The processed image data may be sent to an image memory 1060 for additional processing before being displayed. The first ISP processor 1030 receives processed data from the image memory 1060 and performs image data processing in RGB and YCbCr color space on the processed data. The image data processed by the first ISP processor 1030 may be output to a display 1070 for viewing by a user and/or further processed by a Graphics Processing Unit (GPU). Further, the output of the first ISP processor 1030 may also be sent to an image memory 1060, and the display 1070 may read image data from the image memory 1060. In one embodiment, image memory 1060 may be configured to implement one or more frame buffers.
The statistics determined by the first ISP processor 1030 may be sent to the control logic 1050. For example, the statistical data may include first image sensor 1014 statistics such as auto-exposure, auto-white balance, auto-focus, flicker detection, black level compensation, first lens 1012 shading correction, and the like. Control logic 1050 may include a processor and/or microcontroller that executes one or more routines (e.g., firmware) that may determine control parameters for first camera 1010 and control parameters for first ISP processor 1030 based on the received statistical data. For example, the control parameters of the first camera 1010 may include gain, integration time of exposure control, anti-shake parameters, flash control parameters, first lens 1012 control parameters (e.g., focal length for focusing or zooming), or a combination of these parameters, and the like. The ISP control parameters may include gain levels and color correction matrices for automatic white balance and color adjustment (e.g., during RGB processing), as well as first lens 1012 shading correction parameters.
Similarly, the second image captured by the second camera 1020 is transmitted to the second ISP processor 1040 for processing, after the second ISP processor 1040 processes the first image, the statistical data of the second image (such as the brightness of the image, the contrast value of the image, the color of the image, etc.) may be sent to the control logic 1050, and the control logic 1050 may determine the control parameter of the second camera 1020 according to the statistical data, so that the second camera 1020 may perform operations such as auto-focus and auto-exposure according to the control parameter. The second image may be stored in the image memory 1060 after being processed by the second ISP processor 1040, and the second ISP processor 1040 may also read the image stored in the image memory 1060 for processing. In addition, the second image may be directly transmitted to the display 1070 to be displayed after being processed by the ISP processor 1040, or the display 1070 may read and display the image in the image memory 1060. The second camera 1020 and the second ISP processor 1040 may also implement the processes described for the first camera 1010 and the first ISP processor 1030.
The following steps are steps for implementing the dual-camera calibration method by using the image processing technique in fig. 10.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. Suitable non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present application. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (34)

  1. A calibration method for two cameras is characterized by comprising the following steps:
    shooting calibration plates at different angles through the double-camera module to obtain calibration images at different angles;
    detecting feature points in the calibration image;
    adopting a block curved surface function to perform fitting according to the internal parameters and the external parameters of a single camera in the double-camera module and the corresponding characteristic points in the calibration image to obtain the distortion coefficient of the single camera, wherein the fitting result of the block curved surface function is continuous;
    determining initial external parameters of the double-camera module according to the external parameters of a single camera in the double-camera module; and
    and processing according to the initial external parameters of the double-camera module, the distortion coefficient of the single camera and the internal parameters of the single camera to obtain the target external parameters of the double-camera module.
  2. The method of claim 1, wherein obtaining the distortion coefficient of the single camera by using a block surface function fitting according to the internal reference and the external reference of the single camera in the dual-camera module and the feature points in the corresponding calibration image comprises:
    and obtaining a target point set and a distortion point set of the single camera at different angles according to the internal parameters and the external parameters of the single camera in the double-camera module and the characteristic points in the calibration image, fitting the target point set and the distortion point set by adopting a blocking surface function to obtain a fitting coefficient, and taking the fitting coefficient as the distortion coefficient of the corresponding camera.
  3. The method according to claim 2, wherein obtaining the target point set and the distortion point set of the single camera at different angles according to the internal reference and the external reference of the single camera in the dual-camera module and the feature points in the calibration image comprises:
    calculating characteristic points on the calibration plates with different angles according to external parameters of a single camera in the double-camera module, projecting the characteristic points to a camera coordinate system and normalizing to obtain a target point coordinate;
    converting the detected characteristic points on the calibration images with different angles into camera coordinates according to the internal parameters of a single camera in the double-camera module and normalizing to obtain distortion point coordinates; and
    and obtaining a target point set of the single camera according to the target point coordinates of different angles, and obtaining a distortion point set of the single camera according to the distortion point coordinates of different angles.
  4. The method of claim 2, wherein the dual-camera module comprises a first camera and a second camera, and the calibration images comprise a first calibration image and a second calibration image at different angles;
    obtaining a target point set and a distortion point set of a single camera at different angles according to internal parameters and external parameters of the single camera in the double-camera module and characteristic points in a calibration image, fitting the target point set and the distortion point set by adopting a blocking surface function to obtain a fitting coefficient, and taking the fitting coefficient as a distortion coefficient of the corresponding camera, wherein the method comprises the following steps:
    obtaining a first target point set and a first distortion point set of different angles corresponding to the first camera according to the internal parameter and the external parameter of the first camera and the characteristic points of the first calibration image, fitting the first target point set and the first distortion point set by adopting a block curved surface function to obtain a first fitting coefficient, and taking the first fitting coefficient as the distortion coefficient of the first camera; and
    and obtaining a second target point set and a second distortion point set of different angles corresponding to the second camera according to the internal parameter and the external parameter of the second camera and the characteristic points of the second calibration image, fitting the second target point set and the second distortion point set by adopting a blocking surface function to obtain a second fitting coefficient, and taking the second fitting coefficient as the distortion coefficient of the second camera.
  5. The method of claim 2, wherein fitting the set of target point points and the set of distortion point points with a patch surface function to obtain a fitting coefficient comprises:
    and dividing the target point set according to the block regions, fitting the divided target point set of each block and the corresponding distortion point set by adopting a block curved surface function to obtain a fitting coefficient of each block, and obtaining a distortion coefficient of the corresponding camera according to the fitting coefficient of each block.
  6. The method of claim 5, wherein the block intervals are the same or different in size.
  7. The method of claim 1, wherein obtaining the target external parameters of the dual-camera module by processing according to the initial external parameters of the dual-camera module, the distortion coefficient of the single camera and the internal parameters of the single camera comprises:
    the method comprises the steps of taking the deviation of characteristic points in calibration images shot by a first camera and a second camera in a double-camera module as an objective function, taking the characteristic points in the calibration images shot by the first camera and the second camera, and the internal parameter and distortion coefficient of a single camera as the input of the objective function, taking the initial external parameter of the double-camera module as the initial parameter of the objective function, and processing to obtain the target external parameter of the double-camera module.
  8. The method of claim 1, wherein the calibration image is an image taken when a plane of a calibration plate is perpendicular to an optical axis of a dual-camera module, the calibration plate covering a field angle of the dual-camera module.
  9. The method of claim 1, further comprising:
    and calibrating the first camera and the second camera by using a single camera to obtain the internal reference and the external reference of the first camera and the internal reference and the external reference of the second camera.
  10. The method according to claim 9, wherein the performing single-camera calibration on the first camera and the second camera to obtain the internal reference and the external reference of the first camera and the internal reference and the external reference of the second camera comprises:
    and carrying out single-camera calibration on the first camera and the second camera by adopting a blocking curved surface function to obtain the internal parameters and the external parameters of the first camera and the internal parameters and the external parameters of the second camera.
  11. The method of claim 1, wherein the patch surface function is one of a B-spline function, a free-form surface function, and a zernike polynomial function.
  12. The method of claim 1, wherein the calibration plate is a planar calibration plate or a solid calibration plate, and the pattern of the planar calibration plate is a checkerboard, a circular array or a circular ring array.
  13. An electronic device comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of:
    shooting calibration plates at different angles through the double-camera module to obtain calibration images at different angles;
    detecting feature points in the calibration image;
    adopting a block curved surface function to perform fitting according to the internal parameters and the external parameters of a single camera in the double-camera module and the corresponding characteristic points in the calibration image to obtain the distortion coefficient of the single camera, wherein the fitting result of the block curved surface function is continuous;
    determining initial external parameters of the double-camera module according to the external parameters of a single camera in the double-camera module; and
    and processing according to the initial external parameters of the double-camera module, the distortion coefficient of the single camera and the internal parameters of the single camera to obtain the target external parameters of the double-camera module.
  14. The electronic device of claim 13, wherein the processor is further configured to perform: and obtaining a target point set and a distortion point set of the single camera at different angles according to the internal parameters and the external parameters of the single camera in the double-camera module and the characteristic points in the calibration image, fitting the target point set and the distortion point set by adopting a blocking surface function to obtain a fitting coefficient, and taking the fitting coefficient as the distortion coefficient of the corresponding camera.
  15. The electronic device of claim 14, wherein the processor is further configured to perform: calculating characteristic points on the calibration plates with different angles according to external parameters of a single camera in the double-camera module, projecting the characteristic points to a camera coordinate system and normalizing to obtain a target point coordinate;
    converting the detected characteristic points on the calibration images with different angles into camera coordinates according to the internal parameters of a single camera in the double-camera module and normalizing to obtain distortion point coordinates; and
    and obtaining a target point set of the single camera according to the target point coordinates of different angles, and obtaining a distortion point set of the single camera according to the distortion point coordinates of different angles.
  16. The electronic device of claim 14, wherein the dual-camera module comprises a first camera and a second camera, and the calibration images comprise a first calibration image and a second calibration image of different angles; the processor is further configured to perform: obtaining a first target point set and a first distortion point set of different angles corresponding to the first camera according to the internal parameter and the external parameter of the first camera and the characteristic points of the first calibration image, fitting the first target point set and the first distortion point set by adopting a block curved surface function to obtain a first fitting coefficient, and taking the first fitting coefficient as the distortion coefficient of the first camera; and
    and obtaining a second target point set and a second distortion point set of different angles corresponding to the second camera according to the internal parameter and the external parameter of the second camera and the characteristic points of the second calibration image, fitting the second target point set and the second distortion point set by adopting a blocking surface function to obtain a second fitting coefficient, and taking the second fitting coefficient as the distortion coefficient of the second camera.
  17. The electronic device of claim 14, wherein the processor is further configured to perform: and dividing the target point set according to the block regions, fitting the divided target point set of each block and the corresponding distortion point set by adopting a block curved surface function to obtain a fitting coefficient of each block, and obtaining a distortion coefficient of the corresponding camera according to the fitting coefficient of each block.
  18. The electronic device of claim 17, wherein the block intervals are the same or different in size.
  19. The electronic device according to claim 13, wherein the processor is further configured to take a deviation of a homonymous feature point in a calibration image captured by a first camera and a second camera in the dual-camera module as an objective function, take the homonymous feature point in the calibration image captured by the first camera and the second camera, an internal parameter and a distortion coefficient of the single camera as inputs of the objective function, take an initial external parameter of the dual-camera module as an initial parameter of the objective function, and process the initial external parameter to obtain an objective external parameter of the dual-camera module.
  20. The electronic device of claim 13, wherein the calibration image is an image captured when a plane of a calibration plate is perpendicular to an optical axis of a dual-camera module, and the calibration plate covers a field angle of the dual-camera module.
  21. The electronic device of claim 13, wherein the processor is further configured to perform: and calibrating the first camera and the second camera by using a single camera to obtain the internal reference and the external reference of the first camera and the internal reference and the external reference of the second camera.
  22. The electronic device of claim 21, wherein the processor is further configured to perform: and carrying out single-camera calibration on the first camera and the second camera by adopting a blocking curved surface function to obtain the internal parameters and the external parameters of the first camera and the internal parameters and the external parameters of the second camera.
  23. The electronic device of claim 3, wherein the patch surface function is one of a B-spline function, a free-form surface function, and a Zernike polynomial function.
  24. A non-transitory computer readable storage medium having a computer program stored thereon, wherein the computer program when executed by a processor performs the operations of:
    shooting calibration plates at different angles through the double-camera module to obtain calibration images at different angles;
    detecting feature points in the calibration image;
    adopting a block curved surface function to perform fitting according to the internal parameters and the external parameters of a single camera in the double-camera module and the corresponding characteristic points in the calibration image to obtain the distortion coefficient of the single camera, wherein the fitting result of the block curved surface function is continuous;
    determining initial external parameters of the double-camera module according to the external parameters of a single camera in the double-camera module; and
    and processing according to the initial external parameters of the double-camera module, the distortion coefficient of the single camera and the internal parameters of the single camera to obtain the target external parameters of the double-camera module.
  25. The non-transitory computer-readable storage medium of claim 24, wherein the computer program, when executed by the processor, further performs the following:
    and obtaining a target point set and a distortion point set of the single camera at different angles according to the internal parameters and the external parameters of the single camera in the double-camera module and the characteristic points in the calibration image, fitting the target point set and the distortion point set by adopting a blocking surface function to obtain a fitting coefficient, and taking the fitting coefficient as the distortion coefficient of the corresponding camera.
  26. The non-transitory computer-readable storage medium of claim 25, wherein the computer program, when executed by the processor, further performs the following:
    calculating characteristic points on the calibration plates with different angles according to external parameters of a single camera in the double-camera module, projecting the characteristic points to a camera coordinate system and normalizing to obtain a target point coordinate;
    converting the detected characteristic points on the calibration images with different angles into camera coordinates according to the internal parameters of a single camera in the double-camera module and normalizing to obtain distortion point coordinates; and
    and obtaining a target point set of the single camera according to the target point coordinates of different angles, and obtaining a distortion point set of the single camera according to the distortion point coordinates of different angles.
  27. The non-transitory computer readable storage medium of claim 25, wherein the dual camera module comprises a first camera and a second camera, and the calibration images comprise a first calibration image and a second calibration image at different angles;
    the computer program when executed by a processor further performs the following: obtaining a first target point set and a first distortion point set of different angles corresponding to the first camera according to the internal parameter and the external parameter of the first camera and the characteristic points of the first calibration image, fitting the first target point set and the first distortion point set by adopting a block curved surface function to obtain a first fitting coefficient, and taking the first fitting coefficient as the distortion coefficient of the first camera; and
    and obtaining a second target point set and a second distortion point set of different angles corresponding to the second camera according to the internal parameter and the external parameter of the second camera and the characteristic points of the second calibration image, fitting the second target point set and the second distortion point set by adopting a blocking surface function to obtain a second fitting coefficient, and taking the second fitting coefficient as the distortion coefficient of the second camera.
  28. The non-transitory computer-readable storage medium of claim 25, wherein the computer program, when executed by the processor, further performs the following: and dividing the target point set according to the block regions, fitting the divided target point set of each block and the corresponding distortion point set by adopting a block curved surface function to obtain a fitting coefficient of each block, and obtaining a distortion coefficient of the corresponding camera according to the fitting coefficient of each block.
  29. The non-transitory computer-readable storage medium of claim 28, wherein the block intervals are the same or different in size.
  30. The non-transitory computer-readable storage medium of claim 24, wherein the computer program, when executed by the processor, further performs the following:
    and taking the deviation of the homonymous feature points in the calibration images shot by the first camera and the second camera in the double-camera module as a target function, taking the homonymous feature points in the calibration images shot by the first camera and the second camera, the internal parameters and the distortion coefficients of the single camera as the input of the target function, taking the initial external parameters of the double-camera module as the initial parameters of the target function, and processing to obtain the target external parameters of the double-camera module.
  31. The non-transitory computer readable storage medium of claim 24, wherein the calibration image is an image taken when a plane of a calibration plate is perpendicular to an optical axis of a dual camera module, the calibration plate covering a field angle of the dual camera module.
  32. The non-transitory computer-readable storage medium of claim 24, wherein the computer program, when executed by the processor, further performs the following:
    and calibrating the first camera and the second camera by using a single camera to obtain the internal reference and the external reference of the first camera and the internal reference and the external reference of the second camera.
  33. The non-transitory computer-readable storage medium of claim 32, wherein the computer program, when executed by the processor, further performs the following: and carrying out single-camera calibration on the first camera and the second camera by adopting a blocking curved surface function to obtain the internal parameters and the external parameters of the first camera and the internal parameters and the external parameters of the second camera.
  34. The non-transitory computer readable storage medium of claim 24, wherein the patch surface function is one of a B-spline function, a free-form surface function, and a zernike polynomial function.
CN201880095497.6A 2018-06-08 2018-06-08 Dual-camera calibration method, electronic device and computer-readable storage medium Pending CN112470192A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/090464 WO2019232793A1 (en) 2018-06-08 2018-06-08 Two-camera calibration method, electronic device and computer-readable storage medium

Publications (1)

Publication Number Publication Date
CN112470192A true CN112470192A (en) 2021-03-09

Family

ID=68769248

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880095497.6A Pending CN112470192A (en) 2018-06-08 2018-06-08 Dual-camera calibration method, electronic device and computer-readable storage medium

Country Status (2)

Country Link
CN (1) CN112470192A (en)
WO (1) WO2019232793A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112991465A (en) * 2021-03-26 2021-06-18 禾多科技(北京)有限公司 Camera calibration method and device, electronic equipment and computer readable medium
CN113067984A (en) * 2021-03-30 2021-07-02 Oppo广东移动通信有限公司 Binocular shooting correction method, binocular shooting correction device and electronic equipment
CN113706499A (en) * 2021-08-25 2021-11-26 北京市商汤科技开发有限公司 Error detection method and related product
CN113763545A (en) * 2021-09-22 2021-12-07 拉扎斯网络科技(上海)有限公司 Image determination method, image determination device, electronic equipment and computer-readable storage medium
CN114383564A (en) * 2022-01-11 2022-04-22 平安普惠企业管理有限公司 Depth measurement method, device and equipment based on binocular camera and storage medium
CN117830391A (en) * 2023-12-29 2024-04-05 广东美的白色家电技术创新中心有限公司 Method, apparatus, device and storage medium for coordinate conversion
CN117934556A (en) * 2024-03-25 2024-04-26 杭州海康威视数字技术股份有限公司 High-altitude parabolic detection method and device, storage medium and electronic equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101021947A (en) * 2006-09-22 2007-08-22 东南大学 Double-camera calibrating method in three-dimensional scanning system
CN101261738A (en) * 2008-03-28 2008-09-10 北京航空航天大学 A camera marking method based on double 1-dimension drone
US20110010122A1 (en) * 2009-07-07 2011-01-13 Delta Design, Inc. Calibrating separately located cameras with a double sided visible calibration target for ic device testing handlers
CN103323209A (en) * 2013-07-02 2013-09-25 清华大学 Structural modal parameter identification system based on binocular stereo vision
CN107081755A (en) * 2017-01-25 2017-08-22 上海电气集团股份有限公司 A kind of robot monocular vision guides the automatic calibration device of system
CN108122259A (en) * 2017-12-20 2018-06-05 厦门美图之家科技有限公司 Binocular camera scaling method, device, electronic equipment and readable storage medium storing program for executing

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100573586C (en) * 2008-02-21 2009-12-23 南京航空航天大学 A kind of scaling method of binocular three-dimensional measuring system
CN103530852A (en) * 2013-10-15 2014-01-22 南京芒冠光电科技股份有限公司 Method for correcting distortion of lens

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101021947A (en) * 2006-09-22 2007-08-22 东南大学 Double-camera calibrating method in three-dimensional scanning system
CN101261738A (en) * 2008-03-28 2008-09-10 北京航空航天大学 A camera marking method based on double 1-dimension drone
US20110010122A1 (en) * 2009-07-07 2011-01-13 Delta Design, Inc. Calibrating separately located cameras with a double sided visible calibration target for ic device testing handlers
CN103323209A (en) * 2013-07-02 2013-09-25 清华大学 Structural modal parameter identification system based on binocular stereo vision
CN107081755A (en) * 2017-01-25 2017-08-22 上海电气集团股份有限公司 A kind of robot monocular vision guides the automatic calibration device of system
CN108122259A (en) * 2017-12-20 2018-06-05 厦门美图之家科技有限公司 Binocular camera scaling method, device, electronic equipment and readable storage medium storing program for executing

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112991465A (en) * 2021-03-26 2021-06-18 禾多科技(北京)有限公司 Camera calibration method and device, electronic equipment and computer readable medium
CN113067984A (en) * 2021-03-30 2021-07-02 Oppo广东移动通信有限公司 Binocular shooting correction method, binocular shooting correction device and electronic equipment
CN113067984B (en) * 2021-03-30 2023-01-17 Oppo广东移动通信有限公司 Binocular shooting correction method, binocular shooting correction device and electronic equipment
CN113706499A (en) * 2021-08-25 2021-11-26 北京市商汤科技开发有限公司 Error detection method and related product
CN113763545A (en) * 2021-09-22 2021-12-07 拉扎斯网络科技(上海)有限公司 Image determination method, image determination device, electronic equipment and computer-readable storage medium
CN114383564A (en) * 2022-01-11 2022-04-22 平安普惠企业管理有限公司 Depth measurement method, device and equipment based on binocular camera and storage medium
CN117830391A (en) * 2023-12-29 2024-04-05 广东美的白色家电技术创新中心有限公司 Method, apparatus, device and storage medium for coordinate conversion
CN117934556A (en) * 2024-03-25 2024-04-26 杭州海康威视数字技术股份有限公司 High-altitude parabolic detection method and device, storage medium and electronic equipment
CN117934556B (en) * 2024-03-25 2024-07-23 杭州海康威视数字技术股份有限公司 High-altitude parabolic detection method and device, storage medium and electronic equipment

Also Published As

Publication number Publication date
WO2019232793A1 (en) 2019-12-12

Similar Documents

Publication Publication Date Title
CN110689581B (en) Structured light module calibration method, electronic device and computer readable storage medium
CN110717942B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN112470192A (en) Dual-camera calibration method, electronic device and computer-readable storage medium
CN107633536B (en) Camera calibration method and system based on two-dimensional plane template
CN110969667B (en) Multispectral camera external parameter self-correction algorithm based on edge characteristics
CN106815869B (en) Optical center determining method and device of fisheye camera
CN111383264B (en) Positioning method, positioning device, terminal and computer storage medium
CN109598763B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium
CN109559353B (en) Camera module calibration method and device, electronic equipment and computer readable storage medium
CN106570907B (en) Camera calibration method and device
CN109598764A (en) Camera calibration method and device, electronic equipment, computer readable storage medium
CN112257713A (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN109584312B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium
US11514608B2 (en) Fisheye camera calibration system, method and electronic device
WO2020134123A1 (en) Panoramic photographing method and device, camera and mobile terminal
CN111340737A (en) Image rectification method, device and electronic system
WO2023236508A1 (en) Image stitching method and system based on billion-pixel array camera
KR100513789B1 (en) Method of Lens Distortion Correction and Orthoimage Reconstruction In Digital Camera and A Digital Camera Using Thereof
CN109584311B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium
CN109697737B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium
CN111353945B (en) Fisheye image correction method, device and storage medium
CN109658459B (en) Camera calibration method, device, electronic equipment and computer-readable storage medium
CN111432117A (en) Image rectification method, device and electronic system
CN115661258A (en) Calibration method and device, distortion correction method and device, storage medium and terminal
CN113592934B (en) Target depth and height measuring method and device based on monocular camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination