CN115578466A - Camera calibration method and device, computer readable storage medium and electronic equipment - Google Patents

Camera calibration method and device, computer readable storage medium and electronic equipment Download PDF

Info

Publication number
CN115578466A
CN115578466A CN202110755541.5A CN202110755541A CN115578466A CN 115578466 A CN115578466 A CN 115578466A CN 202110755541 A CN202110755541 A CN 202110755541A CN 115578466 A CN115578466 A CN 115578466A
Authority
CN
China
Prior art keywords
camera
new
parameters
virtual
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110755541.5A
Other languages
Chinese (zh)
Inventor
胡锦丽
刘阳兴
孟俊彪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan TCL Group Industrial Research Institute Co Ltd
Original Assignee
Wuhan TCL Group Industrial Research Institute Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan TCL Group Industrial Research Institute Co Ltd filed Critical Wuhan TCL Group Industrial Research Institute Co Ltd
Priority to CN202110755541.5A priority Critical patent/CN115578466A/en
Publication of CN115578466A publication Critical patent/CN115578466A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses a camera calibration method, a camera calibration device, a computer readable storage medium and electronic equipment. The method is applied to an electronic device at least comprising a first camera and a second camera, and comprises the following steps: acquiring internal parameters and external parameters of the first camera and the second camera respectively; constructing a virtual new camera according to the respective internal parameters of the first camera and the second camera; calibrating the first camera according to the virtual new camera and the internal parameters and the external parameters of the first camera, and calibrating the second camera according to the internal parameters and the external parameters of the virtual new camera and the second camera. According to the scheme, the average re-projection error is reduced when the camera is calibrated, and the calibration precision of the camera is improved.

Description

Camera calibration method and device, computer readable storage medium and electronic equipment
Technical Field
The present application relates to the field of machine vision technologies, and in particular, to a method and an apparatus for calibrating a camera, a computer-readable storage medium, and an electronic device.
Background
In image measurement and machine vision applications, it is necessary to determine the three-dimensional position of an object in space in relation to its interrelationship in a two-dimensional geometric image. The relationship between the pixel position of the image taken by the camera and the geometric position of the spatial object can be determined by calibrating the camera, and calibration based on a binocular multi-resolution camera has become a common method. Generally, the calibration of a binocular camera is to down-sample images captured by two cameras with different resolutions to a low resolution, and then scale the low resolution images to the same resolution before performing the calibration of the binocular camera. Scaling the low resolution image reduces the accuracy of the camera calibration.
Disclosure of Invention
The embodiment of the application provides a camera calibration method and device, a computer readable storage medium and an electronic device, which can improve the calibration precision of a camera.
In a first aspect, an embodiment of the present application provides a camera calibration method, which is applied to an electronic device including at least a first camera and a second camera, and the method includes:
acquiring respective internal parameters and external parameters of a first camera and a second camera;
constructing a virtual new camera according to the respective internal parameters of the first camera and the second camera;
and calibrating the first camera according to the virtual new camera and the internal parameters and the external parameters of the first camera, and calibrating the second camera according to the internal parameters and the external parameters of the virtual new camera and the second camera.
In a second aspect, an embodiment of the present application provides a camera calibration apparatus, which is applied to an electronic device including at least a first camera and a second camera, and includes:
the acquisition module is used for acquiring internal parameters and external parameters of the first camera and the second camera respectively;
the construction module is used for constructing a virtual new camera according to the respective internal parameters of the first camera and the second camera;
and the calibration module is used for calibrating the first camera according to the virtual new camera and the internal parameters and the external parameters of the first camera, and calibrating the second camera according to the virtual new camera and the internal parameters and the external parameters of the second camera.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the steps in the method as provided in the first aspect of the embodiment of the present application.
In a fourth aspect, an embodiment of the present application further provides an electronic device, where the electronic device includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor executes the computer program to implement the steps in the method as provided in the first aspect of the embodiment of the present application.
In the embodiment of the application, the internal parameters and the external parameters of a first camera and a second camera are obtained first, then a virtual new camera is constructed according to the internal parameters and the external parameters of the first camera and the second camera, finally the first camera is calibrated according to the virtual new camera and the internal parameters and the external parameters of the first camera, and the second camera is calibrated according to the internal parameters and the external parameters of the virtual new camera and the second camera. According to the scheme, when the camera is calibrated, the virtual new camera is used as a reference for calibrating the first camera and the second camera, so that images respectively shot by the first camera and the second camera do not need to be scaled to the same resolution, the average re-projection error is reduced, and the calibration precision of the cameras is improved.
Drawings
The technical solutions and advantages of the present application will be apparent from the following detailed description of specific embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a schematic flowchart of a camera calibration method provided in an embodiment of the present application.
Fig. 2 is a schematic flowchart of another camera calibration method according to an embodiment of the present application.
Fig. 3 to fig. 7 are scene schematic diagrams of a camera calibration method provided in an embodiment of the present application.
Fig. 8 is a schematic structural diagram of a camera calibration device provided in an embodiment of the present application.
Fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 10 is a schematic structural diagram of another electronic device provided in an embodiment of the present application.
Detailed Description
Reference is made to the drawings, wherein like reference numerals refer to like elements, which are illustrated in the various figures, and which are implemented in a suitable computing environment. The following description is based on illustrated embodiments of the application and should not be taken as limiting the application with respect to other embodiments that are not detailed herein.
It is understood that the execution subject of the embodiment of the present application may be an electronic device such as a smart phone or a tablet computer.
Referring to fig. 1, fig. 1 is a schematic flowchart of a camera calibration method provided in an embodiment of the present application, where the method may be applied to an electronic device including at least a first camera and a second camera, and the process may include:
101. respective intrinsic parameters and extrinsic parameters of the first camera and the second camera are acquired.
Taking a picture turns a space object into a two-dimensional image through a camera, but if some space information such as distance, size and the like is to be obtained from the picture, three-dimensional position information needs to be obtained by using geometric information in the two-dimensional image. If a mathematical model of the camera is found, the original three-dimensional information can be obtained by utilizing the two-dimensional image reverse-deducing through the mathematical model. The camera calibration is to solve the mathematical model, and the parameters of the mathematical model of the camera are solved through the coordinates of the known points, so that the three-dimensional coordinates of the space points can be recovered, and the three-dimensional reconstruction is completed.
In this embodiment, for example, the electronic device may perform monocular calibration on the first camera and the second camera respectively to obtain the intrinsic parameters and the extrinsic parameters of the first camera and the intrinsic parameters and the extrinsic parameters of the second camera. The internal parameters reflect the camera's configuration, such as camera focal length, principal point coordinates, etc. The extrinsic parameters reflect a relative positional relationship of the first camera and the second camera.
102. And constructing a virtual new camera according to the respective internal parameters of the first camera and the second camera.
For example, the electronic device may be constructed from internal parameters of the first camera and the second cameraA virtual new camera. For example, the focal length f of the first camera 1 As the focal length f of the virtual new camera new And taking the midpoint of the image principal points of the first camera and the second camera as the image principal point of the virtual new camera to construct a virtual new camera related to the first camera and the second camera.
103. And calibrating the first camera according to the virtual new camera and the internal parameters and the external parameters of the first camera, and calibrating the second camera according to the internal parameters and the external parameters of the virtual new camera and the second camera.
When the two image planes are in perfect co-planar row alignment, it is simplest to compute stereo disparity. However, in practical binocular stereo vision systems, there are no two camera image planes that are perfectly aligned in a coplanar line. Binocular camera stereo correction is required. The purpose of stereo correction is to correct two images that are actually non-coplanar line aligned into a coplanar line aligned image. If the relative position relationship between the two cameras in the binocular vision system is known, the electronic device can calibrate the two cameras at the same time, and obtain a rotation matrix and a translation matrix of the other camera relative to the coordinate system by taking one of the cameras as a reference. The two are on the same plane by rotation and translation. And obtaining a rotation matrix and a translation matrix, namely a process of three-dimensional calibration. Generally, calibration of a binocular camera is to down-sample images taken by two cameras with different resolutions to a low resolution, and since the two cameras have different resolutions, the low resolution images need to be scaled to the same resolution before calibration of the binocular camera. Scaling the low resolution image affects the accuracy of the camera calibration.
In this embodiment, the electronic device may calibrate the first camera according to the internal parameters and the external parameters of the virtual new camera and the first camera, and calibrate the second camera according to the internal parameters and the external parameters of the virtual new camera and the second camera. And taking the virtual new camera as a reference, obtaining a rotation matrix and a translation matrix of the first camera and the second camera relative to the virtual new camera, and transforming the first camera and the second camera to a plane where the virtual new camera is located.
It can be understood that, in the embodiment of the present application, the internal parameters and the external parameters of the first camera and the second camera are first obtained, then a virtual new camera is constructed according to the internal parameters and the external parameters of the first camera and the second camera, and finally the first camera is calibrated according to the internal parameters and the external parameters of the virtual new camera and the first camera, and the second camera is calibrated according to the internal parameters and the external parameters of the virtual new camera and the second camera. According to the scheme, when the camera is calibrated, the virtual new camera is used as a reference for calibrating the first camera and the second camera, so that images respectively shot by the first camera and the second camera do not need to be scaled to the same resolution, the average re-projection error is reduced, and the calibration precision of the camera is improved.
Referring to fig. 2, fig. 2 is another schematic flow chart of a camera calibration method according to an embodiment of the present disclosure, where the method may be applied to an electronic device including at least a first camera and a second camera, and the flow chart may include:
201. images taken by the first camera and the second camera are acquired, respectively.
For example, the electronic device may control a first camera and a second camera to capture images, and then acquire an original image captured by the first camera and an original image captured by the second camera. The original image includes a plurality of checkerboard images, and the number of corners in each checkerboard image is multiple, for example, the original image includes at least 3 checkerboard images, and the number of corners in each checkerboard image is greater than 4.
202. According to the shot images, the internal parameters and the external parameters of the first camera and the second camera are obtained respectively.
For example, after acquiring an original image captured by a first camera and an original image captured by a second camera, the electronic device may acquire internal parameters and external parameters of the first camera and the second camera, respectively, according to the images.
The specific steps for obtaining the internal parameters and the external parameters are as follows: acquiring corner coordinate information of an original image shot by a first camera and an original image shot by a second camera, and converting corner coordinates in the corner coordinate information into world coordinates; calculating according to the corner coordinates and the world coordinates to obtain a homography matrix, wherein the homography matrix represents the corresponding relation between points in the world coordinates and points in the pixel coordinates; obtaining internal parameters of the first camera and internal parameters of the second camera according to the homography matrix; and constructing a row alignment matrix of the first camera according to the homography matrix to obtain external parameters of the first camera, and constructing a row alignment matrix of the second camera according to the homography matrix to obtain external parameters of the second camera.
The world coordinate system is a coordinate system of the three-dimensional position of the object, and the position of the calibration object can be determined. The image coordinate system is a coordinate system established with reference to a two-dimensional photograph taken by a camera, and is used for specifying the position of an object in the photograph.
In the embodiment of the present application, the user may prepare a checkerboard as the calibration object in advance. Some photographs of different directions are taken for the calibration object by adjusting the orientation of the calibration object or the camera. By this method, the world coordinate system can be fixed on the checkerboard object image, and the three-dimensional coordinate W =0 of any point on the checkerboard object image, so the camera imaging model can be expressed as the following formula:
Figure BDA0003147169670000051
wherein (U, V, 1) is the coordinate of a point in a world coordinate system, (U, V) is the coordinate of a point in a pixel coordinate system, Z is a scale factor, d x ,d y Respectively, the physical length, u, of a pixel in the X, Y directions on the camera plate 0 ,v 0 Respectively, the coordinates of the center of the light-sensing plate of the camera in the pixel coordinate system, theta represents the angle between the lateral and longitudinal edges of the light-sensing plate, R1, R2 are the first two columns of the rotation matrix of the second camera to the first camera, and T is the translation matrix of the second camera to the first camera.
The camera internal reference matrix can be obtained by the formula (1)
Figure BDA0003147169670000061
The homography matrix is set as
Figure BDA0003147169670000062
Homography is defined in computer vision as the projection mapping of one plane to another. The homography matrix H of the camera calibration can be understood as homography from the camera plane to the checkerboard calibration object plane, represents the imaging corresponding relation of points in world coordinates in pixel coordinates, is a coaction body of internal parameters and external parameters, and has
Figure BDA0003147169670000063
In the embodiment of the present application, the electronic device may extract image coordinates of all inner corner points on the image of the checkerboard calibration object, and since (u, v) is the coordinate of the corner point of the checkerboard calibration object in the pixel coordinate system, the coordinates can be directly obtained by the camera. (U, V) are coordinates of the corner points of the checkerboard standard in the world coordinate system, and each cell of the checkerboard standard can be manually controlled by a designer, so its size is a known quantity. Since the homography matrix H is a three 3 x 3 homogeneous matrix, there are 8 independent unknown parameters. Thus, H has 8 unknowns to solve, at least eight equations are needed, and each checkerboard calibration corner point can provide two constraint equations, so four corresponding points are needed.
In the embodiment of the application, the photos of three different positions can be obtained by changing the relative position between the camera and the checkerboard calibration object. Because the number of the angular points of each checkerboard is more than 4, the optimal matrix homography matrix H can be obtained by a least square regression method. R1 and R2 are two columns of the rotation matrix R, and have a relationship in which units are orthogonal to each other. Namely that
R1 T R2=0 (5)
R1 T R1=R2 T R2=1 (6)
Then, from the relationship between H and R1, R2, it can be seen that
R1=ZA -1 H1 (7)
R2=ZA -1 H2 (8)
Can be substituted to obtain
H1 T Z 2 A -T A -1 H2=0 (9)
H1 T Z 2 A -T A -1 H1=H2 T Z 2 A -T A -1 H2=1 (10)
Is provided with Z 2 A -T A -1 =B (11)
Setting internal parameter matrix
Figure BDA0003147169670000071
Then the
Figure BDA0003147169670000072
Representing matrix B by matrix A to obtain
Figure BDA0003147169670000073
Since B is a symmetric array, let
B=Z 2 A -T A -1 (15)
Then
H1 T BH2=0 (16)
H1 T BH1=H2 T BH2=1 (17)
Figure BDA0003147169670000081
Is provided with
v ij =[H 1i H 1j H 1i H 2j +H 2i H 1i H 2i H 2j H 1i H 3j +H 3i H 1j H 2i H 3j +H 3i H 2j H 3i H 3j ] (19)
The above can be formulated as
v 12 T b=0 (20)
v 11 T b=v 22 T b=1 (21)
Namely that
Figure BDA0003147169670000082
Wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003147169670000083
since vector b has 6 unknowns, each checkerboard scale can provide a vb =0 constraint. The constraint relationship contains two constraint equations. Thus, at least 6 equations, i.e., 3 checkerboard markers, are required to solve for vector b. When the vector b is obtained, the matrix can be obtained
Figure BDA0003147169670000091
Wherein the content of the first and second substances,
Figure BDA0003147169670000092
Figure BDA0003147169670000093
Figure BDA0003147169670000094
Figure BDA0003147169670000095
Figure BDA0003147169670000096
Figure BDA0003147169670000097
b can be estimated by applying the above formula with at least three checkerboard marker images. After B is obtained, the internal parameters of the first camera can be obtained
Figure BDA0003147169670000098
Internal parameters of the second camera
Figure BDA0003147169670000099
The world coordinate system is used as a system reference system of binocular vision, and the relation between the two cameras and the world coordinate system can be given, so that the relative position relation between the cameras is solved.
In this embodiment, the electronic device may construct a row alignment matrix of a first camera according to a homography matrix to obtain extrinsic parameters of the first camera, and construct a row alignment matrix of a second camera according to the homography matrix to obtain extrinsic parameters of the second camera. For example, the electronic device may separate the rotation matrix R into two parts R between the images by rotating the rotation matrix R by half the angle through the rodgers transformation 1 And r 2 . The translation vector T of the second camera is rotated.
t=r 2 *T (33)
Constructing rotation vectors e1, e2 from the translation vector t and
e3=e1*e2 (34)
then the initial rotation matrix is
Figure BDA0003147169670000101
The row alignment matrix of the first camera and the second camera is
R rect1 =R rect r 1 (36)
R rect2 =R rect r 2 (37)
203. And taking the camera focal length with the smaller focal length of the focal lengths of the first camera and the second camera as the focal length of the virtual new camera.
For example, the first camera has a focal length f 1 Focal length of the second camera is f 2 Wherein f is 1 <f 2 Then the electronic device can set the focal length f of the virtual new camera new The one of the first camera and the second camera having the smaller focal length, i.e., f 1 Is the focal length of the virtual new camera. For example, the focal length of the first camera is f 1 =40mm, focal length of the second camera f 2 =50mm, then the electronic device may set the focal length of the first camera to the focal length of the virtual new camera, i.e. f new =40mm。
As another example, the focal length of the first camera is f 1 Focal length of the second camera is f 2 Wherein f is 1 >f 2 Then the electronic device can set the focal length f of the virtual new camera new The one of the first camera and the second camera having the smaller focal length, i.e., f 2 Is the focal length of the virtual new camera. For example, the first camera has a focal length f 1 =60mm, focal length of the second camera f 2 =50mm, the electronic device may set the focal length of the second camera to the focal length of the virtual new camera, i.e. f new =50mm。
204. And taking a half of the sum of the longitudinal coordinate values of the image principal points of the first camera and the second camera as the longitudinal coordinate value of the image principal point of the virtual new camera, and taking the transverse coordinate value of the image principal point of the first camera as the transverse coordinate value of the image principal point of the virtual new camera to obtain the coordinates of the new image principal point.
For example, the image principal point coordinate of the first camera is (cx) 1 ,cy 1 ) The image principal point coordinate of the second camera is (cx) 2 ,cy 2 ) Then the electronic device may set the image principal point ordinate to
y new =(cy 1 +cy 2 )/2 (38)
The horizontal coordinate value of the image principal point is set as
x new =cx 1 (39)
Thus, new image principal point coordinates (cx) can be obtained new ,cy new ) And taking the new image principal point coordinate as the image principal point coordinate of the virtual new camera.
For example, if the coordinates of the principal point of the first camera are (10, 20) and the coordinates of the principal point of the second camera are (15, 10), the electronic device can calculate the ordinate value of the principal point of the virtual new camera as y new =15, virtual new camera image principal point abscissa value x new =10, the image principal point coordinates of the virtual new camera can then be obtained (10, 15).
205. And constructing a virtual new camera according to the focal length of the virtual new camera and the new image principal point coordinate.
For example, in obtaining the focal length f of a virtual new camera new And image principal point coordinates (cx) new ,cy new ) Thereafter, the electronic device may determine an internal parameter K of the virtual new camera new Its internal parameter matrix is
Figure BDA0003147169670000111
In one embodiment, the electronic device may establish a pixel coordinate system based on the virtual new camera according to the focal length and the new image principal point coordinates of the virtual new camera. And acquiring a reprojection matrix for converting the pixel coordinate system into a world coordinate system according to the internal parameters of the virtual new camera. Transforming the first camera and the second camera to the plane where the virtual new camera is located under the pixel coordinate system, and calibrating the first camera and the second camera
For example, after a pixel coordinate system based on the virtual new camera is established, the electronic device may obtain a re-projection matrix between the world coordinate system and the pixel coordinate system according to internal parameters of the virtual new camera.
Figure BDA0003147169670000121
And restoring the plane two-dimensional coordinates into three-dimensional coordinates of the space points according to the re-projection matrix to complete three-dimensional reconstruction. For example, the electronic device may convert coordinates (U, V) of a point of the pixel coordinate system to coordinates (U, V, W) of a point in the world coordinate system based on the reprojection matrix.
206. And determining a correction homography matrix of the first camera according to the internal parameters of the virtual new camera, the internal parameters of the first camera and the row alignment matrix so as to finish the calibration of the first camera.
For example, after obtaining the internal parameters of the virtual new camera, the electronic device may obtain the internal parameters K of the virtual new camera new And internal parameters K of the first camera 1 And a row alignment matrix R of the first camera rect1 Determining a corrective homography matrix for the first camera
H 1 =K new R rect1 K 1 (42)
207. And determining a correction homography matrix of the second camera according to the internal parameters of the virtual new camera, the internal parameters of the second camera and the row alignment matrix so as to finish the calibration of the second camera.
After obtaining the internal parameters of the virtual new camera, the electronic device may obtain the internal parameters K of the virtual new camera new And internal parameters K of the second camera 2 And a row alignment matrix R of the first camera rect2 Determining a corrective homography matrix for the second camera
H 2 =K new R rect2 K 2 (43)
In one embodiment, the electronic device may further determine, according to the correction homography matrix of the first camera, a corrected image corresponding to an image captured by the first camera; and determining a corrected image corresponding to the image shot by the second camera according to the correction homography matrix of the second camera.
For example, after obtaining the correction homography matrix of the first camera, the electronic device may correct an original image captured by the calibrated first camera. After the correction homography matrix of the second camera is obtained, the electronic device can correct the original image shot by the calibrated second camera.
It can be understood that, in the scheme, when the camera is calibrated, the virtual new camera is used as a calibration reference, so that images respectively shot by the first camera and the second camera do not need to be scaled to the same resolution, the average re-projection error is reduced, and the calibration accuracy of the camera is improved.
Referring to fig. 3 to 6, fig. 3 to 6 are schematic views of a scene of a camera calibration method according to an embodiment of the present disclosure.
For example, the images captured by the first camera and the second camera include a plurality of checkerboard images, and the number of corner points in each checkerboard image is multiple. The electronic device controls the first camera and the second camera to take a picture and then acquires an original image taken by the first camera, as shown in fig. 3. The original image taken by the second camera is acquired as shown in fig. 4. Illustratively, the original image is at least 3 checkerboard images, wherein the number of corner points of each checkerboard image is more than 4.
After acquiring the original image captured by the first camera and the original image captured by the second camera, the electronic device may acquire the internal parameters and the external parameters of the first camera and the second camera, respectively, from the images. The specific steps for obtaining the internal parameters and the external parameters are as follows: respectively acquiring images shot by a first camera and a second camera, acquiring corner coordinate information of the images, and converting corner coordinates in the corner coordinate information into world coordinates; determining a homography matrix according to the corner point coordinates and the world coordinates, wherein the homography matrix represents the corresponding relation between the corner points in the world coordinates and the corner points in the pixel coordinates; determining respective internal parameters of the first camera and the second camera according to the homography matrix; and constructing a row alignment matrix of the second camera according to the homography matrix to obtain external parameters of the second camera.
For example, the electronic device may determine internal parameters of the first camera
Figure BDA0003147169670000131
And internal parameters of the second camera
Figure BDA0003147169670000132
And a row alignment matrix R of the first camera rect1 And a row alignment matrix R of the second camera rect2
According to the internal parameters of the first camera and the internal parameters of the second camera, the focal length of the first camera is f 1 Focal length of the second camera is f 2 If therein, f 1 <f 2 Then the electronic device can set the focal length f of the virtual new camera new The one of the first camera and the second camera having the smaller focal length, i.e., f 1 Is the focal length of the virtual new camera. For example, the first camera has a focal length f 1 =40mm, focal length of the second camera f 2 =50mm, then the electronic device may set the focal length of the first camera to the focal length of the virtual new camera, i.e. f new =40mm。
From the internal parameters of the first camera and the internal parameters of the second camera, the image principal point coordinate of the first camera is (cx) 1 ,cy 1 ) The image principal point coordinate of the second camera is (cx) 2 ,cy 2 ) Then the electronic device may set the image principal point ordinate to y new =(cy 1 +cy 2 ) (v 2) the abscissa value of the image principal point is x new =cx 1 Then, new image principal point coordinates (cx) can be obtained new ,cy new ) And taking the new image principal point coordinate as the image principal point coordinate of the virtual new camera. For example, the principal point of the first camera is (10, 20) and the principal point of the second camera is (15, 10), then the electronic device can calculate a virtual newThe longitudinal coordinate value of the image principal point of the camera is y new =15, the horizontal coordinate value of the image main point of the virtual new camera is x new =10, the image principal point coordinates of the virtual new camera can then be obtained (10, 15).
At the focal length f of the new virtual camera new And image principal point coordinates (cx) new ,cy new ) Then, the electronic device can obtain the internal parameter K of the virtual new camera new . After obtaining the internal parameters of the virtual new camera, the electronic device may obtain the internal parameters K of the virtual new camera new And a row alignment matrix R of the first camera rect1 Obtaining a correction homography matrix H of the first camera 1 =K new R rect1 K 1 According to the internal parameters K of the virtual new camera new And a row alignment matrix R of the second camera rect2 Obtaining a correction homography matrix H of the second camera 2 =K new R rect2 K 2
Correcting homography matrix H from first camera 1 The image captured by the first camera may be rectified to obtain a rectified image of the image captured by the first camera, as shown in fig. 5. Correcting homography matrix H from second camera 2 A corrected image of the image taken by the second camera may be obtained for the corrected image of the image taken by the second camera, as shown in fig. 6.
It can be understood that, in the present solution, when the camera is calibrated, the virtual new camera is used as a calibration reference, so that it is not necessary to scale the images respectively captured by the first camera and the second camera to the same resolution, thereby reducing the average re-projection error and improving the calibration accuracy of the camera.
For example, based on the focal length and the principle point coordinates of the virtual new camera, the electronic device may establish a pixel coordinate system with reference to the virtual new camera. And transforming the first camera and the second camera to the plane where the virtual new camera is located under the pixel coordinate system. After the pixel coordinate system based on the virtual new camera is established, the electronic equipment can acquire a re-projection matrix between the world coordinate system and the pixel coordinate system according to the internal parameters of the virtual new camera, and can restore the plane two-dimensional coordinates into the three-dimensional coordinates of the space points according to the re-projection matrix to complete three-dimensional reconstruction. For example, the electronic device may convert coordinates (U, V) of a point of the pixel coordinate system to coordinates (U, V, W) of a point in the world coordinate system based on the reprojection matrix.
For example, after the projection is performed by using the reprojection matrix, the average pixel Error (Mean Error Pixels) of the average reprojection Error (Mean ReprojectionError per Image) is 0.9 Pixels, the effective correction area is 2628 × 2007, as shown in fig. 7, in which Camera 1 represents the first Camera, camera 2 represents the second Camera, and the overall average Error (overall Mean Error) represents the average pixel Error of the reprojection matrix.
Referring to fig. 8, fig. 8 is a schematic structural diagram of a camera calibration device according to an embodiment of the present disclosure. The camera calibration apparatus 300 may include an acquisition module 301, a construction module 302, and a calibration module 303, wherein:
an obtaining module 301, configured to obtain internal parameters and external parameters of a first camera and a second camera, respectively;
a construction module 302, configured to construct a virtual new camera according to respective internal parameters of the first camera and the second camera;
the calibration module 303 is configured to calibrate the first camera according to the virtual new camera and the internal parameters and the external parameters of the first camera, and calibrate the second camera according to the virtual new camera and the internal parameters and the external parameters of the second camera.
In one embodiment, the building module 302 may be specifically configured to:
taking the camera focal length with the smaller focal length of the focal lengths of the first camera and the second camera as the focal length of the virtual new camera;
taking a half of the sum of the longitudinal coordinate values of the image main points of the first camera and the second camera as the longitudinal coordinate value of the image main point of the virtual new camera, and taking the transverse coordinate value of the image main point of the first camera as the transverse coordinate value of the image main point of the virtual new camera to obtain the coordinate of the new image main point;
and constructing a virtual new camera according to the focal length of the virtual new camera and the new image principal point coordinates.
In an embodiment, the obtaining module 301 may be specifically configured to:
respectively acquiring images shot by a first camera and a second camera;
acquiring corner coordinate information of the image, and converting the corner coordinates in the corner coordinate information into world coordinates;
determining a homography matrix according to the corner coordinates and the world coordinates, wherein the homography matrix represents the corresponding relation between the corner points in the world coordinates and the corner points in the pixel coordinates;
determining respective internal parameters of the first camera and the second camera according to the homography matrix;
and constructing a row alignment matrix of a second camera according to the homography matrix to obtain external parameters of the second camera.
In one embodiment, the calibration module 303 may be specifically configured to:
determining a correction homography matrix of the first camera according to the internal parameters of the virtual new camera, the internal parameters of the first camera and the row alignment matrix so as to finish the calibration of the first camera;
and determining a correction homography matrix of the second camera according to the internal parameters of the virtual new camera, the internal parameters of the second camera and the row alignment matrix so as to finish the calibration of the second camera.
In one embodiment, the obtaining module 301 may further be configured to:
determining a corrected image corresponding to an image shot by a first camera according to the correction homography matrix of the first camera;
and determining a corrected image corresponding to the image shot by the second camera according to the correction homography matrix of the second camera.
In one embodiment, the images captured by the first camera and the second camera include a plurality of checkerboard images, and the number of corner points in each checkerboard image is multiple.
An embodiment of the present application provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements a flow in a camera calibration method as provided in this embodiment.
The embodiment of the present application further provides an electronic device, where the electronic device includes a memory, a processor, and a computer program stored in the memory and running on the processor, and the processor is configured to execute the steps in the camera calibration method provided in this embodiment by calling the computer program stored in the memory.
For example, the electronic device may be a mobile terminal such as a tablet computer or a smart phone. Referring to fig. 9, fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure.
The electronic device 400 may include a first camera 401, a second camera 402, a memory 403, a processor 404, and the like. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 9 does not constitute a limitation of the electronic device and may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The first camera 401 and the second camera 402 may convert an optical image signal into an electrical signal and record two-dimensional image information of an object.
Memory 403 may be used to store applications and data. The memory 403 stores applications containing executable code. The application programs may constitute various functional modules. The processor 404 executes various functional applications and data processing by running an application program stored in the memory 403.
The processor 404 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by running or executing an application program stored in the memory 403 and calling data stored in the memory 403, thereby integrally monitoring the electronic device.
In this embodiment, the processor 404 in the electronic device loads the executable code corresponding to the processes of one or more application programs into the memory 403 according to the following instructions, and the processor 404 runs the application programs stored in the memory 403, so as to execute:
acquiring respective internal parameters and external parameters of a first camera and a second camera;
constructing a virtual new camera according to the respective internal parameters of the first camera and the second camera;
calibrating the first camera according to the virtual new camera and the internal parameters and the external parameters of the first camera, and calibrating the second camera according to the virtual new camera and the internal parameters and the external parameters of the second camera.
Referring to fig. 10, an electronic device 400 may include components 400 that may include a first camera 401, a second camera 402, a memory 403, a processor 404, a microphone 405, a display 406, a battery 407, and so on.
The first camera 401 and the second camera 402 may convert an optical image signal into an electrical signal to record two-dimensional image information of an object.
Memory 403 may be used to store applications and data. The memory 403 stores an application program containing executable code. The application programs may constitute various functional modules. The processor 404 executes various functional applications and data processing by running an application program stored in the memory 403.
The processor 404 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by running or executing an application program stored in the memory 403 and calling data stored in the memory 403, thereby integrally monitoring the electronic device.
The microphone 405 may be used to collect sound signals in the surrounding environment.
The display 406 may be used to display information such as text, images, and the like, and may also be used to receive a touch operation by a user.
The battery 407 may be used to provide power support for the normal operation of the various components of the electronic device.
In this embodiment, the processor 404 in the electronic device loads the executable code corresponding to the processes of one or more application programs into the memory 403 according to the following instructions, and the processor 404 runs the application programs stored in the memory 403, so as to execute:
acquiring respective internal parameters and external parameters of a first camera and a second camera;
constructing a virtual new camera according to respective internal parameters of the first camera and the second camera;
calibrating the first camera according to the virtual new camera and the internal parameters and the external parameters of the first camera, and calibrating the second camera according to the virtual new camera and the internal parameters and the external parameters of the second camera.
In one embodiment, the processor 404, when executing the constructing of the virtual new camera according to the internal parameters of the first camera and the second camera, may execute:
taking the focal length of the camera with the smaller focal length of the focal lengths of the first camera and the second camera as the focal length of the virtual new camera;
taking a half of the sum of the vertical coordinate values of the image principal points of the first camera and the second camera as the vertical coordinate value of the image principal point of the virtual new camera, and taking the horizontal coordinate value of the image principal point of the first camera as the horizontal coordinate value of the image principal point of the virtual new camera to obtain the coordinate of the new image principal point;
and constructing a virtual new camera according to the focal length of the virtual new camera and the new image principal point coordinates.
In one embodiment, the processor 404, in performing calibration of the first camera according to the virtual new camera and the intrinsic parameters and extrinsic parameters of the first camera, and calibration of the second camera according to the virtual new camera and the intrinsic parameters and extrinsic parameters of the second camera, may perform:
determining a correction homography matrix of the first camera according to the internal parameters of the virtual new camera, the internal parameters of the first camera and the row alignment matrix so as to finish calibration of the first camera;
and determining a correction homography matrix of the second camera according to the internal parameters of the virtual new camera, the internal parameters of the second camera and the row alignment matrix so as to finish the calibration of the second camera.
In one embodiment, after the processor 404 performs calibration of the first camera and the second camera according to the virtual new camera and the extrinsic parameters of the first camera and the extrinsic parameters of the second camera, it may further perform:
determining a corrected image corresponding to an image shot by a first camera according to the correction homography matrix of the first camera;
and determining a corrected image corresponding to the image shot by the second camera according to the correction homography matrix of the second camera.
In one embodiment, the processor 404, in performing the acquiring of the intrinsic parameters and extrinsic parameters of the first camera and the second camera, may perform:
respectively acquiring images shot by a first camera and a second camera;
acquiring corner coordinate information of the image, and converting the corner coordinates in the corner coordinate information into world coordinates;
determining a homography matrix according to the corner coordinates and the world coordinates, wherein the homography matrix represents the corresponding relation between the corners in the world coordinates and the corners in the pixel coordinates;
determining respective internal parameters of the first camera and the second camera according to the homography matrix;
and constructing a row alignment matrix of a second camera according to the homography matrix to obtain external parameters of the second camera.
In one embodiment, when the processor 404 acquires the images captured by the first camera and the second camera respectively, the images captured by the first camera and the second camera respectively include a plurality of checkerboard images, and the number of corner points in each checkerboard image is multiple.
In the above embodiments, the descriptions of the embodiments are focused on, and parts that are not described in detail in a certain embodiment may refer to the above detailed description of the camera calibration method, which is not described herein again.
The camera calibration device provided in the embodiment of the present application and the camera calibration method in the above embodiments belong to the same concept, and any method provided in the camera calibration method embodiment may be run on the camera calibration device, and the specific implementation process thereof is described in detail in the camera calibration method embodiment, and is not described herein again.
It should be noted that, for the camera calibration method in the embodiment of the present application, it can be understood by those skilled in the art that all or part of the process for implementing the camera calibration method in the embodiment of the present application may be completed by controlling the relevant hardware through a computer program, where the computer program may be stored in a computer-readable storage medium, such as a memory, and executed by at least one processor, and during the execution, the process may include, for example, the process of the embodiment of the camera calibration method. The storage medium may be a magnetic disk, an optical disk, a Read Only Memory (ROM), a Random Access Memory (RAM), or the like.
For the camera calibration device in the embodiment of the present application, each functional module may be integrated in one processing chip, or each module may exist alone physically, or two or more modules are integrated in one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a separate product, may also be stored in a computer-readable storage medium, such as a read-only memory, a magnetic disk or an optical disk.
The above detailed description is given to a camera calibration method, a camera calibration device, a storage medium, and an electronic device provided in the embodiments of the present application, and a specific example is applied in the present application to explain the principle and the implementation of the present application, and the description of the above embodiments is only used to help understanding the method and the core idea of the present application; meanwhile, for those skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A camera calibration method applied to an electronic device at least comprising a first camera and a second camera, the method comprising:
acquiring internal parameters and external parameters of the first camera and the second camera respectively;
constructing a virtual new camera according to the respective internal parameters of the first camera and the second camera;
calibrating the first camera according to the virtual new camera and the internal parameters and the external parameters of the first camera, and calibrating the second camera according to the internal parameters and the external parameters of the virtual new camera and the second camera.
2. The method of claim 1, wherein the intrinsic parameters include focal length and principal point of image coordinates, and wherein constructing a virtual new camera from the respective intrinsic parameters of the first camera and the second camera comprises:
taking the camera focal length with the smaller focal length of the focal lengths of the first camera and the second camera as the focal length of the virtual new camera;
taking a half of the sum of the vertical coordinate values of the main image points of the first camera and the second camera as the vertical coordinate value of the main image point of the new virtual camera, and taking the horizontal coordinate value of the main image point of the first camera as the horizontal coordinate value of the main image point of the new virtual camera to obtain the coordinate of the main image point;
and constructing the virtual new camera according to the focal length of the virtual new camera and the new image principal point coordinate.
3. The method of claim 2, wherein the obtaining respective intrinsic parameters and extrinsic parameters of the first camera and the second camera comprises:
respectively acquiring images shot by the first camera and the second camera;
acquiring corner coordinate information of the image, and converting corner coordinates in the corner coordinate information into world coordinates;
determining a homography matrix according to the corner point coordinates and the world coordinates, wherein the homography matrix represents the corresponding relation between the corner points in the world coordinates and the corner points in the pixel coordinates;
determining respective internal parameters of the first camera and the second camera according to the homography matrix;
and constructing a row alignment matrix of the second camera according to the homography matrix to obtain external parameters of the second camera.
4. The method of claim 3, wherein calibrating the first camera based on the intrinsic parameters and extrinsic parameters of the virtual new camera and the first camera, and calibrating the second camera based on the intrinsic parameters and extrinsic parameters of the virtual new camera and the second camera comprises:
determining a correction homography matrix of the first camera according to the internal parameters of the new virtual camera, the internal parameters of the first camera and the row alignment matrix so as to finish calibration of the first camera;
and determining a correction homography matrix of the second camera according to the internal parameters of the new virtual camera, the internal parameters of the second camera and the row alignment matrix so as to finish the calibration of the second camera.
5. The method of claim 4, further comprising:
determining a corrected image corresponding to the image shot by the first camera according to the correction homography matrix of the first camera;
and determining a corrected image corresponding to the image shot by the second camera according to the correction homography matrix of the second camera.
6. The method of claim 3, wherein the images taken by the first camera and the second camera comprise a plurality of checkerboard images, each checkerboard image having a plurality of corner points.
7. A camera calibration device applied to an electronic device at least comprising a first camera and a second camera, the device comprising:
an acquisition module for acquiring internal parameters and external parameters of the first camera and the second camera respectively;
a construction module for constructing a virtual new camera according to respective internal parameters of the first camera and the second camera;
and the calibration module is used for calibrating the first camera according to the virtual new camera and the internal parameters and the external parameters of the first camera, and calibrating the second camera according to the internal parameters and the external parameters of the virtual new camera and the second camera.
8. The apparatus of claim 7, wherein the build module is configured to:
taking the focal length of the camera with the smaller focal length in the focal lengths of the first camera and the second camera as the focal length of the virtual new camera;
taking a half of the sum of the longitudinal coordinate values of the image principal point of the first camera and the longitudinal coordinate value of the image principal point of the second camera as the longitudinal coordinate value of the image principal point of the virtual new camera, and taking the transverse coordinate value of the image principal point of the first camera as the transverse coordinate value of the image principal point of the virtual new camera to obtain a new image principal point coordinate;
and constructing the virtual new camera according to the focal length of the virtual new camera and the new image principal point coordinate.
9. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, is adapted to carry out the method of any one of claims 1 to 6.
10. An electronic device, comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor when executing the computer program implementing the method according to any of claims 1 to 6.
CN202110755541.5A 2021-07-05 2021-07-05 Camera calibration method and device, computer readable storage medium and electronic equipment Pending CN115578466A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110755541.5A CN115578466A (en) 2021-07-05 2021-07-05 Camera calibration method and device, computer readable storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110755541.5A CN115578466A (en) 2021-07-05 2021-07-05 Camera calibration method and device, computer readable storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN115578466A true CN115578466A (en) 2023-01-06

Family

ID=84579924

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110755541.5A Pending CN115578466A (en) 2021-07-05 2021-07-05 Camera calibration method and device, computer readable storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN115578466A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117649454A (en) * 2024-01-29 2024-03-05 北京友友天宇系统技术有限公司 Binocular camera external parameter automatic correction method and device, electronic equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117649454A (en) * 2024-01-29 2024-03-05 北京友友天宇系统技术有限公司 Binocular camera external parameter automatic correction method and device, electronic equipment and storage medium
CN117649454B (en) * 2024-01-29 2024-05-31 北京友友天宇系统技术有限公司 Binocular camera external parameter automatic correction method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US10594941B2 (en) Method and device of image processing and camera
CN109559349B (en) Method and device for calibration
CN107705252B (en) Method and system suitable for splicing, unfolding and correcting binocular fisheye image
CN111340737B (en) Image correction method, device and electronic system
CN113808220A (en) Calibration method and system of binocular camera, electronic equipment and storage medium
CN106570907B (en) Camera calibration method and device
CN111383264B (en) Positioning method, positioning device, terminal and computer storage medium
CN109741241A (en) Processing method, device, equipment and the storage medium of fish eye images
CN116433737A (en) Method and device for registering laser radar point cloud and image and intelligent terminal
CN112470192A (en) Dual-camera calibration method, electronic device and computer-readable storage medium
CN111461963A (en) Fisheye image splicing method and device
CN114549666B (en) AGV-based panoramic image splicing calibration method
CN103106641A (en) Method and device of projection transformation applying to panoramic imaging system
CN108269234A (en) A kind of lens of panoramic camera Attitude estimation method and panorama camera
CN111951193A (en) Method and apparatus for correcting horizontal distortion of image
CN115830135A (en) Image processing method and device and electronic equipment
CN110136205B (en) Parallax calibration method, device and system of multi-view camera
CN115578466A (en) Camera calibration method and device, computer readable storage medium and electronic equipment
CN111429529B (en) Coordinate conversion calibration method, electronic equipment and computer storage medium
CN111432117B (en) Image rectification method, device and electronic system
US20230379422A1 (en) Method for vehicle hinge point calibration and corresponding calibration apparatus, computer device, and storage medium
CN113379845A (en) Camera calibration method and device, electronic equipment and storage medium
CN112598751A (en) Calibration method and device, terminal and storage medium
CN111696141A (en) Three-dimensional panoramic scanning acquisition method and device and storage device
CN113870364B (en) Self-adaptive binocular camera calibration method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination