CN109523597B - Method and device for calibrating external parameters of camera - Google Patents

Method and device for calibrating external parameters of camera Download PDF

Info

Publication number
CN109523597B
CN109523597B CN201710842468.9A CN201710842468A CN109523597B CN 109523597 B CN109523597 B CN 109523597B CN 201710842468 A CN201710842468 A CN 201710842468A CN 109523597 B CN109523597 B CN 109523597B
Authority
CN
China
Prior art keywords
pair
cameras
external parameters
images
pairs
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710842468.9A
Other languages
Chinese (zh)
Other versions
CN109523597A (en
Inventor
周珣
谢远帆
王亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Baidu Online Network Technology Beijing Co Ltd
Original Assignee
Baidu Online Network Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Baidu Online Network Technology Beijing Co Ltd filed Critical Baidu Online Network Technology Beijing Co Ltd
Priority to CN201710842468.9A priority Critical patent/CN109523597B/en
Publication of CN109523597A publication Critical patent/CN109523597A/en
Application granted granted Critical
Publication of CN109523597B publication Critical patent/CN109523597B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix

Abstract

The application discloses a method and a device for calibrating external parameters of a camera. A specific implementation mode of the calibration method of the camera external parameter comprises the following steps: acquiring an image pair of the same scene acquired by two cameras to be calibrated; respectively extracting a plurality of pairs of mutually matched feature point pairs from two images of the image pair, wherein two feature points of the mutually matched feature point pairs are images of the same three-dimensional space point in the two corresponding images; and performing back projection on each pair of feature points to a three-dimensional space by using the acquired external parameters between the two cameras, then re-projecting the feature points to the two images of the image pair, respectively performing position comparison on the obtained projection points and the corresponding original feature points, and correcting the external parameters according to the comparison result to obtain the correction result of the external parameters. The embodiment can improve the calibration efficiency.

Description

Method and device for calibrating external parameters of camera
Technical Field
The application relates to the technical field of image processing, in particular to the technical field, and especially relates to a method and a device for calibrating external parameters of a camera.
Background
In the field of computer vision technology, in order to obtain ideal three-dimensional effect or image information of a large scene and multiple viewing angles, the relative positions and postures of multiple cameras for image acquisition need to be calibrated. The existing method for calibrating the relative pose parameters between cameras usually needs to extract a calibration object or feature points of a known position for matching in a customized calibration space or a specific calibration scene by using the calibration object, and then calibrates the cameras by using a corresponding three-dimensional calibration principle. However, the calibration method has complicated steps, needs manual cooperation to change the posture of the calibration object, has more limitations on calibration conditions, and needs to improve the calibration efficiency.
Disclosure of Invention
In order to solve one or more technical problems, embodiments of the present application provide a calibration method and apparatus for external parameters of a camera.
On one hand, the embodiment of the application provides a calibration method of external parameters of a camera, which comprises the following steps: acquiring an image pair of the same scene acquired by two cameras to be calibrated; extracting a plurality of pairs of mutually matched feature point pairs from two images of the image pair respectively, wherein two feature points of the mutually matched feature point pairs are images of the same three-dimensional space point in the two corresponding images respectively; utilizing the acquired external parameters between the two cameras to back-project each pair of feature points to a three-dimensional space, then re-projecting the feature points to two images of the image pair, respectively comparing the obtained projection points with the corresponding original feature points in position, and correcting the external parameters according to the comparison result to obtain the correction result of the external parameters; and calibrating the relative external parameters of the two cameras according to the corrected result of the external parameters.
In some embodiments, the above back-projecting each pair of feature points to a three-dimensional space by using the acquired external parameters between two cameras, then re-projecting the pair of feature points to two images of the pair of images, respectively comparing the obtained projection points with the corresponding original feature points, and correcting the external parameters according to the comparison result to obtain the correction result of the external parameters, includes: iteratively adjusting external parameters between the two cameras to enable the positions of the projection point pairs and the corresponding feature point pairs after the external parameters are adjusted to reach a preset matching state; and correcting the acquired external parameters between the two cameras by using the adjustment quantity of the iterative adjustment to obtain a correction result of the external parameters.
In some embodiments, iteratively adjusting the extrinsic parameters between the two cameras to achieve a preset matching state based on the positions of the pairs of projection points and the corresponding pairs of feature points after the extrinsic parameters are adjusted includes: determining the acquired external parameters between the two cameras as initial current external parameters, and executing an error estimation step, wherein the error estimation step comprises the following steps: reconstructing a three-dimensional space point matched with each characteristic point pair based on the position coordinates and current external parameters of each characteristic point pair, back-projecting the reconstructed three-dimensional space point into two images of the image pair based on the current external parameters, determining the position coordinates of each projection point pair corresponding to each characteristic point pair in the image pair, calculating the error between each projection point pair and the position coordinates of each corresponding characteristic point pair, constructing an energy function based on the errors corresponding to all the characteristic point pairs in the characteristic point pair sequence, and judging whether the energy function meets the preset convergence condition or not; in response to the fact that the energy function does not meet the preset convergence condition, adjusting the current external parameter along the gradient direction of the current external parameter based on the energy function, and executing an error estimation step; and responding to the judgment that the energy function meets the preset convergence condition, and determining that the positions of the projection point pairs and the corresponding feature point pairs reach a preset matching state.
In some embodiments, the constructing an energy function based on the errors corresponding to all the feature point pairs in the feature point pair sequence includes: and accumulating the corresponding errors of all the characteristic point pairs in the characteristic point pair sequence to obtain an energy function.
In some embodiments, after each pair of feature points is back-projected to the three-dimensional space by using the acquired external parameters between the two cameras, the feature points are re-projected to the two images of the image pair, the obtained projection points are respectively compared with the corresponding original feature points in position, and the external parameters are corrected according to the comparison result, before the correction result of the external parameters is obtained, the method further includes: acquiring extrinsic parameters of the two cameras, including: acquiring external parameters between two cameras obtained through measurement; or projecting the image pair onto the same projection plane, and estimating an external parameter between the two cameras based on the image pair generated by the projection.
In a second aspect, an embodiment of the present application provides a calibration apparatus for external parameters of a camera, including: the first acquisition unit is used for acquiring an image pair of the same scene acquired by two cameras to be calibrated; the extraction unit is used for extracting a plurality of pairs of mutually matched feature point pairs from the two images of the image pair, wherein the two feature points of the mutually matched feature point pairs are images of the same three-dimensional space point in the two corresponding images respectively; and the correction unit is used for performing back projection on each pair of feature points to a three-dimensional space by using the acquired external parameters between the two cameras, then re-projecting the feature points to the two images of the image pair, performing position comparison on the obtained projection points and the corresponding original feature points respectively, and correcting the external parameters according to the comparison result to obtain the correction result of the external parameters.
In some embodiments, the modifying unit is further configured to modify the external parameter to obtain a modification result of the external parameter as follows: iteratively adjusting external parameters between the two cameras to enable the positions of the projection point pairs and the corresponding feature point pairs after the external parameters are adjusted to reach a preset matching state; and correcting the acquired external parameters between the two cameras by using the adjustment quantity of the iterative adjustment to obtain a correction result of the external parameters.
In some embodiments, the modifying unit is further configured to iteratively adjust the extrinsic parameters between the two cameras in the following manner, so that the positions of the pairs of projection points and the corresponding pairs of feature points based on the adjusted extrinsic parameters reach a preset matching state: determining the acquired external parameters between the two cameras as initial current external parameters, and executing an error estimation step, wherein the error estimation step comprises the following steps: reconstructing a three-dimensional space point matched with each characteristic point pair based on the position coordinates and current external parameters of each characteristic point pair, projecting the reconstructed three-dimensional space point into an image pair based on the current external parameters, determining the position coordinates of each projection point pair corresponding to each characteristic point pair in the image pair, calculating the error between each projection point pair and the position coordinates of each corresponding characteristic point pair, constructing an energy function based on the errors corresponding to all the characteristic point pairs in the characteristic point pair sequence, and judging whether the energy function meets the preset convergence condition or not; in response to the judgment that the energy function does not meet the preset convergence condition, adjusting the current external parameter along the gradient direction of the current external parameter based on the energy function, and executing an error estimation step; and responding to the judgment that the energy function meets the preset convergence condition, and determining that the positions of the projection point pairs and the corresponding feature point pairs reach a preset matching state.
In some embodiments, the above-mentioned modification unit is further configured to construct the energy function as follows: and accumulating the corresponding errors of all the characteristic point pairs in the characteristic point pair sequence to obtain an energy function.
In some embodiments, the apparatus further includes a second obtaining unit, configured to: after each pair of feature points is back-projected to a three-dimensional space by using the acquired external parameters between the two cameras, the feature points are re-projected to the two images of the image pair, the obtained projection points are respectively compared with the corresponding original feature points in position, the external parameters are corrected according to the comparison result, and the external parameters of the two cameras are acquired before the correction result of the external parameters is obtained; the second obtaining unit is specifically configured to: acquiring external parameters between two cameras obtained through measurement; or projecting the image pair onto the same projection plane, and estimating an external parameter between the two cameras based on the image pair generated by the projection.
In a third aspect, an embodiment of the present application provides an electronic device, including: one or more processors; and a storage device for storing one or more programs, which when executed by the one or more processors, cause the one or more processors to implement the camera external reference calibration method described above.
The calibration method and the calibration device for the external parameters of the camera, provided by the application, can be used for calibrating the external parameters between the cameras based on a natural scene without limiting a calibration object or a calibration environment and improving the calibration efficiency by acquiring an image pair of the same scene acquired by two cameras to be calibrated, then extracting a plurality of pairs of mutually matched feature points from the two images of the image pair, wherein the mutually matched feature points are images of the same three-dimensional space point in the two corresponding images, then back-projecting each pair of feature points to the three-dimensional space by using the acquired external parameters between the two cameras, then re-projecting the pairs to the two images of the image pair, respectively comparing the positions of the acquired projection points with the corresponding original feature points, and finally correcting the external parameters according to the comparison result.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which the present application may be applied;
FIG. 2 is a flow diagram of one embodiment of a camera external reference calibration method according to the present application;
FIG. 3 is a flow chart of another embodiment of a camera external reference calibration method according to the present application;
FIG. 4 is a schematic diagram of the positional relationship of the image plane and three-dimensional spatial points of an image pair acquired by two cameras to be calibrated;
FIG. 5 is a schematic structural diagram of an embodiment of a camera external reference calibration apparatus of the present application;
fig. 6 is a schematic structural diagram of a computer system suitable for implementing the terminal device or the server according to the embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
Fig. 1 illustrates an exemplary system architecture 100 to which embodiments of the camera external reference calibration method or camera external reference calibration apparatus of the present application may be applied.
As shown in fig. 1, the system architecture 100 may include an unmanned vehicle 101, an onboard camera 102 and onboard control unit 103 disposed on the unmanned vehicle 101, and a server 104. The onboard camera 102 and onboard control unit 103 on the unmanned vehicle 101 may be connected to the server 104 via a network, and the onboard camera 102 may also be connected to the onboard control unit 103 via a network, where the network may include various types of connections, such as wired, wireless communication links, or fiber optic cables, among others.
A plurality of onboard cameras 102 may be mounted on the unmanned vehicle 101, for example, fig. 1 shows two onboard cameras a and B mounted on the front and side of the vehicle body, respectively. The plurality of vehicle-mounted cameras 102 may collect image information of the surrounding environment during the driving process of the unmanned vehicle 101, and may send the collected image information to the vehicle-mounted control unit 103 or the server 104, so that the vehicle-mounted control unit 103 or the server 104 may store, analyze, and the like the collected image information. The positions and attitudes of the in-vehicle cameras a and B may be inconsistent, and the imaging ranges of the two may have an overlap.
The in-vehicle Control Unit 103 may be an "in-vehicle brain," such as an ECU (Electronic Control Unit) for controlling operations of various components of the unmanned vehicle 101. The vehicle-mounted control brain 103 may process image information collected by the plurality of cameras, for example, perform stitching to generate a road panoramic image.
The server 104 may be a server for storing, analyzing and responding to data collected by each component of the unmanned vehicle 101, for example, a data processing server for performing registration, feature extraction and target recognition on image information collected by a plurality of vehicle-mounted cameras 102 on the unmanned vehicle 101, and the data processing server may feed back the processing result to a vehicle-mounted control unit of the unmanned vehicle 101. Or the server 104 may be a storage server that stores image information collected by a plurality of onboard cameras on the unmanned vehicle 101 for failure analysis and accident cause analysis, for example.
It should be noted that the calibration method for the camera external parameter provided in the embodiment of the present application may be executed by the vehicle-mounted control unit 103 or the server 104, and accordingly, a calibration device for the camera external parameter may be disposed in the vehicle-mounted control unit 103 or the server 104.
It should be understood that the number of unmanned vehicles, onboard cameras, onboard control units, and servers in fig. 1 are merely illustrative. There may be any number of unmanned vehicles, onboard cameras, onboard control units, as desired for implementation.
In addition, fig. 1 only shows an exemplary system architecture to which the camera external parameter calibration method or the camera external parameter calibration apparatus of the present application can be applied, and the camera external parameter calibration method or the camera external parameter calibration apparatus of the present application is not limited to be applied to the system architecture shown in fig. 1, and may also be applied to other system architectures, such as external parameter calibration of a stereo camera. This is not a limitation of the present application.
With continued reference to FIG. 2, a flow 200 of one embodiment of a calibration method of a camera external reference according to the present application is shown. The calibration method of the camera external parameter comprises the following steps:
step 201, acquiring an image pair of the same scene acquired by two cameras to be calibrated.
In this embodiment, there may be an overlap between the imaging ranges of the two cameras to be calibrated. Two images acquired by two cameras to be calibrated to the same scene can be acquired to form an image pair of the same scene.
Because the positions and the postures of the two cameras to be calibrated have differences, the optical axes of the two cameras are not coincident, and the positions of the mass centers of the two cameras are not coincident, the image pairs of the same scene acquired by the two cameras have differences. The image pair contains two-dimensional image information within the overlapping three-dimensional spatial imaging range of the two cameras.
An electronic device (such as a vehicle-mounted control unit or a server shown in fig. 1) on which the calibration method of the external parameter of the camera operates may be connected with the two cameras in a wired or wireless manner, so as to acquire an image pair of the same scene acquired by the two cameras. Here, the two cameras may transmit the image pair of the same scene to the electronic device in real time after acquiring the image pair, or may store the image pair in a storage area of the cameras, and transmit the image pair to the electronic device after receiving an acquisition request of the electronic device.
In some optional implementations of this embodiment, the two cameras may acquire a sequence of images at a set time period, the image pairs may be a plurality of image pairs in a sequence, and two images in each image pair are acquired at the same time. The calibration method of the external parameters of the camera according to the embodiment of the present application may be performed on each pair of the sequential image pairs to obtain the final external parameter calibration result of the camera.
Here, the same scene may be a natural scene, where the natural scene may be a non-set scene, and may be a randomly selected scene, for example, a scene including roads, pedestrians, automobiles, buildings, and the like. The two cameras to be calibrated can acquire image pairs in the general working environment and transmit the image pairs to the electronic equipment.
In step 202, a plurality of pairs of matched feature points are extracted from two images of the image pair respectively.
Two feature points of the feature point pairs which are matched with each other are images of the same three-dimensional space point in the corresponding two images respectively, that is, each feature point pair comprises one feature point of each image in the image pair, and the two feature points in each feature point pair are matched with the same three-dimensional space point.
In this embodiment, the electronic device may extract feature points from two images of the image pair respectively by using an image feature point detection algorithm and perform matching on the feature points. The feature point may be a point in the image, containing its position information and luminance information in the image coordinate system.
Specifically, Feature point detection algorithms such as SIFT (Scale-invariant Feature Transform), SURF (Speeded Up Robust Features), BRIEF (Binary Robust Independent Elementary Features), and the like may be used to extract Feature points, extract information of the Feature points in the image, and describe the Feature points with descriptors such as vectors and matrices. The SIFT algorithm specifically determines the positions and the scales of key points by searching extrema in a scale space, fitting a fine model, then appointing a direction parameter for each key point by using the gradient direction distribution characteristics of neighborhood pixels of the key points, and finally generating descriptors of the key points, wherein the key points are extracted feature points.
After the feature points are respectively extracted from the two images in the image pair, the feature points of the two images can be matched. Specifically, the descriptors of the feature points may be used to perform similarity calculation, and the feature points with high similarity are determined as matching feature points. Here, the matched feature points correspond to the same point in the three-dimensional space. After matching feature points in the two images are found, pairs of feature points may be generated. In this embodiment, pairwise matching may be performed on the feature point pairs in two images in each image pair in sequence, a plurality of feature point pairs are matched, a feature point pair set is generated, and after the feature point pairs in the feature point pair set are further numbered, a feature point pair sequence is generated.
For example, the one image pair acquired in step 201 includes a first image IMA1 acquired by the first camera and a second image IMA2 acquired by the second camera, feature point extraction may be performed on the first image IMA1 to obtain a first set of feature points { a1, a2, A3, …, Am }, feature point extraction is carried out on the second image IMA2 to obtain a second feature point set { B1, B2, B3, … and Bn }, then, feature points a1, a2, A3, … and Am in the first feature point set are sequentially matched with feature points B1, B2, B3, … and Bn in the second feature point set, feature points in the first feature point set and feature points in the second feature point set corresponding to the same spatial point are extracted, and feature point pairs are generated, e.g., feature point a1 and feature point B3 match and correspond to the same three-dimensional spatial point, then feature points a1 and B3 are combined into a pair of feature points.
In some optional implementations of this embodiment, the extracted feature point pairs may be filtered according to a preset filtering rule to generate a feature point pair sequence, where the preset filtering rule may include, but is not limited to, that the total number of the feature point pairs does not exceed a set threshold, that a uniformity index of distribution of the feature point pairs in an image pair reaches a preset index, and the like, where the uniformity index may be obtained based on a distance between the feature points.
Step 203, after the feature point pairs are back projected to the three-dimensional space by using the acquired external parameters between the two cameras, the feature point pairs are re-projected to the two images of the image pair, the obtained projection point pairs are compared with the corresponding original feature points in position, and the external parameters are corrected according to the comparison result, so that the correction result of the external parameters is obtained.
In this embodiment, the extrinsic parameters between the two cameras may be acquired in advance. The extrinsic parameters may be parameters characterizing the relative pose relationship of the two cameras, and may include translation parameters, rotation parameters, and scale transformation parameters. Specifically, the value range of the external parameter between the two cameras can be estimated empirically, and a group of values is randomly selected in the value range as the acquired external parameter between the two cameras; alternatively, the extrinsic parameters between the two cameras may be preset, and the preset extrinsic parameters between the two cameras may be acquired before step 203 is performed.
In some optional implementations of this embodiment, if the image pair acquired in step 201 is a plurality of pairs of image pairs, a correction result of the extrinsic parameters obtained in the process of calibrating the camera extrinsic parameters based on the previous image pair may be acquired in advance, and the initial extrinsic parameters may be used as the above-mentioned acquired extrinsic parameters between two cameras as initial extrinsic parameters for calibrating the camera extrinsic parameters based on the current image pair.
And then, the feature point pairs can be back-projected to the three-dimensional space by using the acquired external parameters between the two cameras, so as to obtain the coordinates of the three-dimensional space points.
Assume that the two cameras are camera C1 and camera C2, and the internal parameters of the two cameras are known. According to the camera projection model, a three-dimensional space coordinate system is taken as a world coordinate system, and the coordinate of a point P0 in the three-dimensional space coordinate system is taken as XwIt can be expressed as:
Xw=(xw,yw,zw)T, (1)
the coordinate P of the image point P0' of the point P0 on the image captured by a certain cameraiCan be expressed as:
Figure BDA0001411096810000091
wherein u and v are the components of the point P0' on two coordinate axes in the camera coordinate system, (u, v, 1)TIs a vector (u, v)TS is a projection scale factor, K is an internal reference matrix of the camera, K is a3 x 3 matrix containing the focal length of the camera, the size of each cell of the sensor, and the location parameters of the origin of the camera coordinate system in the image coordinate system, (R | t) represents the pose of the camera in the world coordinate system, where R is a rotation matrix, t is a translation vector, and (R | t) is 3 |4, which can be expressed as:
Figure BDA0001411096810000092
wherein r is11、r12、r13、r21、r22、r23、r31、r32、r33A parameter related to the rotation angle of the camera coordinate system with respect to three coordinate axes in the world coordinate system, t1、t2、t3Are translation parameters of the camera coordinate system relative to three coordinate axes in the world coordinate system.
For the case where two cameras are imaging the same scene, with the coordinate system of camera C1 as the reference coordinate system, i.e. with the coordinate system of camera C1 as the world coordinate system, the pose of camera C1 in the world coordinate system can be represented as (I | O), where,
Figure BDA0001411096810000101
the pose of camera C2 in the world coordinate system is represented by (R | t), which is the extrinsic parameter matrix of camera C2 relative to camera C1.
One point P0 (coordinate X) in the world coordinate systemw) Projected point P10 coordinate X in camera C1 imagep1Comprises the following steps:
Figure BDA0001411096810000102
projection point P20 coordinate X of P0 in camera C2 imagep2Comprises the following steps:
Figure BDA0001411096810000103
wherein the content of the first and second substances,
P1=K1·(I|O), (7)
P2=K2·(R|t), (8)
(u1,v1,1)Tis the coordinate (u) of point P10 in the coordinate system of camera C11,v1)THomogeneous coordinate of (u)2,v2,1)TIs the coordinate (u) of point P20 in the coordinate system of camera C22,v2)TOf homogeneous system, K1、K2The reference matrices for camera C1 and camera C2, respectively.
Equation set (9) is obtained by transformation of equation (5):
Figure BDA0001411096810000104
the equation system (10) is obtained by transforming the equation (6):
Figure BDA0001411096810000111
wherein, Pi nA row vector representing the nth row of the matrix Pi, i ═ 1, 2; n is 1, 2, 3.
For a pair of feature points P10 and P20, its coordinates (u) can be obtained1,v1) And (u)2, v2) From the acquired extrinsic parameters, (R | t) is obtained, assuming the internal reference matrix K of the two cameras1、K2It is known that P can be obtained from the formulas (7) and (8)1、P2The simultaneous equations (9) and (10) can calculate the unknown quantity xw,yw,zwThe method for solving the equation system may be SVD (Singular Value Decomposition).
The feature point pairs extracted in step 202 may be projected into a three-dimensional space according to the above projection principle and the obtained external parameters between the cameras, so as to obtain coordinates (x) of the corresponding three-dimensional space pointsw,yw,zw). Then, the three-dimensional space point can be re-projected to the image pair by using the acquired external parameters between the cameras and the acquired internal parameters of the two cameras to obtain the corresponding characteristic point pairsMay be based on the known xw,yw,zwAnd calculating the coordinates of two corresponding projection points in the coordinate systems of the two cameras by adopting the formulas (1) and (2).
Then, the positions of the feature point pairs and the corresponding projection point pairs generated after the reprojection can be compared, specifically, the distance between each feature point in the feature point pairs and the corresponding projection point can be calculated, and the distances between two feature points in the feature point pairs and the corresponding projection points are accumulated to obtain the position comparison result of the projection points and the corresponding original feature points. The error of the external parameter can be estimated more accurately by projecting the characteristic points into the three-dimensional space and then re-projecting the characteristic points into the image, so that the external parameter can be corrected better.
And then, correcting the external parameter according to the comparison result, for example, correcting the external parameter according to the calculated distance between the projection point pair and the corresponding feature point pair. Specifically, a corresponding relationship list between the distance between the feature point pair and the projection point pair and the correction amount may be preset according to experience or a result obtained by machine learning training, and after the distance between the feature point pair and the corresponding projection point pair is obtained by comparison, the list may be searched, so as to obtain the corresponding correction amount as a correction result of the external parameter.
The corrected external parameters are the calibration results of the external parameters of the cameras, and the external parameters are external parameters of one camera coordinate system relative to the other camera coordinate system.
In some optional implementation manners of this embodiment, the method may further include a step of obtaining external parameters of the two cameras before performing position comparison between the obtained projection points and the corresponding original feature points respectively after back-projecting each pair of feature points to the three-dimensional space by using the obtained external parameters between the two cameras, and specifically, obtaining the measured external parameters between the two cameras, for example, manually measuring a relative translation distance and a relative rotation angle of the two cameras. Or the image pairs acquired by the two cameras may be projected onto the same projection plane and extrinsic parameters between the two cameras may be estimated based on the image pairs generated by the projection. Here, after the image pair is projected onto the same projection plane, the extrinsic parameters between the two cameras can be estimated from the relative positions between the matching feature points in the image pair generated by the projection. The initial estimation value of the external parameters between the two cameras is obtained through the characteristic points in the image pair generated by measurement or matched projection, the numerical value search range of the external parameters between the two cameras can be narrowed, the search efficiency is improved, and the calibration efficiency is further improved.
The calibration method for camera external parameters (i.e. external parameters) in the above embodiment of the present application obtains an image pair of the same scene acquired by two cameras to be calibrated, then extracts a plurality of pairs of mutually matched feature points from two images of the image pair respectively, wherein, two characteristic points of the mutually matched characteristic point pair are respectively images of the same three-dimensional space point in the corresponding two images, and then the characteristic point pair is back projected to the three-dimensional space by utilizing the acquired external parameters between the two cameras, then projecting the image into two images of the image pair again, comparing the obtained projection points with the corresponding feature points respectively, and the external parameters are corrected according to the comparison result to obtain the correction result of the external parameters, so that the external parameters between the cameras are calibrated based on the natural scene without limiting calibration objects or calibration environments, and the calibration efficiency is improved.
In some embodiments, the step 203 of obtaining a corrected result of the external parameter by back-projecting each pair of feature points to a three-dimensional space by using the obtained external parameter between the two cameras, then re-projecting the pair of feature points to the two images of the pair of images, respectively performing position comparison between the obtained projection points and the corresponding original feature points, and correcting the external parameter according to the comparison result may include: iteratively adjusting external parameters between the two cameras to enable the positions of the projection point pairs and the corresponding feature point pairs after the external parameters are adjusted to reach a preset matching state; and correcting the acquired external parameters between the two cameras by using the adjustment quantity of the iterative adjustment to obtain a correction result of the external parameters.
Specifically, the external parameters may be adjusted in an iterative manner, a new external parameter is obtained through each iteration, the above comparison is repeated, the obtained external parameters between the two cameras are used to project the feature point pairs to the three-dimensional space, and then the feature point pairs are re-projected to the image pairs, so that the obtained projection point pairs and the corresponding feature point pairs are obtained, and the process of correcting the external parameters according to the comparison result is repeated until the positions of the projection point pairs and the corresponding feature point pairs after the external parameters are adjusted reach a preset matching state, and then the iterative adjustment process may be ended. Here, the iteration stop condition is that the positions of each pair of projection points and each corresponding pair of feature points reach a preset matching state, that is, the error between the position coordinates of each projection point with respect to each corresponding pair of feature points is reduced to a certain degree, and then the currently adjusted external parameter may be output.
With continued reference to FIG. 3, a flow diagram of another embodiment of a camera external reference calibration method according to the present application is shown. As shown in fig. 3, the process 300 of the calibration method of the camera external parameter includes the following steps:
step 301, acquiring an image pair of the same scene acquired by two cameras to be calibrated.
In this embodiment, an electronic device (such as the vehicle-mounted control unit or the server shown in fig. 1) on which the calibration method of the camera external parameter operates may be connected with the two cameras in a wired or wireless manner, so as to acquire an image pair of the same scene acquired by the two cameras. The two cameras may acquire a sequence of images at a set time period, the image pairs may be a plurality of image pairs in a sequence, and the two images in each image pair are acquired at the same time.
Step 302, extracting mutually matched feature point pairs from the two images of the image pair respectively.
Two feature points of the feature point pairs which are matched with each other are images of the same three-dimensional space point in the two corresponding images respectively. That is, each pair of feature points includes one feature point of each image in the pair of images, and both feature points in each pair of feature points are matched with the same three-dimensional space point. In this embodiment, the electronic device may extract feature points from two images of the image pair by using an image feature point detection algorithm and match the feature points.
Then, the extrinsic parameters between the two cameras may be iteratively adjusted, so that the positions of the projection point pairs and the corresponding feature point pairs after the extrinsic parameters are adjusted reach a preset matching state, and steps 303 to 306 are a specific implementation manner of iteratively adjusting the extrinsic parameters between the two cameras in this embodiment.
Step 303, determining the acquired external parameters between the two cameras as initial current external parameters.
In this embodiment, the external parameters between the two cameras obtained by measuring, projecting the pair of graphics onto the same projection plane, matching the feature points, and then estimating the translation distance, the rotation angle, and the like may be obtained as the initial current external parameters.
Step 304, an error estimation step is performed.
In this embodiment, the error between the current external parameter and the actual external parameter may be estimated. Specifically, step 304 may include steps 3041, 3042, 3043, and 3044 as follows.
In step 3041, a three-dimensional space point matching each feature point pair is reconstructed based on the position coordinates of each feature point pair and the current external parameters.
In the present embodiment, the feature point pairs P10 and P20 may be projected to a three-dimensional space with reference to equations (3) to (10) in the above-described embodiments, and the coordinates (x) of the corresponding three-dimensional space points may be obtainedw, yw,zw)。
In step 3042, the reconstructed three-dimensional space points are projected onto two images of the image pair based on the current external parameters, and the position coordinates of each pair of projection points corresponding to each pair of feature points in the image pair are determined.
Obtaining coordinates (x) of three-dimensional space pointsw,yw,zw) Then, the three-dimensional space point can be re-projected to the image pair collected by the two cameras by using the current external parametersThe volume can project the three-dimensional space points by adopting the formulas (1) and (2) to obtain projection points which are respectively projected to two images in the image pair, so as to form projection point pairs.
As shown in fig. 4, a schematic diagram of the positional relationship of the image plane and the three-dimensional spatial point of the image pair acquired by the two cameras to be calibrated is shown. Wherein C1 and C2 are positions of two cameras, IMA1 and IMA2 are an image pair, P0 is a three-dimensional space point reconstructed based on the feature point pairs P10 and P20, and P10 'and P20' are projection points obtained by re-projecting the three-dimensional space point P0 onto the image IMA1 and the image IMA2, respectively.
Step 3043, calculate the error between each projection point pair and the position coordinate of each corresponding feature point pair, and construct an energy function based on the errors corresponding to all feature point pairs in the feature point pair sequence.
In this embodiment, an error between the position coordinates of the projection point with respect to the corresponding feature point pair may be calculated, specifically, a projection error between the feature point on the same image and the coordinates of the projection point re-projected onto the image may be calculated, for example, a distance between the feature point and the projection point may be calculated as the projection error, and the projection errors of the two images in the image pair may be added together as the estimation error corresponding to the feature point pair. An energy function may then be constructed based on the estimated errors corresponding to the plurality of pairs of feature points in the sequence of pairs of feature points. The energy function may characterize the total estimated error between all pairs of feature points of the image pair and the corresponding pairs of projection points.
In some embodiments, the step of constructing the energy function based on the errors corresponding to all the feature point pairs in the feature point pair sequence may include: and accumulating the corresponding errors of all the characteristic point pairs in the characteristic point pair sequence to obtain the energy function. That is, the sum of the corresponding errors of the characteristic points can be used as an energy function.
Specifically, the energy function cost may be constructed based on the following equation (11):
Figure BDA0001411096810000151
wherein, XwjIs the coordinate, X, of the reconstructed three-dimensional space point based on the jth characteristic point pair in the characteristic point sequence1jHomogeneous coordinates, X, of feature points belonging to j pairs of feature points extracted from the first image of a pair2jThe homogeneous coordinates of the characteristic points belonging to j characteristic point pairs extracted from the second image in the image pair are obtained, n is the number of the characteristic point pairs in the characteristic point pair sequence, K1、K2The (R | t) is the relative extrinsic parameter matrix between the two cameras.
In other embodiments of the present application, the energy function may be constructed based on a sum of squares, an index, and the like of the error corresponding to each feature point, which is not particularly limited in the present application.
Step 3044, determine whether the energy function satisfies the predetermined convergence condition.
After the energy function is obtained through calculation, whether the energy function meets a preset convergence condition or not can be judged, and whether iteration is continued or not is determined according to a judgment result.
And 305, in response to the energy function is judged not to meet the preset convergence condition, adjusting the current external parameter along the gradient direction of the current external parameter based on the energy function, and returning to the step 304 to execute the error estimation step.
When the determination result in the step 3044 is "no", the current external parameter may be adjusted according to the energy function, specifically, a distance of a preset step length may be respectively adjusted for each parameter value of the translation parameter, the rotation angle parameter, and the like in the current external parameter along a descending or ascending direction of the gradient thereof in a gradient descending or gradient ascending manner, so as to serve as the adjusted current external parameter. The adjusted current external parameter is then used as the acquired current external parameter, and the error estimation step 304 is repeated.
In this embodiment, in the case that the energy function does not satisfy the preset convergence condition, the error estimation step 304 may be executed in a loop until the energy function satisfies the preset convergence condition, and the loop may be skipped to stop the process of iteratively adjusting the external parameter. Here, the preset convergence condition may be established based on the current energy function and the energy functions of the last several error estimation steps, for example, the amount of change of the current energy function from the energy functions obtained by the last several error estimation steps may be smaller than a set value.
Step 306, in response to determining that the energy function satisfies the predetermined convergence condition, determining that the positions of each projection point pair and each corresponding feature point pair reach a predetermined matching state.
When the energy function meets the preset convergence condition, the process of iteratively adjusting the current external parameters can be stopped, and at this time, the error can be determined to be small enough, that is, the position coordinates of the projection point pairs and the corresponding characteristic point pairs are determined to reach the preset matching state.
And 307, correcting the acquired external parameters between the two cameras by using the adjustment amount of the iterative adjustment to obtain a correction result of the external parameters.
In this embodiment, the initial current external parameter estimated by measuring and the like may be corrected by using the adjustment amount of the current external parameter in the error estimation step to obtain a corrected result of the external parameter, that is, a calibration result of the external parameter between cameras.
Step 301 and step 302 in the above method flow are respectively the same as step 201 and step 202 in the foregoing embodiment, and the above description for step 201 and step 202 also applies to step 301 and step 302 in this implementation, and is not repeated here.
As can be seen from fig. 3, compared with the embodiment corresponding to fig. 2, the process 300 of the calibration method for camera external parameters in this embodiment highlights the step of gradually modifying the current external parameters by using an iterative adjustment manner. Therefore, the scheme described in this embodiment can continuously adjust the external parameter according to the energy function to make the estimated value of the external parameter gradually approach the true value, thereby further improving the estimation accuracy of the external parameter.
With further reference to fig. 5, as an implementation of the method shown in the above-mentioned figures, the present application provides an embodiment of a calibration apparatus for camera external references, where the embodiment of the apparatus corresponds to the embodiment of the method shown in fig. 2, and the apparatus may be specifically applied to various electronic devices.
As shown in fig. 5, the calibration apparatus 500 of the camera external reference of the present embodiment includes: a first acquisition unit 501, an extraction unit 502 and a correction unit 503. The first obtaining unit 501 is configured to obtain an image pair of the same scene collected by two cameras to be calibrated; the extracting unit 502 is configured to extract a feature point pair from two images of an image pair, where two feature points of the mutually matched feature point pair are images of the same three-dimensional space point in the two corresponding images respectively; the correcting unit 503 is configured to perform back projection on each pair of feature points to a three-dimensional space by using the acquired external parameters between the two cameras, then re-project the feature points to the two images of the pair of images, perform position comparison on the obtained projection points and the corresponding original feature points, and correct the external parameters according to the comparison result, so as to obtain a correction result of the external parameters.
In this embodiment, the first acquisition unit 501 may receive the image pair of the same captured scene transmitted by the two cameras in real time, or extract the image pair of the same captured scene at the same time from the storage areas of the two cameras. The image pairs may be a plurality of image pairs in a sequence, each image pair may be images acquired by both cameras at the same time instant that contain the same natural scene.
The extraction unit 502 may extract a plurality of pairs of feature points from the pair of images acquired by the first acquisition unit 501 using a feature point detection and matching algorithm. Wherein each pair of feature points comprises one feature point of each image in the pair of images, and two feature points in each pair of feature points are matched with the same three-dimensional space point. Specifically, an algorithm such as SIFT, SURF, or the like may be adopted to extract feature points, and then the extracted feature points are matched, and the successfully matched feature points are combined into a feature point pair.
The correcting unit 503 may project the feature point pair extracted by the extracting unit 502 to a three-dimensional space by using the acquired external parameter between the two cameras to obtain coordinates of the three-dimensional space point, and then re-project the three-dimensional space point to two images in the image pair according to the coordinates of the three-dimensional space point and the acquired current external parameter to obtain a corresponding projection point pair. Then, the comparing unit 503 may compare the feature point pairs and the projection point pairs, and perform error calculation, specifically, may calculate a relative distance between the feature point pairs and the corresponding projection point pairs as a projection error. And then correcting extrinsic parameters between the two cameras according to the projection errors. Specifically, a corresponding relationship list between the distance between the feature point pair and the projection point pair and the correction amount may be preset according to experience or a result obtained by machine learning training, and after the distance between the feature point pair and the corresponding projection point pair is obtained by comparison, the list may be searched, so as to obtain the corresponding correction amount as a correction result of the external parameter.
The calibration apparatus 500 for camera external reference of the present embodiment obtains, by the first obtaining unit, an image pair of the same scene collected by two cameras to be calibrated, then extracts a feature point pair from two images of the image pair by the extracting unit, wherein two feature points of the mutually matched feature point pairs are respectively images of the same three-dimensional space point in the corresponding two images, then the correction unit utilizes the acquired external parameters between the two cameras to back-project each pair of feature point pairs to the three-dimensional space, then projecting the image into two images of the image pair again, respectively comparing the obtained projection points with the corresponding original feature points, and the external parameters are corrected according to the comparison result to obtain the correction result of the external parameters, so that the external parameters between the cameras are calibrated based on the natural scene without limiting calibration objects or calibration environments, and the calibration efficiency is improved.
In some embodiments, the correcting unit 503 may be further configured to correct the external parameter to obtain a corrected result of the external parameter as follows: iteratively adjusting external parameters between the two cameras to enable the positions of the projection point pairs and the corresponding feature point pairs after the external parameters are adjusted to reach a preset matching state; and correcting the acquired external parameters between the two cameras by using the adjustment quantity of the iterative adjustment to obtain a correction result of the external parameters.
In a further embodiment, the above-mentioned modification unit 503 may be further configured to iteratively adjust the extrinsic parameters between the two cameras as follows: determining the acquired external parameters between the two cameras as initial current external parameters, and executing an error estimation step, wherein the error estimation step comprises the following steps: reconstructing a three-dimensional space point matched with each characteristic point pair based on the position coordinates and current external parameters of each characteristic point pair, projecting the reconstructed three-dimensional space point into an image pair based on the current external parameters, determining the position coordinates of each projection point pair corresponding to each characteristic point pair in the image pair, calculating the error between each projection point pair and the position coordinates of each corresponding characteristic point pair, constructing an energy function based on the errors corresponding to all the characteristic point pairs in the characteristic point pair sequence, and judging whether the energy function meets the preset convergence condition or not; in response to the judgment that the energy function does not meet the preset convergence condition, adjusting the current external parameter along the gradient direction of the current external parameter based on the energy function, and executing an error estimation step; and responding to the judgment that the energy function meets the preset convergence condition, and determining that the positions of the projection point pairs and the corresponding feature point pairs reach a preset matching state.
Further, the above-mentioned modification unit 503 may be further configured to construct the energy function as follows: and accumulating the corresponding errors of all the characteristic point pairs in the characteristic point pair sequence to obtain an energy function.
In some embodiments, the inter-camera extrinsic parameter calibration apparatus may further include a second obtaining unit, configured to: and after the feature point pairs are back projected to a three-dimensional space by using the acquired external parameters between the two cameras, re-projecting the feature point pairs to the two images of the image pair, respectively comparing the obtained projection points with the corresponding feature points, and correcting the external parameters according to the comparison result to acquire the external parameters of the two cameras before obtaining the correction result of the external parameters.
The second obtaining unit may specifically be configured to: acquiring external parameters between two cameras obtained through measurement; or projecting the image pair onto the same projection plane, and estimating an external parameter between the two cameras based on the image pair generated by the projection. In particular, the second acquisition unit may be configured to estimate an extrinsic parameter between the two cameras based on a relative position between matching feature points in the projection-generated image pair. Therefore, before the comparison unit compares the characteristic point pairs with the corresponding projection point pairs, the external parameters of the two cameras acquired by the second acquisition unit can be used for carrying out three-dimensional space point reconstruction and re-projection, and then correction is carried out on the basis of the acquired external parameters, so that the numerical value search range of the external parameters can be reduced, and the estimation accuracy of the external parameters is improved.
It should be understood that the units recited in the apparatus 500 may correspond to various steps in the methods described with reference to fig. 2 and 3. Thus, the operations and features described above for the method are equally applicable to the apparatus 500 and the units included therein, and are not described in detail here.
Referring now to FIG. 6, shown is a block diagram of a computer system 600 suitable for use in implementing a terminal device/server according to embodiments of the subject application. The terminal device/server shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU)601 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the system 600 are also stored. The CPU 601, ROM 602, and RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output portion 607 including a display such as a Cathode Ray Tube (CRT), a Liquid Crystal Display (LCD), and the like, and a speaker; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted in the storage section 608 as necessary.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611. The computer program performs the above-described functions defined in the method of the present application when executed by a Central Processing Unit (CPU) 601. It should be noted that the computer readable medium described herein can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor comprises a first obtaining unit, an extracting unit, a comparing unit and a calibrating unit. Where the names of the units do not in some cases constitute a limitation of the unit itself, for example, the first acquisition unit may also be described as a "unit acquiring an image pair of the same scene acquired by two cameras to be calibrated".
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be present separately and not assembled into the device. The computer readable medium carries one or more programs which, when executed by the apparatus, cause the apparatus to: acquiring an image pair of the same scene acquired by two cameras to be calibrated; extracting a plurality of pairs of mutually matched feature point pairs from the two images of the image pair, wherein the two feature points of the mutually matched feature point pairs are images of the same three-dimensional space point in the two corresponding images respectively; and after the acquired external parameters between the two cameras are utilized to back-project the characteristic point pairs to a three-dimensional space, re-projecting the characteristic point pairs to two images of the image pair, comparing the positions of the obtained projection point pairs with the corresponding original characteristic points, and correcting the external parameters according to the comparison result to obtain the correction result of the external parameters.
The above description is only a preferred embodiment of the application and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention herein disclosed is not limited to the particular combination of features described above, but also encompasses other arrangements formed by any combination of the above features or their equivalents without departing from the spirit of the invention. For example, the above features may be replaced with (but not limited to) features having similar functions disclosed in the present application.

Claims (8)

1. A calibration method of external parameters of a camera is characterized by comprising the following steps:
acquiring an image pair of the same scene acquired by two cameras to be calibrated; wherein the two cameras acquire a difference between the pair of images of the same scene;
extracting a plurality of pairs of mutually matched feature point pairs from the two images of the image pair respectively, wherein the two feature points of the mutually matched feature point pairs are images of the same three-dimensional space point in the two corresponding images respectively;
determining the acquired extrinsic parameters between the two cameras as initial current extrinsic parameters, and performing an error estimation step, wherein the error estimation step comprises: reconstructing a three-dimensional space point matched with each characteristic point pair based on the position coordinates and current external parameters of each characteristic point pair, back-projecting the reconstructed three-dimensional space point to two images of the image pair based on the current external parameters, determining the position coordinates of each projection point pair corresponding to each characteristic point pair in the image pair, calculating the error between each projection point pair and the position coordinates of each corresponding characteristic point pair, constructing an energy function based on the errors corresponding to all the characteristic point pairs in the characteristic point pair sequence, and judging whether the energy function meets a preset convergence condition or not;
in response to determining that the energy function does not satisfy the preset convergence condition, adjusting a current external parameter along a gradient direction of the current external parameter based on the energy function, and performing the error estimation step;
in response to the fact that the energy function meets the preset convergence condition, determining that the positions of the projection point pairs and the corresponding feature point pairs reach a preset matching state;
and correcting the acquired external parameters between the two cameras by using the adjustment quantity of the iterative adjustment so as to obtain a correction result of the external parameters.
2. The method according to claim 1, wherein constructing an energy function based on the errors corresponding to all the feature point pairs in the feature point pair sequence comprises:
and accumulating the corresponding errors of all the characteristic point pairs in the characteristic point pair sequence to obtain the energy function.
3. The method according to claim 1, wherein before the method comprises the steps of, after back-projecting each pair of feature points to a three-dimensional space by using the acquired external parameters between two cameras, re-projecting the pairs of feature points to two images of the pair of images, respectively performing position comparison between the obtained projection points and corresponding original feature points, and correcting the external parameters according to the comparison result to obtain the correction result of the external parameters, the method further comprises:
acquiring extrinsic parameters of the two cameras, including:
acquiring external parameters between the two cameras obtained through measurement; or
Projecting the image pair onto the same projection plane, and estimating an extrinsic parameter between the two cameras based on the image pair generated by the projection.
4. A calibration device for external parameters of a camera, the device comprising:
the first acquisition unit is used for acquiring an image pair of the same scene acquired by two cameras to be calibrated; wherein the two cameras acquire a difference between the pair of images of the same scene;
an extracting unit, configured to extract multiple pairs of feature point pairs that are matched with each other from two images of the image pair, where two feature points of the feature point pairs that are matched with each other are images of the same three-dimensional space point in the two corresponding images, respectively;
a correction unit for determining the acquired external parameters between the two cameras as initial current external parameters, and performing an error estimation step, wherein the error estimation step comprises: reconstructing a three-dimensional space point matched with each characteristic point pair based on the position coordinate and the current external parameter of each characteristic point pair, projecting the reconstructed three-dimensional space point to the image pair based on the current external parameter, determining the position coordinate of each projection point pair corresponding to each characteristic point pair in the image pair, calculating the error between each projection point pair and the position coordinate of each corresponding characteristic point pair, constructing an energy function based on the errors corresponding to all the characteristic point pairs in the characteristic point pair sequence, and judging whether the energy function meets the preset convergence condition or not;
in response to determining that the energy function does not satisfy the preset convergence condition, adjusting a current external parameter along a gradient direction of the current external parameter based on the energy function, and performing the error estimation step;
in response to the fact that the energy function meets the preset convergence condition, determining that the positions of the projection point pairs and the corresponding feature point pairs reach a preset matching state;
and correcting the acquired external parameters between the two cameras by using the adjustment quantity of the iterative adjustment so as to obtain a correction result of the external parameters.
5. The apparatus of claim 4, wherein the modification unit is further configured to construct the energy function as follows:
and accumulating the corresponding errors of all the characteristic point pairs in the characteristic point pair sequence to obtain the energy function.
6. The apparatus of claim 4, further comprising a second obtaining unit configured to: after each pair of feature points is back-projected to a three-dimensional space by using the acquired external parameters between the two cameras, re-projected to the two images of the image pair, respectively performing position comparison on the obtained projection points and the corresponding original feature points, and correcting the external parameters according to the comparison result to acquire the external parameters of the two cameras before the correction result of the external parameters is obtained;
the second obtaining unit is specifically configured to:
acquiring external parameters between the two cameras obtained through measurement; or
Projecting the image pair onto the same projection plane, and estimating an extrinsic parameter between the two cameras based on the image pair generated by the projection.
7. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-3.
8. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1-3.
CN201710842468.9A 2017-09-18 2017-09-18 Method and device for calibrating external parameters of camera Active CN109523597B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710842468.9A CN109523597B (en) 2017-09-18 2017-09-18 Method and device for calibrating external parameters of camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710842468.9A CN109523597B (en) 2017-09-18 2017-09-18 Method and device for calibrating external parameters of camera

Publications (2)

Publication Number Publication Date
CN109523597A CN109523597A (en) 2019-03-26
CN109523597B true CN109523597B (en) 2022-06-03

Family

ID=65769538

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710842468.9A Active CN109523597B (en) 2017-09-18 2017-09-18 Method and device for calibrating external parameters of camera

Country Status (1)

Country Link
CN (1) CN109523597B (en)

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110148185B (en) * 2019-05-22 2022-04-15 北京百度网讯科技有限公司 Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment
CN112132902B (en) * 2019-06-24 2024-01-16 上海安亭地平线智能交通技术有限公司 Vehicle-mounted camera external parameter adjusting method and device, electronic equipment and medium
CN114663528A (en) 2019-10-09 2022-06-24 阿波罗智能技术(北京)有限公司 Multi-phase external parameter combined calibration method, device, equipment and medium
CN110675635B (en) * 2019-10-09 2021-08-03 北京百度网讯科技有限公司 Method and device for acquiring external parameters of camera, electronic equipment and storage medium
CN110766760B (en) * 2019-10-21 2022-08-02 北京百度网讯科技有限公司 Method, device, equipment and storage medium for camera calibration
CN110728721B (en) * 2019-10-21 2022-11-01 北京百度网讯科技有限公司 Method, device and equipment for acquiring external parameters
CN110866956A (en) * 2019-10-28 2020-03-06 中国科学院深圳先进技术研究院 Robot calibration method and terminal
CN112837227B (en) * 2019-11-22 2023-07-04 杭州海康威视数字技术股份有限公司 Parameter correction method, device and system, electronic equipment and storage medium
CN111325803B (en) * 2020-02-12 2023-05-12 清华大学深圳国际研究生院 Calibration method for evaluating internal and external participation time synchronization of binocular camera
CN111179359B (en) * 2020-04-10 2023-03-14 浙江欣奕华智能科技有限公司 Method and device for determining external parameters of photographing system
CN111429532B (en) * 2020-04-30 2023-03-31 南京大学 Method for improving camera calibration accuracy by utilizing multi-plane calibration plate
CN111780673B (en) * 2020-06-17 2022-05-31 杭州海康威视数字技术股份有限公司 Distance measurement method, device and equipment
CN114170302A (en) * 2020-08-20 2022-03-11 北京达佳互联信息技术有限公司 Camera external parameter calibration method and device, electronic equipment and storage medium
CN111986086B (en) * 2020-08-27 2021-11-09 贝壳找房(北京)科技有限公司 Three-dimensional image optimization generation method and system
CN112102417B (en) * 2020-09-15 2024-04-19 阿波罗智联(北京)科技有限公司 Method and device for determining world coordinates
CN113628283A (en) * 2021-08-10 2021-11-09 地平线征程(杭州)人工智能科技有限公司 Parameter calibration method and device for camera device, medium and electronic equipment
CN114092559A (en) * 2021-11-30 2022-02-25 中德(珠海)人工智能研究院有限公司 Training method and device for panoramic image feature point descriptor generation network
CN114022570B (en) * 2022-01-05 2022-06-17 荣耀终端有限公司 Method for calibrating external parameters between cameras and electronic equipment
WO2023240401A1 (en) * 2022-06-13 2023-12-21 北京小米移动软件有限公司 Camera calibration method and apparatus, and readable storage medium
CN115512242B (en) * 2022-07-22 2023-05-30 北京微视威信息科技有限公司 Scene change detection method and flight device

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101419705B (en) * 2007-10-24 2011-01-05 华为终端有限公司 Video camera demarcating method and device
CN103150724B (en) * 2013-02-06 2015-06-03 长春师范大学 Segmented model-based camera calibration method
CN103458181B (en) * 2013-06-29 2016-12-28 华为技术有限公司 Lens distortion parameter adjusting method, device and picture pick-up device
CN104077585B (en) * 2014-05-30 2017-09-22 小米科技有限责任公司 Method for correcting image, device and terminal
CN104236478B (en) * 2014-09-19 2017-01-18 山东交通学院 Automatic vehicle overall size measuring system and method based on vision
CN104713885B (en) * 2015-03-04 2017-06-30 中国人民解放军国防科学技术大学 A kind of structure light for pcb board on-line checking aids in binocular measuring method
CN105118055B (en) * 2015-08-11 2017-12-15 北京电影学院 Camera position amendment scaling method and system
CN105807741B (en) * 2016-03-09 2018-08-07 北京科技大学 A kind of industrial process stream prediction technique
CN106408614B (en) * 2016-09-27 2019-03-15 中国船舶工业系统工程研究院 Camera intrinsic parameter Calibration Method and system suitable for field application

Also Published As

Publication number Publication date
CN109523597A (en) 2019-03-26

Similar Documents

Publication Publication Date Title
CN109523597B (en) Method and device for calibrating external parameters of camera
CN107063228B (en) Target attitude calculation method based on binocular vision
KR101791590B1 (en) Object pose recognition apparatus and method using the same
CN109544629B (en) Camera position and posture determining method and device and electronic equipment
CN108955718B (en) Visual odometer and positioning method thereof, robot and storage medium
CN110070564B (en) Feature point matching method, device, equipment and storage medium
WO2019047641A1 (en) Method and device for estimating orientation error of onboard camera
CN110047108B (en) Unmanned aerial vehicle pose determination method and device, computer equipment and storage medium
EP2199983A1 (en) A method of estimating a motion of a multiple camera system, a multiple camera system and a computer program product
CN111738032B (en) Vehicle driving information determination method and device and vehicle-mounted terminal
CN114862973B (en) Space positioning method, device and equipment based on fixed point location and storage medium
CN112967339A (en) Vehicle pose determination method, vehicle control method and device and vehicle
CN113205447A (en) Road picture marking method and device for lane line identification
CN113793251A (en) Pose determination method and device, electronic equipment and readable storage medium
CN112270748B (en) Three-dimensional reconstruction method and device based on image
CN112668596B (en) Three-dimensional object recognition method and device, recognition model training method and device
CN110827337B (en) Method and device for determining posture of vehicle-mounted camera and electronic equipment
CN109902695B (en) Line feature correction and purification method for image pair linear feature matching
CN111288956B (en) Target attitude determination method, device, equipment and storage medium
CN111401423A (en) Data processing method and device for automatic driving vehicle
CN116597246A (en) Model training method, target detection method, electronic device and storage medium
CN115713560A (en) Camera and vehicle external parameter calibration method and device, electronic equipment and storage medium
CN114882465A (en) Visual perception method and device, storage medium and electronic equipment
CN112288817A (en) Three-dimensional reconstruction processing method and device based on image
CN109978986B (en) Three-dimensional model reconstruction method and device, storage medium and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant