CN111311693B - Online calibration method and system for multi-camera - Google Patents
Online calibration method and system for multi-camera Download PDFInfo
- Publication number
- CN111311693B CN111311693B CN202010181386.6A CN202010181386A CN111311693B CN 111311693 B CN111311693 B CN 111311693B CN 202010181386 A CN202010181386 A CN 202010181386A CN 111311693 B CN111311693 B CN 111311693B
- Authority
- CN
- China
- Prior art keywords
- camera
- calibration
- ground
- point pairs
- cameras
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 47
- 238000004364 calculation method Methods 0.000 claims description 22
- 230000009466 transformation Effects 0.000 claims description 17
- 230000002159 abnormal effect Effects 0.000 claims description 16
- 239000011159 matrix material Substances 0.000 claims description 14
- 238000013507 mapping Methods 0.000 claims description 9
- 238000005457 optimization Methods 0.000 claims description 8
- 238000005259 measurement Methods 0.000 abstract description 8
- 230000006870 function Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 238000013519 translation Methods 0.000 description 2
- 206010039203 Road traffic accident Diseases 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000004744 fabric Substances 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000750 progressive effect Effects 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Studio Devices (AREA)
Abstract
The invention discloses an online calibration method and system of a multi-camera, wherein the method comprises the following steps: calibrating parameters of each camera to obtain the internal parameters of each camera; in the process of the vehicle moving straight at a slow speed, respectively carrying out ground calibration on cameras which are arranged on the vehicle and are subjected to parameter calibration, so as to obtain a ground calibration result; when the vehicle stays on the ground with rich textures, the pictures shot by a plurality of cameras are collected at the same time; calculating the top view of each camera based on the internal parameters of each camera and the ground calibration result; and calculating matched characteristic point pairs on the overlapping area of the top views of the adjacent cameras, and calculating based on a plurality of groups of matched characteristic point pairs to obtain a strictly aligned calibration result. The invention can realize calibration without auxiliary calibration tools or preprinted sites by only using a flat ground with certain textures, thereby greatly avoiding the cost caused by calibration tools or environments and reducing the possibility of errors caused by personnel operation and measurement.
Description
Technical Field
The invention relates to the technical field of image processing, in particular to an online calibration method and system for a multi-camera.
Background
Currently, with the continuous development of electronic technology, a multi-camera look-around splicing technology is widely applied to the field of automobiles, particularly for large-scale transportation vehicles and engineering vehicles, because dead zones are very large, serious traffic accidents are often caused, and therefore look-around products are particularly important. The method is characterized in that a plurality of cameras are installed on a vehicle or mobile equipment, and independent and combined calibration work by utilizing an image processing technology is the basis of a look-around function.
Conventional look-around calibration methods rely on calibration aids, requiring a special pattern of calibration cloth or calibration plate to be laid around the vehicle, and measuring the distance between these patterns. After the cameras acquire images, the relation between the cameras and the ground is calibrated by identifying characteristic points in the images, calculation is performed by combining measured distance data, and the relative relation between the cameras is calibrated, so that final looking-around calibration is completed. Or printing large-area squares on the ground, enabling the vehicle to drive into the squares, and realizing the calibration of the look-around by identifying the crossing points of the squares and the preset square size data.
Therefore, for the traditional looking-around calibration method using the calibration auxiliary tool, a special calibration auxiliary tool is needed, the production and transportation cost is increased, the distance between patterns is required to be accurately measured, errors are easily caused by uneven placement or inaccurate measurement, the calibration process is complex, special training personnel are required to operate, the cost is high, and the efficiency is low; for the calibration method for printing the calibration patterns on the ground, a large ground is reserved, the calibration can be carried out only in a fixed place, the printed patterns can be blurred and damaged due to the fact that the printed patterns are rolled by a vehicle repeatedly over time, maintenance is needed, and different manufacturers have different requirements and different sizes on the patterns and the places cannot be compatible.
Therefore, how to effectively calibrate the multi-camera is a problem to be solved.
Disclosure of Invention
In view of the above, the invention provides an online calibration method for a multi-camera, which does not need an auxiliary calibration tool or a pre-printed site, and can realize calibration by only finding a flat ground with a certain texture, thereby greatly avoiding the cost caused by the calibration tool or the environment and reducing the possibility of errors caused by personnel operation and measurement.
The invention provides an online calibration method of a multi-camera, which comprises the following steps:
calibrating parameters of each camera to obtain the internal parameters of each camera;
in the process of the vehicle moving straight at a slow speed, respectively carrying out ground calibration on cameras which are arranged on the vehicle and are subjected to parameter calibration, so as to obtain a ground calibration result;
when the vehicle stays on the ground with rich textures, the pictures shot by a plurality of cameras are collected at the same time;
calculating the top view of each camera based on the internal parameters of each camera and the ground calibration result;
calculating matched characteristic point pairs on the top view overlapping areas of the adjacent cameras;
and calculating based on the plurality of groups of matched characteristic point pairs to obtain a strictly aligned calibration result.
Preferably, the calculating the top view of each camera based on the internal parameters of each camera and the ground calibration result includes:
and removing distortion of pictures shot by the cameras and carrying out homography transformation on the pictures based on the internal parameters of each camera and the ground calibration result, so as to obtain a top view of each camera.
Preferably, the calculating based on the plurality of sets of matched feature point pairs to obtain a calibration result of strict alignment includes:
performing nonlinear optimization calculation on a plurality of groups of matched characteristic point pairs to obtain homography transformation matrixes capable of aligning top views of adjacent cameras;
and mapping the homography transformation matrix back to the relation between the camera and the ground to obtain a calibration result.
Preferably, the calculating the matched feature point pairs on the overlapping area of the top views of the adjacent cameras includes:
and eliminating abnormal matching point pairs on the overlapping area of the top views of the adjacent cameras to obtain matched characteristic point pairs.
Preferably, the removing the abnormal matching point pairs from the overlapping area of the top views of the adjacent cameras to obtain the matched feature point pairs includes:
and removing the matched point pairs with outliers of each pair of points being larger than a preset threshold value from the overlapping area of the top views of the adjacent cameras to obtain matched characteristic point pairs.
An online calibration system for a multi-camera, comprising:
the parameter calibration module is used for respectively calibrating parameters of each camera to obtain the internal parameters of each camera;
the ground calibration module is used for respectively performing ground calibration on cameras which are installed on the vehicle and subjected to parameter calibration in the process of the vehicle moving straight at a slow speed to obtain a ground calibration result;
the acquisition module is used for simultaneously acquiring pictures shot by a plurality of cameras when the vehicle stays on the ground with rich textures;
the first calculation module is used for calculating the top view of each camera based on the internal parameters of each camera and the ground calibration result;
the second calculation module is used for calculating matched characteristic point pairs on the top view overlapping area of the adjacent cameras;
and the third calculation module is used for calculating and obtaining a strictly aligned calibration result based on the plurality of groups of matched characteristic point pairs.
Preferably, the first computing module is specifically configured to:
and removing distortion of pictures shot by the cameras and carrying out homography transformation on the pictures based on the internal parameters of each camera and the ground calibration result, so as to obtain a top view of each camera.
Preferably, the third computing module is specifically configured to:
performing nonlinear optimization calculation on a plurality of groups of matched characteristic point pairs to obtain homography transformation matrixes capable of aligning top views of adjacent cameras;
and mapping the homography transformation matrix back to the relation between the camera and the ground to obtain a calibration result.
Preferably, the second computing module is specifically configured to:
and eliminating abnormal matching point pairs on the overlapping area of the top views of the adjacent cameras to obtain matched characteristic point pairs.
Preferably, the second calculating module eliminates abnormal matching point pairs on the top view overlapping area of the adjacent cameras, and obtains matched feature point pairs, which are specifically used for:
and removing the matched point pairs with outliers of each pair of points being larger than a preset threshold value from the overlapping area of the top views of the adjacent cameras to obtain matched characteristic point pairs.
In summary, the invention discloses an online calibration method for multiple cameras, when multiple cameras are required to be calibrated online, firstly, parameter calibration is performed on each camera to obtain internal parameters of each camera, and then, in the process of the vehicle moving along at a slow speed, ground calibration is performed on the cameras which are mounted on the vehicle and subjected to parameter calibration to obtain a ground calibration result; when the vehicle stays on the ground with rich textures, the pictures shot by a plurality of cameras are collected at the same time; calculating the top view of each camera based on the internal parameters of each camera and the ground calibration result; and calculating matched characteristic point pairs on the overlapping area of the top views of the adjacent cameras, and calculating based on a plurality of groups of matched characteristic point pairs to obtain a strictly aligned calibration result. The invention can realize calibration by only finding a flat ground with certain texture without a calibration auxiliary tool or a preprinted field, thereby greatly avoiding the cost caused by the calibration tool or the environment and reducing the possibility of errors caused by personnel operation and measurement.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of an embodiment 1 of an online calibration method for a multi-camera;
FIG. 2 is a flow chart of an embodiment 2 of an online calibration method for a multi-camera according to the present invention;
FIG. 3 is a schematic structural diagram of an embodiment 1 of an online calibration system for a multi-camera according to the present disclosure;
fig. 4 is a schematic structural diagram of an embodiment 2 of an online calibration system for a multi-camera according to the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
As shown in fig. 1, a method flowchart of an embodiment 1 of an online calibration method for a multi-camera according to the present disclosure may include the following steps:
s101, respectively calibrating parameters of each camera to obtain the internal parameters of each camera;
when a plurality of cameras on a vehicle are required to be calibrated online, firstly, parameter calibration is carried out on each camera which is required to be installed on the vehicle, and internal parameters such as focal length, principal point, lens distortion coefficient and the like of the camera which are only related to the characteristics of the camera are obtained.
S102, respectively performing ground calibration on cameras which are arranged on the vehicle and subjected to parameter calibration in the process of the slow and straight running of the vehicle, so as to obtain a ground calibration result;
after the parameters of each camera are calibrated respectively to obtain the internal parameters of each camera, the calibrated cameras are installed at the proper positions of the vehicle body, the vehicle is started, and an automatic calibration program is entered. After entering an automatic calibration program, the vehicle is kept to move straight at a low speed, and in the process of moving straight at a low speed, each camera installed on the vehicle is respectively calibrated on the ground, and the ground normal vector, the height, the relative direction of the camera and the vehicle and the like of each camera are calibrated in sequence.
S103, when the vehicle stays on the ground with rich textures, simultaneously acquiring pictures shot by a plurality of cameras;
after the ground calibration of the cameras is completed, the vehicle is kept to move straight or is stopped on the ground with rich textures, and pictures shot by a plurality of cameras arranged on the vehicle are collected.
S104, calculating the top view of each camera based on the internal parameters of each camera and the ground calibration result;
and then, according to the obtained internal parameters of each camera and the ground calibration result, processing the picture shot by the camera, and calculating to obtain the top view of each camera.
S105, calculating matched characteristic point pairs on the top view overlapping areas of the adjacent cameras;
after the top view of each camera is calculated, matched feature point pairs are further calculated on the top view overlapping area of the adjacent cameras.
S106, calculating based on a plurality of groups of matched characteristic point pairs to obtain a strictly aligned calibration result.
And finally, calculating according to the characteristic point pairs of the plurality of groups of cameras to obtain a strictly aligned calibration result.
In summary, in the above embodiment, when online calibration is required for a plurality of cameras, firstly, parameter calibration is performed for each camera to obtain an internal parameter of each camera, and then, in the process of the vehicle moving straight at a slow speed, ground calibration is performed for the cameras subjected to parameter calibration, which are installed on the vehicle, to obtain a ground calibration result; when the vehicle stays on the ground with rich textures, the pictures shot by a plurality of cameras are collected at the same time; calculating the top view of each camera based on the internal parameters of each camera and the ground calibration result; and calculating matched characteristic point pairs on the overlapping area of the top views of the adjacent cameras, and calculating based on a plurality of groups of matched characteristic point pairs to obtain a strictly aligned calibration result. The invention can realize calibration by only finding a flat ground with certain texture without a calibration auxiliary tool or a preprinted field, thereby greatly avoiding the cost caused by the calibration tool or the environment and reducing the possibility of errors caused by personnel operation and measurement.
As shown in fig. 2, a method flowchart of an embodiment 2 of an online calibration method for a multi-camera according to the present disclosure may include the following steps:
s201, respectively calibrating parameters of each camera to obtain the internal parameters of each camera;
when a plurality of cameras on a vehicle are required to be calibrated online, firstly, parameter calibration is carried out on each camera which is required to be installed on the vehicle, and internal parameters such as focal length, principal point, lens distortion coefficient and the like of the camera which are only related to the characteristics of the camera are obtained.
S202, respectively performing ground calibration on cameras which are arranged on the vehicle and subjected to parameter calibration in the process of the slow and straight running of the vehicle to obtain a ground calibration result;
after the parameters of each camera are calibrated respectively to obtain the internal parameters of each camera, the calibrated cameras are installed at the proper positions of the vehicle body, the vehicle is started, and an automatic calibration program is entered. After entering an automatic calibration program, the vehicle is kept to move straight at a low speed, and in the process of moving straight at a low speed, each camera installed on the vehicle is respectively calibrated on the ground, and the ground normal vector, the height, the relative direction of the camera and the vehicle and the like of each camera are calibrated in sequence.
S203, when the vehicle stays on the ground with rich textures, simultaneously acquiring pictures shot by a plurality of cameras;
after the ground calibration of the cameras is completed, the vehicle is kept to move straight or is stopped on the ground with rich textures, and pictures shot by a plurality of cameras arranged on the vehicle are collected.
S204, based on the internal parameters of each camera and the ground calibration result, de-distorting the picture shot by the camera and performing homography transformation to obtain a top view of each camera;
after ground calibration, camera coordinate system c of ith camera i And the ground coordinate system g i The relationship between them can be represented by a homography matrix H gci Description. According to the included angle theta between the camera ground coordinate system and the vehicle ground coordinate system i Obtaining a rotation matrix:
based on the translational relationship (d) between the approximate mounting position of the camera and the origin of the vehicle coordinate system xi ,d yi ) Obtaining a translation matrix:
next according to R i And T i And mapping the camera picture after de-distortion into a top view. The correspondence between the pixels (u, v) and the camera picture pixels (x, y) in the top view is as follows:
s205, eliminating abnormal matching point pairs on the top view overlapping area of the adjacent cameras to obtain matched characteristic point pairs;
dividing top views of a plurality of cameras into N groups, wherein the grouping method is to take the cameras with adjacent mounting positions or enough large common view areas as one group, mapping pictures of each group of cameras into top views, and then obtaining M groups of matching point pairs p on the N groups of top views ij And q ij 。
And removing abnormal matching points from the matching point pairs. The abnormal matching points are usually generated by mismatching or objects higher than the ground, and the method for eliminating the abnormal matching points is to evaluate the outliers of each pair of point distances respectively:
wherein the term "norm () operation refers to calculating the distance between two points. After calculation, based on the selected threshold th, L i >And (5) eliminating the characteristic point pairs of th to obtain matched characteristic point pairs.
S206, performing nonlinear optimization calculation on a plurality of groups of matched characteristic point pairs to obtain homography transformation matrixes capable of aligning top views of adjacent cameras;
and then, carrying out nonlinear optimization on the screened matching point pairs. The nonlinear optimized loss function is defined as follows:
wherein:
when the number of matching point pairs on one group of pictures is insufficient, shooting is needed to be carried out for a plurality of times at different positions, and the matching point pairs of a plurality of groups of pictures are calculated together. When shooting P times, the nonlinear optimized loss function is:
s207, mapping the homography transformation matrix back to the relation between the camera and the ground to obtain a calibration result.
After obtaining H i Then, it is superimposed to the original homography matrix H gci And obtaining a final calibration result:
in summary, the invention does not need calibration auxiliary tools or print specific patterns on the ground in advance, directly uses the existing textures of the ground, such as cracks, stains, lane marks or other apparent points, calculates the relative relationship between cameras through the overlapping area of the fields of view of the cameras, greatly avoids the cost caused by calibration tools or environments, and reduces the possibility of errors caused by personnel operation and measurement.
As shown in fig. 3, a schematic structural diagram of an embodiment 1 of an online calibration system for a multi-camera according to the present disclosure may include:
the parameter calibration module 301 is configured to perform parameter calibration on each camera to obtain an internal parameter of each camera;
when a plurality of cameras on a vehicle are required to be calibrated online, firstly, parameter calibration is carried out on each camera which is required to be installed on the vehicle, and internal parameters such as focal length, principal point, lens distortion coefficient and the like of the camera which are only related to the characteristics of the camera are obtained.
The ground calibration module 302 is configured to perform ground calibration on the cameras that are mounted on the vehicle and calibrated by parameters, respectively, in a process of the vehicle moving straight at a slow speed, so as to obtain a ground calibration result;
after the parameters of each camera are calibrated respectively to obtain the internal parameters of each camera, the calibrated cameras are installed at the proper positions of the vehicle body, the vehicle is started, and an automatic calibration program is entered. After entering an automatic calibration program, the vehicle is kept to move straight at a low speed, and in the process of moving straight at a low speed, each camera installed on the vehicle is respectively calibrated on the ground, and the ground normal vector, the height, the relative direction of the camera and the vehicle and the like of each camera are calibrated in sequence.
The acquisition module 303 is configured to acquire pictures captured by multiple cameras at the same time when the vehicle stays on the ground with rich textures;
after the ground calibration of the cameras is completed, the vehicle is kept to move straight or is stopped on the ground with rich textures, and pictures shot by a plurality of cameras arranged on the vehicle are collected.
The first calculation module 304 is configured to calculate a top view of each camera based on the internal parameter of each camera and the ground calibration result;
and then, according to the obtained internal parameters of each camera and the ground calibration result, processing the picture shot by the camera, and calculating to obtain the top view of each camera.
A second calculation module 305, configured to calculate matched pairs of feature points on overlapping areas of top views of adjacent cameras;
after the top view of each camera is calculated, matched feature point pairs are further calculated on the top view overlapping area of the adjacent cameras.
And a third calculation module 306, configured to calculate a calibration result that is strictly aligned based on the multiple sets of matched pairs of feature points.
And finally, calculating according to the characteristic point pairs of the plurality of groups of cameras to obtain a strictly aligned calibration result.
In summary, in the above embodiment, when online calibration is required for a plurality of cameras, firstly, parameter calibration is performed for each camera to obtain an internal parameter of each camera, and then, in the process of the vehicle moving straight at a slow speed, ground calibration is performed for the cameras subjected to parameter calibration, which are installed on the vehicle, to obtain a ground calibration result; when the vehicle stays on the ground with rich textures, the pictures shot by a plurality of cameras are collected at the same time; calculating the top view of each camera based on the internal parameters of each camera and the ground calibration result; and calculating matched characteristic point pairs on the overlapping area of the top views of the adjacent cameras, and calculating based on a plurality of groups of matched characteristic point pairs to obtain a strictly aligned calibration result. The invention can realize calibration by only finding a flat ground with certain texture without a calibration auxiliary tool or a preprinted field, thereby greatly avoiding the cost caused by the calibration tool or the environment and reducing the possibility of errors caused by personnel operation and measurement.
As shown in fig. 4, a schematic structural diagram of an embodiment 2 of an online calibration system for a multi-camera according to the present disclosure may include:
the parameter calibration module 401 is configured to perform parameter calibration on each camera to obtain an internal parameter of each camera;
when a plurality of cameras on a vehicle are required to be calibrated online, firstly, parameter calibration is carried out on each camera which is required to be installed on the vehicle, and internal parameters such as focal length, principal point, lens distortion coefficient and the like of the camera which are only related to the characteristics of the camera are obtained.
The ground calibration module 402 is configured to perform ground calibration on the cameras that are mounted on the vehicle and calibrated by parameters, respectively, in a process that the vehicle moves straight at a slow speed, so as to obtain a ground calibration result;
after the parameters of each camera are calibrated respectively to obtain the internal parameters of each camera, the calibrated cameras are installed at the proper positions of the vehicle body, the vehicle is started, and an automatic calibration program is entered. After entering an automatic calibration program, the vehicle is kept to move straight at a low speed, and in the process of moving straight at a low speed, each camera installed on the vehicle is respectively calibrated on the ground, and the ground normal vector, the height, the relative direction of the camera and the vehicle and the like of each camera are calibrated in sequence.
The acquisition module 403 is configured to acquire images captured by multiple cameras simultaneously when the vehicle stays on the ground with rich textures;
after the ground calibration of the cameras is completed, the vehicle is kept to move straight or is stopped on the ground with rich textures, and pictures shot by a plurality of cameras arranged on the vehicle are collected.
The first calculation module 404 is configured to de-distort and homography transform a picture captured by each camera based on an internal parameter of each camera and a ground calibration result, so as to obtain a top view of each camera;
after ground calibration, camera coordinate system c of ith camera i And the ground coordinate system g i The relationship between them can be represented by a homography matrix H gci Description. According to the included angle theta between the camera ground coordinate system and the vehicle ground coordinate system i Obtaining a rotation matrix:
based on the translational relationship (d) between the approximate mounting position of the camera and the origin of the vehicle coordinate system xi ,d yi ) Obtaining a translation matrix:
next according to R i And T i And mapping the camera picture after de-distortion into a top view. The correspondence between the pixels (u, v) and the camera picture pixels (x, y) in the top view is as follows:
a second calculation module 405, configured to reject abnormal matching point pairs in the overlapping area of the top views of the adjacent cameras, so as to obtain matched feature point pairs;
dividing top views of a plurality of cameras into N groups, wherein the grouping method is to take the cameras with adjacent mounting positions or enough large common view areas as one group, mapping pictures of each group of cameras into top views, and then obtaining M groups of matching point pairs p on the N groups of top views ij And q ij 。
And removing abnormal matching points from the matching point pairs. The abnormal matching points are usually generated by mismatching or objects higher than the ground, and the method for eliminating the abnormal matching points is to evaluate the outliers of each pair of point distances respectively:
wherein the term "norm () operation refers to calculating the distance between two points. After calculation, based on the selected threshold th, L i >th feature point pairs are removed to obtain a matchAnd matching the characteristic point pairs.
A third calculation module 406, configured to perform nonlinear optimization calculation on a plurality of sets of matched feature point pairs, so as to obtain homography transformation matrices capable of aligning top views of adjacent cameras;
and then, carrying out nonlinear optimization on the screened matching point pairs. The nonlinear optimized loss function is defined as follows:
wherein:
when the number of matching point pairs on one group of pictures is insufficient, shooting is needed to be carried out for a plurality of times at different positions, and the matching point pairs of a plurality of groups of pictures are calculated together. When shooting P times, the nonlinear optimized loss function is:
the third calculation module 406 is further configured to map the homography transformation matrix back to a relationship between the camera and the ground, so as to obtain a calibration result.
After obtaining H i Then, it is superimposed to the original homography matrix H gci And obtaining a final calibration result:
in summary, the invention does not need calibration auxiliary tools or print specific patterns on the ground in advance, directly uses the existing textures of the ground, such as cracks, stains, lane marks or other apparent points, calculates the relative relationship between cameras through the overlapping area of the fields of view of the cameras, greatly avoids the cost caused by calibration tools or environments, and reduces the possibility of errors caused by personnel operation and measurement.
In the present specification, each embodiment is described in a progressive manner, and each embodiment is mainly described in a different point from other embodiments, and identical and similar parts between the embodiments are all enough to refer to each other. For the device disclosed in the embodiment, since it corresponds to the method disclosed in the embodiment, the description is relatively simple, and the relevant points refer to the description of the method section.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative elements and steps are described above generally in terms of functionality in order to clearly illustrate the interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. The software modules may be disposed in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
Claims (10)
1. An online calibration method for a multi-camera is characterized by comprising the following steps:
calibrating parameters of each camera to obtain the internal parameters of each camera;
in the process of the vehicle moving straight at a slow speed, respectively carrying out ground calibration on cameras which are arranged on the vehicle and are subjected to parameter calibration, so as to obtain a ground calibration result;
when the vehicle stays on the ground with rich textures, the pictures shot by a plurality of cameras are collected at the same time;
calculating the top view of each camera based on the internal parameters of each camera and the ground calibration result;
calculating matched characteristic point pairs on the top view overlapping areas of the adjacent cameras;
and calculating based on the plurality of groups of matched characteristic point pairs to obtain a strictly aligned calibration result.
2. The method of claim 1, wherein calculating a top view of each camera based on the internal parameters of each camera and the ground calibration results comprises:
and removing distortion of pictures shot by the cameras and carrying out homography transformation on the pictures based on the internal parameters of each camera and the ground calibration result, so as to obtain a top view of each camera.
3. The method according to claim 2, wherein the computing of the precisely aligned calibration results based on the plurality of sets of matched pairs of feature points comprises:
performing nonlinear optimization calculation on a plurality of groups of matched characteristic point pairs to obtain homography transformation matrixes capable of aligning top views of adjacent cameras;
and mapping the homography transformation matrix back to the relation between the camera and the ground to obtain a calibration result.
4. A method according to claim 3, wherein said computing matched pairs of feature points over overlapping areas of top views of adjacent cameras comprises:
and eliminating abnormal matching point pairs on the overlapping area of the top views of the adjacent cameras to obtain matched characteristic point pairs.
5. The method of claim 4, wherein the removing the abnormal matching point pairs from the overlapping area of the top views of the adjacent cameras to obtain the matched feature point pairs comprises:
and removing the matched point pairs with outliers of each pair of points being larger than a preset threshold value from the overlapping area of the top views of the adjacent cameras to obtain matched characteristic point pairs.
6. An online calibration system for a multi-camera, comprising:
the parameter calibration module is used for respectively calibrating parameters of each camera to obtain the internal parameters of each camera;
the ground calibration module is used for respectively performing ground calibration on cameras which are installed on the vehicle and subjected to parameter calibration in the process of the vehicle moving straight at a slow speed to obtain a ground calibration result;
the acquisition module is used for simultaneously acquiring pictures shot by a plurality of cameras when the vehicle stays on the ground with rich textures;
the first calculation module is used for calculating the top view of each camera based on the internal parameters of each camera and the ground calibration result;
the second calculation module is used for calculating matched characteristic point pairs on the top view overlapping area of the adjacent cameras;
and the third calculation module is used for calculating and obtaining a strictly aligned calibration result based on the plurality of groups of matched characteristic point pairs.
7. The system of claim 6, wherein the first computing module is specifically configured to:
and removing distortion of pictures shot by the cameras and carrying out homography transformation on the pictures based on the internal parameters of each camera and the ground calibration result, so as to obtain a top view of each camera.
8. The system of claim 7, wherein the third computing module is specifically configured to:
performing nonlinear optimization calculation on a plurality of groups of matched characteristic point pairs to obtain homography transformation matrixes capable of aligning top views of adjacent cameras;
and mapping the homography transformation matrix back to the relation between the camera and the ground to obtain a calibration result.
9. The system of claim 8, wherein the second computing module is specifically configured to:
and eliminating abnormal matching point pairs on the overlapping area of the top views of the adjacent cameras to obtain matched characteristic point pairs.
10. The system according to claim 9, wherein the second computing module eliminates abnormal matching point pairs on the top view overlapping area of the adjacent cameras, and obtains matching feature point pairs, specifically for:
and removing the matched point pairs with outliers of each pair of points being larger than a preset threshold value from the overlapping area of the top views of the adjacent cameras to obtain matched characteristic point pairs.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010181386.6A CN111311693B (en) | 2020-03-16 | 2020-03-16 | Online calibration method and system for multi-camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010181386.6A CN111311693B (en) | 2020-03-16 | 2020-03-16 | Online calibration method and system for multi-camera |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111311693A CN111311693A (en) | 2020-06-19 |
CN111311693B true CN111311693B (en) | 2023-11-14 |
Family
ID=71162061
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010181386.6A Active CN111311693B (en) | 2020-03-16 | 2020-03-16 | Online calibration method and system for multi-camera |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111311693B (en) |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101876532A (en) * | 2010-05-25 | 2010-11-03 | 大连理工大学 | Camera on-field calibration method in measuring system |
CN103824278A (en) * | 2013-12-10 | 2014-05-28 | 清华大学 | Monitoring camera calibration method and system |
WO2015043507A1 (en) * | 2013-09-27 | 2015-04-02 | 比亚迪股份有限公司 | Image processing method and apparatus for cars, method for generating car surround view image, and car surround view system |
CN108230397A (en) * | 2017-12-08 | 2018-06-29 | 深圳市商汤科技有限公司 | Multi-lens camera is demarcated and bearing calibration and device, equipment, program and medium |
CN108564617A (en) * | 2018-03-22 | 2018-09-21 | 深圳岚锋创视网络科技有限公司 | Three-dimensional rebuilding method, device, VR cameras and the panorama camera of more mesh cameras |
CN108596982A (en) * | 2018-04-24 | 2018-09-28 | 深圳市航盛电子股份有限公司 | A kind of easy vehicle-mounted multi-view camera viewing system scaling method and device |
CN109278640A (en) * | 2018-10-12 | 2019-01-29 | 北京双髻鲨科技有限公司 | A kind of blind area detection system and method |
CN109345591A (en) * | 2018-10-12 | 2019-02-15 | 北京双髻鲨科技有限公司 | A kind of vehicle itself attitude detecting method and device |
WO2019105044A1 (en) * | 2017-11-28 | 2019-06-06 | 东莞市普灵思智能电子有限公司 | Method and system for lens distortion correction and feature extraction |
CN110288713A (en) * | 2019-07-03 | 2019-09-27 | 北京机械设备研究所 | A kind of quick three-dimensional model reconstruction method and system based on multi-vision visual |
WO2019192358A1 (en) * | 2018-04-02 | 2019-10-10 | 杭州海康威视数字技术股份有限公司 | Method and apparatus for synthesizing panoramic video, and electronic device |
CN110473262A (en) * | 2019-08-22 | 2019-11-19 | 北京双髻鲨科技有限公司 | Outer ginseng scaling method, device, storage medium and the electronic equipment of more mesh cameras |
CN110503694A (en) * | 2019-08-08 | 2019-11-26 | Oppo广东移动通信有限公司 | Multi-camera calibration, device, storage medium and electronic equipment |
-
2020
- 2020-03-16 CN CN202010181386.6A patent/CN111311693B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101876532A (en) * | 2010-05-25 | 2010-11-03 | 大连理工大学 | Camera on-field calibration method in measuring system |
WO2015043507A1 (en) * | 2013-09-27 | 2015-04-02 | 比亚迪股份有限公司 | Image processing method and apparatus for cars, method for generating car surround view image, and car surround view system |
CN103824278A (en) * | 2013-12-10 | 2014-05-28 | 清华大学 | Monitoring camera calibration method and system |
WO2019105044A1 (en) * | 2017-11-28 | 2019-06-06 | 东莞市普灵思智能电子有限公司 | Method and system for lens distortion correction and feature extraction |
CN108230397A (en) * | 2017-12-08 | 2018-06-29 | 深圳市商汤科技有限公司 | Multi-lens camera is demarcated and bearing calibration and device, equipment, program and medium |
CN108564617A (en) * | 2018-03-22 | 2018-09-21 | 深圳岚锋创视网络科技有限公司 | Three-dimensional rebuilding method, device, VR cameras and the panorama camera of more mesh cameras |
WO2019192358A1 (en) * | 2018-04-02 | 2019-10-10 | 杭州海康威视数字技术股份有限公司 | Method and apparatus for synthesizing panoramic video, and electronic device |
CN108596982A (en) * | 2018-04-24 | 2018-09-28 | 深圳市航盛电子股份有限公司 | A kind of easy vehicle-mounted multi-view camera viewing system scaling method and device |
CN109278640A (en) * | 2018-10-12 | 2019-01-29 | 北京双髻鲨科技有限公司 | A kind of blind area detection system and method |
CN109345591A (en) * | 2018-10-12 | 2019-02-15 | 北京双髻鲨科技有限公司 | A kind of vehicle itself attitude detecting method and device |
CN110288713A (en) * | 2019-07-03 | 2019-09-27 | 北京机械设备研究所 | A kind of quick three-dimensional model reconstruction method and system based on multi-vision visual |
CN110503694A (en) * | 2019-08-08 | 2019-11-26 | Oppo广东移动通信有限公司 | Multi-camera calibration, device, storage medium and electronic equipment |
CN110473262A (en) * | 2019-08-22 | 2019-11-19 | 北京双髻鲨科技有限公司 | Outer ginseng scaling method, device, storage medium and the electronic equipment of more mesh cameras |
Non-Patent Citations (2)
Title |
---|
冯聪.360°车载环视系统图像拼接技术研究.《中国优秀硕士学位论文全文数据库工程科技Ⅱ辑》.2019,(第2019年第01期),第1-59页. * |
周宝通.车载3D全景辅助驾驶系统关键技术研究及应用.《中国优秀硕士学位论文全文数据库信息科技辑》.2017,(2017年第08期),第1-53页. * |
Also Published As
Publication number | Publication date |
---|---|
CN111311693A (en) | 2020-06-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109146980B (en) | Monocular vision based optimized depth extraction and passive distance measurement method | |
CN111047536B (en) | CCD image correction method, device, equipment and storage medium | |
CN107886547B (en) | Fisheye camera calibration method and system | |
CN100583151C (en) | Double-camera calibrating method in three-dimensional scanning system | |
CN102376089B (en) | Target correction method and system | |
CN112819903B (en) | L-shaped calibration plate-based camera and laser radar combined calibration method | |
CN111311689A (en) | Method and system for calibrating relative external parameters of laser radar and camera | |
CN112902874A (en) | Image acquisition device and method, image processing method and device and image processing system | |
CN109255818B (en) | Novel target and extraction method of sub-pixel level angular points thereof | |
CN111707187B (en) | Measuring method and system for large part | |
CN111508027A (en) | Method and device for calibrating external parameters of camera | |
CN113362228A (en) | Method and system for splicing panoramic images based on improved distortion correction and mark splicing | |
CN114331924B (en) | Large workpiece multi-camera vision measurement method | |
CN111862193A (en) | Binocular vision positioning method and device for electric welding spots based on shape descriptors | |
CN105118086A (en) | 3D point cloud data registering method and system in 3D-AOI device | |
CN112308926B (en) | Camera external reference calibration method without public view field | |
JP5228614B2 (en) | Parameter calculation apparatus, parameter calculation system and program | |
CN111461963A (en) | Fisheye image splicing method and device | |
CN115239820A (en) | Split type flying vehicle aerial view real-time splicing and parking space detection method | |
CN104123726B (en) | Heavy forging measuring system scaling method based on vanishing point | |
CN113658279B (en) | Camera internal reference and external reference estimation method, device, computer equipment and storage medium | |
CN111986267A (en) | Coordinate system calibration method of multi-camera vision system | |
CN116740187A (en) | Multi-camera combined calibration method without overlapping view fields | |
CN115861448A (en) | System calibration method and system based on angular point detection and characteristic point extraction | |
CN115880372A (en) | Unified calibration method and system for external hub positioning camera of automatic crane |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |