CN112053404A - Stereo correction method and system for binocular camera after loading - Google Patents

Stereo correction method and system for binocular camera after loading Download PDF

Info

Publication number
CN112053404A
CN112053404A CN202010786460.7A CN202010786460A CN112053404A CN 112053404 A CN112053404 A CN 112053404A CN 202010786460 A CN202010786460 A CN 202010786460A CN 112053404 A CN112053404 A CN 112053404A
Authority
CN
China
Prior art keywords
point
effective data
fitting
line
coordinate difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010786460.7A
Other languages
Chinese (zh)
Other versions
CN112053404B (en
Inventor
李建
孙钊
崔峰
朱海涛
刘永才
万振
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Smarter Eye Technology Co Ltd
Original Assignee
Beijing Smarter Eye Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Smarter Eye Technology Co Ltd filed Critical Beijing Smarter Eye Technology Co Ltd
Priority to CN202010786460.7A priority Critical patent/CN112053404B/en
Publication of CN112053404A publication Critical patent/CN112053404A/en
Application granted granted Critical
Publication of CN112053404B publication Critical patent/CN112053404B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4023Scaling of whole images or parts thereof, e.g. expanding or contracting based on decimating pixels or lines of pixels; based on inserting pixels or lines of pixels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30252Vehicle exterior; Vicinity of vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the application discloses a binocular camera three-dimensional correction method and system after loading, wherein the method comprises the following steps: after loading the binocular camera, simultaneously acquiring left and right camera images, and performing stereo correction effect evaluation on the images; determining the three-dimensional correction effect of the image area covered by the calibration plate by utilizing interpolation operation; performing row coordinate difference value fitting operation on the uncovered image edge area of the calibration plate by using a least square method to obtain row coordinate difference value information of each pixel point after the binocular camera is loaded; and carrying out stereo correction difference compensation operation on the left image and the right image. And the secondary three-dimensional correction after the installation of the indirect external mode is carried out on the binocular camera, so that the precision requirement is met.

Description

Stereo correction method and system for binocular camera after loading
Technical Field
The embodiment of the application relates to the technical field of intelligent traffic, in particular to a binocular camera three-dimensional correction method and system after loading.
Background
Depth information acquired by real-time stereo matching of left and right images of a binocular camera has been widely applied to the field of automatic driving, such as ranging, navigation and other technologies, and one of the most effective methods for reducing the complexity of stereo matching calculation is to perform stereo correction on the binocular camera.
After the binocular stereo camera is mounted on a vehicle, the stereo correction effect can be seriously damaged due to the influence of the distortion of the windshield of the vehicle and the like. As a precondition of stereo matching, secondary stereo correction after the binocular camera is mounted in an indirect external mode is particularly necessary.
Disclosure of Invention
Therefore, the embodiment of the application provides a binocular camera three-dimensional correction method and system after loading, secondary three-dimensional correction is performed on the binocular camera after installation in an indirect external mode, and the precision requirement is met.
In order to achieve the above object, the embodiments of the present application provide the following technical solutions:
according to a first aspect of the embodiments of the present application, there is provided a binocular camera post-loading stereo correction method, the method including:
after loading the binocular camera, simultaneously acquiring left and right camera images, and performing stereo correction effect evaluation on the images;
determining the three-dimensional correction effect of the image area covered by the calibration plate by utilizing interpolation operation;
performing row coordinate difference value fitting operation on the uncovered image edge area of the calibration plate by using a least square method to obtain row coordinate difference value information of each pixel point after the binocular camera is loaded;
and carrying out stereo correction difference compensation operation on the left image and the right image.
Optionally, the determining a stereoscopic correction effect of the calibration plate covering the image area by using an interpolation operation includes:
judging whether a point to be interpolated is in a quadrangle formed by four adjacent angular points of the upper corner, the lower corner, the left corner and the right corner, if so, carrying out interpolation operation, otherwise, carrying out no calculation processing; the judgment principle whether the point is in the quadrangle or not is that the area of four triangular surfaces formed by the point P and any two adjacent angular points in the four angular points is equal to the area of the quadrangle, if the point is in the quadrangle, and if the area of the triangular surfaces is larger than the area of the quadrangle, the point is outside the quadrangle;
after the cyclic calculation, the line coordinate difference value of each point of the area covered by the calibration plate in the image is obtained, namely the stereo correction effect of the image relative to the central area.
Optionally, the performing, by using a least square method, a coordinate difference fitting operation on an image edge area uncovered by the calibration board includes:
for the upper edge region: if the point to be fitted is above the effective data and the effective data in the column is more than N1, fitting the row coordinate difference value at the point by using least squares; if the number of the effective data of the row where the point to be fitted is located is between N1 and N2, performing edge row coordinate difference fitting on the row where the point to be fitted is located by using all the effective data; if the number of the effective data is more than N2, fitting by using the first N2 effective data, and the number of the upward fitting points is at most N3;
for the lower edge region: if the point to be fitted is below the effective data and the effective data in the column is more than N1, fitting the row coordinate difference value at the point by using least squares; if the number of the effective data of the column where the point to be fitted is located is between N1 and N2, performing coordinate difference fitting on the lower edge row of the column by using all the effective data; if the number of the effective data is more than N2, fitting by using the last N2 effective data, and the number of downward fitting points is at most N3;
for the left edge region: if the point to be fitted is on the left side of the effective data and the effective data in the line is more than N1, fitting the line coordinate difference at the point by using least squares; if the number of the effective data of the line where the point to be fitted is located is between N1 and N2, performing coordinate difference fitting on the left edge line of the line by using all the effective data; if the number of the effective data is more than N2, fitting by using the first N2 effective data, and the number of fitting points to the left is at most N3;
for the right edge region: if the point to be fitted is on the right of the effective data and the effective data of the line is more than N1, fitting the line coordinate difference at the point by using least squares; if the number of the effective data of the line where the point to be fitted is located is between N1 and N2, performing coordinate difference fitting on the right edge line of the line by using all the effective data; if the number of valid data is more than N2, fitting is performed using the last N2 valid data, and the number of fitting points to the right is at most N3.
Optionally, the performing stereo correction difference compensation operation on the left and right images includes:
obtaining the inside and outside calibration parameters of the binocular camera by utilizing the stereo calibration before loading, and solving distortion correction and stereo correction mapping transformation tables mapLx, mapLy, mapRx and mapRy;
performing compensation operation on mapRy by using the row coordinate difference value of each pixel point in mapDeltaY to obtain updated newmapRy;
and performing image correction again by using mapLx, mapLy, mapRx and newmapRy so as to achieve the aim that the same feature point is aligned in line coordinates in the left and right pixel coordinate systems.
According to a second aspect of the embodiments of the present application, there is provided a binocular camera post-loading stereo correction system, the system including:
the stereo evaluation module is used for simultaneously acquiring left and right camera images after the binocular camera is loaded, and evaluating the stereo correction effect of the images;
the corner coverage area correction module is used for determining the three-dimensional correction effect of the calibration plate coverage image area by utilizing interpolation operation;
the image edge area correction module is used for performing line coordinate difference value fitting operation on the image edge area uncovered by the calibration plate by using a least square method to obtain line coordinate difference value information of each pixel point after the binocular camera is loaded;
and the compensation module is used for carrying out stereo correction difference compensation operation on the left image and the right image.
Optionally, the corner coverage area correction module is specifically configured to:
judging whether a point to be interpolated is in a quadrangle formed by four adjacent angular points of the upper corner, the lower corner, the left corner and the right corner, if so, carrying out interpolation operation, otherwise, carrying out no calculation processing; the judgment principle whether the point is in the quadrangle or not is that the area of four triangular surfaces formed by the point P and any two adjacent angular points in the four angular points is equal to the area of the quadrangle, if the point is in the quadrangle, and if the area of the triangular surfaces is larger than the area of the quadrangle, the point is outside the quadrangle;
after the cyclic calculation, the line coordinate difference value of each point of the area covered by the calibration plate in the image is obtained, namely the stereo correction effect of the image relative to the central area.
Optionally, the image edge area correction module is specifically configured to:
for the upper edge region: if the point to be fitted is above the effective data and the effective data in the column is more than N1, fitting the row coordinate difference value at the point by using least squares; if the number of the effective data of the row where the point to be fitted is located is between N1 and N2, performing edge row coordinate difference fitting on the row where the point to be fitted is located by using all the effective data; if the number of the effective data is more than N2, fitting by using the first N2 effective data, and the number of the upward fitting points is at most N3;
for the lower edge region: if the point to be fitted is below the effective data and the effective data in the column is more than N1, fitting the row coordinate difference value at the point by using least squares; if the number of the effective data of the column where the point to be fitted is located is between N1 and N2, performing coordinate difference fitting on the lower edge row of the column by using all the effective data; if the number of the effective data is more than N2, fitting by using the last N2 effective data, and the number of downward fitting points is at most N3;
for the left edge region: if the point to be fitted is on the left side of the effective data and the effective data in the line is more than N1, fitting the line coordinate difference at the point by using least squares; if the number of the effective data of the line where the point to be fitted is located is between N1 and N2, performing coordinate difference fitting on the left edge line of the line by using all the effective data; if the number of the effective data is more than N2, fitting by using the first N2 effective data, and the number of fitting points to the left is at most N3;
for the right edge region: if the point to be fitted is on the right of the effective data and the effective data of the line is more than N1, fitting the line coordinate difference at the point by using least squares; if the number of the effective data of the line where the point to be fitted is located is between N1 and N2, performing coordinate difference fitting on the right edge line of the line by using all the effective data; if the number of valid data is more than N2, fitting is performed using the last N2 valid data, and the number of fitting points to the right is at most N3.
Optionally, the compensation module is specifically configured to:
obtaining the inside and outside calibration parameters of the binocular camera by utilizing the stereo calibration before loading, and solving distortion correction and stereo correction mapping transformation tables mapLx, mapLy, mapRx and mapRy;
performing compensation operation on mapRy by using the row coordinate difference value of each pixel point in mapDeltaY to obtain updated newmapRy;
and performing image correction again by using mapLx, mapLy, mapRx and newmapRy so as to achieve the aim that the same feature point is aligned in line coordinates in the left and right pixel coordinate systems.
According to a third aspect of embodiments herein, there is provided an apparatus comprising: the device comprises a data acquisition device, a processor and a memory; the data acquisition device is used for acquiring data; the memory is to store one or more program instructions; the processor is configured to execute one or more program instructions to perform the method of any of the first aspect.
According to a fourth aspect of embodiments herein, there is provided a computer-readable storage medium having one or more program instructions embodied therein for performing the method of any of the first aspects.
In summary, the embodiment of the application provides a method and a system for stereo correction after loading of binocular cameras, wherein after loading of the binocular cameras, images of a left camera and an image of a right camera are simultaneously acquired, and stereo correction effect evaluation is performed on the images; determining the three-dimensional correction effect of the image area covered by the calibration plate by utilizing interpolation operation; performing row coordinate difference value fitting operation on the uncovered image edge area of the calibration plate by using a least square method to obtain row coordinate difference value information of each pixel point after the binocular camera is loaded; and carrying out stereo correction difference compensation operation on the left image and the right image. And the secondary three-dimensional correction after the installation of the indirect external mode is carried out on the binocular camera, so that the precision requirement is met.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below. It should be apparent that the drawings in the following description are merely exemplary, and that other embodiments can be derived from the drawings provided by those of ordinary skill in the art without inventive effort.
The structures, ratios, sizes, and the like shown in the present specification are only used for matching with the contents disclosed in the specification, so that those skilled in the art can understand and read the present invention, and do not limit the conditions for implementing the present invention, so that the present invention has no technical significance, and any structural modifications, changes in the ratio relationship, or adjustments of the sizes, without affecting the functions and purposes of the present invention, should still fall within the scope of the present invention.
Fig. 1 is a schematic flow chart of a binocular camera after loading stereo correction method provided in an embodiment of the present application;
fig. 2 is a schematic view illustrating placement of a checkerboard target and a binocular camera provided in the embodiment of the present application;
FIG. 3a is a schematic diagram of a relationship between a point and a quadrilateral according to an embodiment of the present application;
FIG. 3b is a second schematic diagram illustrating a relationship between a point and a quadrilateral according to an embodiment of the present invention;
FIG. 4 is a schematic illustration of interpolation provided by an embodiment of the present application;
fig. 5 is a block diagram of a binocular camera post-loading stereo correction system provided in an embodiment of the present application.
Detailed Description
The present invention is described in terms of particular embodiments, other advantages and features of the invention will become apparent to those skilled in the art from the following disclosure, and it is to be understood that the described embodiments are merely exemplary of the invention and that it is not intended to limit the invention to the particular embodiments disclosed. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The stereo correction actually corrects images aligned in non-coplanar rows into coplanar rows, that is, when the imaging planes of the two cameras are aligned on the same ideal imaging plane and one point in the world coordinate system is projected onto the ideal imaging planes of the two cameras, the imaging pixels on the left ideal imaging plane and the right ideal imaging plane have the same row coordinate. When two image planes are completely aligned in a coplanar line, stereo matching is reduced from two-dimensional searching to one-dimensional searching, and some points which cannot be matched are directly filtered, so that the efficiency and the quality of stereo matching are greatly improved. After the binocular stereo camera is mounted on a vehicle, the stereo correction effect can be seriously damaged due to the influence of the distortion of the windshield of the vehicle and the like. As a precondition of stereo matching, secondary stereo correction after the binocular camera is mounted in an indirect external mode is particularly necessary.
Fig. 1 is a schematic flow chart of a binocular camera post-loading stereo correction method provided in an embodiment of the present application, the method including the following steps:
step 101: and after the binocular camera is loaded, simultaneously acquiring left and right camera images, and performing stereo correction effect evaluation on the images.
Step 102: and determining the stereo correction effect of the calibration plate covering image area by utilizing interpolation operation.
Step 103: and performing line coordinate difference value fitting operation on the uncovered image edge area of the calibration plate by using a least square method to obtain line coordinate difference value information of each pixel point after the binocular camera is loaded.
Step 104: and carrying out stereo correction difference compensation operation on the left image and the right image.
In a possible implementation manner, in step 102, the determining, by using an interpolation operation, a stereoscopic correction effect of the calibration plate covering the image area includes:
judging whether a point to be interpolated is in a quadrangle formed by four adjacent angular points of the upper corner, the lower corner, the left corner and the right corner, if so, carrying out interpolation operation, otherwise, carrying out no calculation processing; the judgment principle whether the point is in the quadrangle or not is that the area of four triangular surfaces formed by the point P and any two adjacent angular points in the four angular points is equal to the area of the quadrangle, if the point is in the quadrangle, and if the area of the triangular surfaces is larger than the area of the quadrangle, the point is outside the quadrangle; after the cyclic calculation, the line coordinate difference value of each point of the area covered by the calibration plate in the image is obtained, namely the stereo correction effect of the image relative to the central area.
In a possible implementation manner, in step 103, the performing a coordinate difference fitting operation on the image edge area uncovered by the calibration board by using a least square method includes:
for the upper edge region: if the point to be fitted is above the effective data and the effective data in the column is more than N1, fitting the row coordinate difference value at the point by using least squares; if the number of the effective data of the row where the point to be fitted is located is between N1 and N2, performing edge row coordinate difference fitting on the row where the point to be fitted is located by using all the effective data; if the number of valid data is more than N2, fitting is performed using the first N2 valid data, and the number of upward fitting points is at most N3.
For the lower edge region: if the point to be fitted is below the effective data and the effective data in the column is more than N1, fitting the row coordinate difference value at the point by using least squares; if the number of the effective data of the column where the point to be fitted is located is between N1 and N2, performing coordinate difference fitting on the lower edge row of the column by using all the effective data; if the number of valid data is more than N2, fitting is performed using the last N2 valid data, and the number of downward fitting points is at most N3.
For the left edge region: if the point to be fitted is on the left side of the effective data and the effective data in the line is more than N1, fitting the line coordinate difference at the point by using least squares; if the number of the effective data of the line where the point to be fitted is located is between N1 and N2, performing coordinate difference fitting on the left edge line of the line by using all the effective data; if the number of valid data is more than N2, fitting is performed using the first N2 valid data, and the number of fitting points to the left is at most N3.
For the right edge region: if the point to be fitted is on the right of the effective data and the effective data of the line is more than N1, fitting the line coordinate difference at the point by using least squares; if the number of the effective data of the line where the point to be fitted is located is between N1 and N2, performing coordinate difference fitting on the right edge line of the line by using all the effective data; if the number of valid data is more than N2, fitting is performed using the last N2 valid data, and the number of fitting points to the right is at most N3.
In a possible implementation manner, in step 104, the performing a stereo correction difference compensation operation on the left and right images includes:
obtaining the inside and outside calibration parameters of the binocular camera by utilizing the stereo calibration before loading, and solving distortion correction and stereo correction mapping transformation tables mapLx, mapLy, mapRx and mapRy; performing compensation operation on mapRy by using the row coordinate difference value of each pixel point in mapDeltaY to obtain updated newmapRy; and performing image correction again by using mapLx, mapLy, mapRx and newmapRy so as to achieve the aim that the same feature point is aligned in line coordinates in the left and right pixel coordinate systems.
In order to make the method provided by the embodiments of the present application clearer and clearer, further detailed description is now made:
the first step is as follows: and evaluating the stereo correction effect of the binocular camera after loading.
After the binocular camera is installed on a front windshield of an automobile, firstly, a checkerboard calibration board is placed at a certain proper distance from the front of the camera, the height of a target is adjusted, the center of the target is aligned with the center of the camera, as shown in fig. 2, images of a left camera and a right camera are collected simultaneously, then, checkerboard sub-pixel angular point extraction is carried out on the collected images, line coordinate information of each angular point extracted by the left image and the right image is recorded in detail, finally, difference value operation is carried out on the line coordinate information of the corresponding matched angular points of the left image and the right image one by one, corresponding calculation data is recorded and stored, and the data represents the effect of stereo correction after the binocular camera is loaded on the automobile.
The second step is that: and (5) performing interpolation operation to calibrate the stereo correction effect of the image coverage area of the calibration plate.
The method comprises the following steps of obtaining a three-dimensional correction effect only with partial local corner positions according to a calibration board, wherein the three-dimensional correction effect of a non-corner area is still unknown, performing interpolation operation on an inner area of a corner by utilizing four corner information, and obtaining the three-dimensional correction effect of a corner coverage area, wherein the specific process comprises the following steps:
and judging whether the point to be interpolated is in a quadrangle formed by four adjacent angular points, namely an upper corner, a lower corner, a left corner and a right corner, if so, carrying out interpolation operation, and otherwise, carrying out no calculation processing. The criterion for determining whether a point is inside the quadrangle is that the area of four triangular areas formed by the point P and any two adjacent corner points among the four corner points is equal to the area of the quadrangle, the point is inside the quadrangle, and the area of the triangular areas is larger than the area of the quadrangle, the point is outside the quadrangle, as shown in FIG. 3, wherein S in FIG. 3(a)ΔAPB+SΔBPC+SΔCPD+SΔDPA=SABCDS in FIG. 3(b)ΔAPB+SΔBPC+SΔCPD+SΔDPA>SABCD
From the above, for example, the sub-pixel coordinates (x1, y1), (x2, y2), (x3, y3), (x4, y4) corresponding to the corner points P1, P2, P3, P4 in fig. 3, and the line coordinate difference values v1, v2, v3, v4 of the left and right images corresponding to the matching points can be obtained. The coordinates of the point P are (x, y), if the point P is inside the quadrangle composed of the points P1, P2, P3 and P4, as shown in FIG. 4, the interpolation operation is performed according to the distance weights of the point P and the four points, and the calculation formula is as follows:
dist1=sqrt((x1-x)2+(y1-y)2)
dist2=sqrt((x2-x)2+(y2-y)2)
dist3=sqrt((x3-x)2+(y3-y)2)
dist4=sqrt((x4-x)2+(y4-y)2)
Figure BDA0002622146560000091
Figure BDA0002622146560000092
Figure BDA0002622146560000093
Figure BDA0002622146560000094
Figure BDA0002622146560000095
v=weight*(w1*v1+w2*v2+w3*v3+w4*v4)
and (4) performing cyclic calculation on the formula to obtain a row coordinate difference value of each point of the area covered by the calibration plate in the image, namely the three-dimensional correction effect of the image relative to the central area.
The third step: and performing least square fitting operation on the stereo correction effect of the image edge area.
And for the image edge area uncovered by the calibration plate, performing coordinate difference fitting operation by using a least square method. The calculation process is as follows:
(1) upper edge area: if the point to be fitted is above the valid data and the column of valid data is more than N1, fitting the row coordinate difference at the point by using least squares. If the number of the effective data of the column where the points to be fitted are located is between N1 and N2 (N2> N1), all the effective data are used for carrying out the coordinate difference fitting of the edge rows on the column, and if the number of the effective data is more than N2, the first N2 effective data are used for carrying out the fitting, the number of the upward fitted points is at most N3 (N3> N2).
(2) Lower edge area: and if the point to be fitted is below the effective data and the effective data in the column is more than N1, fitting the row coordinate difference value at the point by using least squares. If the number of the effective data of the column where the points to be fitted are located is between N1 and N2 (N2> N1), all the effective data are used for carrying out the coordinate difference fitting of the lower edge row of the column, if the number of the effective data is more than N2, the last N2 effective data are used for carrying out the fitting, the number of the downward fitted points is at most N3 (N3> N2).
(3) Left edge area: if the point to be fitted is on the left side of the valid data and the valid data of the row is more than N1, fitting the row coordinate difference value at the point by using least squares. If the number of the effective data of the row where the point to be fitted is located is between N1 and N2 (N2> N1), the coordinate difference value fitting of the left edge row of the row is carried out by using all the effective data, if the number of the effective data is more than N2, the fitting is carried out by using the first N2 effective data, the number of the left fitting points is at most N3 (N3> N2).
(4) Right edge area: if the point to be fitted is right of the effective data and the effective data of the line is more than N1, fitting the line coordinate difference value at the point by using least squares. If the number of the effective data of the row where the point to be fitted is located is between N1 and N2 (N2> N1), the coordinate difference value fitting of the right edge row of the row is carried out by using all the effective data, if the number of the effective data is more than N2, the fitting is carried out by using the last N2 effective data, the number of the fitting points to the right is at most N3 (N3> N2).
The fourth step: left and right image stereo correction difference compensation operation
Through the three steps of operation, the row coordinate difference value information mapDeltaY of each pixel point after the binocular camera is loaded can be obtained. The method comprises the steps of obtaining internal and external calibration parameters of the binocular camera through three-dimensional calibration before loading, obtaining distortion correction and three-dimensional correction mapping transformation tables mapLx, mapLy, mapRx and mapRy, then performing compensation operation on the mapRy through a line coordinate difference value of each pixel point in mapDeltaY to obtain updated newmapRy, and then performing image correction again through the mapLx, mapLy, mapRx and newmapRy to achieve three-dimensional correction after loading of the binocular camera and achieve the effect of line coordinate alignment of the same feature point in left and right pixel coordinate systems.
In summary, the embodiment of the application provides a stereoscopic correction method after loading of binocular cameras, and after loading of the binocular cameras, images of a left camera and an image of a right camera are collected simultaneously, and stereoscopic correction effect evaluation is performed on the images; determining the three-dimensional correction effect of the image area covered by the calibration plate by utilizing interpolation operation; performing row coordinate difference value fitting operation on the uncovered image edge area of the calibration plate by using a least square method to obtain row coordinate difference value information of each pixel point after the binocular camera is loaded; and carrying out stereo correction difference compensation operation on the left image and the right image. And the secondary three-dimensional correction after the installation of the indirect external mode is carried out on the binocular camera, so that the precision requirement is met.
Based on the same technical concept, the embodiment of the present application further provides a binocular camera stereo correction system after loading, as shown in fig. 5, the system includes:
and the stereo evaluation module 501 is used for simultaneously acquiring left and right camera images after the binocular camera is loaded, and evaluating the stereo correction effect of the images.
And the corner coverage area correction module 502 is configured to determine a three-dimensional correction effect of the calibration board coverage image area by using interpolation operation.
And the image edge area correction module 503 is configured to perform line coordinate difference fitting operation on the image edge area uncovered by the calibration board by using a least square method to obtain line coordinate difference information at each pixel point after the binocular camera is loaded.
The compensation module 504 is configured to perform stereo correction difference compensation operation on the left and right images.
In a possible implementation, the corner coverage area correction module 502 is specifically configured to:
judging whether a point to be interpolated is in a quadrangle formed by four adjacent angular points of the upper corner, the lower corner, the left corner and the right corner, if so, carrying out interpolation operation, otherwise, carrying out no calculation processing; the judgment principle whether the point is in the quadrangle or not is that the area of four triangular surfaces formed by the point P and any two adjacent angular points in the four angular points is equal to the area of the quadrangle, if the point is in the quadrangle, and if the area of the triangular surfaces is larger than the area of the quadrangle, the point is outside the quadrangle; after the cyclic calculation, the line coordinate difference value of each point of the area covered by the calibration plate in the image is obtained, namely the stereo correction effect of the image relative to the central area.
In a possible implementation manner, the image edge area correction module 503 is specifically configured to:
for the upper edge region: if the point to be fitted is above the effective data and the effective data in the column is more than N1, fitting the row coordinate difference value at the point by using least squares; if the number of the effective data of the row where the point to be fitted is located is between N1 and N2, performing edge row coordinate difference fitting on the row where the point to be fitted is located by using all the effective data; if the number of valid data is more than N2, fitting is performed using the first N2 valid data, and the number of upward fitting points is at most N3.
For the lower edge region: if the point to be fitted is below the effective data and the effective data in the column is more than N1, fitting the row coordinate difference value at the point by using least squares; if the number of the effective data of the column where the point to be fitted is located is between N1 and N2, performing coordinate difference fitting on the lower edge row of the column by using all the effective data; if the number of valid data is more than N2, fitting is performed using the last N2 valid data, and the number of downward fitting points is at most N3.
For the left edge region: if the point to be fitted is on the left side of the effective data and the effective data in the line is more than N1, fitting the line coordinate difference at the point by using least squares; if the number of the effective data of the line where the point to be fitted is located is between N1 and N2, performing coordinate difference fitting on the left edge line of the line by using all the effective data; if the number of valid data is more than N2, fitting is performed using the first N2 valid data, and the number of fitting points to the left is at most N3.
For the right edge region: if the point to be fitted is on the right of the effective data and the effective data of the line is more than N1, fitting the line coordinate difference at the point by using least squares; if the number of the effective data of the line where the point to be fitted is located is between N1 and N2, performing coordinate difference fitting on the right edge line of the line by using all the effective data; if the number of valid data is more than N2, fitting is performed using the last N2 valid data, and the number of fitting points to the right is at most N3.
In a possible implementation, the compensation module 504 is specifically configured to: obtaining the inside and outside calibration parameters of the binocular camera by utilizing the stereo calibration before loading, and solving distortion correction and stereo correction mapping transformation tables mapLx, mapLy, mapRx and mapRy; performing compensation operation on mapRy by using the row coordinate difference value of each pixel point in mapDeltaY to obtain updated newmapRy; and performing image correction again by using mapLx, mapLy, mapRx and newmapRy so as to achieve the aim that the same feature point is aligned in line coordinates in the left and right pixel coordinate systems.
Based on the same technical concept, an embodiment of the present application further provides an apparatus, including: the device comprises a data acquisition device, a processor and a memory; the data acquisition device is used for acquiring data; the memory is to store one or more program instructions; the processor is configured to execute one or more program instructions to perform the method according to any of the above methods.
Based on the same technical concept, the embodiment of the present application further provides a computer-readable storage medium, wherein the computer-readable storage medium contains one or more program instructions, and the one or more program instructions are used for executing the method according to any one of the above methods.
In the present specification, each embodiment of the method is described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. Reference is made to the description of the method embodiments.
It is noted that while the operations of the methods of the present invention are depicted in the drawings in a particular order, this is not a requirement or suggestion that the operations must be performed in this particular order or that all of the illustrated operations must be performed to achieve desirable results. Additionally or alternatively, certain steps may be omitted, multiple steps combined into one step execution, and/or one step broken down into multiple step executions.
Although the present application provides method steps as in embodiments or flowcharts, additional or fewer steps may be included based on conventional or non-inventive approaches. The order of steps recited in the embodiments is merely one manner of performing the steps in a multitude of orders and does not represent the only order of execution. When an apparatus or client product in practice executes, it may execute sequentially or in parallel (e.g., in a parallel processor or multithreaded processing environment, or even in a distributed data processing environment) according to the embodiments or methods shown in the figures. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the presence of additional identical or equivalent elements in a process, method, article, or apparatus that comprises the recited elements is not excluded.
The units, devices, modules, etc. set forth in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. For convenience of description, the above devices are described as being divided into various modules by functions, and are described separately. Of course, in implementing the present application, the functions of each module may be implemented in one or more software and/or hardware, or a module implementing the same function may be implemented by a combination of a plurality of sub-modules or sub-units, and the like. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Those skilled in the art will also appreciate that, in addition to implementing the controller as pure computer readable program code, the same functionality can be implemented by logically programming method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Such a controller may therefore be considered as a hardware component, and the means included therein for performing the various functions may also be considered as a structure within the hardware component. Or even means for performing the functions may be regarded as being both a software module for performing the method and a structure within a hardware component.
The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, classes, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
From the above description of the embodiments, it is clear to those skilled in the art that the present application can be implemented by software plus necessary general hardware platform. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, or the like, and includes several instructions for enabling a computer device (which may be a personal computer, a mobile terminal, a server, or a network device) to execute the method according to the embodiments or some parts of the embodiments of the present application.
The embodiments in the present specification are described in a progressive manner, and the same or similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. The application is operational with numerous general purpose or special purpose computing system environments or configurations. For example: personal computers, server computers, hand-held or portable devices, tablet-type devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable electronic devices, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
The above-mentioned embodiments are further described in detail for the purpose of illustrating the invention, and it should be understood that the above-mentioned embodiments are only illustrative of the present invention and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements, etc. made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (10)

1. A binocular camera three-dimensional correction method after loading is characterized by comprising the following steps:
after loading the binocular camera, simultaneously acquiring left and right camera images, and performing stereo correction effect evaluation on the images;
determining the three-dimensional correction effect of the image area covered by the calibration plate by utilizing interpolation operation;
performing row coordinate difference value fitting operation on the uncovered image edge area of the calibration plate by using a least square method to obtain row coordinate difference value information of each pixel point after the binocular camera is loaded;
and carrying out stereo correction difference compensation operation on the left image and the right image.
2. The method of claim 1, wherein determining the calibration plate overlay image area stereo correction effect using interpolation comprises:
judging whether a point to be interpolated is in a quadrangle formed by four adjacent angular points of the upper corner, the lower corner, the left corner and the right corner, if so, carrying out interpolation operation, otherwise, carrying out no calculation processing; the judgment principle whether the point is in the quadrangle or not is that the area of four triangular surfaces formed by the point P and any two adjacent angular points in the four angular points is equal to the area of the quadrangle, if the point is in the quadrangle, and if the area of the triangular surfaces is larger than the area of the quadrangle, the point is outside the quadrangle;
after the cyclic calculation, the line coordinate difference value of each point of the area covered by the calibration plate in the image is obtained, namely the stereo correction effect of the image relative to the central area.
3. The method of claim 1, wherein performing a coordinate difference fit operation on the uncovered image edge region of the calibration plate using a least squares method comprises:
for the upper edge region: if the point to be fitted is above the effective data and the effective data in the column is more than N1, fitting the row coordinate difference value at the point by using least squares; if the number of the effective data of the row where the point to be fitted is located is between N1 and N2, performing edge row coordinate difference fitting on the row where the point to be fitted is located by using all the effective data; if the number of the effective data is more than N2, fitting by using the first N2 effective data, and the number of the upward fitting points is at most N3;
for the lower edge region: if the point to be fitted is below the effective data and the effective data in the column is more than N1, fitting the row coordinate difference value at the point by using least squares; if the number of the effective data of the column where the point to be fitted is located is between N1 and N2, performing coordinate difference fitting on the lower edge row of the column by using all the effective data; if the number of the effective data is more than N2, fitting by using the last N2 effective data, and the number of downward fitting points is at most N3;
for the left edge region: if the point to be fitted is on the left side of the effective data and the effective data in the line is more than N1, fitting the line coordinate difference at the point by using least squares; if the number of the effective data of the line where the point to be fitted is located is between N1 and N2, performing coordinate difference fitting on the left edge line of the line by using all the effective data; if the number of the effective data is more than N2, fitting by using the first N2 effective data, and the number of fitting points to the left is at most N3;
for the right edge region: if the point to be fitted is on the right of the effective data and the effective data of the line is more than N1, fitting the line coordinate difference at the point by using least squares; if the number of the effective data of the line where the point to be fitted is located is between N1 and N2, performing coordinate difference fitting on the right edge line of the line by using all the effective data; if the number of valid data is more than N2, fitting is performed using the last N2 valid data, and the number of fitting points to the right is at most N3.
4. The method of claim 1, wherein performing a stereo correction difference compensation operation on the left and right images comprises:
obtaining the inside and outside calibration parameters of the binocular camera by utilizing the stereo calibration before loading, and solving distortion correction and stereo correction mapping transformation tables mapLx, mapLy, mapRx and mapRy;
performing compensation operation on mapRy by using the row coordinate difference value of each pixel point in mapDeltaY to obtain updated newmapRy;
and performing image correction again by using mapLx, mapLy, mapRx and newmapRy so as to achieve the aim that the same feature point is aligned in line coordinates in the left and right pixel coordinate systems.
5. The utility model provides a stereo correction system behind binocular camera loading which characterized in that, the system includes:
the stereo evaluation module is used for simultaneously acquiring left and right camera images after the binocular camera is loaded, and evaluating the stereo correction effect of the images;
the corner coverage area correction module is used for determining the three-dimensional correction effect of the calibration plate coverage image area by utilizing interpolation operation;
the image edge area correction module is used for performing line coordinate difference value fitting operation on the image edge area uncovered by the calibration plate by using a least square method to obtain line coordinate difference value information of each pixel point after the binocular camera is loaded;
and the compensation module is used for carrying out stereo correction difference compensation operation on the left image and the right image.
6. The system of claim 5, wherein the corner coverage correction module is specifically configured to:
judging whether a point to be interpolated is in a quadrangle formed by four adjacent angular points of the upper corner, the lower corner, the left corner and the right corner, if so, carrying out interpolation operation, otherwise, carrying out no calculation processing; the judgment principle whether the point is in the quadrangle or not is that the area of four triangular surfaces formed by the point P and any two adjacent angular points in the four angular points is equal to the area of the quadrangle, if the point is in the quadrangle, and if the area of the triangular surfaces is larger than the area of the quadrangle, the point is outside the quadrangle;
after the cyclic calculation, the line coordinate difference value of each point of the area covered by the calibration plate in the image is obtained, namely the stereo correction effect of the image relative to the central area.
7. The system of claim 5, wherein the image edge region correction module is specifically configured to:
for the upper edge region: if the point to be fitted is above the effective data and the effective data in the column is more than N1, fitting the row coordinate difference value at the point by using least squares; if the number of the effective data of the row where the point to be fitted is located is between N1 and N2, performing edge row coordinate difference fitting on the row where the point to be fitted is located by using all the effective data; if the number of the effective data is more than N2, fitting by using the first N2 effective data, and the number of the upward fitting points is at most N3;
for the lower edge region: if the point to be fitted is below the effective data and the effective data in the column is more than N1, fitting the row coordinate difference value at the point by using least squares; if the number of the effective data of the column where the point to be fitted is located is between N1 and N2, performing coordinate difference fitting on the lower edge row of the column by using all the effective data; if the number of the effective data is more than N2, fitting by using the last N2 effective data, and the number of downward fitting points is at most N3;
for the left edge region: if the point to be fitted is on the left side of the effective data and the effective data in the line is more than N1, fitting the line coordinate difference at the point by using least squares; if the number of the effective data of the line where the point to be fitted is located is between N1 and N2, performing coordinate difference fitting on the left edge line of the line by using all the effective data; if the number of the effective data is more than N2, fitting by using the first N2 effective data, and the number of fitting points to the left is at most N3;
for the right edge region: if the point to be fitted is on the right of the effective data and the effective data of the line is more than N1, fitting the line coordinate difference at the point by using least squares; if the number of the effective data of the line where the point to be fitted is located is between N1 and N2, performing coordinate difference fitting on the right edge line of the line by using all the effective data; if the number of valid data is more than N2, fitting is performed using the last N2 valid data, and the number of fitting points to the right is at most N3.
8. The system of claim 5, wherein the compensation module is specifically configured to:
obtaining the inside and outside calibration parameters of the binocular camera by utilizing the stereo calibration before loading, and solving distortion correction and stereo correction mapping transformation tables mapLx, mapLy, mapRx and mapRy;
performing compensation operation on mapRy by using the row coordinate difference value of each pixel point in mapDeltaY to obtain updated newmapRy;
and performing image correction again by using mapLx, mapLy, mapRx and newmapRy so as to achieve the aim that the same feature point is aligned in line coordinates in the left and right pixel coordinate systems.
9. An apparatus, characterized in that the apparatus comprises: the device comprises a data acquisition device, a processor and a memory;
the data acquisition device is used for acquiring data; the memory is to store one or more program instructions; the processor, configured to execute one or more program instructions to perform the method of any of claims 1-4.
10. A computer-readable storage medium having one or more program instructions embodied therein for performing the method of any of claims 1-4.
CN202010786460.7A 2020-08-07 2020-08-07 Stereoscopic correction method and system for binocular camera after loading Active CN112053404B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010786460.7A CN112053404B (en) 2020-08-07 2020-08-07 Stereoscopic correction method and system for binocular camera after loading

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010786460.7A CN112053404B (en) 2020-08-07 2020-08-07 Stereoscopic correction method and system for binocular camera after loading

Publications (2)

Publication Number Publication Date
CN112053404A true CN112053404A (en) 2020-12-08
CN112053404B CN112053404B (en) 2024-04-16

Family

ID=73601551

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010786460.7A Active CN112053404B (en) 2020-08-07 2020-08-07 Stereoscopic correction method and system for binocular camera after loading

Country Status (1)

Country Link
CN (1) CN112053404B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120154794A1 (en) * 2010-12-17 2012-06-21 Teledyne Scientific & Imaging, Llc Optical angle-of-arrival measurement system and method for multiple light sources
US20120182403A1 (en) * 2004-09-30 2012-07-19 Eric Belk Lange Stereoscopic imaging
CN103854271A (en) * 2012-11-28 2014-06-11 华中科技大学 Plane type camera calibration method
CN104331897A (en) * 2014-11-21 2015-02-04 天津工业大学 Polar correction based sub-pixel level phase three-dimensional matching method
WO2018086348A1 (en) * 2016-11-09 2018-05-17 人加智能机器人技术(北京)有限公司 Binocular stereo vision system and depth measurement method
US20180295350A1 (en) * 2015-01-21 2018-10-11 Chengdu Idealsee Technology Co., Ltd. Binocular See-Through AR Head-Mounted Display Device and Information Display Method Therefor
US20190158813A1 (en) * 2016-06-10 2019-05-23 Lucid VR, Inc. Real Time Re-Calibration of Stereo Cameras
KR102023087B1 (en) * 2018-05-31 2019-09-20 주식회사 미르기술 Method for camera calibration

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120182403A1 (en) * 2004-09-30 2012-07-19 Eric Belk Lange Stereoscopic imaging
US20120154794A1 (en) * 2010-12-17 2012-06-21 Teledyne Scientific & Imaging, Llc Optical angle-of-arrival measurement system and method for multiple light sources
CN103854271A (en) * 2012-11-28 2014-06-11 华中科技大学 Plane type camera calibration method
CN104331897A (en) * 2014-11-21 2015-02-04 天津工业大学 Polar correction based sub-pixel level phase three-dimensional matching method
US20180295350A1 (en) * 2015-01-21 2018-10-11 Chengdu Idealsee Technology Co., Ltd. Binocular See-Through AR Head-Mounted Display Device and Information Display Method Therefor
US20190158813A1 (en) * 2016-06-10 2019-05-23 Lucid VR, Inc. Real Time Re-Calibration of Stereo Cameras
WO2018086348A1 (en) * 2016-11-09 2018-05-17 人加智能机器人技术(北京)有限公司 Binocular stereo vision system and depth measurement method
KR102023087B1 (en) * 2018-05-31 2019-09-20 주식회사 미르기술 Method for camera calibration

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
何长海;许增朴;王永强;周聪玲;: "一种基于双目视觉的奥制齿轮刀具参数测量方法", 传感器与微系统, no. 10 *
张如如;葛广英;申哲;秦腾飞;董腾;: "基于HALCON的双目立体视觉工件尺寸测量", 计算机测量与控制, no. 01 *
肖志涛;卢晓方;耿磊;张芳;吴骏;李月龙;郎建业;甘鹏;刘洋;: "基于极线校正的亚像素相位立体匹配方法", 红外与激光工程, no. 1 *

Also Published As

Publication number Publication date
CN112053404B (en) 2024-04-16

Similar Documents

Publication Publication Date Title
US10102433B2 (en) Traveling road surface detection apparatus and traveling road surface detection method
CN111383279B (en) External parameter calibration method and device and electronic equipment
US10378877B2 (en) Image processing device, image processing method, and program
CN111080662A (en) Lane line extraction method and device and computer equipment
CN110349195A (en) A kind of target object 3D measurement parameter acquisition methods, system and storage medium based on depth image
CN107452028B (en) Method and device for determining position information of target image
CN111680685B (en) Positioning method and device based on image, electronic equipment and storage medium
WO2015040657A1 (en) Image processing device, image processing method, and image processing program
CN110009571B (en) Longitude and latitude calculation method, system and storage medium for detection position in camera image
CN109920004A (en) Image processing method, device, the combination of calibration object, terminal device and calibration system
CN111553870B (en) Image processing method based on distributed system
CN113658262B (en) Camera external parameter calibration method, device, system and storage medium
CN110673607A (en) Feature point extraction method and device in dynamic scene and terminal equipment
Cerri et al. Free space detection on highways using time correlation between stabilized sub-pixel precision IPM images
CN112053404A (en) Stereo correction method and system for binocular camera after loading
CN110035279B (en) Method and device for searching SFR test area in checkerboard test pattern
CN113903188B (en) Parking space detection method, electronic device and computer readable storage medium
CN115372987A (en) Lane line extraction method, device, medium and equipment based on laser radar
CN111986253B (en) Method, device, equipment and storage medium for detecting elevator crowding degree
CN111028264B (en) Rotation robust three-dimensional object detection optimization method and device
CN114493967A (en) Image acquisition device and method, image processing device and method, and image processing system
CN112001857A (en) Image correction method, system and equipment based on binocular camera and readable storage medium
CN112215048A (en) 3D target detection method and device and computer readable storage medium
CN111860136B (en) Package positioning method, device, equipment and computer readable storage medium
CN117788593B (en) Method, device, medium and equipment for eliminating dynamic points in three-dimensional laser data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant