CN115936995A - Panoramic splicing method for four-way fisheye cameras of vehicle - Google Patents

Panoramic splicing method for four-way fisheye cameras of vehicle Download PDF

Info

Publication number
CN115936995A
CN115936995A CN202310002280.9A CN202310002280A CN115936995A CN 115936995 A CN115936995 A CN 115936995A CN 202310002280 A CN202310002280 A CN 202310002280A CN 115936995 A CN115936995 A CN 115936995A
Authority
CN
China
Prior art keywords
image
vehicle
coordinates
panoramic
fisheye
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310002280.9A
Other languages
Chinese (zh)
Inventor
颜琳崧
姜立标
李长玉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China University of Technology SCUT
Original Assignee
South China University of Technology SCUT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China University of Technology SCUT filed Critical South China University of Technology SCUT
Priority to CN202310002280.9A priority Critical patent/CN115936995A/en
Publication of CN115936995A publication Critical patent/CN115936995A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Closed-Circuit Television Systems (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a panoramic stitching method for four fisheye cameras of a vehicle, which comprises the following steps: 1) Acquiring fish-eye camera images and driving gear signals which are arranged on the periphery of a vehicle; 2) Performing calibration and image distortion correction by adopting an MSER extraction most stable limit region algorithm; 3) Judging the driving state of the vehicle according to the driving gear; 4) If the vehicle is in a forward gear, a cylindrical projection model is adopted to obtain a wider visual field range, and if the vehicle is in a reverse gear, perspective transformation is adopted to obtain a vehicle aerial view; 5) Splicing by adopting a mask cutting method; 6) Obtaining a panoramic spliced image around the automobile body in a forward state, and obtaining a bird's-eye view panoramic image around the automobile body in a reverse state; 7) Optimizing the image splicing seams by adopting a fusion algorithm; 8) And generating a panoramic mosaic image. The method can switch different splicing modes based on the vehicle running state, is beneficial to a driver to obtain a better visual field range, and has the advantages of short time, high precision, good effect and easy realization.

Description

Panoramic stitching method for four-way fisheye cameras of vehicle
Technical Field
The invention relates to the field of vehicle-mounted image processing, in particular to a panoramic stitching method for four fisheye cameras of a vehicle.
Background
The existing vehicle-mounted 360-degree panoramic all-around viewing system only provides a panoramic aerial view of the vehicle, most drivers only use the aerial view when parking the vehicle in actual driving, and the aerial view is often small in external viewing range for ensuring distortion correction effect, so that the use of the panoramic system for detecting information around the vehicle is greatly limited.
When an automobile runs forwards, the conventional panoramic system cannot provide wide visual field information for a driver, and the direct visual field information of each blind area of the automobile, particularly an A-pillar blind area, cannot be well provided for the driver, so that the driver cannot make a judgment quickly.
The existing fisheye camera calibration and correction method mostly adopts an iterative solution mode, because internal and external parameters, distortion parameters and the like of the camera are mutually coupled, the iterative solution requirement can be well decoupled, otherwise, the subsequent splicing effect is poor due to inaccurate result.
The existing image splicing method is commonly provided with a region-based splicing method and a feature-based splicing method. The splicing speed is low due to large calculation amount of the splicing method based on the region. Although the feature-based splicing method replaces points, the workload is reduced to a certain extent, steps of searching and screening feature points are added, and the splicing speed is still to be improved.
Disclosure of Invention
In order to solve the problems in the prior art, the invention provides a panoramic stitching method for a four-way fisheye camera of a vehicle.
In order to achieve the purpose of the invention, the panoramic stitching method of the four-way fisheye camera of the vehicle, provided by the invention, comprises the following steps:
1) Acquiring driving gear signals and fisheye images of fisheye cameras arranged on the periphery of a vehicle;
2) Performing calibration and image distortion correction on the fisheye image by adopting MSER algorithm (Maximally Stable extreme Region);
3) Judging the running state of the vehicle according to the running gear;
4) If the vehicle is in the forward gear, projecting the projection image into a cylindrical panoramic image by adopting a cylindrical projection model so as to obtain a wider view field range, and if the vehicle is in the reverse gear, obtaining a vehicle aerial view by adopting a perspective transformation model;
5) Splicing by adopting a mask cutting method, obtaining a panoramic spliced image around the automobile body in a forward state, and obtaining a bird's-eye view panoramic image around the automobile body in a reverse state;
6) Optimizing the image splicing seams by adopting a fusion algorithm to obtain an optimized spliced image;
7) And generating and displaying the panoramic mosaic image.
Further, in the step 1), the four fisheye cameras are respectively arranged at the outer ends of the lower sides of the left and right rearview mirrors of the automobile, at the center between the opening of the front cover of the automobile and the front license plate and at the center of the upper frame of the rear license plate, after the installation positions are fixed, the four fisheye cameras are respectively calibrated, a plurality of checkerboard pictures are shot, and a driving gear signal is obtained from the automobile machine.
Further, step 2) comprises the following substeps:
establishing a univariate fisheye camera radial distortion model;
intercepting more than two fisheye checkerboard images;
extracting a most stable limit region (MSER), performing feature matching on the vectors by using a nearest neighbor vector method, storing automatically acquired matched feature point coordinates, and obtaining a coordinate relation between a distorted image and an undistorted image;
and solving the pixel coordinate relation before and after correction according to the distortion model, carrying out distortion correction, and solving to obtain a homography matrix.
Furthermore, the univariate fisheye camera radial distortion model is
Figure BDA0004035594140000021
Wherein x is u Is a vector, representing the undistorted image pixel coordinates; vector e represents the distortion center coordinates; x is the number of d Is a vector, representing distorted image pixel coordinates; k is a univariate parameter; r is a radical of hydrogen d The distance between the corrected image pixel point and the distortion center is obtained.
Further, the process of obtaining the coordinate relationship between the distorted image and the undistorted image includes:
for an undistorted image, the equation is:
Figure BDA0004035594140000022
wherein x is u Is a vector representing undistorted image pixel coordinates, x' u Is the pixel coordinates of another non-distorted image,
Figure BDA0004035594140000023
as its transpose, the G base matrix;
substituting the single variable fisheye camera radial distortion model, and finishing to obtain an equation of a distorted image:
Figure BDA0004035594140000024
wherein x is d Is a vector representing the distorted image pixel coordinates, x' d Pixel coordinates of another distorted image;
straightening a basic matrix G into G by adopting a matrix direct product method; the above equation can be:
M(X′,X,k)g=0
wherein, the matrix M is a matrix combining a pairs of feature matching point coordinates X and X' and a univariate parameter k, X represents X u And x d Coordinates of the stacked representations; x 'represents X' u And x' d The coordinates are piled up, g is a transformed basic matrix;
using N groups, N is more than or equal to 2, obtaining a nonlinear square composed of N nonlinear equationsAnd (3) a program group, solving an optimal real root which simultaneously meets all the selected nonlinear equations, wherein the root is a univariate parameter k, and replacing the univariate parameter k into the univariate fisheye camera radial distortion model to obtain r d And r u Obtaining the coordinate relation r of the distorted image and the undistorted image d
Figure BDA0004035594140000031
r u The distance of the pixel point of the corrected image from the distortion center.
This is the coordinate relationship between the distorted image and the undistorted image, and the distortion-corrected image is obtained from the fisheye image. The method does not need a complex decoupling process, and the calculation result is more accurate.
Further, according to the relation, c (c is larger than or equal to 4) group corresponding points can be selected, a homography matrix is solved, and the homography matrix is used for generating the aerial view in the step 4).
Further, in step 3), the vehicle gear information is provided to the panoramic image system by the vehicle machine, and the panoramic image system adopts different models according to the gear information.
Further, in the step 4), different models are adopted for image splicing according to the driving state, a cylindrical projection model is adopted for the forward gear, a perspective transformation model is adopted for the reverse gear, and perspective transformation is carried out according to a homography matrix to obtain a four-way image aerial view.
Further, in step 4), the step of obtaining the four-way image bird's-eye view includes:
placing a black and white checkerboard in the front, rear, left and right directions of the vehicle respectively, and performing calibration and correction on the fisheye camera and generation of a bird's eye view;
a black-white chessboard is also respectively placed at four angular positions of the vehicle and used for selecting and referring coordinates;
according to the relation between the world coordinate and the pixel coordinate, the mapping relation between the world coordinate system and the image coordinate system is established by measuring the world coordinate system coordinates of the black-white checkerboard of the front, the rear, the left and the right Fang Xiange of the vehicle and combining the image coordinate system coordinates;
according to the mapping relation of the coordinate points in each direction, homography matrixes in the front direction, the rear direction, the left direction and the right direction are obtained;
images in the front direction, the rear direction, the left direction and the right direction are respectively subjected to inverse perspective transformation through homography matrixes of the four fisheye cameras, and are converted into bird-eye views in the front direction, the rear direction, the left direction and the right direction of the automobile body.
Further, in step 4), the cylindrical projection model is:
Figure BDA0004035594140000041
wherein (x, y) and (x) 1 ,y 1 ) The coordinates of points of the original image and the projection image are (W/2,H/2) the coordinates of the original image origin, and f is a given focal length.
And (3) in a reverse gear state, using the fisheye distortion model in the step 2) to obtain a homography matrix, and respectively carrying out inverse perspective transformation on the images in the front, rear, left and right directions of the automobile through the homography matrices of the four fisheye cameras to obtain the bird's-eye views in the front, rear, left and right directions of the automobile body.
Further, in step 5), a mask splicing method is adopted to perform cylindrical surface or aerial view splicing.
Further, in step 6), the image splicing seam is fused by adopting a distance trigonometric function weighted average fusion algorithm, specifically, the method comprises the steps of adopting a trigonometric function value of the distance from an image pixel point to the boundary of an image overlapping region as a weight value, carrying out weighted average on an image overlapping part, and achieving the effect of image fusion, wherein the formula is as follows:
Figure BDA0004035594140000042
in the formula, theta is a trigonometric function value of the distance from the pixel point to the boundary of the overlapping area, d i Is pixel point to region I 1 The distance of the boundary, d is the distance of the overlapping area, I 1 (I, j) is a non-overlapping region I 1 Grey value of the middle pixel, I 2 (I, j) is a non-overlapping region I 2 And I (I, j) is the gray value of the fused pixel point.
Then, determining a bird's-eye view coordinate system, and taking the upper left corner as a coordinate origin. Setting masks of four fisheye images, namely front, rear, left and right, respectively, cutting the images, and performing coordinate transformation on the four cut aerial views to generate a spliced image.
Compared with the existing vehicle-mounted panoramic splicing method, the method has the beneficial effects that at least:
(1) The invention can switch different modes according to the running state of the vehicle so as to meet the requirements of different working conditions on the panoramic information of the vehicle; the splicing effect is good, and no obvious splicing seam exists; the requirement on computing power is low; can be deployed in most embedded devices; the requirements of the field of automatic driving of vehicles on panoramic image information are met.
(2) According to the invention, based on the MSER algorithm, the fish-eye image is subjected to cylindrical projection and perspective transformation treatment according to different conditions, the calculation amount for solving the characteristic points can be reduced, the correction effect on the fish-eye image, particularly on the edge image is better, no obvious gap exists at the splicing part of the two images, the fusion effect is good, and the cylindrical projection panoramic image and the panoramic aerial view image can be switched according to the running state of the vehicle, so that the vehicle environment information which is more in line with the driving requirement is provided for a driver.
Drawings
Fig. 1 is a flowchart of a panoramic stitching method for a four-way fisheye camera of a vehicle according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a coordinate system provided by an embodiment of the present invention;
FIG. 3 is a schematic view illustrating a cutting process of the bird's-eye view mask according to an embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating trimming of a cylindrical mask according to the present embodiment of the invention;
FIG. 5 is a schematic diagram of image fusion provided by an embodiment of the present invention;
fig. 6 is a schematic view of a cylindrical panoramic effect in an advance state according to an embodiment of the present invention;
fig. 7 is a schematic view of a reversing panoramic aerial view effect according to an embodiment of the present invention.
Detailed Description
The invention is further described with reference to the following figures and specific embodiments.
As shown in fig. 1, the panoramic stitching method for the four-way fisheye cameras of the vehicle provided by the invention comprises the following steps:
step 1: and acquiring fish-eye camera images and driving gear signals which are arranged on the periphery of the vehicle.
In some embodiments of the invention, image data is acquired by four-way fisheye cameras mounted in front of, behind, to the left of, and to the right of the automobile body; the four fisheye cameras are respectively arranged at the outer ends of the lower sides of the left and right rearview mirrors of the automobile, the center between the opening of the front cover of the automobile and the front license plate and the center of the upper frame of the rear license plate, and are installed and fixed.
In some embodiments of the present invention, a four-way image capturing module and a decoding module are further provided, the fisheye camera is connected to the image capturing module, the capturing module is connected to the decoding module, and the decoding module is connected to a processor, wherein the processor generally employs a vehicle-mounted chip, such as a TDA4 chip, and a computer loaded with a Ubuntu system may also be employed.
Step 2: and (3) calibrating the fisheye camera image obtained in the step (1) and correcting image distortion.
In some embodiments of the present invention, MSER algorithm calibration and image distortion correction are performed on the obtained four fish-eye images to obtain fish-eye images with distortion correction, specifically, checkerboard images shot by a plurality of fish-eye cameras are used for distortion correction, and the distortion correction includes:
step 2.1: establishing a univariate fisheye camera radial distortion model which is
Figure BDA0004035594140000051
Wherein x is u Is a vector, representing an undistorted graphImage pixel coordinates; vector e represents the distortion center coordinates; x is the number of d Is a vector, representing distorted image pixel coordinates; k is a univariate parameter; r is d The distance between the corrected image pixel point and the distortion center is obtained.
Step 2.2: intercepting more than two fisheye checkerboard images to ensure that all angular points of the checkerboard are in the visual field range of the fisheye camera;
step 2.3: and extracting a most stable limit region (MSER), performing feature matching on the vector by using a K-NN nearest neighbor algorithm, and storing the automatically acquired matched feature point coordinates.
Randomly extracting a matching feature points (in some embodiments of the invention, 9 matching feature points) from all feature points to form a group, and establishing a b (b ≧ 6) order polynomial equation and solving for each group of feature matching points. The establishment process is as follows:
for an undistorted image, the equation is:
Figure BDA0004035594140000061
wherein x is u Is a vector representing undistorted image pixel coordinates, x' u Is the pixel coordinates of another undistorted image,
Figure BDA0004035594140000062
for its transpose, G is the fundamental matrix resulting from the equation listing.
Then, substituting the single variable fisheye camera radial distortion model, and finishing to obtain an equation of a distorted image:
Figure BDA0004035594140000063
wherein x is d Is a vector representing the distorted image pixel coordinates. x' d Pixel coordinates of another distorted image.
Straightening a basic matrix G into G by adopting a matrix direct product method; the above equation can be:
M(X′,X,k)g=0
the matrix M is a matrix formed by combining a pair of feature matching point coordinates X and X' and a univariate parameter k. X represents X u And x d Coordinates of the stacked representations; x 'represents X' u And x' d The coordinates of the representations are stacked, and g is the transformed basis matrix.
By using N (N is more than or equal to 2) groups, a nonlinear equation group consisting of N nonlinear equations can be obtained. And solving the optimal real root which simultaneously satisfies all the selected nonlinear equations, wherein the root is the univariate parameter k. The distortion model is replaced into the univariate fisheye camera radial distortion model to obtain r d And r u (the distance between the pixel point of the corrected image and the distortion center) to obtain the coordinate relation between the distorted image and the undistorted image:
Figure BDA0004035594140000064
step 2.4: and carrying out distortion correction according to the coordinate relation between the distorted image and the undistorted image.
And c (c is more than or equal to 4) group corresponding points can be selected according to the coordinate relation between the distorted image and the undistorted image, a homography matrix is solved, and the homography matrix is used for generating the aerial view in the step 4).
And step 3: and judging the driving state of the vehicle according to the driving gear.
And adopting different models to carry out image splicing according to the driving state. The vehicle running gear signal is received and fed back by the processor, 0,1 judgment is carried out in front of the panoramic system, the non-reverse gear signal is 0, and the reverse gear is 1.
And 4, step 4: and if the vehicle is in the forward gear, projecting the projection image into a cylindrical panoramic image by using a cylindrical projection model to obtain a wider visual field range, and if the vehicle is in the reverse gear, obtaining a vehicle aerial view by using a perspective transformation model established based on the result of the step 2), and performing perspective transformation according to the obtained homography matrix to obtain a four-way image aerial view.
If the vehicle is in reverse gear, performing inverse perspective transformation on the four paths of fisheye images after distortion correction to obtain four bird's-eye views of the front, the rear, the left and the right of the automobile body, in some embodiments of the invention, please refer to fig. 2, placing a black-and-white checkerboard L1, L2, L3 and L4 in the front, the rear, the left and the right directions of the automobile respectively for calibration and correction of a fisheye camera and generation of a bird's-eye view, and placing a black-and-white checkerboard L5, L6, L7 and L8 in the four angular positions of the automobile respectively for coordinate selection reference. The world coordinates are related to the pixel coordinates according to the following:
Figure BDA0004035594140000071
wherein (u, v) is a pixel coordinate system point coordinate, and (u) 0 ,v 0 ) For the coordinates of the origin of the image coordinate system in the pixel coordinate system, dx and dy are the physical dimensions of each pixel in the X and y directions of the image plane, respectively, where f is a given focal length, R, t is the rotation matrix and translation matrix, respectively, (X) w ,Y w ,Z w ) Is a world coordinate system point coordinate.
By measuring the world coordinate system (vehicle coordinate system) coordinates of the checkerboard of L1, L2, L3, L4, the image coordinate system coordinates are known, and a mapping relationship between the two can be established. L5, L6, L7, L8 are placed to better measure the coordinates of the points.
For the inverse perspective transformation model, homography matrixes in the front direction, the rear direction, the left direction and the right direction are respectively H-front, H-back, H-left and H-right according to the mapping relation of coordinate points in each direction, and images in the front direction, the rear direction, the left direction and the right direction are respectively subjected to inverse perspective transformation through the homography matrixes of the four fisheye cameras to form bird's-eye views in the front direction, the rear direction, the left direction and the right direction of the automobile body.
For the cylindrical projection model, correction work based on cylindrical projection is not needed, the corrected picture is regarded as an original cylindrical projection image, and a cylindrical projection image can be obtained through the cylindrical projection model, wherein the cylindrical projection model is as follows:
Figure BDA0004035594140000072
wherein (x, y) and (x) 1 ,y 1 ) The coordinates of points of the original image and the projection image are (W/2,H/2) the coordinates of the original image origin, and f is the given focal length.
And 5: adopting a mask cutting method to splice the cylindrical surface or the aerial view: the method comprises the steps of respectively setting masks of a front fish eye image, a rear fish eye image, a left fish eye image and a right fish eye image, cutting the images, determining a coordinate system of a cylindrical surface panoramic image in an advancing state, determining a coordinate system of a panoramic aerial view image in a backing state, and carrying out coordinate transformation to respectively obtain a panoramic spliced image around an automobile body and the panoramic aerial view image around the automobile body.
In some embodiments of the present invention, in a reverse state, masks in four directions of front, back, left, and right of a vehicle body of an automobile are set, as shown in fig. 3, four vertices (positions depend on a specific vehicle) of a vehicle region are connected with four vertices (positions depend on a set size of a display region) of a panoramic image, so as to obtain four mask regions in front, back, left, and right of the automobile, where the four regions are trapezoidal, and a bird's-eye view in four directions obtained by inverse perspective transformation is clipped. On the premise that the relative positions of the four fisheye cameras are fixed, the overlapped areas generated by the images collected by the adjacent fisheye cameras are also fixed, so that the bird's-eye views generated by the adjacent cameras are spliced in pairs. For the forward state, setting masks as shown in fig. 4, wherein the left and right view mask regions are regions where the vehicle region extends to the image boundary in the left and right directions and the vehicle is removed, the front and rear view mask regions are rectangular regions between the image boundary and the vehicle region and between the image boundary and the left and right view mask regions, the four regions are all rectangular, and the image obtained by cylindrical projection is cut and then spliced.
Step 6: and optimizing the image splicing seams by adopting a fusion algorithm to obtain an optimized spliced image.
In some embodiments of the present invention, as shown in fig. 5, a superposed region in which the picture I1 and the picture I2 have the same element exists near a splicing gap between the picture I1 and the picture I2, and the region is subjected to image fusion to obtain a spliced image; the weighted average fusion formula is:
Figure BDA0004035594140000081
in the formula, theta is a trigonometric function value of the distance from the pixel point to the boundary of the overlapping area, d i For pixel points to region I 1 The distance of the boundary, d is the distance of the overlapping area, I 1 (I, j) is a non-overlapping region I 1 Grey value of the middle pixel, I 2 (I, j) is a non-overlapping region I 2 The gray value of the pixel point in (b), I (I, j), is the gray value of the pixel point after fusion.
It can be seen from the formula that the larger d, i.e. the larger the overlap transition region, the better the fusion effect. However, the calculation amount increases, and the speed decreases. Therefore, the size of d needs to be reasonably selected according to the adjustment of the splicing seam and the boundary line so as to eliminate the splicing seam and achieve the best fusion effect
And 7: and generating and displaying the panoramic spliced image.
In some embodiments of the present invention, the display module is used for displaying, and the display module is a car screen or a computer screen.
In some embodiments of the present invention, in the forward state, the cylindrical projection displays the images in a sequence of "one-half back-left-front-right-one-half back", as shown schematically in fig. 6. When the vehicle is in a reverse state, a panoramic aerial view is provided, and the display schematic is shown in fig. 7.
The above-mentioned embodiments are merely preferred embodiments of the present invention, and the scope of the present invention is not limited thereto, so that the changes in the shape and principle of the present invention should be covered by the protection scope of the present invention.

Claims (10)

1. A panoramic splicing method for four fisheye cameras of a vehicle is characterized by comprising the following steps:
1) Acquiring driving gear signals and fisheye images of fisheye cameras arranged on the periphery of a vehicle;
2) Carrying out calibration and image distortion correction on the fisheye image by adopting an MSER algorithm;
3) Judging the driving state of the vehicle according to the driving gear;
4) If the vehicle is in the forward gear, projecting the projection image into a cylindrical panoramic image by adopting a cylindrical projection model so as to obtain a wider view range, and if the vehicle is in the reverse gear, obtaining a vehicle aerial view by adopting a perspective transformation model;
5) Splicing by adopting a mask cutting method, obtaining a panoramic spliced image around the automobile body in a forward state, and obtaining a bird's-eye view panoramic image around the automobile body in a reverse state;
6) Optimizing the image splicing seams by adopting a fusion algorithm to obtain an optimized spliced image;
7) And generating and displaying the panoramic spliced image.
2. The method for panoramic stitching of the four fisheye cameras of the vehicle as claimed in claim 1, wherein in step 1), the four fisheye cameras are respectively installed at the outer ends of the lower sides of the left and right rearview mirrors of the vehicle, at the center between the opening of the front cover of the vehicle and the front license plate, and at the center of the upper frame of the rear license plate, and after the installation positions are fixed, the four fisheye cameras are respectively calibrated, a plurality of checkerboard pictures are taken, and a driving gear signal is obtained from the vehicle machine.
3. The vehicle four-way fisheye camera panorama stitching method according to claim 1, characterized in that step 2) comprises the following substeps:
establishing a univariate fisheye camera radial distortion model;
intercepting more than two fisheye checkerboard images;
extracting a most stable limit area (MSER), performing feature matching on the vectors by using a nearest neighbor vector method, storing automatically acquired matching feature point coordinates, and obtaining a coordinate relation between a distorted image and an undistorted image;
and solving the pixel coordinate relation before and after correction according to the distortion model, carrying out distortion correction, and solving to obtain a homography matrix.
4. The vehicle four-way fisheye camera panorama stitching method of claim 3, wherein the univariate fisheye camera radial distortion model is
Figure FDA0004035594130000011
Wherein x is u Is a vector, representing the undistorted image pixel coordinates; vector e represents the distortion center coordinates; x is the number of d Is a vector, representing distorted image pixel coordinates; k is a univariate parameter; r is d The distance between the corrected image pixel point and the distortion center is obtained.
5. The vehicle four-way fisheye camera panorama stitching method of claim 3, wherein the process of obtaining the coordinate relationship between the distorted image and the undistorted image comprises:
for an undistorted image, the equation is:
Figure FDA0004035594130000012
wherein x is u Is a vector representing undistorted image pixel coordinates, x' u Is the pixel coordinates of another non-distorted image,
Figure FDA0004035594130000021
as its transpose, the G base matrix;
substituting the single variable fisheye camera radial distortion model, and finishing to obtain an equation of a distorted image:
Figure FDA0004035594130000022
wherein x is d Is a vectorAnd represents a distorted image pixel coordinate, x' d Pixel coordinates of another distorted image;
straightening a basic matrix G into G by adopting a matrix direct product method; the above equation can be:
M(X′,X,k)g=0
wherein, the matrix M is a matrix combining a pairs of feature matching point coordinates X and X' and a univariate parameter k, X represents X u And x d Coordinates of the stacked representations; x 'represents X' u And x' d The coordinates are piled up, g is a transformed basic matrix;
obtaining a nonlinear equation set formed by N nonlinear equations by using N groups, wherein N is more than or equal to 2, solving an optimal real root which simultaneously meets all selected nonlinear equations, the root is a univariate parameter k, and replacing the root into the univariate fisheye camera radial distortion model to obtain r d And r u Obtaining the coordinate relation r of the distorted image and the undistorted image d
Figure FDA0004035594130000023
r u The distance of the pixel point of the corrected image from the distortion center.
6. The vehicle four-way fisheye camera panorama stitching method according to claim 5, characterized in that the panorama stitching method is based on a coordinate relation r of a distorted image and an undistorted image d And selecting the c groups of corresponding points, and solving to obtain a homography matrix.
7. The method for splicing the panoramic images of the four fish-eye cameras of the vehicle as claimed in claim 1, wherein in step 4), different models are used for image splicing according to driving states, a cylindrical projection model is used for a forward gear, a perspective transformation model is used for a reverse gear, and perspective transformation is performed according to a homography matrix to obtain a bird's-eye view image of the four images.
8. The vehicle four-way fisheye camera panorama stitching method according to claim 1, wherein in the step 4), the step of obtaining the four-way image aerial view comprises:
placing a black-and-white checkerboard in the front, rear, left and right directions of the vehicle respectively, and performing calibration and correction on the fisheye camera and generation of a bird's-eye view;
a black and white chessboard is also respectively arranged at four angular positions of the vehicle for selecting and referring coordinates;
according to the relation between the world coordinate and the pixel coordinate, the mapping relation between the world coordinate system and the image coordinate system is established by measuring the world coordinate system coordinates of the black-white checkerboard of the front, the rear, the left and the right Fang Xiange of the vehicle and combining the image coordinate system coordinates;
according to the mapping relation of the coordinate points in each direction, homography matrixes in the front direction, the rear direction, the left direction and the right direction are obtained;
images in the front direction, the rear direction, the left direction and the right direction are respectively subjected to inverse perspective transformation through homography matrixes of the four fisheye cameras, and are converted into bird-eye views in the front direction, the rear direction, the left direction and the right direction of the automobile body.
9. The vehicle four-way fisheye camera panorama stitching method according to claim 1, wherein in step 4), the cylindrical projection model is:
Figure FDA0004035594130000031
wherein (x, y) and (x) 1 ,y 1 ) The coordinates of points of the original image and the projection image are (W/2,H/2) the coordinates of the original image origin, and f is the given focal length.
10. The vehicle four-way fisheye camera panorama stitching method according to any one of claims 1 to 9, wherein in step 6), the image stitching seams are fused by using a distance trigonometric function weighted average fusion algorithm, and specifically, the method comprises the step of taking a trigonometric function value of a distance from an image pixel point to an image overlapping region boundary as a weight value, and carrying out weighted average on an image overlapping portion to achieve an image fusion effect.
CN202310002280.9A 2023-01-03 2023-01-03 Panoramic splicing method for four-way fisheye cameras of vehicle Pending CN115936995A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310002280.9A CN115936995A (en) 2023-01-03 2023-01-03 Panoramic splicing method for four-way fisheye cameras of vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310002280.9A CN115936995A (en) 2023-01-03 2023-01-03 Panoramic splicing method for four-way fisheye cameras of vehicle

Publications (1)

Publication Number Publication Date
CN115936995A true CN115936995A (en) 2023-04-07

Family

ID=86557785

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310002280.9A Pending CN115936995A (en) 2023-01-03 2023-01-03 Panoramic splicing method for four-way fisheye cameras of vehicle

Country Status (1)

Country Link
CN (1) CN115936995A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117011446A (en) * 2023-08-23 2023-11-07 苏州深捷信息科技有限公司 Real-time rendering method for dynamic environment illumination
CN117541517A (en) * 2024-01-05 2024-02-09 深圳市欧冶半导体有限公司 Dual-curvature imaging method, device, computer equipment and storage medium of CMS

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117011446A (en) * 2023-08-23 2023-11-07 苏州深捷信息科技有限公司 Real-time rendering method for dynamic environment illumination
CN117011446B (en) * 2023-08-23 2024-03-08 苏州深捷信息科技有限公司 Real-time rendering method for dynamic environment illumination
CN117541517A (en) * 2024-01-05 2024-02-09 深圳市欧冶半导体有限公司 Dual-curvature imaging method, device, computer equipment and storage medium of CMS
CN117541517B (en) * 2024-01-05 2024-03-08 深圳市欧冶半导体有限公司 Dual-curvature imaging method, device, computer equipment and storage medium of CMS

Similar Documents

Publication Publication Date Title
CN111369439B (en) Panoramic all-around image real-time splicing method for automatic parking space identification based on all-around
CN109741455B (en) Vehicle-mounted stereoscopic panoramic display method, computer readable storage medium and system
US10434877B2 (en) Driver-assistance method and a driver-assistance apparatus
CN115936995A (en) Panoramic splicing method for four-way fisheye cameras of vehicle
CN106952311B (en) Auxiliary parking system and method based on panoramic stitching data mapping table
CN109948398B (en) Image processing method for panoramic parking and panoramic parking device
JP5739584B2 (en) 3D image synthesizing apparatus and method for visualizing vehicle periphery
CN112224132B (en) Vehicle panoramic all-around obstacle early warning method
CN108965742B (en) Special-shaped screen display method and device, electronic equipment and computer readable storage medium
US8130270B2 (en) Vehicle-mounted image capturing apparatus
CN110139084B (en) Vehicle surrounding image processing method and device
CN105894549A (en) Panorama assisted parking system and device and panorama image display method
CN112070886B (en) Image monitoring method and related equipment for mining dump truck
CN113362228A (en) Method and system for splicing panoramic images based on improved distortion correction and mark splicing
EP3690799A1 (en) Vehicle lane marking and other object detection using side fisheye cameras and three-fold de-warping
CN110796711B (en) Panoramic system calibration method and device, computer readable storage medium and vehicle
CN111768332A (en) Splicing method of vehicle-mounted all-around real-time 3D panoramic image and image acquisition device
CN109883433B (en) Vehicle positioning method in structured environment based on 360-degree panoramic view
CN110689506A (en) Panoramic stitching method, automotive panoramic stitching method and panoramic system thereof
CN112233188A (en) Laser radar-based roof panoramic camera and calibration method thereof
CN107492125A (en) The processing method of automobile fish eye lens panoramic view picture
CN112348741A (en) Panoramic image splicing method, panoramic image splicing equipment, storage medium, display method and display system
CN110400255B (en) Vehicle panoramic image generation method and system and vehicle
CN111243034A (en) Panoramic auxiliary parking calibration method, device, equipment and storage medium
CN111652937A (en) Vehicle-mounted camera calibration method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination