CN106952311B - Auxiliary parking system and method based on panoramic stitching data mapping table - Google Patents

Auxiliary parking system and method based on panoramic stitching data mapping table Download PDF

Info

Publication number
CN106952311B
CN106952311B CN201710120841.XA CN201710120841A CN106952311B CN 106952311 B CN106952311 B CN 106952311B CN 201710120841 A CN201710120841 A CN 201710120841A CN 106952311 B CN106952311 B CN 106952311B
Authority
CN
China
Prior art keywords
image
calibration
images
area
panoramic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710120841.XA
Other languages
Chinese (zh)
Other versions
CN106952311A (en
Inventor
闫旭琴
李研强
王磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Automation Shandong Academy of Sciences
Original Assignee
Institute of Automation Shandong Academy of Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Automation Shandong Academy of Sciences filed Critical Institute of Automation Shandong Academy of Sciences
Priority to CN201710120841.XA priority Critical patent/CN106952311B/en
Publication of CN106952311A publication Critical patent/CN106952311A/en
Application granted granted Critical
Publication of CN106952311B publication Critical patent/CN106952311B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/10Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
    • B60R2300/105Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/303Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using joined images, e.g. multiple camera images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/806Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The invention discloses an auxiliary parking method and system based on a panoramic stitching data mapping table.A calibration cloth is arranged on the ground around a vehicle body, and images in the coverage area of adjacent cameras arranged on the vehicle body are subjected to coarse registration by utilizing the layout and size information of a checkerboard area and a public coverage area on the calibration cloth; then, fine adjustment is carried out on the images in the coverage areas of the adjacent cameras by using the object characteristics in the public coverage area, and the optimal image registration position is found when the sum of the image characteristic non-similarities acquired by different cameras reaches the minimum by using the non-similarity of the image characteristics; and generating a panoramic stitching data mapping table by using the panoramic stitching template image, and generating a panoramic video image in real time according to a file containing the data mapping relation table. The invention can effectively improve the registration precision of the panoramic calibration image, improve the splicing and fusing effect of the panoramic video image and effectively meet the real-time requirement of hardware.

Description

Auxiliary parking system and method based on panoramic stitching data mapping table
Technical Field
The invention relates to the technical field of image processing, in particular to an auxiliary parking system and method based on a panoramic stitching data mapping table.
Background
The panoramic auxiliary parking system acquires images around the vehicle body in real time through 4 fisheye cameras arranged on the front, the rear, the left and the right of the vehicle body, the image processing unit performs distortion correction on the four fisheye images, then the four fisheye images are transformed into a space coordinate system through perspective, panoramic images are generated and transmitted to the center console display equipment, a user can visually see the environment around the vehicle body in the vehicle, and the driving safety can be effectively improved.
The panoramic auxiliary parking system is difficult to realize because the input original images are acquired by a plurality of ultra-wide-angle vehicle-mounted cameras and need to be subjected to distortion correction and homography transformation to carry out system calibration, which causes large object deformation and color difference, no matter which registration algorithm is used, an ideal registration effect cannot be obtained, and an optimal image registration position is found. The panoramic auxiliary parking system needs a hardware device of the system to meet the real-time requirement in practical application due to the special application requirement of the system.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides an auxiliary parking method based on a panoramic stitching data mapping table, and aims to generate the panoramic stitching data mapping table by using a panoramic stitching template image.
Another object of the present invention is to provide an auxiliary parking system based on a panorama stitching data mapping table, which can generate a panorama video image in real time according to a file containing the data mapping table.
An auxiliary parking method based on a panoramic stitching data mapping table comprises the following steps:
arranging calibration cloth on the ground around the vehicle body, and performing coarse registration on images in the coverage areas of adjacent cameras mounted on the vehicle body by using the layout and size information of checkerboard areas and public coverage areas on the calibration cloth;
then, fine adjustment is carried out on the images in the coverage areas of the adjacent cameras by using the object characteristics in the public coverage area, and the optimal image registration position is found when the sum of the image characteristic non-similarities acquired by different cameras reaches the minimum by using the non-similarity of the image characteristics;
and generating a panoramic stitching data mapping table by using the panoramic stitching template image, and generating a panoramic video image in real time according to a file containing the data mapping relation table.
Furthermore, 4 paths of fisheye super wide-angle cameras are arranged on the front, the left, the rear and the right of the vehicle body to be calibrated.
Furthermore, the front, the left and the rear of the automobile are obtained by utilizing the cameraAnd four fisheye calibration images A on the right0(x00,y00)、A1(x10,y10)、A2(x20,y20)、A3(x30,y30) Distortion correction is carried out on the four fisheye calibration images to obtain four fisheye calibration images B after distortion correction0(x01,y01)、B1(x11,y11)、B2(x21,y21)、B3(x31,y31)。
Further, B is respectively identified0(x01,y01)、B1(x11,y11)、B2(x21,y21)、B3(x31,y31) Obtaining world coordinates and image coordinates of the angular points in the checkerboards in the image, and calculating B according to the world coordinates and the image coordinates of the angular points in the checkerboards0(x01,y01)、B1(x11,y11)、B2(x21,y21)、B3(x31,y31) Homography transformation matrix H of image0、H1、H2、H3For image B, based on the homography transformation matrix0(x01,y01)、B1(x11,y11)、B2(x21,y21)、B3(x31,y31) Respectively carrying out aerial view transformation to obtain four aerial view calibration images C0(x02,y02)、C1(x12,y12)、C2(x22,y22)、C3(x32,y32)。
Further, homography transformation needs to ensure that the image sizes of the single checkerboard grids involved in the four bird's-eye view calibration images are the same, and the direction of the checkerboard grids in the bird's-eye view calibration images is consistent with the direction of panorama stitching.
Further, the panoramic mosaic template image is generated in the following manner: and generating a panoramic spliced template image according to the actual size of the calibrated checkerboard areas, the actual vertical distance between the checkerboard areas, the image size information of the checkerboard areas and the display distance of the panoramic aerial view calibrated image.
Further, the panoramic mosaic template image consists of a central region RcentFront region RALeft region RBRear region RCRight region RDFive blocks forming a central region RcentFor an image of a model of a vehicle, a front region RALeft region RBRear region RCRight region RDAerial view calibration image C in corresponding direction respectively0(x02,y02)、C1(x12,y12)、C2(x22,y22)、C3(x32,y32) Correspondingly, splicing gaps of the two bird's-eye-view calibration images are determined according to the size of the checkerboard area, and rectangular areas corresponding to the splicing gaps are overlapped areas of the two bird's-eye-view calibration images.
Further, according to the position of the splicing line and the image position of the checkerboard in the bird's-eye view image, the four bird's-eye view calibration images are spliced into a complete panoramic bird's-eye view calibration image D (x)3,y3)。
Further, the finding of the optimal image registration position specifically includes: determining the overlapping areas of the adjacent aerial view calibration images, respectively extracting the characteristics of the overlapping area images of the adjacent aerial view calibration images, finely adjusting the positions of the two aerial view calibration images according to the image characteristics of the two overlapping areas, and searching the optimal registration positions of every two adjacent aerial view images, so that the sum of the non-similarities of the image characteristics of all the images which belong to different cameras in 4 overlapping areas is minimized.
Further, a panoramic stitching mapping table file is generated, and the panoramic stitching mapping table consists of the following parts:
f0: aerial view calibration image C0(x02,y02) Calibrating image A with front fish eye0(x00,y00) Coordinate mapping table M of intermediate pixel points0
F1: aerial view calibration image C1(x12,y12) Demarcating image A with left fish eye1(x10,y10) Coordinate mapping table M of intermediate pixel points1
F2: aerial view calibration image C2(x22,y22) Image A calibrated with the rear fisheye2(x20,y20) Coordinate mapping table M of intermediate pixel points2
F3: aerial view calibration image C3(x32,y32) And the right fisheye calibration image A3(x30,y30) Coordinate mapping table M of intermediate pixel points3
F4: panoramic aerial view calibration image D (x)3,y3) Mapping table M of areas where pixel points between 4 directional aerial view calibration images belong4Mapping table M4Can be expressed as formula (1), wherein R0、R1、R2、R3Respectively, is a bird's-eye view calibration image C0(x02,y02)、C1(x12,y12)、C2(x22,y22)、C3(x32,y32) And R is a sub-image of the overlapping area of the adjacent bird's-eye-view calibration images, and R is a sub-image of the overlapping area of the adjacent bird's-eye-view calibration images4、R5、R6、R7Belonging to the fusion area of the adjacent aerial view calibration images, and performing fusion processing on the images of the corresponding overlapping area in the fusion area, wherein R is4Is the bird's eye view calibration image C0(x02,y02)、C1(x12,y12) Overlap area sub-images of R5Is the bird's eye view calibration image C1(x12,y12)、C2(x22,y22) Overlap area sub-images of R6Is the bird's eye view calibration image C2(x22,y22)、C3(x32,y32) Overlap area sub-images of R7Is the bird's eye view calibration image C0(x02,y02)、C3(x32,y32) The overlap region sub-image of (a);
Figure BDA0001236993880000041
f5: panoramic aerial view calibration image D (x)3,y3) Bird's-eye view calibration image C in 4 directions0(x02,y02)、C1(x12,y12)、C2(x22,y22)、C3(x32,y32) Coordinate mapping table M of intermediate pixel points5And any pixel point (x ') in the adjacent aerial view calibration image fusion area'3,y′3) Note that (x'3,y′3) Coordinate mapping relation between the two aerial view image pixel points;
f6: fusion region R for adjacent aerial view calibration images4、R5、R6、R7Fusion factor h of internal determined pixel point1、h2Wherein h is1、h2Respectively, the weight of the pixel under the corresponding coordinate of the overlapped part of the bird's-eye view image, and fusing the regions R4、R5、R6、R7The coordinates (x ') of any pixel point mentioned in (1)'3,y′3) The fusion factor with the pixel point is stored in a mapping table M6
Furthermore, the calibration cloth is composed of a checkerboard area and a public coverage area, the public coverage area is mainly used as a marker for panoramic splicing, the front calibration cloth, the rear calibration cloth, the left calibration cloth and the right calibration cloth are respectively placed or drawn on the ground of the front, the rear, the left and the right of the automobile at a set distance from the automobile body, the rectangular areas of the adjacent calibration cloth checkerboard are ensured to be mutually vertical, the calibration cloth is ensured to be positioned in the visible range of fisheye photography of the front, the rear, the left and the right of the automobile, and the camera is utilized to obtain the information of the calibration image around the automobile according to the calibration cloth placed or drawn in the front, the rear, the left.
Further, the front region R of the panoramic mosaic template imageABy regionR0、R7Composition of the left region RBFrom the region R1、R4Composition, rear region RCFrom the region R2、R5Composition, right region RDFrom the region R3、R6And (4) forming.
Further, generating a panoramic video image in real time: after 4 fish-eye real-time images around the vehicle body are obtained in real time, R is corrected0、R1、R2、R3Mapping the pixels in the region to the coordinates of the original fisheye image according to a data mapping table file generated by a calibration system, determining the value of the output pixel, and for R4、R5、R6、R7And mapping the pixels in the area to the coordinates of the original fisheye image to be calculated according to the data mapping table file, fusing the pixels according to the fusion factor determined by the mapping table, and determining an output pixel value.
Supplementary parking system based on panorama concatenation data mapping table includes:
the image acquisition module acquires images by utilizing 4-way fisheye super-wide-angle cameras arranged in front of, on the left, on the rear and on the right of a vehicle body and synchronously processes the images;
the judging module is used for judging whether the images after synchronous processing need to be calibrated or not, transmitting the images after synchronous processing to the system calibrating module if the images need to be calibrated, and otherwise, switching to the panorama generating module;
the system calibration module is used for respectively carrying out distortion correction on the front video image, the rear video image, the left video image and the right video image which are subjected to synchronous processing, then respectively carrying out aerial view transformation, carrying out panoramic stitching on the images subjected to aerial view transformation and generating a panoramic stitching data mapping table;
and the panorama generating module is used for remapping and transforming the video image, processing the transformed image and transmitting the processed image to the display module for image display.
Compared with the prior art, the invention has the beneficial effects that:
the panoramic auxiliary parking system performs coarse registration on the aerial view calibration image by using the calibration distribution information, fine-tunes the image by using the object characteristics in the public coverage area, realizes the optimal registration of the aerial view calibration image, and generates a panoramic stitching data mapping table by using the automatically generated stitching template and the panoramic stitching data mapping table.
The invention can effectively improve the registration precision of the panoramic calibration image, improve the splicing and fusing effect of the panoramic video image and effectively meet the real-time requirement of hardware.
The invention combines the actual requirements in the development of panoramic parking products and solves the key problem in the panoramic parking system.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the application and, together with the description, serve to explain the application and are not intended to limit the application.
FIG. 1 is a block diagram of a panoramic assisted parking system;
FIG. 2 is a schematic view of the calibration arrangement position;
FIG. 3 is a panorama stitching template diagram;
fig. 4 is a display range diagram of the panoramic stitched bird's-eye view image.
Detailed Description
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
As described in the background art, the present application provides a method and a system for assisting parking based on a panorama stitching data mapping table, in order to solve the above technical problems.
The working process of the panoramic auxiliary parking system can be divided into two subsystems, wherein one working subsystem is a calibration system and is calibrated after the system is installed for the first time, 4 paths of fisheye images are subjected to image processing to generate a panoramic spliced image, the final purpose is to generate a data mapping table between an original fisheye image and the panoramic spliced image, and the other working subsystem is to generate a panoramic real-time image by using the data mapping table generated by the calibration system after the system is calibrated and 4 paths of fisheye real-time images are acquired.
In a typical implementation manner of the present application, the following system design scheme is adopted in the embodiment of the present invention: the method comprises the steps that a PC is used for calibrating the panoramic auxiliary parking system, a calibration image and a panoramic splicing image data mapping table acquired by a fisheye super-wide-angle camera are determined, the data mapping table is written into a text file, then the file is copied into a storage device using a USB interface, and hardware equipment of the panoramic auxiliary parking system finishes initialization of the panoramic auxiliary parking device by reading the file containing the data mapping table in a USB port. And when the panoramic auxiliary parking equipment is started next time, the system reads an information unit containing a data mapping table in the flash memory, processes the real-time four-way fisheye video image by using the data mapping table, generates a real-time panoramic video image and displays the real-time panoramic video image on a display screen. An operation block diagram of an image system for panoramic-assisted parking according to an embodiment of the present invention is shown in fig. 1.
The four-camera car is characterized in that 4 fisheye ultra-wide-angle vehicle-mounted cameras are fixed on the front, the back, the left and the right of a car body, the cameras in the front and the back of the car can be installed above a license plate, the installation angle is inclined downwards, the cameras in the left and the right of the car can be installed near a rearview mirror, and the installation angle is vertical downwards. The size and the layout of the calibration cloth are determined according to the length-width ratio of the vehicle, checkerboards and splicing markers are drawn on the calibration cloth, the calibration cloth is placed around the vehicle, the checkerboard markers for calibration are placed in the front, the rear and the two sides of the vehicle respectively, the splicing markers are placed in the public coverage area of every two cameras, and the position schematic diagram of the calibration cloth is shown in the attached figure 2.
A fisheye camera calibration tool is selected for calibrating a fisheye camera to be used in the system, an opencv or matlab calibration tool box can be selected for calibrating the fisheye camera, and the matlab calibration tool box is selected here.
In another exemplary embodiment of the present application, the calibration step: the 4-path fisheye calibration image is subjected to image processing to generate a panoramic stitching image, the final purpose is to generate a data mapping table between the original fisheye image and the panoramic stitching image, and the implementation steps are described as follows:
step (1): 4 paths of fisheye cameras are well fixed on the front, the left, the rear and the right of a vehicle body to be calibrated, calibration cloth is placed (drawn) on the ground according to the attached drawing 2, and the layout and the size of four checkerboard areas on the calibration cloth are completely the same.
Step (2): obtaining four fisheye calibration images A of the front, the left, the rear and the right of the automobile0(x00,y00)、A1(x10,y10)、A2(x20,y20)、A3(x30,y30) (ii) a Distortion correction is carried out on the front fisheye calibration image, the left fisheye calibration image, the rear fisheye calibration image and the right fisheye calibration image according to the distortion correction model, and four fisheye calibration images B after distortion correction are obtained0(x01,y01)、B1(x11,y11)、B2(x21,y21)、B3(x31,y31)。
And (3): respectively identifying the inner angular points of the checkerboards in the fisheye calibration image after distortion correction to obtain the image coordinates of four vertexes of a rectangular area formed by the inner angular points of the checkerboards, wherein the image coordinates corresponding to the upper left vertex, the upper right vertex, the lower left vertex and the lower right vertex of the rectangular area formed by the inner angular points of all the checkerboards are respectively P0、P1、P2、P3. If there is a horizontal direction in the imagea chessboard squares and b chessboard squares are arranged in the vertical direction. If P4(x0,y0) The coordinates of the upper left vertex of the rectangular area formed by the checkerboard internal angle points in the bird's-eye view image correspond to the coordinates of the upper right vertex, the lower left vertex and the lower right vertex of the rectangular area formed by the checkerboard internal angle points in the bird's-eye view image, which are respectively P5((a-2)×u+x0,y0)、P6(x0,(b-2)×u+y0)、P7((a-2)×u+x0,(b-2)×u+y0) Where u is the image size of the single board square after bird's eye view transformation, C'0、C′1、C′2、C′3Respectively is an image B0、B1、B2、B3P is calculated by rotating the image by 0 degree, 90 degrees, 180 degrees and 270 degrees counterclockwise by taking the top left vertex as the center4、P5、P6、P7In picture C'0、C′1、C′2、C′3Middle corresponding pixel point P'0、P′1、P′2、P′3According to the image coordinate P of the angular point in the checkerboard0、P1、P2、P3And world coordinates P 'of corner points in the checkerboard'0、P′1、P′2、P′3A homography transformation matrix of the image can be calculated; according to the homography transformation matrix, carrying out bird-eye view transformation on each distorted and corrected fisheye calibration image respectively to obtain four bird-eye view images C0(x02,y02)、C1(x12,y12)、C2(x22,y22)、C3(x32,y32) And respectively determining the fusion areas of the four bird's-eye view images.
And (4): generating a mosaic template image as shown in FIG. 3 according to equations (2), (3), (4), (5), (6), (7), (8), (9), (10), wherein wlr、wrr、hfr、hrrIn relation to the display distance of the panoramic bird's eye view image, as shown in fig. 4.
Figure BDA0001236993880000091
Figure BDA0001236993880000092
Figure BDA0001236993880000093
Figure BDA0001236993880000101
Figure BDA0001236993880000102
Figure BDA0001236993880000103
Figure BDA0001236993880000104
Figure BDA0001236993880000105
wv=a×u (10)
Splicing the four aerial view images into a complete panoramic aerial view image D (x)3,y3) And determining the overlapping area of the bird's-eye view calibration images.
And (5): respectively carrying out binarization processing on overlapping areas of adjacent aerial view calibration images, and selecting a target template image A on one of the characteristic imagesmhSize is Mmh×NmhOn the other characteristic image, there is a template AmhSearch-element sub-image B under covermhAnd a size of Lmh×WmhRemember MmhAnd LmhThe maximum value in (1) is r, and is recorded as NmhAnd WmhThe maximum value in c is c, and for the pixel points which do not exist on the image, the pixel value is 0, namely representing thatThis pixel point is not a feature point on the image. Target template A is calculated using equation (11)mhAnd searching for sub-image BmhIs not similar tomhAccording to the template AmhAnd sub-graph B under template coveragemhWhen matched, PmhAnd obtaining the minimum value, finishing the registration between the two images, and establishing registration mapping transformation between the images through the matching relation of the characteristics. And finally, in 4 overlapped areas of the panoramic spliced image, the sum of the non-similarity of all the sub-images and the image characteristics acquired by different cameras reaches the minimum.
Figure BDA0001236993880000106
And (6): generating a panoramic stitching mapping table M according to the homography transformation matrix calculated in the step (3) and the fisheye image distortion correction model0、M1、M2And M3Mapping table M can be generated according to panoramic stitching template images4、M5. The fusion region image fusion adopts feathering method, and the specific fusion method is as formula (12), formula (13), formula (14), wherein d1、d2Respectively calculating the distances from the pixel points in the splicing fusion region to the overlapping boundaries AC and AB, and calculating the fusion region R4、R5、R6、R7When pixel values of pixel points are calculated, the overlapped image I participating in calculationsAre respectively A0、A1、A2And A3Superimposed images I involved in the calculationtAre respectively A1、A2、A3And A0. Fusing the region R4、R5、R6、R7The coordinates of the pixel points involved in the process and the fusion gradient factors of the pixel points are stored in a mapping table M6. Will contain the panorama concatenation mapping relation table M0、M1、M2、M3、M4、M5And M6The information is written into the panoramic stitching mapping table file.
Figure BDA0001236993880000111
h1+h2=1 (13)
E(x,y)=h1Is(x,y)+h2It(x,y) (14)
If (x'3,y′3) After calibration is completed for pixel points in a panoramic video image, and after 4 paths of fisheye real-time images are obtained, a panoramic real-time image is generated by using a data mapping table generated by a calibration system, and the implementation steps are described as follows:
step (1): reading a panoramic stitching mapping table file, and initializing variables related to the panoramic stitching mapping table;
step (2): according to a mapping table M4Pixel point (x 'can be determined'3,y′3) The region to which it belongs;
and (3): if pixel point (x'3,y′3) Is of the formula R0、R1、R2、R3To determine pixel point (x'3,y′3) Coordinate mapping table M for fisheye imagek. According to the coordinate mapping table M5And pixel point (x ') may be determined'3,y′3) Corresponding aerial view image pixel point coordinate Ck(x′k2,y′k2) According to a coordinate mapping table MkCan determine the pixel point C of the aerial view imagek(x′k2,y′k2) Corresponding fish-eye image pixel point coordinate Ak(x′k0,y′k0) Thus determining the pixel value of the final pixel point;
step (4) if pixel point (x'3,y′3) Is of the formula R4、R5、R6、R7Determining an overlapping region pixel point (x'3,y′3) Coordinate mapping table M for fisheye images、Mt. According to the coordinate mapping table M5Of pixel points (x ') can be determined to participate in the computation'3,y′3) Bird's-eye view image pixel point coordinate C corresponding to pixel values(x′s2,y′s2)、Ct(x′t2,y′t2) According to a coordinate mapping table Ms、MtCan determine the pixel point C of the aerial view images(x′s2,y′s2)、Ct(x′t2,y′t2) Corresponding fish-eye image pixel point coordinate As(x′s0,y′s0)、At(x′t0,y′t0). According to a mapping table M6Determining (x'3,y′3) And the fusion factor carries out fusion processing on the pixel point to obtain a finally output pixel value.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (9)

1. An auxiliary parking method based on a panoramic stitching data mapping table is characterized by comprising the following steps:
arranging calibration cloth on the ground around the vehicle body, and performing coarse registration on images in the coverage areas of adjacent cameras mounted on the vehicle body by using the layout and size information of checkerboard areas and public coverage areas on the calibration cloth;
then, fine adjustment is carried out on the images in the coverage areas of the adjacent cameras by using the object characteristics in the public coverage area, and the optimal image registration position is found when the sum of the image characteristic non-similarities acquired by different cameras reaches the minimum by using the non-similarity of the image characteristics;
generating a panoramic stitching data mapping table by using the panoramic stitching template image, and generating a panoramic video image in real time according to a file containing the data mapping relation table;
generating a panoramic stitching mapping table file, wherein the panoramic stitching mapping table consists of the following parts:
f0: aerial view calibration image C0(x02,y02) Calibrating image A with front fish eye0(x00,y00) Coordinate mapping table M of intermediate pixel points0
F1: aerial view calibration image C1(x12,y12) Demarcating image A with left fish eye1(x10,y10) Coordinate mapping table M of intermediate pixel points1
F2: aerial view calibration image C2(x22,y22) Image A calibrated with the rear fisheye2(x20,y20) Coordinate mapping table M of intermediate pixel points2
F3: aerial view calibration image C3(x32,y32) And the right fisheye calibration image A3(x30,y30) Coordinate mapping table M of intermediate pixel points3
F4: panoramic aerial view calibration image D (x)3,y3) Mapping table M of areas where pixel points between 4 directional aerial view calibration images belong4Mapping table M4Can be expressed as formula (1), wherein R0、R1、R2、R3Respectively, is a bird's-eye view calibration image C0、C1、C2、C3And R is a sub-image of the overlapping area of the adjacent bird's-eye-view calibration images, and R is a sub-image of the overlapping area of the adjacent bird's-eye-view calibration images4、R5、R6、R7Belonging to the fusion area of the adjacent aerial view calibration images, and performing fusion processing on the images of the corresponding overlapping area in the fusion area, wherein R is4Is the bird's eye view calibration image C0、C1Overlap area sub-images of R5Is the bird's eye view calibration image C1、C2Overlap area sub-images of R6Is the bird's eye view calibration image C2、C3Overlap area sub-images of R7Is the bird's eye view calibration image C0、C3The overlap region sub-image of (a);
Figure FDA0002193918460000021
f5: panoramic aerial view calibrationImage D (x)3,y3) Bird's-eye view calibration image C in 4 directions0(x02,y02)、C1(x12,y12)、C2(x22,y22)、C3(x32,y32) Coordinate mapping table M of intermediate pixel points5And any pixel point (x ') in the adjacent aerial view calibration image fusion area'3,y′3) Note that (x'3,y′3) Coordinate mapping relation between the two aerial view image pixel points;
f6: fusion region R for adjacent aerial view calibration images4、R5、R6、R7Fusion factor h of internal determined pixel point1、h2Wherein h is1、h2Respectively, the weight of the pixel under the corresponding coordinate of the overlapped part of the bird's-eye view image, and fusing the regions R4、R5、R6、R7The coordinates (x ') of any pixel point mentioned in (1)'3,y′3) The fusion factor with the pixel point is stored in a mapping table M6
2. The auxiliary parking method based on the panoramic stitching data mapping table as claimed in claim 1, wherein four fisheye calibration images A of the front, left, rear and right of the automobile are obtained by using cameras0(x00,y00)、A1(x10,y10)、A2(x20,y20)、A3(x30,y30) Distortion correction is carried out on the four fisheye calibration images to obtain four fisheye calibration images B after distortion correction0(x01,y01)、B1(x11,y11)、B2(x21,y21)、B3(x31,y31)。
3. The parking assist method based on the panorama stitching data mapping table as claimed in claim 2, wherein B is identified separately0(x01,y01)、B1(x11,y11)、B2(x21,y21)、B3(x31,y31) Obtaining world coordinates and image coordinates of the angular points in the checkerboards in the image, and calculating B according to the world coordinates and the image coordinates of the angular points in the checkerboards0(x01,y01)、B1(x11,y11)、B2(x21,y21)、B3(x31,y31) Homography transformation matrix H of image0、H1、H2、H3For image B, based on the homography transformation matrix0(x01,y01)、B1(x11,y11)、B2(x21,y21)、B3(x31,y31) Respectively carrying out aerial view transformation to obtain four aerial view calibration images C0(x02,y02)、C1(x12,y12)、C2(x22,y22)、C3(x32,y32)。
4. The parking assistance method based on the panorama stitching data mapping table according to claim 3, wherein the homography transformation is required to ensure that the image sizes of the single checkerboard squares involved in the four bird's-eye view calibration images are the same, and the direction of the checkerboard squares in the bird's-eye view calibration images is consistent with the direction of the panorama stitching.
5. The auxiliary parking method based on the panoramic stitching data mapping table as claimed in claim 1, wherein the panoramic stitching template image is generated in a manner that: and generating a panoramic spliced template image according to the actual size of the calibrated checkerboard areas, the actual vertical distance between the checkerboard areas, the image size information of the checkerboard areas and the display distance of the panoramic aerial view calibrated image.
6. The vehicle parking assist method based on the panorama stitching data mapping table as claimed in claim 5, wherein the full stitching data mapping table is used for assisting the vehicle parkingThe scene splicing template image consists of a central region RcentFront region RALeft region RBRear region RCRight region RDFive blocks forming a central region RcentFor an image of a model of a vehicle, a front region RALeft region RBRear region RCRight region RDAerial view calibration image C in corresponding direction respectively0(x02,y02)、C1(x12,y12)、C2(x22,y22)、C3(x32,y32) Correspondingly, determining a splicing gap of the adjacent aerial view calibration images according to the size of the checkerboard area, wherein a rectangular area corresponding to the splicing gap is an overlapping area of the adjacent aerial view calibration images;
splicing the four aerial view calibration images into a complete panoramic aerial view calibration image D (x) according to the position of the splicing line and the image position of the checkerboard in the aerial view image3,y3)。
7. The auxiliary parking method based on the panorama stitching data mapping table as claimed in claim 1, wherein the finding of the optimal image registration position specifically comprises: determining the overlapping areas of the adjacent aerial view calibration images, respectively extracting the characteristics of the overlapping area images of the adjacent aerial view calibration images, finely adjusting the positions of the two aerial view calibration images according to the image characteristics of the two overlapping areas, and searching the optimal registration positions of every two adjacent aerial view images, so that the sum of the non-similarities of the image characteristics of all the images which belong to different cameras in 4 overlapping areas is minimized.
8. The parking assist method based on the panorama stitching data mapping table as claimed in claim 1, wherein the generating of the panorama video image in real time comprises: after 4 fish-eye real-time images around the vehicle body are obtained in real time, R is corrected0、R1、R2、R3The pixels in the area are mapped to the original fisheye image according to the data mapping table file generated by the calibration systemCoordinates, determining output pixel value, for R4、R5、R6、R7And mapping the pixels in the area to the coordinates of the original fisheye image to be calculated according to the data mapping table file, fusing the pixels according to the fusion factor determined by the mapping table, and determining an output pixel value.
9. Supplementary parking system based on panorama concatenation data mapping table, characterized by includes:
the image acquisition module acquires images by utilizing 4-way fisheye super-wide-angle cameras arranged in front of, on the left, on the rear and on the right of a vehicle body and synchronously processes the images;
the judging module is used for judging whether the images after synchronous processing need to be calibrated or not, transmitting the images after synchronous processing to the system calibrating module if the images need to be calibrated, and otherwise, switching to the panorama generating module;
determining the size and the layout of a calibration cloth according to the length-width ratio of a vehicle, drawing checkerboards and splicing markers on the calibration cloth, placing the calibration cloth around the vehicle, respectively placing the checkerboard markers for calibration in the front, the rear and the two sides of the vehicle, and simultaneously placing the splicing markers in the public coverage area of every two cameras;
the system calibration module is used for respectively carrying out distortion correction on the front video image, the rear video image, the left video image and the right video image which are subjected to synchronous processing, then respectively carrying out aerial view transformation, carrying out panoramic stitching on the images subjected to aerial view transformation and generating a panoramic stitching data mapping table;
step (1): fixing 4 paths of fisheye cameras at the front, the left, the rear and the right of a vehicle body to be calibrated, and placing or drawing calibration cloth on the ground, wherein the layout and the size of four checkerboard areas on the calibration cloth are completely the same;
step (2): obtaining four fisheye calibration images A of the front, the left, the rear and the right of the automobile0(x00,y00)、A1(x10,y10)、A2(x20,y20)、A3(x30,y30) (ii) a Distortion correction is carried out on the front fisheye calibration image, the left fisheye calibration image, the rear fisheye calibration image and the right fisheye calibration image according to the distortion correction model, and four fisheye calibration images B after distortion correction are obtained0(x01,y01)、B1(x11,y11)、B2(x21,y21)、B3(x31,y31);
And (3): respectively identifying the inner angular points of the checkerboards in the fisheye calibration image after distortion correction to obtain the image coordinates of four vertexes of a rectangular area formed by the inner angular points of the checkerboards, wherein the image coordinates corresponding to the upper left vertex, the upper right vertex, the lower left vertex and the lower right vertex of the rectangular area formed by the inner angular points of all the checkerboards are respectively P0、P1、P2、P3(ii) a If a checkerboard squares exist in the image in the horizontal direction, b checkerboard squares exist in the vertical direction; if P4(x0,y0) The coordinates of the upper left vertex of the rectangular area formed by the checkerboard internal angle points in the bird's-eye view image correspond to the coordinates of the upper right vertex, the lower left vertex and the lower right vertex of the rectangular area formed by the checkerboard internal angle points in the bird's-eye view image, which are respectively P5((a-2)×u+x0,y0)、P6(x0,(b-2)×u+y0)、P7((a-2)×u+x0,(b-2)×u+y0) Where u is the image size of the single board square after bird's eye view transformation, C'0、C′1、C′2、C′3Respectively is an image B0、B1、B2、B3P is calculated by rotating the image by 0 degree, 90 degrees, 180 degrees and 270 degrees counterclockwise by taking the top left vertex as the center4、P5、P6、P7In picture C'0、C′1、C′2、C′3Middle corresponding pixel point P'0、P′1、P′2、P′3According to the image coordinate P of the angular point in the checkerboard0、P1、P2、P3And world coordinates P 'of corner points in the checkerboard'0、P′1、P′2、P′3A homography transformation matrix of the image can be calculated; according to the homography transformation matrix, carrying out bird-eye view transformation on each distorted and corrected fisheye calibration image respectively to obtain four bird-eye view images C0(x02,y02)、C1(x12,y12)、C2(x22,y22)、C3(x32,y32) Respectively determining fusion areas of the four aerial view images;
and (4): generating a mosaic template image according to the formula (2), (3), (4), (5), (6), (7), (8), (9), (10), wherein wlr、wrr、hfr、hrrCorrelating with the display distance of the panoramic aerial view image;
Figure FDA0002193918460000051
Figure FDA0002193918460000061
Figure FDA0002193918460000062
Figure FDA0002193918460000063
Figure FDA0002193918460000064
Figure FDA0002193918460000065
Figure FDA0002193918460000066
Figure FDA0002193918460000067
wv=a×u (10)
in the formula, L0Indicating the length of the checkerboard area, L1Indicates the length of the common coverage area, L2The distance between the common coverage area of the upper side and the left checkerboard area is represented, the distance between the common coverage area of the upper side and the left checkerboard area is equal to the distance between the common coverage area of the upper side and the right checkerboard area, and L is equal to L3The distance between the common coverage area of the lower side and the left checkerboard area is represented, the distance between the common coverage area of the lower side and the left checkerboard area is equal to the distance between the common coverage area of the lower side and the right checkerboard area, and L is equal to L4Representing the width of the public coverage area, the width of the public coverage area is equal to the width of the checkerboard area, w represents the horizontal distance of the splicing templates, h represents the vertical distance of the splicing templates0The distance between the upper side of the left fusion area of the vehicle front aerial view calibration image and the upper side edge of the panoramic stitching template image is represented, the distance between the upper side of the left fusion area of the vehicle front aerial view calibration image and the upper side edge of the panoramic stitching template image is equal to the distance between the upper side of the right fusion area of the vehicle front aerial view calibration image and the upper side edge of the panoramic stitching template image, and h is equal to h1The distance between the lower side of the left fusion area of the bird's-eye view calibration image at the rear of the vehicle and the lower side edge of the panoramic stitching template image is represented, the distance between the lower side of the left fusion area of the bird's-eye view calibration image at the rear of the vehicle and the lower side edge of the panoramic stitching template image is equal to the distance between the lower side of the right fusion area of the bird's-eye view calibration image at the rear of the vehicle and the lower side edge of the panoramic stitching template image, h is2Vertical distance, w, representing the fusion area of the aerial view calibration image0The horizontal distance of the upper fusion area of the left aerial view calibration image is shown, the horizontal distance of the upper fusion area of the left aerial view calibration image and the horizontal distance of the lower fusion area of the left aerial view calibration image are equal, and w is1Horizontal distance of upper fusion area of right aerial view calibration image, and water of lower fusion area of right aerial view calibration imageThe flat distances being equal, hv、wvRespectively representing the vertical distance, the horizontal distance, w, of the central region of the panoramic mosaic template imagelrThe distance, w, between the left side boundary of the panoramic spliced aerial view image and the left side boundary of the calibration distribution arearrThe distance h between the right side boundary of the panoramic aerial view image and the right side boundary of the calibration layout areafrThe distance h between the upper boundary of the panoramic aerial view image and the upper boundary of the calibration layout arearrRepresenting the distance between the lower side boundary of the panoramic aerial view image and the lower side boundary of the calibration distribution area;
splicing the four aerial view images into a complete panoramic aerial view image D (x)3,y3) Determining an overlapping area of every two bird's-eye view calibration images;
and (5): respectively carrying out binarization processing on overlapping areas of adjacent aerial view calibration images, and selecting a target template image A on one of the characteristic imagesmhSize is Mmh×NmhOn the other characteristic image, there is a template AmhSearch-element sub-image B under covermhAnd a size of Lmh×WmhRemember MmhAnd LmhThe maximum value in (1) is r, and is recorded as NmhAnd WmhThe maximum value in c is c, and for a pixel point which does not exist on the image, the pixel value is 0, namely the pixel point is not a characteristic point on the image; target template A is calculated using equation (11)mhAnd searching for sub-image BmhIs not similar tomhAccording to the template AmhAnd sub-graph B under template coveragemhWhen matched, PmhObtaining the minimum value, finishing the registration between the two images, and establishing registration mapping transformation between the images through the matching relation of the characteristics; finally, in 4 overlapped areas of the panoramic spliced image, the sum of the non-similarity of all the sub-images and the image characteristics acquired by different cameras reaches the minimum;
Figure FDA0002193918460000071
and (6): homography transformation matrix and homography transformation matrix calculated according to step (3)The used fisheye image distortion correction model can generate a panoramic stitching mapping table M0、M1、M2And M3Mapping table M can be generated according to panoramic stitching template images4、M5(ii) a The fusion region image fusion adopts feathering method, and the specific fusion method is as formula (12), formula (13), formula (14), wherein d1、d2Respectively calculating the distances from the pixel points in the splicing fusion region to the overlapping boundaries AC and AB, and calculating the fusion region R4、R5、R6、R7When pixel values of pixel points are calculated, the overlapped image I participating in calculationsAre respectively A0、A1、A2And A3Superimposed images I involved in the calculationtAre respectively A1、A2、A3And A0(ii) a Fusing the region R4、R5、R6、R7The coordinates of the pixel points involved in the process and the fusion gradient factors of the pixel points are stored in a mapping table M6(ii) a Will contain the panorama concatenation mapping relation table M0、M1、M2、M3、M4、M5And M6Writing the information into a panoramic stitching mapping table file;
Figure FDA0002193918460000081
h1+h2=1 (13)
E(x,y)=h1Is(x,y)+h2It(x,y) (14)
and the panorama generating module is used for remapping and transforming the video image, processing the transformed image and transmitting the processed image to the display module for image display.
CN201710120841.XA 2017-03-02 2017-03-02 Auxiliary parking system and method based on panoramic stitching data mapping table Active CN106952311B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710120841.XA CN106952311B (en) 2017-03-02 2017-03-02 Auxiliary parking system and method based on panoramic stitching data mapping table

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710120841.XA CN106952311B (en) 2017-03-02 2017-03-02 Auxiliary parking system and method based on panoramic stitching data mapping table

Publications (2)

Publication Number Publication Date
CN106952311A CN106952311A (en) 2017-07-14
CN106952311B true CN106952311B (en) 2020-04-07

Family

ID=59468022

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710120841.XA Active CN106952311B (en) 2017-03-02 2017-03-02 Auxiliary parking system and method based on panoramic stitching data mapping table

Country Status (1)

Country Link
CN (1) CN106952311B (en)

Families Citing this family (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107492125A (en) * 2017-07-28 2017-12-19 哈尔滨工业大学深圳研究生院 The processing method of automobile fish eye lens panoramic view picture
CN107633545A (en) * 2017-08-25 2018-01-26 包谦 A kind of processing method of panoramic picture
CN107590773A (en) * 2017-08-25 2018-01-16 包谦 A kind of preparation method of panoramic picture
CN109429013A (en) * 2017-08-28 2019-03-05 华利纳企业股份有限公司 Image correcting system and image correcting method
CN107958440A (en) * 2017-12-08 2018-04-24 合肥工业大学 Double fish eye images real time panoramic image split-joint methods and system are realized on GPU
CN109948398B (en) * 2017-12-20 2024-02-13 深圳开阳电子股份有限公司 Image processing method for panoramic parking and panoramic parking device
US10482626B2 (en) * 2018-01-08 2019-11-19 Mediatek Inc. Around view monitoring systems for vehicle and calibration methods for calibrating image capture devices of an around view monitoring system using the same
CN108712604B (en) * 2018-05-07 2022-02-01 维沃移动通信有限公司 Panoramic shooting method and mobile terminal
CN109068207B (en) * 2018-07-04 2023-05-09 广州希脉创新科技有限公司 Earphone and earphone system
CN109204141A (en) * 2018-09-19 2019-01-15 深圳市众鸿科技股份有限公司 Method for early warning and device in vehicle travel process
CN109544647A (en) * 2018-11-30 2019-03-29 郑州天迈科技股份有限公司 Calibration cloth, place and method for 360 ° of panoramic parking assist systems
CN109600565B (en) * 2018-12-13 2021-02-26 广州路派电子科技有限公司 Parking system-based video recording method
CN109613920B (en) * 2018-12-27 2022-02-11 睿驰达新能源汽车科技(北京)有限公司 Method and device for determining vehicle position
CN109767473B (en) * 2018-12-30 2022-10-28 惠州华阳通用电子有限公司 Panoramic parking device calibration method and device
CN110264395B (en) * 2019-05-20 2023-11-28 深圳市森国科科技股份有限公司 Lens calibration method and related device of vehicle-mounted monocular panoramic system
CN110288527B (en) * 2019-06-24 2023-10-24 北京智行者科技股份有限公司 Panoramic aerial view generation method of vehicle-mounted panoramic camera
CN110838086B (en) * 2019-11-07 2021-07-16 上海大学 Outdoor image splicing method based on correlation template matching
CN111086455B (en) * 2020-03-19 2020-09-11 吉利汽车研究院(宁波)有限公司 Data processing method and device, electronic equipment and storage medium
CN111559314B (en) * 2020-04-27 2021-08-24 长沙立中汽车设计开发股份有限公司 Depth and image information fused 3D enhanced panoramic looking-around system and implementation method
CN112435161A (en) * 2020-11-12 2021-03-02 蘑菇车联信息科技有限公司 Panoramic all-around image splicing method and system, electronic equipment and storage medium
CN112606829A (en) * 2020-12-16 2021-04-06 广州市车智连电子有限公司 Auxiliary parking system based on panoramic stitching data mapping
CN113674363B (en) * 2021-08-26 2023-05-30 龙岩学院 Panoramic parking image stitching calibration method and calibration object thereof
CN114881863B (en) * 2022-06-30 2022-09-30 湖北芯擎科技有限公司 Image splicing method, electronic equipment and computer readable storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102298779A (en) * 2011-08-16 2011-12-28 淮安盈科伟力科技有限公司 Image registering method for panoramic assisted parking system
CN103177439B (en) * 2012-11-26 2015-10-28 惠州华阳通用电子有限公司 A kind of automatic calibration method based on black and white lattice corners Matching
CN105894549A (en) * 2015-10-21 2016-08-24 乐卡汽车智能科技(北京)有限公司 Panorama assisted parking system and device and panorama image display method
CN106373091A (en) * 2016-09-05 2017-02-01 山东省科学院自动化研究所 Automatic panorama parking aerial view image splicing method, system and vehicle

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102298779A (en) * 2011-08-16 2011-12-28 淮安盈科伟力科技有限公司 Image registering method for panoramic assisted parking system
CN103177439B (en) * 2012-11-26 2015-10-28 惠州华阳通用电子有限公司 A kind of automatic calibration method based on black and white lattice corners Matching
CN105894549A (en) * 2015-10-21 2016-08-24 乐卡汽车智能科技(北京)有限公司 Panorama assisted parking system and device and panorama image display method
CN106373091A (en) * 2016-09-05 2017-02-01 山东省科学院自动化研究所 Automatic panorama parking aerial view image splicing method, system and vehicle

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Research on the auxiliary panoramic parking technology based on fast image mosaic;Chunbao Huo 等;《2016 3rd International Conference on Informative and Cybernetics for Computational Social Systems (ICCSS)》;20161010;第216-219页 *
多视点全景图像辅助泊车系统研究;眭昊天;《中国优秀硕士学位论文全文数据库 信息科技辑》;20141025;I138-853页 *

Also Published As

Publication number Publication date
CN106952311A (en) 2017-07-14

Similar Documents

Publication Publication Date Title
CN106952311B (en) Auxiliary parking system and method based on panoramic stitching data mapping table
CN107133988B (en) Calibration method and calibration system for camera in vehicle-mounted panoramic looking-around system
CN106373091B (en) Panorama gets a bird's eye view method for automatically split-jointing, system and the vehicle of image in parking
US11350073B2 (en) Disparity image stitching and visualization method based on multiple pairs of binocular cameras
JP5739584B2 (en) 3D image synthesizing apparatus and method for visualizing vehicle periphery
CN109741455B (en) Vehicle-mounted stereoscopic panoramic display method, computer readable storage medium and system
CN108198133B (en) Rapid splicing method for vehicle panoramic images
CN108475437A (en) 360 ° of viewing systems of vehicle of video camera, calibration system and method are placed with corner
CN110264395B (en) Lens calibration method and related device of vehicle-mounted monocular panoramic system
CN106408511A (en) Overlook conversion method, overlook image acquisition method and mapping table construction method of fisheye image
CN110288527B (en) Panoramic aerial view generation method of vehicle-mounted panoramic camera
JP2008187564A (en) Camera calibration apparatus and method, and vehicle
CN109767473A (en) A kind of panorama parking apparatus scaling method and device
JP2008205811A (en) Camera attitude calculation target device and camera attitude calculation method using it, and image display method
CN114549666B (en) AGV-based panoramic image splicing calibration method
CN111652937A (en) Vehicle-mounted camera calibration method and device
CN116245722A (en) Panoramic image stitching system and method applied to heavy high-speed vehicle
CN115936995A (en) Panoramic splicing method for four-way fisheye cameras of vehicle
CN113658262A (en) Camera external parameter calibration method, device, system and storage medium
CN111815752B (en) Image processing method and device and electronic equipment
CN112802109A (en) Method for generating automobile aerial view panoramic image
CN109708655A (en) Air navigation aid, device, vehicle and computer readable storage medium
CN110738696B (en) Driving blind area perspective video generation method and driving blind area view perspective system
JP2003115057A (en) Texture editing device, texture editing system and method
CN106803874A (en) A kind of full-view camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant