CN112308927B - Fusion device of panoramic camera and laser radar and calibration method thereof - Google Patents

Fusion device of panoramic camera and laser radar and calibration method thereof Download PDF

Info

Publication number
CN112308927B
CN112308927B CN202011153660.5A CN202011153660A CN112308927B CN 112308927 B CN112308927 B CN 112308927B CN 202011153660 A CN202011153660 A CN 202011153660A CN 112308927 B CN112308927 B CN 112308927B
Authority
CN
China
Prior art keywords
laser radar
camera
calibration
fisheye
coordinate system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011153660.5A
Other languages
Chinese (zh)
Other versions
CN112308927A (en
Inventor
邓振文
熊璐
罗永昌
舒强
牛志
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanchang Intelligent New Energy Vehicle Research Institute
Original Assignee
Nanchang Intelligent New Energy Vehicle Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanchang Intelligent New Energy Vehicle Research Institute filed Critical Nanchang Intelligent New Energy Vehicle Research Institute
Priority to CN202011153660.5A priority Critical patent/CN112308927B/en
Publication of CN112308927A publication Critical patent/CN112308927A/en
Application granted granted Critical
Publication of CN112308927B publication Critical patent/CN112308927B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computing Systems (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to a fusion device of a panoramic camera and a laser radar and a calibration method thereof, wherein the fusion device of the panoramic camera and the laser radar comprises a fisheye camera (1), a camera bracket (2), an annular scanning laser radar (3), a bracket (4), a transparent shell (5), a fastening bolt (6) and a base (7). The invention also discloses a calibration method of the fusion device of the panoramic camera and the laser radar, and the panoramic camera formed by splicing the two fisheye cameras (1) can cover any direction in the space to form a 360-degree multiplied by 180-degree spherical panoramic image. The horizontal view angle of the circular scanning type laser radar is 360 degrees, and the circular scanning type laser radar is fixed below the panoramic camera to form an image and a panoramic environment sensing system of the circular scanning type laser radar, and the calibration method can be used for simultaneously completing external parameter calibration of all the fish-eye cameras (1) and the circular scanning type laser radar (3), so that the calibration efficiency is improved, and meanwhile, the accuracy of a calibration result is ensured.

Description

Fusion device of panoramic camera and laser radar and calibration method thereof
Technical Field
The invention relates to a fusion device of a panoramic camera and a laser radar and a calibration method thereof, in particular to a fusion device of a panoramic camera and a laser radar for an intelligent driving automobile environment sensing system and a calibration method of the fusion device.
Background
The intelligent driving technology plays an important role in preventing and avoiding traffic accidents. In order to replace human drivers to accomplish ambient sensing and cognition, intelligent driving vehicles are often equipped with ambient sensing sensors such as cameras, lidar, millimeter wave radar, and the like. However, the environmental perception technology based on a single sensor has the defects that, for example, although the image provides rich color semantic information, each pixel does not provide depth information due to the imaging principle; the laser radar can provide space three-dimensional point cloud information, but the point cloud is usually sparse, so that the condition that small objects are easy to miss detection occurs; millimeter wave radars have strong environmental interference resistance, but have lower accuracy and often clutter. For the deficiency of a single sensor, sensor fusion is increasingly gaining attention.
Intelligent automobiles are often equipped with multiple sensors to form accurate and redundant sensing systems, and for the specific problems of intelligent automobiles, various fusion systems have been proposed in recent years, such as laser radar and cameras, cameras and cameras, laser radar and laser radar, cameras and IMU, laser radar and IMU, looking-around cameras and ultrasonic waves, etc., especially fusion of laser radar and cameras, greatly improving the accuracy of environmental target detection. But currently, most of data fusion of images and laser point clouds mainly uses a fusion scheme of a plane camera and a laser radar or a panoramic camera is composed of more than 2 cameras. In addition, the precondition of multi-sensor fusion is that spatial alignment can be formed between the sensor data, and the accuracy of data alignment directly determines the fusion result. The existing multi-sensor fusion device and calibration method mainly have the following defects:
(1) Panoramic cameras assembled based on a plurality of cameras are subjected to image stitching technology through plane cameras facing different directions, so that panoramic images are obtained, but due to the fact that the optical centers of the cameras deviate greatly, severe uneven textures of the stitched images are easy to occur. Even if the least square image transformation algorithm is adopted, only partial textures can be aligned, but partial object distortion or inaccurate spatial position of pixels are caused by large pixel movement.
(2) There currently exist partial fusion devices that place a lidar over a panoramic camera. By the method, although the data of all laser radar point clouds can be prevented from being blocked by the support, the area above the panoramic camera is blocked by the bottom surface of the laser radar, so that a larger area above the fusion device is not covered by the sensor.
(3) The external parameters between the sensors are mainly used for acquiring the data conversion relation between two sensor coordinate systems, and if the external parameters are directly used for data conversion between multiple sensors, accumulated errors can be generated. At present, no proper calibration method is available for calibrating a fusion device consisting of two fisheye cameras and one laser radar at the same time.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide a fusion device of a panoramic camera and a laser radar and a calibration method thereof.
The aim of the invention is achieved by the following technical scheme: a fusion device of a panoramic camera and a laser radar comprises a fisheye camera (1), a camera bracket (2), a circular scanning laser radar (3), a bracket (4), a transparent shell (5), a fastening bolt (6) and a base (7); two fisheye cameras (1) are fastened back to back in the camera support (2) so as to construct a panoramic camera; the circular scanning type laser radar (3) is fixed on the bracket (4), the side edge of the bracket (4) is connected with the camera bracket (2), and the camera bracket (2) is fixed on the base (7) by adopting the fastening bolt (6); the circular scanning type laser radar (3) is fixed below the panoramic camera, so that the whole area above the panoramic camera is not shielded.
Further, the field angle of the fisheye camera (1) is larger than 185 degrees.
Further, the external parameter calibration environment between the fisheye camera (1) and one circular scanning laser radar (3) is an indoor calibration environment with round black spots stuck on four sides, calibration data are synchronously acquired in the indoor calibration environment, and the external parameter between the fisheye camera (1) and one circular scanning laser radar (3) is obtained through calculation.
Further, the interior shape of the indoor calibration environment is cuboid, the colors of the periphery are obviously distinguished from black, a certain corner of the ground is set as an origin of an environment coordinate system, and intersecting lines between three wall surfaces connected with the origin and the ground are used as three directions of the environment coordinate system; the adhered round black dots are uniformly arranged on the wall surface, and the three-dimensional position of each black dot under the environment coordinate system is known.
As an improvement, the fusion device is provided with a transparent shell (5), the transparent shell (5) is made of transparent materials, the transparent materials do not affect the receiving and transmitting of laser beams of the laser radar and the sensitization of the panoramic camera, the top of the transparent shell (5) is hemispherical, the center of the hemispherical is coincident with the center of the spherical panoramic camera, and the distortion of imaging of the panoramic camera is reduced.
The invention also discloses a calibration method of the fusion device of the panoramic camera and the laser radar, which comprises the following steps:
S1: the internal parameters of the fish-eye camera (1) are marked through a checkerboard pattern: the calibration of the two fish-eye cameras (1) is completed by adopting a black-white checkerboard calibration plate and a Zhang Zhengyou camera calibration method, the circular scanning laser radar (3) performs data calibration before delivery, and provides corresponding fish-eye camera (1) internal references, if the internal references are provided with enough precision, the internal references are not calibrated;
S2: setting up a rectangular calibration environment, regularly adhering black dots on the wall surface of the calibration environment, and measuring the three-dimensional coordinates of each black dot in an environment coordinate system; the calibration environment is composed of ground and wall surfaces: a large number of black points are stuck on the wall surface, the coordinates of the black points under a calibration environment coordinate system are known, and in order to make the black points more obvious in the image, the color of the wall surface is distinguished from the color of the black points;
S3: placing the fusion device in a position close to the middle of the calibration environment, and enabling the right front of the circular scanning laser radar (3) to deviate to corners as much as possible; when the calibration data are collected, the fusion device is placed in the calibration environment and close to the middle area, and the front of the circular scanning laser radar (3) is deviated to a certain corner;
S4: collecting current image data of the fish-eye cameras (1), measuring pixel coordinates of black points of an image and three-dimensional coordinates of a corresponding environment coordinate system, and obtaining external parameters T g2c1 and T g2c2 between a calibrated environment coordinate system and the two fish-eye cameras (1) by minimizing a target loss function;
S5: fitting the wall surface and the ground of a calibration environment according to the data point cloud of the circular scanning laser radar (3), and acquiring a conversion relation T l2g from the coordinate system of the circular scanning laser radar (3) to the coordinate system of the calibration environment;
S6: and calculating to obtain external parameter conversion relations T l2c1 and T c22c1 between the two fisheye cameras (1) and the three sensors of the circular scanning laser radar (3) in the whole fusion device according to the obtained external parameter conversion relations T l2g、Tg2c1 and T g2c2.
Further, after the calibration environment coordinate system and the external parameter conversion matrix T l2g of the circular scanning type laser radar (3) and the external parameter conversion matrices T g2c1 and T g2c2 of the two fisheye cameras (1) are respectively obtained, external parameters T l2c1 and T c22c1 between the circular scanning type laser radar (3) and the two fisheye cameras (1) are obtained according to the step S6, and the formula of the pixel coordinates (u c,vc) of the fisheye image projected to the fisheye camera (1) by the laser point cloud (x l,yl,zl) of the circular scanning type laser radar (3) is as follows:
The laser point cloud is firstly converted into a fisheye camera coordinate system coordinate (x c,yc,zc),fK is a conversion formula of three-dimensional point projection to an image plane under the fisheye camera (1) coordinate system, as follows:
fK
θdist=θ(1+p1·θ2+p2·θ4+p3·θ6+p4·θ8)
Wherein f x,fy,u0,v0,p1,p2,p3,p4 is an internal reference of the fish-eye camera (1) and is obtained by an internal reference calibration method.
The external parameter conversion matrixes T g2c1 and T g2c2 of the fisheye camera (1) have the following calculation formulas:
Wherein R g2c and t g2c represent a rotation matrix and a translation vector, X represents three-dimensional coordinates of a black dot, The three-dimensional coordinates of the black points are projected onto the image plane, f is the projection function of the fisheye camera, p c is the pixel coordinates of the corresponding black points on the image, and K is the internal reference of the fisheye camera (1).
Compared with the prior art, the invention has the following advantages:
(1) According to the invention, two fisheye cameras (1) with the field angles larger than 185 degrees are arranged back to form a set of panoramic cameras, and the sensing range can cover a 360-degree multiplied by 180-degree spherical panoramic area. The fisheye camera 1 adopted by the invention has smaller size, and the optical center distance of the fisheye camera 1 is closer after the fisheye camera is installed, so that the uneven textures of the spliced panoramic image in a transition area are fully reduced.
(2) The panoramic camera is arranged above the laser radar, the transition area of the panoramic camera points to two sides of the vehicle, and the parts of front and rear key areas of the vehicle have higher pixel quality. Meanwhile, all areas above the panoramic camera are not shielded, so that traffic signs such as signal lamps or signboards can be recognized.
(3) The fusion device synchronously completes the external parameter calibration among all the sensors in the constructed calibration environment, the calibration environment is easy to construct, and the calibration steps are concise; as the wall surface and the black points on the wall surface are adopted as geometric constraints, and all sensors are comprehensively considered to be aligned pairwise, the nonlinear optimized external parameter is more accurate.
Drawings
FIG. 1 is a schematic diagram of a fusion device of a panoramic camera and a lidar according to the present invention;
FIG. 2 is a flow chart of a calibration method of a fusion device of a panoramic camera and a laser radar according to the present invention;
FIG. 3 is a diagram showing the data conversion of each sensor of a fusion device of a panoramic camera and a laser radar according to the present invention;
Fig. 4 is a schematic diagram of a calibration environment of a fusion device of a panoramic camera and a laser radar.
Reference numerals: 1. a fish-eye camera; 2. a camera mount; 3. circular scanning type laser radar; 4. a bracket; 5. a transparent housing; 6. a fastening bolt; 7. a base.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
As shown in fig. 1, the assembly of the fusion device is completed according to the structure of the fusion device of the panoramic camera and the laser radar, and mainly comprises the following steps: firstly, two fisheye cameras 1 are fastened back to the camera support 2, and meanwhile, the circular scanning laser radar 3 is fastened to the bracket 4; secondly, the circular scanning type laser radar 3 is fixed on the bracket 4, the side edge of the bracket 4 is connected with the camera bracket 2, and the camera bracket 2 is fixed on the base 7 by adopting the fastening bolt 6; and finally, fixing the fisheye camera 1 and the laser radar data line along the side surface of the camera support, and avoiding the wire harness from interfering with the point cloud of the laser radar.
As an improvement, the fusion device is provided with a transparent shell 5, the transparent shell 5 is made of transparent materials, the transparent materials do not affect the receiving and transmitting of laser beams of the laser radar and the sensitization of the panoramic camera, the top of the transparent shell 5 is hemispherical, the center of the hemispherical is coincident with the center of the spherical panoramic camera, and the distortion of imaging of the panoramic camera is reduced.
In the use process of the fusion device, only the data line is connected to the port corresponding to the computing unit, and the programs such as space-time alignment, data fusion, algorithm application and the like are integrated in the computing unit.
The space-time alignment mainly includes time synchronization and spatial synchronization. The spatial synchronization mainly finds a transformation matrix T for mutual alignment between the data, and the data transformation matrix between the multiple sensors in the fusion device will be described in detail later. In terms of time synchronization, for two fisheye cameras 1 with the same model, as the frame rate is the same, only the same image capturing time is required to be set, and multiple threads are started to synchronously acquire images of the two fisheye cameras 1. For the time synchronization of heterogeneous sensors between the circular scanning laser radar 3 and the fisheye camera 1, the frame rate of the circular scanning laser radar 3 with a lower frame rate is used as the data updating frame rate of the fusion device due to different frame rates, and when laser point cloud data are output, the current image of the fisheye camera 1 is output to form time synchronization.
For the heterogeneous sensor target level data fusion of the camera and the laser radar, the time synchronization problem can be avoided, and at the moment, the observation data z (t) of each sensor and the prediction data of the global system are fused in the fusion process in a sensor-to-globalKalmanfilter modeFused data is/>Is the following.
Wherein K is Kalman gain, H is space conversion matrix, F is state transition matrix,System data at the previous time.
The data fusion method adopted by the fusion device mainly focuses on the fusion of the panoramic image and the panoramic laser point cloud and is divided into pixel-level fusion, feature-level fusion and target-level fusion, wherein the fusion formula is as follows:
pixel level fusion:
feature level fusion:
target level fusion:
Wherein x c is an original image of the camera image, x l is an original point cloud of the laser radar, conv E represents a depth convolution neural network fused at a pixel level, conv M is a depth convolution neural network with characteristics and level fusion, and Conv c and Conv l represent neural networks for processing of the camera image and the laser radar, respectively.
The application algorithm comprises the applications of accessible area detection, traffic sign detection, panoramic image positioning, lane line detection and the like besides obstacle detection.
Before the fusion device is used, external parameter calibration among sensors in the fusion device is required to be completed, so that data space alignment can be completed, and feasibility of data fusion is realized. The device selects the coordinate system of the front-view fisheye camera 1 as the coordinate system of the fusion device, so that the coordinate transformation matrices T l2c1 and T c22c1 of the rear-view fisheye camera 1 and the circular scanning laser radar 3 and the front-view fisheye camera 1 respectively need to be calculated, and the data transformation diagram of the fusion device is shown in fig. 3.
A calibration method of a fusion device of a panoramic camera and a laser radar is shown in fig. 2, and comprises the following steps:
S1: the internal reference of the fish-eye camera 1 is marked by a checkerboard pattern: the calibration of the two fish-eye cameras 1 is completed by adopting a black-and-white checkerboard calibration plate and a Zhang Zhengyou camera calibration method, the circular scanning laser radar 3 performs data calibration before delivery, and provides corresponding internal references of the fish-eye cameras 1, if the internal references are provided with enough precision, the internal references are not calibrated;
S2: setting up a rectangular calibration environment, regularly adhering black dots on the wall surface of the calibration environment, and measuring the three-dimensional coordinates of each black dot in an environment coordinate system; the calibration environment is composed of ground and wall surfaces: a large number of black points are stuck on the wall surface, the coordinates of the black points under a calibration environment coordinate system are known, and in order to make the black points more obvious in the image, the color of the wall surface is distinguished from the color of the black points;
S3: placing the fusion device in a position close to the middle of the calibration environment, and enabling the right front of the circular scanning laser radar 3 to deviate to the corners as much as possible; when the calibration data are collected, the fusion device is placed in the calibration environment close to the middle area, and the front of the circular scanning laser radar 3 is deviated to a certain corner, as shown in fig. 4;
s4: collecting current image data of the fisheye cameras 1, measuring pixel coordinates of black points of the images and three-dimensional coordinates of a corresponding environment coordinate system, and obtaining external parameters T g2c1 and T g2c2 between a calibrated environment coordinate system and the two fisheye cameras 1 by minimizing a target loss function;
S5: fitting the wall surface and the ground of the calibration environment according to the data point cloud of the circular scanning laser radar 3, and obtaining a conversion relation T l2g from the circular scanning laser radar 3 coordinate system to the calibration environment coordinate system;
s6: and calculating to obtain the external parameter conversion relations T l2c1 and T c22c1 between the two fisheye cameras 1 and the three sensors of the circular scanning laser radar 3 in the whole fusion device according to the obtained external parameter conversion relations T l2g、Tg2c1 and T g2c2.
Meanwhile, two fisheye images acquired by the two fisheye cameras 1 can acquire pixel positions of black points in the fisheye images, and meanwhile, the three-dimensional positions of the black points in an environment coordinate system can also be known, and then the minimum target loss function of the reprojection error is obtained through a nonlinear optimization method, namely the environment coordinate system and the external parameter transformation matrices T g2c1 and T g2c2 of the two fisheye cameras 1 can be obtained, and the target loss function is as follows:
Wherein R g2c and t g2c represent a rotation matrix and a translation vector, X represents three-dimensional coordinates of a black dot, The three-dimensional coordinates of the black points are represented by coordinates after being projected onto an image plane, f represents a projection function of the fisheye camera, p c represents pixel coordinates of the corresponding black points on the image, and K represents an internal reference of the fisheye camera.
After the calibrated environment coordinate system, the circular scanning laser radar 3 and the external parameter conversion matrixes T l2g,Tg2c1 and T g2c2 of the two fish-eye cameras 1 are respectively obtained, external parameters T l2c1 and T c22c1 between the sensors of the fusion device are obtained, and the formula of the pixel coordinate (u c,vc) of the fish-eye image projected to the fish-eye camera 1 by the laser point cloud (x l,yl,zl) of the circular scanning laser radar 3 is as follows:
The laser point cloud is firstly converted into a fisheye camera coordinate system coordinate (x c,yc,zc),fK is a conversion formula of three-dimensional point projection to an image plane under the fisheye camera coordinate system, as follows:
fK
θdist=θ(1+p1·θ2+p2·θ4+p3·θ6+p4·θ8)
Wherein f x,fy,u0,v0,p1,p2,p3,p4 is an internal reference of the fish-eye camera, and is obtained by an internal reference calibration method.
And the two fisheye cameras 1 can perform panoramic image stitching, thereby forming the function of a panoramic camera. Firstly, according to a data conversion matrix between the two fisheye cameras 1, the image of the fisheye camera 1 with the rear view is projected to the fisheye camera 1 with the front view, so that a spherical panoramic view is formed. Firstly, projecting an image of a fish-eye camera 1 with a rear view onto a spherical surface with a radius of 1, and then overlapping the pixel point with a coordinate system of the fish-eye camera 1 with a front view by rotating the pixel point on the spherical surface, wherein a conversion matrix is as follows:
wherein, R -1 g2c2 is the inverse of R g2c2.
And projecting the converted pixel points to the spherical fisheye image according to the f K projection formula, so as to form a panoramic image.
The two fisheye cameras 1 are fused in the region with overlapped pixels after panoramic expansion by adopting an alpha mixing algorithm, and the texture and the brightness of the fused panoramic image at the transition part are more continuous and even, so that the panoramic camera is constructed to form a panoramic image.
In a word, the panoramic camera constructed by the two fisheye cameras can form the perception of a 360-degree area and is arranged on the laser radar, so that all areas above the fusion device are free from any shielding. Meanwhile, after the fusion device is installed on the top of the intelligent automobile, important areas in front and back can be covered by the camera and the laser radar at the same time. In addition, the calibration method provided by the invention can be used for calibrating the external parameters of each sensor of the fusion device at the same time, and is more stable and reliable.
While the invention has been described with reference to certain preferred embodiments, it will be understood by those skilled in the art that various changes and substitutions of equivalents may be made and equivalents will be apparent to those skilled in the art without departing from the scope of the invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.

Claims (1)

1. A calibration method of a fusion device of a panoramic camera and a laser radar is characterized in that the fusion device of the panoramic camera and the laser radar is used, two fisheye cameras (1) are fastened in a camera bracket (2) back to back, and therefore a panoramic camera is constructed; the circular scanning type laser radar (3) is fixed on the bracket (4); the circular scanning type laser radar (3) is fixed below the panoramic camera, so that the whole area above the panoramic camera is not blocked; the method is characterized by comprising the following steps of:
S1: the internal parameters of the fish-eye camera (1) are marked through a checkerboard pattern: the calibration of the two fish-eye cameras (1) is completed by adopting a black-white checkerboard calibration plate and a Zhang Zhengyou camera calibration method, and the circular scanning laser radar (3) performs data calibration before delivery and provides corresponding internal references of the fish-eye cameras (1);
S2: setting up a rectangular calibration environment, regularly adhering black dots on the wall surface of the calibration environment, and measuring the three-dimensional coordinates of each black dot in an environment coordinate system; the calibration environment is composed of ground and wall surfaces: a large number of black points are stuck on the wall surface, the coordinates of the black points under a calibration environment coordinate system are known, and in order to make the black points more obvious in the image, the color of the wall surface is distinguished from the color of the black points;
S3: placing the fusion device in a position close to the middle of the calibration environment, and enabling the right front of the circular scanning laser radar (3) to deviate to corners as much as possible; when the calibration data are collected, the fusion device is placed in the calibration environment and close to the middle area, and the front of the circular scanning laser radar (3) is deviated to a certain corner;
S4: collecting current image data of the fish-eye cameras (1), measuring pixel coordinates of black points of an image and three-dimensional coordinates of a corresponding environment coordinate system, and obtaining external parameters T g2c1 and T g2c2 between a calibrated environment coordinate system and the two fish-eye cameras (1) by minimizing a target loss function;
S5: fitting the wall surface and the ground of a calibration environment according to the data point cloud of the circular scanning laser radar (3), and acquiring a conversion relation T l2g from the coordinate system of the circular scanning laser radar (3) to the coordinate system of the calibration environment;
s6: according to the obtained external parameter conversion relations T l2g、Tg2c1 and T g2c2, external parameter conversion relations T l2c1 and T c22c1 between the two fisheye cameras (1) and the three sensors of the circular scanning laser radar (3) in the whole fusion device are obtained through calculation:
The two fisheye images acquired by the two fisheye cameras 1 can acquire the pixel positions of each black point in the fisheye images, and meanwhile, the three-dimensional positions of each black point in the environment coordinate system can also be known, then the minimum target loss function of the reprojection error is obtained through a nonlinear optimization method, and the environment coordinate system and the external parameter transformation matrices T g2c1 and T g2c2 of the two fisheye cameras 1 can be obtained, wherein the target loss function is as follows:
Wherein R g2c and t g2c represent a rotation matrix and a translation vector, X represents three-dimensional coordinates of a black dot, Representing the coordinates of the black points after the three-dimensional coordinates are projected onto an image plane, f representing the projection function of the fisheye camera, p c representing the pixel coordinates of the corresponding black points on the image, and K representing the internal reference of the fisheye camera;
the formula of the pixel coordinates (u c,vc) of the fish-eye image projected to the fish-eye camera (1) by the laser data point cloud (x l,yl,zl) of the circular scanning laser radar (3) is as follows:
the laser data point cloud is firstly converted into a fisheye camera coordinate system coordinate (x c,yc,zc),fK is a conversion formula of three-dimensional point projection to an image plane under the fisheye camera (1) coordinate system, as follows:
fK
θdist=θ(1+p1·θ2+p2·θ4+p3·θ6+p4·θ8)
Wherein f x,fy,u0,v0,p1,p2,p3,p4 is an internal reference of the fish-eye camera (1), and is obtained by an internal reference calibration method;
The external parameter conversion matrixes T g2c1 and T g2c2 of the fisheye camera (1) have the following calculation formulas:
Wherein R g2c and t g2c represent a rotation matrix and a translation vector, X represents three-dimensional coordinates of a black dot, The three-dimensional coordinates of the black points are projected onto the image plane, f is the projection function of the fisheye camera, p c is the pixel coordinates of the corresponding black points on the image, and K is the internal reference of the fisheye camera (1).
CN202011153660.5A 2020-10-26 2020-10-26 Fusion device of panoramic camera and laser radar and calibration method thereof Active CN112308927B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011153660.5A CN112308927B (en) 2020-10-26 2020-10-26 Fusion device of panoramic camera and laser radar and calibration method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011153660.5A CN112308927B (en) 2020-10-26 2020-10-26 Fusion device of panoramic camera and laser radar and calibration method thereof

Publications (2)

Publication Number Publication Date
CN112308927A CN112308927A (en) 2021-02-02
CN112308927B true CN112308927B (en) 2024-05-17

Family

ID=74331183

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011153660.5A Active CN112308927B (en) 2020-10-26 2020-10-26 Fusion device of panoramic camera and laser radar and calibration method thereof

Country Status (1)

Country Link
CN (1) CN112308927B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113075683B (en) * 2021-03-05 2022-08-23 上海交通大学 Environment three-dimensional reconstruction method, device and system
CN113034615B (en) * 2021-03-30 2023-05-23 南方电网电力科技股份有限公司 Equipment calibration method and related device for multi-source data fusion
CN113219479A (en) * 2021-05-13 2021-08-06 环宇智行科技(苏州)有限公司 Camera and laser radar synchronization method and system of intelligent driving control system
CN113298878A (en) * 2021-05-19 2021-08-24 的卢技术有限公司 Calibration method and device for vehicle-mounted all-around camera, electronic equipment and readable storage medium
CN117406185B (en) * 2023-12-14 2024-02-23 深圳市其域创新科技有限公司 External parameter calibration method, device and equipment between radar and camera and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882313A (en) * 2010-07-14 2010-11-10 中国人民解放军国防科学技术大学 Calibration method of correlation between single line laser radar and CCD (Charge Coupled Device) camera
CN202782968U (en) * 2012-09-21 2013-03-13 纵横皆景(武汉)信息技术有限公司 Vehicle-mounted measure integrated system based on laser scanning and panorama images
CN103837869A (en) * 2014-02-26 2014-06-04 北京工业大学 Vector-relation-based method for calibrating single-line laser radar and CCD camera
CN106443687A (en) * 2016-08-31 2017-02-22 欧思徕(北京)智能科技有限公司 Piggyback mobile surveying and mapping system based on laser radar and panorama camera
CN107274336A (en) * 2017-06-14 2017-10-20 电子科技大学 A kind of Panorama Mosaic method for vehicle environment
CN110148180A (en) * 2019-04-22 2019-08-20 河海大学 A kind of laser radar and camera fusing device and scaling method
CN110677599A (en) * 2019-09-30 2020-01-10 西安工程大学 System and method for reconstructing 360-degree panoramic video image
CN110889829A (en) * 2019-11-09 2020-03-17 东华大学 Monocular distance measurement method based on fisheye lens
CN111145269A (en) * 2019-12-27 2020-05-12 武汉大学 Calibration method for external orientation elements of fisheye camera and single-line laser radar

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016116859A1 (en) * 2016-09-08 2018-03-08 Knorr-Bremse Systeme für Nutzfahrzeuge GmbH Sensor arrangement for an autonomously operated commercial vehicle and a method for round imaging
CN107392851A (en) * 2017-07-04 2017-11-24 上海小蚁科技有限公司 Method and apparatus for generating panoramic picture
US11435456B2 (en) * 2017-12-28 2022-09-06 Lyft, Inc. Sensor calibration facility

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882313A (en) * 2010-07-14 2010-11-10 中国人民解放军国防科学技术大学 Calibration method of correlation between single line laser radar and CCD (Charge Coupled Device) camera
CN202782968U (en) * 2012-09-21 2013-03-13 纵横皆景(武汉)信息技术有限公司 Vehicle-mounted measure integrated system based on laser scanning and panorama images
CN103837869A (en) * 2014-02-26 2014-06-04 北京工业大学 Vector-relation-based method for calibrating single-line laser radar and CCD camera
CN106443687A (en) * 2016-08-31 2017-02-22 欧思徕(北京)智能科技有限公司 Piggyback mobile surveying and mapping system based on laser radar and panorama camera
CN107274336A (en) * 2017-06-14 2017-10-20 电子科技大学 A kind of Panorama Mosaic method for vehicle environment
CN110148180A (en) * 2019-04-22 2019-08-20 河海大学 A kind of laser radar and camera fusing device and scaling method
CN110677599A (en) * 2019-09-30 2020-01-10 西安工程大学 System and method for reconstructing 360-degree panoramic video image
CN110889829A (en) * 2019-11-09 2020-03-17 东华大学 Monocular distance measurement method based on fisheye lens
CN111145269A (en) * 2019-12-27 2020-05-12 武汉大学 Calibration method for external orientation elements of fisheye camera and single-line laser radar

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
Coal Mine Rescue Robots Based on Binocular Vision: A Review of the State of the Art;Guodong Zhai;《IEEE》;20200715;全文 *
全向视觉传感器标定;林颖;《中国博士学位论文全文数据库》;20140630;全文 *
基于信息融合的智能车障碍物检测方法;陆峰;徐友春;李永乐;王德宇;谢德胜;;计算机应用;20171220(S2);全文 *
基于梯形棋盘格的摄像机和激光雷达标定方法;贾子永;任国全;李冬伟;程子阳;;计算机应用;20170710(07);全文 *
激光雷达和相机的联合标定(Camera-LiDAR Calibration)之Autoware;W_Tortoise;《https://blog.csdn.net/learning_tortosie/article/details/82347694》;20180903;全文 *

Also Published As

Publication number Publication date
CN112308927A (en) 2021-02-02

Similar Documents

Publication Publication Date Title
CN112308927B (en) Fusion device of panoramic camera and laser radar and calibration method thereof
CN110264520B (en) Vehicle-mounted sensor and vehicle pose relation calibration method, device, equipment and medium
CN112233188B (en) Calibration method of data fusion system of laser radar and panoramic camera
US9858639B2 (en) Imaging surface modeling for camera modeling and virtual view synthesis
US10445928B2 (en) Method and system for generating multidimensional maps of a scene using a plurality of sensors of various types
WO2021098608A1 (en) Calibration method for sensors, device, system, vehicle, apparatus, and storage medium
WO2021098448A1 (en) Sensor calibration method and device, storage medium, calibration system, and program product
CN111559314B (en) Depth and image information fused 3D enhanced panoramic looking-around system and implementation method
WO2022088103A1 (en) Image calibration method and apparatus
US11977167B2 (en) Efficient algorithm for projecting world points to a rolling shutter image
JP2011215063A (en) Camera attitude parameter estimation device
US20220276360A1 (en) Calibration method and apparatus for sensor, and calibration system
EP3206184A1 (en) Apparatus, method and system for adjusting predefined calibration data for generating a perspective view
CN111260539A (en) Fisheye pattern target identification method and system
CN114445592A (en) Bird view semantic segmentation label generation method based on inverse perspective transformation and point cloud projection
CN110750153A (en) Dynamic virtualization device of unmanned vehicle
CN114782548B (en) Global image-based radar data calibration method, device, equipment and medium
JP2001091649A (en) Ground control point device for synthetic aperture radar image precise geometrical correction
CN115447568A (en) Data processing method and device
KR101816068B1 (en) Detection System for Vehicle Surroundings and Detection Method for Vehicle Surroundings Using thereof
CN116245722A (en) Panoramic image stitching system and method applied to heavy high-speed vehicle
CN115936995A (en) Panoramic splicing method for four-way fisheye cameras of vehicle
US11648888B2 (en) Surround view monitoring system and providing method of the same
JP2001091650A (en) Active ground control point device for synthetic aperture radar image precise geometrical correction
WO2021172264A1 (en) Device for detecting posture/position of detector

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant