Panoramic camera and multi-line laser radar external parameter calibration system
Technical Field
The invention relates to the field of external parameter calibration systems, in particular to an external parameter calibration system for a panoramic camera and a multi-line laser radar.
Background
The current domestic technology for creating various high-end technical products based on the laser radar and camera fusion technology is becoming a trend, so that the related technology of the laser radar and camera fusion is rapidly developed. The most basic technology of the fusion of the two is the calibration of external parameters between the two, which is the basis for ensuring the consistency of the environmental information perceived by the two. In order to achieve the calibration between the two, a plurality of effective technical schemes have appeared, but the calibration of the camera with small distortion and the laser radar is mainly performed, and the method is rarely applied to the external parameter calibration method of the panoramic camera and the multi-line laser radar.
The Chinese patent with publication number CN105678783B discloses a data fusion calibration method of a refraction-reflection panoramic camera and a laser radar, wherein the data fusion calibration structure comprises the laser radar and a single-viewpoint refraction-reflection panoramic camera which are arranged on an environment sensing system body; the combined calibration method comprises the following steps: 1. calibrating an internal reference K of a camera; 2. solving refractive point parameters XmYmZm of the refraction mirror surface; 3. solving the world coordinate point parameters XwYwZw of the panoramic camera; 4. measuring laser radar world coordinate point parameters; 5. and (5) calibrating the panoramic camera and the laser radar in a combined way. The invention has reasonable design, integrates the data of the laser radar and the panoramic camera, and can effectively calibrate the internal parameters of the panoramic camera. And a reasonable, quick and effective scheme is provided for the distance measurement and positioning problem in the environment sensing system.
However, the above disclosed method has three limitations: firstly, although the panoramic camera and the laser radar can be effectively calibrated, the corresponding point relation between the camera pixels and the radar cannot be effectively and automatically found, so that the corresponding points are required to be found by manual intervention; secondly, the single-line laser radar is adopted in the method, so that the multiple beams of the multi-line laser radar cannot be completely calibrated; thirdly, the method is not suitable for calibrating products such as a three-dimensional scanner, and the three-dimensional scanner requires that a panoramic camera is stationary and the multi-line laser radar always rotates at 360 degrees.
Based on the technical background, the external parameter calibration method of the multi-line laser radar and the panoramic camera is not available at present, but similar calibration methods of the single-line laser radar and the panoramic camera are available, but the external parameter calibration problems that corresponding points cannot be found automatically, multi-line laser radar multi-line beam calibration cannot be guaranteed, and the external parameter calibration method cannot be suitable for equipment such as a scanner exist. Therefore, it is necessary to invent a panoramic camera and multi-line laser radar external parameter calibration system to solve the above problems.
Disclosure of Invention
The invention aims to provide a panoramic camera and multi-line laser radar external parameter calibration system so as to solve the problems in the background technology.
In order to achieve the above purpose, the present invention provides the following technical solutions: a panoramic camera and multi-line laser radar external parameter calibration system comprises a calibration room with a mark, a rotating device with an encoder, an accumulated multi-line laser radar generated laser intensity map algorithm, a depth map and panoramic map characteristic automatic matching algorithm and an optimization constraint relation acquisition external parameter algorithm;
two calibration plates for acquiring characteristic information of images are respectively stuck to the two sides, the front and the back of the marking room, and the right center of each calibration plate is provided with a reflector for acquiring the characteristic information by the multi-line laser radar;
the rotating device comprises a fixed base, a motor with an encoder and a placing table arranged at the top end of the fixed base, wherein the motor with the encoder is arranged in the fixed base, and the output end of the motor with the encoder extends to the top end of the fixed base and is connected with the placing table;
the intensity map generation includes the steps of:
s1, placing a panoramic camera to be calibrated and a multi-line laser radar on a placement table, then rotating the placement table through a motor with an encoder, controlling the motor with the encoder to drive the placement table to rotate 360 degrees, and recording encoder data information, multi-line laser radar data information and a panoramic image img captured by the panoramic camera at the position of the motor with the encoder;
s2, accumulating point cloud information:
converting the non-frame point cloud information into a world coordinate system according to the following formula:
X wij =X ij ×cos(θ i )-Y ij ×sin(θ i );
Y wij =X ij ×sin(θ i )-Y ij ×cos(θ i );
Z wij =Z ij ;
K wij =K ij ;
wherein: { X wij ,Y wij ,Z wij ,K wij Information of radar laser point in world coordinate system, { X } refers to ij ,Y ij ,Z ij ,K ij The j laser point information, theta, in the point cloud acquired at the i moment i Refers to the rotated angle information obtained by the ith moment encoder;
s3, projecting the point cloud onto the panorama expansion chart according to longitude and latitude, wherein a specific projection formula is as follows:
longitude ij =arctan(yi j ,x ij );
v ij =latitude ij /360×img rows ;
u ij =longitude ij /180×img cols ;
wherein { img } in the formula rows ,img cols The number of rows and columns of pixels of the panorama expanded image, { v ij ,u ij The j point at the i moment is the pixel position of the j point in the panorama;
s4, finally mapping the intensity of the laser points onto the panorama according to the pixel position in the step S3;
the depth map and panorama feature automatic matching algorithm comprises panorama feature extraction, laser point cloud intensity map feature extraction and feature automatic matching.
Preferably, the encoder data information in step S1 is { θ } 0 ,θ 1 ,θ 2 ,…θ n Data information of the multi-line laser radar is { C } 0 ,C 1 ,C 2 ,…C n (C) i ={P 0 ,P 1 ,…P m }|P j ={x,y,z,k}};
Wherein: θ i Refers to the rotation angle read by each frame encoder;
C i the method comprises the steps of referring to a single-frame point cloud acquired at the ith moment;
{ x, y, z, k } refers to the positional information and intensity information of each point in the point cloud, respectively.
Preferably, the panorama feature extraction captures texture information of the whole calibration room, visually recognizes the position of the calibration plate, and simultaneously determines the center of the calibration plate by identifying the calibration plate, and takes the center as a feature point of the panorama, wherein the center feature point identification process is as follows:
s5, firstly performing panoramic binarization, setting a threshold T, and traversing each pixel of the panoramic image;
s6, when the pixel value is larger than T, setting the pixel value to be black, otherwise setting the pixel value to be white;
s7, identifying a rectangular frame, calculating fixed point numbers along the edges of each black area of the panoramic image, and identifying the rectangular frame when the number of vertexes is equal to 4, namely, the position of the calibration plate;
s8, finally calculating the position of the center point of the calibration plate through the following formula:
u=(u 1 +u 2 +u 3 +u 4 )/4;
v=(v 1 +v 2 +v 3 +v 4 )/4;
in the formula, { u, v } is the position of the center point of the calibration plate, { u } i ,v i I epsilon {1,2,3,4} is the pixel position information of the four vertexes of the calibration plate.
Preferably, the extracting of the laser point cloud intensity map features includes clustering of high reflection intensity pixels and determination of feature center points, which is specifically as follows:
firstly, clustering is carried out, wherein the clustering is carried out because the reflecting plates have a certain area, so that the positions of the laser striking points of the reflecting plates are different and are dispersed in a certain range, and therefore, in order to determine the central position of the reflecting plates, the laser reflecting points of the same reflecting plate are required to be clustered together, the clustering method is a threshold value method, and the distance between adjacent pixel points is calculated by the following formula:
d in ji I.e. the distance between the ith pixel and the jth pixel, { u } i ,v i }、{u j ,v j The i and j pixel point positions are respectively;
then setting a threshold T', and gathering the distance smaller than the threshold into one type, so as to classify the high-reflection laser points returned by the same reflector in the intensity graph together;
then, the position in the intensity diagram of the center of the reflector is calculated by the following formula:
u′=(u′ 1 +u′ 2 +…+u′ n )/n;
v′=(v′ 1 +v′ 2 +…+v′ n )/n;
where { u ', v ' } is the position in the intensity pattern of the reflector center, { u ' i ,v′ i I e {1,2 … n } is the position of n pixels in a cluster.
Preferably, the characteristic automatic matching utilizes a nearest neighbor method to automatically match the center of the calibration plate and the center of the reflector, and the specific method is to calculate the distance between any two center points by the following formula:
d in ji ' is the distance between the center point of the ith calibration plate and the center point of the jth reflector; then, for each calibration plate center point, selecting the nearest reflector center point as a matching point to generate a matching point set { { (u) i ,v i ),(u′ i ,v′ i The |i epsilon (1, 2 … n) }, and the matching of the feature points is automatically completed.
Preferably, the optimization constraint relation obtaining external parameters algorithm needs to include rotation and movement between a camera and a radar when being established, the external parameters are respectively set to be rotation { roll, pitch, yaw } and translation { tx, ty, tz }, and then the following equation set F is established according to each pair of matching points in a matching point set established by the depth map and panorama characteristic automatic matching algorithm:
x′=cp.cy.x+(sr.sp.cy-cr.sy).y+(cr.sp.cy+sr.sy).z+tx;
y′=cp.sy.x+(sr.sp.sy+cr.cy).y+(cr.sp.sy+sr.sy).z+ty;
z′=-sy.x+sr.cp.y+cr.cp.z+tx;
v′arctan(y′,x′)/180.img cols ;
u=u′;
v=v′;
where cp=cos (pitch), sp=sin (pitch), cr=cos (roll), sr=sin (roll), cy=cos (yaw), sy=sin (yaw);
thus, a constraint relation equation constructed by a pair of matching points is established, and n pairs of matching points are totally established, so that n equation sets { F 1 ,F 2 …F n And solving the equation set to obtain the external parameters { roll, pitch, law }, { tx, ty, tz }, thereby realizing the calibration of the panoramic camera and the multi-line laser radar external parameters.
The invention has the technical effects and advantages that: the invention designs a calibration room with a calibration plate and a reflector, and then designs a rotating device with an encoder in the center of the calibration room, the device rotates together with equipment combining a panoramic camera and a multi-line laser radar, a first frame of panoramic camera photo is snapped and accumulated multi-line laser radar information is generated to generate a laser intensity image, then the panoramic photo and a depth image are automatically matched to establish a constraint relation, the establishment of the constraint relation is carried out through the calibration room with a mark and a specific device, and then the panoramic camera and the laser radar are optimized and solved through a least square method, so that an external parameter calibration system of the multi-line laser radar and the panoramic camera has a brand-new calibration system with the functions of automatic calibration, multi-line radar application, scanning device calibration application and the like.
Drawings
FIG. 1 is a schematic view of a calibration room structure of the present invention.
Fig. 2 is a schematic view of a rotary device according to the present invention.
Fig. 3 is a schematic diagram of the device to be calibrated (panoramic camera and lidar) of the present invention.
Fig. 4 is a flow chart of laser intensity map generation according to the present invention.
FIG. 5 is a schematic view of a laser point cloud of a calibration room of the present invention.
Fig. 6 is a graph of laser point cloud intensity according to the present invention.
Fig. 7 is a flow chart of the feature automatic matching of the present invention.
Fig. 8 is a view of the panoramic view of the present invention.
Fig. 9 is a flowchart for determining the position of the feature point of the panorama of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The invention provides a panoramic camera and multi-line laser radar external parameter calibration system as shown in figures 1-9, which comprises a calibration room with marks, a rotating device with an encoder, an algorithm for accumulating laser intensity maps generated by the multi-line laser radar, an automatic matching algorithm for depth maps and panoramic map features and an external parameter acquisition algorithm for optimizing constraint relations;
as shown in fig. 1, two calibration plates for acquiring characteristic information of an image are respectively posted on two sides, the front and the back of the marking room, and a reflecting plate for acquiring the characteristic information of the multi-line laser radar is arranged at the right center of each calibration plate;
as shown in fig. 2, the rotating device comprises a fixed base, a motor with an encoder and a placing table arranged at the top end of the fixed base, wherein the motor with the encoder is arranged in the fixed base, and the output end of the motor with the encoder extends to the top end of the fixed base to be connected with the placing table; specifically, the panoramic camera and the multi-line laser radar device to be calibrated are placed on a placing table during calibration, and then the motor with the encoder is controlled to drive the placing table to rotate in situ by more than 180 degrees.
And (3) accumulating a multi-line laser radar to generate a laser intensity graph algorithm: as shown in fig. 3, the panoramic camera and the laser radar are simultaneously arranged on the rotary table to rotate for calibration, and the structure of the panoramic camera and the laser radar is the same as that of the scanner device, so the panoramic camera and the laser radar can be suitable for calibrating the scanner device. An intensity map is then generated by the flowchart as in fig. 4.
The intensity map generation includes the steps of:
s1, placing a panoramic camera to be calibrated and a multi-line laser radar on a placing table, then rotating the placing table through a motor with an encoder, controlling the motor with the encoder to drive the placing table to rotate 360 degrees, and recording encoder data information { theta } in the whole process 0 ,θ 1 ,θ 2 ,…θ n Data information { C of } multi-line laser radar 0 ,C 1 ,C 2 ,…C n And panoramic camera grabbed panoramic image img with coded motor position, where { C } i ={P 0 ,P 1 ,…P m }|P j ={x,y,z,k}};
Wherein: θ i Refers to the rotation angle read by each frame encoder;
C i the method comprises the steps of referring to a single-frame point cloud acquired at the ith moment;
{ x, y, z, k } refers to the position information and intensity information of each point in the point cloud, respectively;
s2, accumulating point cloud information:
the non-frame point cloud information is converted into a world coordinate system (the world coordinate system is the exact center of the turntable, and the orientation is the zero position of the turntable rotation) according to the following formula:
X wij =X ij ×cos(θ i )-Y ij ×sin(θ i );
Y wij =X ij ×sin(θ i )-Y ij ×cos(θ i );
Z wij =Z ij ;
K wij =K ij ;
wherein: { X wij ,Y wij ,Z wij ,K wij Information of radar laser point in world coordinate system, { X } refers to ij ,Y ij ,Z ij ,K ij The j laser point information, theta, in the point cloud acquired at the i moment i Refer to the ith timeThe rotated angle information obtained by the encoder; all laser point information can be converted into a world coordinate system through the method, accumulation of point clouds is achieved, and a complete calibration room point cloud chart shown in fig. 5 is generated. In the figure, the black dot represents a strong reflection point, and the white dot represents a low reflection point. The reason why the strong reflection point occurs is that the reflection plate in the calibration room has strong reflectivity, so that the reflection intensity of the laser spot is large at the position where the reflection plate exists, and the reflection intensity of the laser spot is small at other positions. .
S3, projecting the point cloud onto the panorama expansion chart according to longitude and latitude, wherein a projection formula is as follows:
longitude ij =arctan(y ij ,x ij );
v ij =latitude ij /360×img rows ;
u ij =longitude ij /180×img cols ;
wherein { img } in the formula rows ,img cols The number of rows and columns of pixels of the panorama expanded image, { v ij ,u ij The j point at the i moment is the pixel position of the j point in the panorama;
s4, finally mapping the intensity of the laser points onto the panorama according to the pixel position in the step S3 to form an intensity diagram shown in FIG. 6;
the depth map and panorama feature automatic matching algorithm comprises panorama feature extraction, laser point cloud intensity map feature extraction and feature automatic matching, and the algorithm flow is shown in a block diagram 7.
Preferably, the panorama feature extraction captures texture information of the whole calibration room, as shown in fig. 8, and intuitively sees the position of the calibration plate, and at the same time, the center of the calibration plate is determined by identifying the calibration plate, and the center is used as a feature point of the panorama, wherein the center feature point identification flow is as follows (as shown in fig. 9):
s5, firstly performing panoramic binarization, setting a threshold T, and traversing each pixel of the panoramic image;
s6, when the pixel value is larger than T, setting the pixel value to be black, otherwise setting the pixel value to be white;
s7, identifying a rectangular frame, calculating fixed point numbers along the edges of each black area of the panoramic image, and identifying the rectangular frame when the number of vertexes is equal to 4, namely, the position of the calibration plate;
s8, finally calculating the position of the center point of the calibration plate through the following formula:
u=(u 1 +u 2 +u 3 +u 4 )/4;
v=(v 1 +v 2 +v 3 +v 4 )/4;
in the formula, { u, v } is the position of the center point of the calibration plate, { u } i ,v i I epsilon {1,2,3,4} is the pixel position information of the four vertexes of the calibration plate.
Preferably, the extracting of the laser point cloud intensity map features includes clustering of high reflection intensity pixels and determination of feature center points, which is specifically as follows:
firstly, clustering is carried out, wherein the clustering is carried out because the reflecting plates have a certain area, so that the positions of the laser striking points of the reflecting plates are different and are dispersed in a certain range, and therefore, in order to determine the central position of the reflecting plates, the laser reflecting points of the same reflecting plate are required to be clustered together, the clustering method is a threshold value method, and the distance between adjacent pixel points is calculated by the following formula:
d in ji I.e. the distance between the ith pixel and the jth pixel, { u } i ,v i }、{u j ,v j The i and j pixel point positions are respectively;
then setting a threshold T', and gathering the distance smaller than the threshold into one type, so as to classify the high-reflection laser points returned by the same reflector in the intensity graph together;
then, the position in the intensity diagram of the center of the reflector is calculated by the following formula:
u′=(u′ 1 +u′ 2 +…+u′ n )/n;
v′=(v′ 1 +v′ 2 +…+v′ n )/n;
where { u ', v ' } is the position in the intensity pattern of the reflector center, { u ' i ,v′ i I e {1,2 … n } is the position of n pixels in a cluster.
Since the intensity map and the panorama are consistent in size and the reflector is attached to the center of the calibration plate when the calibration room is designed, the center of the calibration plate and the center of the reflector obtained by the calculation should be at the same position if there is no error in external parameters between the multi-line radar and the panorama camera. However, since there is a small error in the initial set of external parameters, there is a small deviation in the positions of the calibration plate center and the reflector plate center. Through the analysis, the center of the calibration plate can be automatically matched by using the nearest neighbor method, and the specific method is to calculate the distance between any two center points by the following formula:
d in ji ' is the distance between the center point of the ith calibration plate and the center point of the jth reflector; then, for each calibration plate center point, selecting the nearest reflector center point as a matching point to generate a matching point set { { (u) i ,v i ),(u′ i ,v′ i The |i epsilon (1, 2 … n) }, and the matching of the feature points is automatically completed.
Preferably, the optimization constraint relation obtaining external parameters algorithm needs to include rotation and movement between a camera and a radar when being established, the external parameters are respectively set to be rotation { roll, pitch, yaw } and translation { tx, ty, tz }, and then the following equation set F is established according to each pair of matching points in a matching point set established by the depth map and panorama characteristic automatic matching algorithm:
x′=cp.cy.x+(sr.sp.cy-cr.sy).y+(cr.sp.cy+sr.sy).z+tx;
y′=cp.sy.x+(sr.sp.sy+cr.cy).y+(cr.sp.sy+sr.sy).z+ty;
z′=-sy.x+sr.cp.y+cr.cp.z+tx;
v′arctan(y′,x′)/180.img cols ;
u=u′;
v=v′;
where cp=cos (pitch), sp=sin (pitch), cr=cos (roll), sr=sin (roll), cy=cos (yaw), sy=sin (yaw);
thus, a constraint relation equation constructed by a pair of matching points is established, and n pairs of matching points are totally established, so that n equation sets { F 1 ,F 2 …F n And solving the equation set to obtain the external parameters { roll, pitch, law }, { tx, ty, tz }, thereby realizing the calibration of the panoramic camera and the multi-line laser radar external parameters.
Finally, it should be noted that: the foregoing description is only illustrative of the preferred embodiments of the present invention, and although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that modifications may be made to the embodiments described, or equivalents may be substituted for elements thereof, and any modifications, equivalents, improvements or changes may be made without departing from the spirit and principles of the present invention.