CN114140534A - Combined calibration method for laser radar and camera - Google Patents
Combined calibration method for laser radar and camera Download PDFInfo
- Publication number
- CN114140534A CN114140534A CN202111350523.5A CN202111350523A CN114140534A CN 114140534 A CN114140534 A CN 114140534A CN 202111350523 A CN202111350523 A CN 202111350523A CN 114140534 A CN114140534 A CN 114140534A
- Authority
- CN
- China
- Prior art keywords
- camera
- calibration plate
- calibration
- frame
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 238000001514 detection method Methods 0.000 claims abstract description 33
- 230000003068 static effect Effects 0.000 claims abstract description 29
- 230000008569 process Effects 0.000 claims abstract description 16
- 238000006243 chemical reaction Methods 0.000 claims abstract description 15
- 239000011159 matrix material Substances 0.000 claims description 19
- 238000004422 calculation algorithm Methods 0.000 claims description 12
- 238000006073 displacement reaction Methods 0.000 claims description 10
- 238000012545 processing Methods 0.000 claims description 6
- 230000000007 visual effect Effects 0.000 claims description 5
- 239000000463 material Substances 0.000 claims description 4
- 230000008859 change Effects 0.000 claims description 3
- 238000013519 translation Methods 0.000 claims description 3
- 238000012907 on board imaging Methods 0.000 claims description 2
- 230000004927 fusion Effects 0.000 description 4
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 238000012216 screening Methods 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000013215 result calculation Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10044—Radar image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
- G06T2207/30208—Marker matrix
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a combined calibration method for a laser radar and a camera, which comprises the following steps: 1) the exposure and focal length parameters of the camera are respectively detected and adjusted through a camera detection module, so that the exposure and focal length parameters can reach a preset threshold range, and the accuracy of a calibration result is ensured; 2) the method comprises the following steps of performing static frame detection on a calibration plate in a motion state through a static frame detection module, selecting a plurality of frames of camera images meeting requirements to form a static frame set { S }, and preparing for subsequent data acquisition of a laser radar and a camera; 3) through the combined calibration module, camera internal parameters are calibrated firstly, and then after the 3D space coordinate and the 2D pixel coordinate of the calibration plate are acquired, a final combined calibration result is obtained according to the conversion relation between the two coordinates. The invention selects the square monochromatic plate as the calibration plate, and can automatically adjust the equipment parameters, select the data frame and calculate the calibration result in the whole process without manual intervention.
Description
Technical Field
The invention relates to the field of multi-sensor fusion calibration, in particular to a combined calibration method for a laser radar and a camera.
Background
With the rapid development of computer and machine vision technologies, the research on mobile robots is increasingly becoming a hotspot and difficulty in the field of robotics. The perception of the mobile robot to the external environment in the motion process must be obtained through sensors, wherein cameras and laser radars are two sensors with the highest use frequency. For a visual camera, it is particularly sensitive to illumination, although it has rich color information and high resolution; whereas lidar, while immune to lighting conditions and capable of providing accurate geometric information, has low resolution and refresh rate. Therefore, the data collected by only a single sensor often cannot provide clear and accurate enough environment information for the mobile robot, and a scheme of multi-sensor fusion needs to be adopted, and the premise of the fusion technology is to realize parameter calibration among the multiple sensors.
Most of the existing combined calibration methods do not explicitly describe the detection and adjustment work of the visual equipment before data acquisition, and neglect the important influence of hardware parameters on the calibration result; in the calibration process, the calibration plate is usually placed statically at a plurality of preset positions to be measured for data acquisition, so that the efficiency is low, and the calibration result is not very universal; in some calibration methods, a complicated calibration device or a calibration plate with a special geometric shape needs to be designed to assist in completing the calibration operation, and the calculation process is complicated and is not easy to use.
Disclosure of Invention
Aiming at the technical defects in the prior art, the invention provides a combined calibration method for a laser radar and a camera, and provides an implementation method for detecting camera parameters through a modular design, so that the accuracy of image data information is ensured; optimizing the traditional standing calibration plate data acquisition mode into real-time data screening and acquisition in the motion process; and finally, solving a transformation parameter matrix between the laser radar and the camera by using a coordinate conversion formula so as to achieve the aim of jointly calibrating the laser radar and the camera. The whole calibration method is practical and efficient, has high calibration precision, and better meets the actual requirements.
In order to achieve the purpose, the invention adopts the following technical scheme:
a combined calibration method for a laser radar and a camera specifically comprises the following steps:
1) the exposure and focal length parameters of the camera are respectively detected and adjusted through a camera detection module, so that the exposure and focal length parameters can reach a preset threshold range, and the accuracy of a calibration result is ensured;
2) the method comprises the following steps of performing static frame detection on a calibration plate in a motion state through a static frame detection module, selecting a plurality of frames of camera images meeting requirements to form a static frame set { S }, and preparing for subsequent data acquisition of a laser radar and a camera;
3) through the combined calibration module, camera internal parameters are calibrated firstly, and then after the 3D space coordinate and the 2D pixel coordinate of the calibration plate are acquired, a final combined calibration result is obtained according to the conversion relation between the two coordinates.
Further, the step 1) comprises the following steps:
1-1) shooting an initial image containing a calibration plate by using a camera;
1-2) carrying out corner point detection on the acquired camera image to obtain position information of each corner point in a pixel coordinate system; connecting the coordinates of each corner point to form a polygonal pattern of the calibration plate, namely obtaining the area of the calibration plate in the initial camera image;
1-3) calculating the average value RGB of pixels in the area where the calibration plate is located in the initial image, comparing the average value RGB with a preset pixel threshold value TH, and if the pixel average value RGB is within the threshold value TH range, turning to the step 1-4); otherwise, sending a signal to the camera to automatically adjust the exposure parameter of the camera and collecting the image again;
1-4) shooting by using a camera with adjusted exposure to obtain a brand new image containing a calibration plate;
1-5) carrying out angular point detection on the acquired camera image again to obtain the area information of the calibration plate in the camera image;
1-6) utilizing a Tenengrad gradient method to score the definition of a region where a calibration plate is located in a camera image, respectively calculating gradient values in the horizontal direction and the vertical direction based on a Sobel operator, taking the processed average gray value as a measure index of the image definition, and if the average gray value is within a preset threshold range ([ T ] T1,T2]) Then, it represents a camera pairThe focus is good, and the detection and adjustment work of the exposure and the definition of the camera is finished; otherwise, sending a signal to the camera to adjust the focal length of the camera, and acquiring the image again.
Further, the step 2) comprises the following steps:
2-1) fixing the laser radar and the camera on the same base to form a sensor module to be calibrated, ensuring that the base is fixed and the relative positions of the laser radar and the camera are not changed, and ensuring that the detection range of the laser radar and the visual field of the camera have an overlapping area of more than 60%;
2-2) selecting a square monochromatic plate as a calibration plate, and moving the calibration plate in a superposition range of the laser radar and the camera vision, wherein the movement process comprises the changes of translation and rotation;
2-3) defining the coordinate system of the calibration plate denoted OBIs also the world coordinate system OWSelecting an angular point at the upper left corner of the calibration plate as an origin of a coordinate system of the calibration plate, pointing right-angled sides at two sides to the X/Y axis direction of the coordinate system of the calibration plate, and determining the Z axis direction according to the right-hand rule; calculating the angular points of the calibration plate at the position O according to the actual size of the square calibration plateBCoordinate P of (1)BIs also PW;
2-4) according to the coordinate PBAnd the coordinates P of the corner point in the pixel coordinate systempxThe EPnP algorithm is used for solving the conversion matrix from the coordinate system of the calibration plate to the coordinate system of the cameraProcessing each frame of image data acquired by the camera in the motion process to obtain a conversion matrix list from the calibration plate to the camera
2-5) determining the motion quantity of a calibration plate between front and back two moments of any frame of camera image; firstly, selecting any point P on a coordinate system of a calibration plate in the ith frame of camera imageBiUsing the equationFinding the selected point PBiProjection position P 'in j-th frame camera image'Bj(ii) a If the calibration plate does not move relatively at the front and rear moments, Ti=TjI.e. P'Bj=PBi(ii) a Otherwise, by the equationObtaining P'BjAnd its actual position P in the j-th frame camera imageBjThe difference value is the displacement of the calibration plate between the two moments;
2-6) comparing adjacent data frames, and selecting the data frame with motion change meeting the requirement as a static frame; for a calibration plate, it contains a list of corner points C ═ XnIs given by the equation m (C, i, j) ═ maxd (X)ni,Xnj) Determining the final displacement condition of the calibration plate; according to equation m (C, i-1, i)<x&&m(C,i,i+1)<x, traversing each frame of camera image, and selecting a frame as a stationary frame if and only if the motion of the frame is less than a threshold displacement compared to an adjacent frame; and the selected static frame is required to be distributed in the overlapped area of the laser radar and the camera vision, namely the following conditions are met: m (C, s)i,sj)>y, wherein si,sjE { S }, x 1cm, y 300cm, the set S being the selected set of stationary frames.
Further, the step 3) comprises the following steps:
3-1) calibrating the internal reference of the camera to obtainWhere the matrix K represents the intrinsic parameters of the camera, fx,fyDenotes the focal length, cx,cyRepresents a principal point offset;
3-2) collecting the laser radar three-dimensional point cloud data and the camera image data of the frame while selecting the static frame, and adding the collected selected frame data into a static frame set { S } if the collected selected frame data can clearly and comprehensively describe the calibration plate information; otherwise, skipping the frame, reselecting the next static frame and carrying out data acquisition;
3-3) processing the point cloud data collected by the laser radar to obtain the spatial coordinates of the angular point of the calibration plate; firstly, performing plane fitting on laser point cloud by using a RANSAC algorithm, and extracting a plane of a fixed plate under a laser radar coordinate system; then projecting all point clouds onto a fitting plane, and extracting a boundary point set of the point clouds by a grid division method; respectively performing linear fitting on the boundary points by using a least square method to obtain the outer edge contour line and a linear equation of the outer edge contour line; two intersection points between the four outer edge lines form four angular points of the calibration plate, coordinate data of each angular point are obtained by solving a linear equation and are used as 3D space coordinates of the calibration plate at the position;
3-4) carrying out corner detection on image information corresponding to each static frame; firstly, converting an acquired camera image into a gray image, then determining the position of a strong corner point in the image by using a Shi-Twoasi algorithm, and then extracting sub-pixel-level accurate coordinate data of the image to serve as a 2D pixel coordinate of a calibration plate corner point at the position;
3-5) from the conversion formula between the world coordinate system and the camera coordinate system:
wherein,representing the 3D space coordinates of the calibration plate corner points obtained in the step 3-3),representing the 2D pixel coordinates of the calibration plate corner point obtained in step 3-4) in a pixel coordinate system, an external reference matrix representing the camera is provided,representing the camera internal reference matrix obtained in the step 3-1); therefore, the EPnP algorithm is used for solving a rotation matrix R and a translational vector t between the laser radar coordinate system and the camera coordinate system, and further the joint calibration of the laser radar and the camera is realized.
The calibration plate is a square monochromatic plate, the side length of the calibration plate is 0.5m, the thickness of the calibration plate is 5mm, the calibration plate is made of a diffuse reflection material, and detection of a laser radar is facilitated.
Compared with the existing calibration method, the invention has the advantages and beneficial effects that:
1. the invention adopts a modularized design, divides the whole calibration process into three steps of hardware parameter optimization, dynamic data acquisition and sensor combined calibration, can automatically add or delete corresponding modules according to requirements on the basis of ensuring that fusion information is correct, and can meet the combined calibration work of at least one laser radar and one camera.
2. The invention provides a specific implementation mode of camera parameter detection, exposure and focal length parameters of a camera can be automatically detected and adjusted and optimized before formal calibration, and the integrity and accuracy of image data information are ensured.
3. The invention does not adopt the mode of standing the calibration plate at a preset position for data acquisition, but creatively proposes that the real-time data screening and acquisition are carried out in the motion process of the calibration plate, and the data frame meeting the requirement is selected as the acquisition object through the comparison of the displacement between the adjacent frames; the whole process is automatically realized without manual intervention.
4. The invention provides a method for detecting the color of a color plate, which comprises the steps of selecting a square monochromatic plate as a calibration plate, and taking four angular points of the square plate as points to be detected; the calibration plate is made of diffuse reflection materials, and has a better detection effect for laser radars.
The calibration method provided by the invention does not need to manually identify and match corresponding detection points of the laser radar and the camera, but utilizes the conversion relation between the coordinate systems of the laser radar and the camera and adopts an optimization method of data fitting to calculate and solve, thereby ensuring the rigor and accuracy of the calibration process and avoiding the randomness of the calibration result.
Drawings
FIG. 1 is a flow chart of the calibration method of the present invention.
FIG. 2 is a flowchart of a camera detection module in the calibration method of the present invention.
FIG. 3 is a flowchart of the step of detecting the stationary frame in the calibration method of the present invention.
FIG. 4 is a schematic structural diagram of the calibration method of the present invention.
FIG. 5 is a flowchart of the combined calibration steps of the lidar and the camera in the calibration method of the present invention.
FIG. 6 is a schematic plan view of a calibration plate fitted from a calibration plate laser point cloud.
Fig. 7 is a schematic diagram of the fitted outer edge profile and corner points of the calibration plate.
Detailed Description
The invention is described in detail below with reference to the figures and specific embodiments. The following examples are only for illustrating the technical solutions of the present invention more clearly, and the protection scope of the present invention is not limited thereby.
As shown in fig. 1, a joint calibration method for a laser radar and a camera specifically includes the following steps:
1) the exposure and the focal length parameters of the camera are respectively detected and adjusted through the camera detection module, so that the exposure and the focal length parameters can reach a preset threshold range, and the accuracy of a calibration result is ensured.
As shown in fig. 2, the step 1) includes the following steps:
1-1) shooting an initial image containing a calibration plate by using a camera;
1-2) carrying out corner point detection on the acquired camera image to obtain position information of each corner point in a pixel coordinate system; connecting the coordinates of each corner point to form a polygonal pattern of the calibration plate, namely obtaining the area of the calibration plate in the initial camera image;
1-3) calculating an average value (RGB) of pixels in an area where a calibration plate is located in an initial image, comparing the average value (RGB) with a preset pixel threshold TH (for example, TH is 128 or close to 128, and a value range of each pixel is [0,255]), and if the average value RGB of the pixels is within the threshold TH range (RGB is TH ± 5), turning to step 1-4); otherwise, sending a signal to the camera to automatically adjust the exposure parameter of the camera and collecting the image again;
1-4) shooting by using a camera with adjusted exposure to obtain a brand new image containing a calibration plate;
1-5) carrying out angular point detection on the acquired camera image again to obtain the area information of the calibration plate in the camera image;
1-6) utilizing a Tenengrad gradient method to score the definition of a region where a calibration plate is located in a camera image, respectively calculating gradient values in the horizontal direction and the vertical direction based on a Sobel operator, taking the processed average gray value as a measure index of the image definition, and if the average gray value is within a preset threshold range ([ T ] T1,T2]) If the camera is in good focus, the detection and adjustment work of the exposure and definition of the camera is finished; otherwise, sending a signal to the camera to adjust the focal length of the camera, and acquiring the image again.
2) And (3) performing static frame detection on the calibration board in the motion state through a static frame detection module, selecting a plurality of frames of camera images meeting the requirements to form a static frame set { S }, and preparing for subsequent data acquisition of the laser radar and the camera.
As shown in fig. 3, the step 2) includes the following steps:
2-1) fixing the laser radar and the camera on the same base to form a sensor module to be calibrated, ensuring that the base is fixed and the relative positions of the laser radar and the camera are not changed, and ensuring that the detection range of the laser radar and the visual field of the camera have an overlapping area of more than 60%;
2-2) selecting a square monochromatic plate as a calibration plate, and moving the calibration plate in a superposition range of the laser radar and the camera vision field, wherein the movement process comprises the changes of translation and rotation. The size of the calibration plate is a fixed square with the side length of 0.5m and the thickness of 0.5mm, and the calibration plate is made of a diffuse reflection material and is beneficial to reflection of a laser radar; the position and the corresponding relation between the sensor module are shown in fig. 4.
2-3) defining the coordinate system of the calibration plate denoted OB(also world coordinate system O)W) Selecting an angular point at the upper left corner of the calibration plate as an origin of a coordinate system of the calibration plate, pointing right-angled sides at two sides to the X/Y axis direction of the coordinate system of the calibration plate, and determining the Z axis direction according to the right-hand rule; calculating the angular points of the calibration plate at the position O according to the actual size of the square calibration plateBCoordinate P of (1)B(also is P)W);
2-4) according to the coordinate PBAnd the coordinates P of the corner point in the pixel coordinate systempxThe EPnP algorithm is used for solving the conversion matrix from the coordinate system of the calibration plate to the coordinate system of the cameraProcessing each frame of image data acquired by the camera in the motion process to obtain a conversion matrix list from the calibration plate to the camera
2-5) determining the motion quantity of a calibration plate between front and back two moments of any frame of camera image; firstly, selecting any point P on a coordinate system of a calibration plate in the ith frame of camera imageBiUsing the equationFinding the selected point PBiProjection position P 'in j-th frame camera image'Bj(ii) a If the calibration plate does not move relatively at the front and rear moments, Ti=TjI.e. P'Bj=PBi(ii) a Otherwise, by the equationObtaining P'BjAnd its actual position P in the j-th frame camera imageBjThe difference value is the displacement of the calibration plate between the two moments;
2-6) comparing adjacent data frames, and selecting the data frame with motion change meeting the requirement as staticStopping the frame; for a calibration plate, it contains a list of corner points C ═ XnIs given by the equation m (C, i, j) ═ maxd (X)ni,Xnj) Determining the final displacement condition of the calibration plate; according to equation m (C, i-1, i)<x&&m(C,i,i+1)<x, traversing each frame of camera image, and selecting a frame as a stationary frame if and only if the motion of the frame is less than a threshold displacement compared to an adjacent frame; and the selected static frame is required to be distributed in the overlapped area of the laser radar and the camera vision, namely the following conditions are met: m (C, s)i,sj)>y, wherein si,sjE { S }, x 1cm, y 300cm, the set S being the selected set of stationary frames.
3) Through the combined calibration module, camera internal parameters are calibrated firstly, and then after the 3D space coordinate and the 2D pixel coordinate of the calibration plate are acquired, a final combined calibration result is obtained according to the conversion relation between the two coordinates.
As shown in fig. 5, the step 3) includes the following steps:
3-1) calibrating the internal reference of the camera to obtainWhere the matrix K represents the intrinsic parameters of the camera, fx,fyDenotes the focal length (i.e. the distance from the optical center of the camera lens to the imaging plane), cx,cyRepresenting the principal point offset (namely the actual intersection point position of the main axis of the camera and the image, and the unit is pixel);
3-2) collecting the laser radar three-dimensional point cloud data and the camera image data of the frame while selecting the static frame, and adding the collected selected frame data into a static frame set { S } if the collected selected frame data can clearly and comprehensively describe the calibration plate information; otherwise, skipping the frame, reselecting the next static frame and carrying out data acquisition;
and 3-3) processing the point cloud data acquired by the laser radar to obtain the spatial coordinates of the angular point of the calibration plate. Firstly, performing plane fitting on laser point cloud by using a RANSAC algorithm, and extracting a plane of a fixed plate under a laser radar coordinate system; let the expression of the spatial plane equation be: a. thex+By+CzWhen + D is 0, then the normal vector of the plane is n ═ a, B, C; randomly selecting three points from the point cloud each time to form a plane, and calculating the distances from all other points to the plane If the distance d is smaller than the threshold value T, the points are considered to be in the same plane and are called inner points; through multiple iterations, selecting the plane with the most interior points as an optimal fitting plane, as shown in fig. 6; then projecting all point clouds onto a fitting plane, and extracting a boundary point set of the point clouds by a grid division method; respectively performing straight line fitting on the boundary points by using a least square method; assuming the spatial straight line expression is: y ═ a + bx, for n boundary points (x)i,yi) Set up of equationsCalculating and solving a and b to obtain the outer edge contour line and the linear equation of the outer edge contour line; two intersection points between the four outer edge lines form four angular points of the calibration plate, coordinate data of each angular point are obtained by solving a linear equation and are used as 3D space coordinates of the calibration plate at the position; the outline and corner points of the outer edge of the calibration plate are schematically shown in FIG. 7.
3-4) carrying out corner detection on image information corresponding to each static frame; firstly, converting an acquired camera image into a gray image, then determining the position of a strong corner point in the image by using a Shi-Twoasi algorithm, and then extracting sub-pixel-level accurate coordinate data of the image to serve as a 2D pixel coordinate of a calibration plate corner point at the position;
3-5) from the conversion formula between the world coordinate system and the camera coordinate system:
wherein,representing the 3D space coordinates of the calibration plate corner points obtained in the step 3-3),representing the 2D pixel coordinates of the calibration plate corner point obtained in step 3-4) in a pixel coordinate system, an external reference matrix representing the camera is provided,representing the camera internal reference matrix obtained in the step 3-1); therefore, the EPnP algorithm is used for solving a rotation matrix R and a translational vector t between the laser radar coordinate system and the camera coordinate system, and further the joint calibration of the laser radar and the camera is realized. In the embodiment, the square monochromatic plate is selected as the calibration plate, and the equipment parameter adjustment, the data frame selection and the calibration result calculation can be automatically carried out in the whole process without manual intervention.
The embodiments of the present invention have been described with reference to the accompanying drawings, but the present invention is not limited to the embodiments, and various changes and modifications can be made according to the purpose of the invention, and any changes, modifications, substitutions, combinations or simplifications made according to the spirit and principle of the technical solution of the present invention shall be equivalent substitutions, as long as the purpose of the present invention is met, and the present invention shall fall within the protection scope of the present invention without departing from the technical principle and inventive concept of the present invention.
Claims (5)
1. A combined calibration method for a laser radar and a camera is characterized by comprising the following steps:
1) the exposure and focal length parameters of the camera are respectively detected and adjusted through a camera detection module, so that the exposure and focal length parameters can reach a preset threshold range, and the accuracy of a calibration result is ensured;
2) the method comprises the following steps of performing static frame detection on a calibration plate in a motion state through a static frame detection module, selecting a plurality of frames of camera images meeting requirements to form a static frame set { S }, and preparing for subsequent data acquisition of a laser radar and a camera;
3) through the combined calibration module, camera internal parameters are calibrated firstly, and then after the 3D space coordinate and the 2D pixel coordinate of the calibration plate are acquired, a final combined calibration result is obtained according to the conversion relation between the two coordinates.
2. The method for joint calibration of lidar and a camera according to claim 1, wherein the step 1) comprises the steps of:
1-1) shooting an initial image containing a calibration plate by using a camera;
1-2) carrying out corner point detection on the acquired camera image to obtain position information of each corner point in a pixel coordinate system; connecting the coordinates of each corner point to form a polygonal pattern of the calibration plate, namely obtaining the area of the calibration plate in the initial camera image;
1-3) calculating the average value RGB of pixels in the area where the calibration plate is located in the initial image, comparing the average value RGB with a preset pixel threshold value TH, and if the pixel average value RGB is within the threshold value TH range, turning to the step 1-4); otherwise, sending a signal to the camera to automatically adjust the exposure parameter of the camera and collecting the image again;
1-4) shooting by using a camera with adjusted exposure to obtain a brand new image containing a calibration plate;
1-5) carrying out angular point detection on the acquired camera image again to obtain the area information of the calibration plate in the camera image;
1-6) utilizing a Tenengrad gradient method to score the definition of a region where a calibration plate is located in a camera image, respectively calculating gradient values in the horizontal direction and the vertical direction based on a Sobel operator, taking the processed average gray value as a measure index of the image definition, and if the average gray value is within a preset threshold range ([ T ] T1,T2]) Then representsThe camera is well focused, and the detection and adjustment work of the exposure and the definition of the camera is completed; otherwise, sending a signal to the camera to adjust the focal length of the camera, and acquiring the image again.
3. The method for joint calibration of lidar and a camera according to claim 1, wherein the step 2) comprises the steps of:
2-1) fixing the laser radar and the camera on the same base to form a sensor module to be calibrated, ensuring that the base is fixed and the relative positions of the laser radar and the camera are not changed, and ensuring that the detection range of the laser radar and the visual field of the camera have an overlapping area of more than 60%;
2-2) selecting a square monochromatic plate as a calibration plate, and moving the calibration plate in a superposition range of the laser radar and the camera vision, wherein the movement process comprises the changes of translation and rotation;
2-3) defining the coordinate system of the calibration plate denoted OBIs also the world coordinate system OWSelecting an angular point at the upper left corner of the calibration plate as an origin of a coordinate system of the calibration plate, pointing right-angled sides at two sides to the X/Y axis direction of the coordinate system of the calibration plate, and determining the Z axis direction according to the right-hand rule; calculating the angular points of the calibration plate at the position O according to the actual size of the square calibration plateBCoordinate P of (1)BIs also PW;
2-4) according to the coordinate PBAnd the coordinates P of the corner point in the pixel coordinate systempxThe EPnP algorithm is used for solving the conversion matrix from the coordinate system of the calibration plate to the coordinate system of the cameraProcessing each frame of image data acquired by the camera in the motion process to obtain a conversion matrix list from the calibration plate to the camera
2-5) determining the motion quantity of a calibration plate between front and back two moments of any frame of camera image; first, choose the ith frame camera image to bidAny point P on the coordinate system of the fixed plateBiUsing the equationFinding the selected point PBiProjection position P 'in j-th frame camera image'Bj(ii) a If the calibration plate does not move relatively at the front and rear moments, Ti=TjI.e. P'Bj=PBi(ii) a Otherwise, by the equationObtaining P'BjAnd its actual position P in the j-th frame camera imageBjThe difference value is the displacement of the calibration plate between the two moments;
2-6) comparing adjacent data frames, and selecting the data frame with motion change meeting the requirement as a static frame; for a calibration plate, it contains a list of corner points C ═ XnBy the equation m (C, i, j) ═ max d (X)ni,Xnj) Determining the final displacement condition of the calibration plate; according to the equation m (C, i-1, i) < x&&m (C, i, i +1) < x, traversing each frame of camera images, and selecting a frame as a stationary frame if and only if the motion of the frame is less than a threshold displacement compared to an adjacent frame; and the selected static frame is required to be distributed in the overlapped area of the laser radar and the camera vision, namely the following conditions are met: m (C, s)i,sj) > y, wherein si,sjE { S }, x 1cm, y 300cm, the set S being the selected set of stationary frames.
4. The method for joint calibration of lidar and camera according to claim 1, wherein the step 3) comprises the steps of:
3-1) calibrating the internal reference of the camera to obtainWhere the matrix K represents the intrinsic parameters of the camera, fx,fyDenotes the focal length, cx,cyRepresents a principal point offset;
3-2) collecting the laser radar three-dimensional point cloud data and the camera image data of the frame while selecting the static frame, and adding the collected selected frame data into a static frame set { S } if the collected selected frame data can clearly and comprehensively describe the calibration plate information; otherwise, skipping the frame, reselecting the next static frame and carrying out data acquisition;
3-3) processing the point cloud data collected by the laser radar to obtain the spatial coordinates of the angular point of the calibration plate; firstly, performing plane fitting on laser point cloud by using a RANSAC algorithm, and extracting a plane of a fixed plate under a laser radar coordinate system; then projecting all point clouds onto a fitting plane, and extracting a boundary point set of the point clouds by a grid division method; respectively performing linear fitting on the boundary points by using a least square method to obtain the outer edge contour line and a linear equation of the outer edge contour line; two intersection points between the four outer edge lines form four angular points of the calibration plate, coordinate data of each angular point are obtained by solving a linear equation and are used as 3D space coordinates of the calibration plate at the position;
3-4) carrying out corner detection on image information corresponding to each static frame; firstly, converting an acquired camera image into a gray image, then determining the position of a strong corner point in the image by using a Shi-Twoasi algorithm, and then extracting sub-pixel-level accurate coordinate data of the image to serve as a 2D pixel coordinate of a calibration plate corner point at the position;
3-5) from the conversion formula between the world coordinate system and the camera coordinate system:
wherein,representing the 3D space coordinates of the calibration plate corner points obtained in the step 3-3),representing the 2D pixel coordinates of the calibration plate corner point obtained in step 3-4) in a pixel coordinate system, an external reference matrix representing the camera is provided,representing the camera internal reference matrix obtained in the step 3-1); therefore, the EPnP algorithm is used for solving a rotation matrix R and a translational vector t between the laser radar coordinate system and the camera coordinate system, and further the joint calibration of the laser radar and the camera is realized.
5. A combined calibration method for lidar and camera according to any of claims 1 to 4, characterized in that the calibration plate is selected as a square monochromatic plate with a side length of 0.5m and a thickness of 5mm, made of a diffuse reflective material.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111350523.5A CN114140534A (en) | 2021-11-15 | 2021-11-15 | Combined calibration method for laser radar and camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111350523.5A CN114140534A (en) | 2021-11-15 | 2021-11-15 | Combined calibration method for laser radar and camera |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114140534A true CN114140534A (en) | 2022-03-04 |
Family
ID=80393209
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111350523.5A Pending CN114140534A (en) | 2021-11-15 | 2021-11-15 | Combined calibration method for laser radar and camera |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114140534A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114758005A (en) * | 2022-03-23 | 2022-07-15 | 中国科学院自动化研究所 | Laser radar and camera external parameter calibration method and device |
CN115994955A (en) * | 2023-03-23 | 2023-04-21 | 深圳佑驾创新科技有限公司 | Camera external parameter calibration method and device and vehicle |
-
2021
- 2021-11-15 CN CN202111350523.5A patent/CN114140534A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114758005A (en) * | 2022-03-23 | 2022-07-15 | 中国科学院自动化研究所 | Laser radar and camera external parameter calibration method and device |
CN115994955A (en) * | 2023-03-23 | 2023-04-21 | 深圳佑驾创新科技有限公司 | Camera external parameter calibration method and device and vehicle |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Chen et al. | High-accuracy multi-camera reconstruction enhanced by adaptive point cloud correction algorithm | |
CN110230998B (en) | Rapid and precise three-dimensional measurement method and device based on line laser and binocular camera | |
CN105716542B (en) | A kind of three-dimensional data joining method based on flexible characteristic point | |
CN108986070B (en) | Rock crack propagation experiment monitoring method based on high-speed video measurement | |
CN111369630A (en) | Method for calibrating multi-line laser radar and camera | |
CN107869954B (en) | Binocular vision volume weight measurement system and implementation method thereof | |
CN114140534A (en) | Combined calibration method for laser radar and camera | |
CN110940295B (en) | High-reflection object measurement method and system based on laser speckle limit constraint projection | |
CN107941153B (en) | Visual system for optimizing calibration of laser ranging | |
CN112880562A (en) | Method and system for measuring pose error of tail end of mechanical arm | |
Xu et al. | An omnidirectional 3D sensor with line laser scanning | |
CN113643436B (en) | Depth data splicing and fusion method and device | |
CN208254424U (en) | A kind of laser blind hole depth detection system | |
CN108917640A (en) | A kind of laser blind hole depth detection method and its system | |
Okarma et al. | The 3D scanning system for the machine vision based positioning of workpieces on the CNC machine tools | |
CN115082538A (en) | System and method for three-dimensional reconstruction of surface of multi-view vision balance ring part based on line structure light projection | |
CN208350997U (en) | A kind of object movement monitoring system | |
CN112785647A (en) | Three-eye stereo image detection method and system | |
WO2022078437A1 (en) | Three-dimensional processing apparatus and method between moving objects | |
CN110766740B (en) | Real-time high-precision binocular range finding system and method based on pedestrian tracking | |
EP4071578A1 (en) | Light source control method for vision machine, and vision machine | |
CN107941147B (en) | Non-contact online measurement method for three-dimensional coordinates of large-scale system | |
CN113450415A (en) | Imaging device calibration method and device | |
Sheng et al. | Research on point-cloud collection and 3D model reconstruction | |
WO2022171003A1 (en) | Camera calibration method and apparatus, and electronic device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |