CN117630892B - Combined calibration method and system for visible light camera, infrared camera and laser radar - Google Patents

Combined calibration method and system for visible light camera, infrared camera and laser radar Download PDF

Info

Publication number
CN117630892B
CN117630892B CN202410106754.9A CN202410106754A CN117630892B CN 117630892 B CN117630892 B CN 117630892B CN 202410106754 A CN202410106754 A CN 202410106754A CN 117630892 B CN117630892 B CN 117630892B
Authority
CN
China
Prior art keywords
calibration plate
plane
point
calibration
visible light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410106754.9A
Other languages
Chinese (zh)
Other versions
CN117630892A (en
Inventor
马惠敏
王艺霖
刘海壮
傅豪杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Science and Technology Beijing USTB
Original Assignee
University of Science and Technology Beijing USTB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Science and Technology Beijing USTB filed Critical University of Science and Technology Beijing USTB
Priority to CN202410106754.9A priority Critical patent/CN117630892B/en
Publication of CN117630892A publication Critical patent/CN117630892A/en
Application granted granted Critical
Publication of CN117630892B publication Critical patent/CN117630892B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention provides a combined calibration method and a system for a visible light camera, an infrared camera and a laser radar, and relates to the technical field of multi-sensor combined calibration. The method comprises the following steps: setting a rectangular calibration plate and refrigerating; in the vertical state of the calibration plate, the distance between the calibration plate and the visible light camera, the infrared camera and the laser radar is adjusted, so that the calibration plate is completely in the field angle of each sensor, and data are synchronously collected; adjusting the orientation angle and the pitch angle of the calibration plate, and collecting data at each angle; taking a frame of data of an angle of time synchronization, and respectively obtaining a two-dimensional coordinate of a corner point of a calibration plate in a visible light image, a two-dimensional coordinate of the corner point of the calibration plate in an infrared image and a three-dimensional coordinate of the corner point of the calibration plate in laser radar point cloud data; and solving a coordinate system rotation translation matrix based on the corresponding relation between the two-dimensional coordinates and the three-dimensional coordinates, and completing the joint calibration. The invention can realize simple, low-cost and high-precision calibration of the multi-mode sensor.

Description

Combined calibration method and system for visible light camera, infrared camera and laser radar
Technical Field
The invention relates to the technical field of multi-sensor combined calibration, in particular to a combined calibration method and system for a visible light camera, an infrared camera and a laser radar.
Background
Calibration of cameras and lidars is a critical technology in the fields of computer vision and robotics. It involves measuring and calibrating the geometric relationship between the camera and the lidar so that two-or three-dimensional data can be accurately mapped to physical objects in the real world. The importance of this technology is that it provides accurate perceptibility for a variety of applications, including autopilot, robotic navigation, industrial automation, medical imaging, augmented reality, and the like.
Cameras and lidars play a vital role in different application scenarios. Cameras are commonly used to acquire two-dimensional images, while lidar can provide more accurate three-dimensional spatial information. By calibrating the camera and lidar, an accurate geometric relationship between them can be established, enabling the system to accurately estimate the position, shape and size of the object.
For camera calibration, the main objective is to determine the internal and external parameters of the camera. Internal parameters include focal length, optical center, etc. parameters that describe the internal properties of the camera. The external parameters describe the position and pose of the camera in three-dimensional space. By capturing images of known geometry, such as checkerboard or spheres, and then analyzing the correspondence between these images and the geometry in the real world, the internal and external parameters of the camera can be calculated, thus achieving accurate calibration of the camera.
The laser radar calibration process is similar, but the main focus is to determine the internal and external parameters of the laser radar. Internal parameters typically include scan angle, distortion parameters, etc. of the lidar. The external parameters then include the position and attitude of the lidar in three-dimensional space. By scanning an object of known geometry and comparing it to the actual object, the internal and external parameters of the lidar can be calculated.
The joint calibration of the camera and the laser radar involves combining calibration results of the camera and the laser radar so as to establish an accurate corresponding relationship between the two. The combined calibration can accurately position the data acquired by the camera and the laser radar in a three-dimensional space, and realizes accurate three-dimensional reconstruction and environment perception.
In practical applications, calibration of cameras and lidars is critical to achieving accurate sensing and positioning capabilities. For example, in an autopilot car, accurate camera and lidar calibration can help the vehicle accurately perceive roads and obstacles, enabling accurate environmental awareness and navigation. In the field of industrial automation, accurate calibration can ensure that a robot can accurately identify and operate objects, and high-precision production and manufacturing are realized. In the field of medical imaging, camera and laser radar calibration can help doctors to accurately diagnose and treat diseases, and improve the quality and efficiency of medical services. In augmented reality applications, accurate calibration can ensure that virtual objects are aligned exactly with the real world, providing a more realistic augmented reality experience.
Thus, calibration of cameras and lidars is a key step in achieving accurate sensing and positioning capabilities. However, the existing calibration method has the problems of complex system structure, high cost, poor sensing effect when facing low-illumination or extreme weather scenes, and the like, and is difficult to meet the requirements of practical application.
Disclosure of Invention
Aiming at the problems, the invention aims to provide a combined calibration method and a system for a visible light camera, an infrared camera and a laser radar, wherein the infrared mode is added for fusion sensing under a low-illumination or extreme weather scene, namely a scene in which effective information is difficult to obtain by the visible light mode, so that three-dimensional combined calibration of the visible light, the infrared and the laser radar is realized, and the calibration accuracy is improved.
In order to solve the technical problems, the invention provides the following technical scheme:
in one aspect, a method for jointly calibrating a visible light camera, an infrared camera and a laser radar is provided, and the method comprises the following steps:
s1, setting a rectangular calibration plate, distinguishing the color of the calibration plate from the background, and refrigerating the calibration plate before collecting data;
s2, adjusting the distance between the calibration plate and the visible light camera, the infrared camera and the laser radar in the vertical state of the calibration plate, so that the calibration plate is completely arranged in the angles of view of the visible light camera, the infrared camera and the laser radar, and synchronously collecting data for more than one second;
the data comprise visible light images, infrared images and laser radar point cloud data;
s3, adjusting the orientation angle and the pitch angle of the calibration plate, changing the angles among the calibration plate, the visible light camera, the infrared camera and the laser radar, and collecting data of more than one second at each angle;
s4, taking a frame of data of an angle of time synchronization, and respectively obtaining a two-dimensional coordinate of an angular point of the calibration plate in a visible light image, a two-dimensional coordinate of the angular point of the calibration plate in an infrared image and a three-dimensional coordinate of the angular point of the calibration plate in laser radar point cloud data;
and S5, solving a coordinate system rotation translation matrix based on the corresponding relation among the two-dimensional coordinates of the corner points of the calibration plate in the visible light image, the two-dimensional coordinates of the corner points of the calibration plate in the infrared image and the three-dimensional coordinates of the corner points of the laser radar point cloud data, and completing the joint calibration.
Preferably, in step S1, the calibration plate is a rectangular wood board, and the length and the width of the calibration plate are both greater than 40cm, and the thickness of the calibration plate is 5mm.
Preferably, in step S3, adjusting the orientation angle and the pitch angle of the calibration plate specifically includes:
rotating the calibration plate along a longitudinal central axis of the calibration plate to adjust an orientation angle of the calibration plate;
and rotating the calibration plate along a transverse central shaft of the calibration plate so as to adjust the pitch angle of the calibration plate.
Preferably, in step S4, the obtaining two-dimensional coordinates of the corner point of the calibration plate in the visible light image specifically includes:
converting the visible light image from an RGB color space to an HSV color space;
setting a first threshold according to the color of the calibration plate, and dividing the calibration plate from the visible light image based on the first threshold;
fitting a first quadrangle according to the segmentation result, and taking four corner coordinates of the first quadrangle, namely two-dimensional coordinates of the corner of the calibration plate in the visible light image.
Preferably, in step S4, the obtaining two-dimensional coordinates of the corner point of the calibration plate in the infrared image specifically includes:
after the calibration plate is refrigerated, the temperature is lower than the background, so that the pixel value in the infrared image is distinguished from the background;
setting a second threshold according to the pixel value of the calibration plate, if the pixel value of a certain point is larger than the second threshold, the background is considered, and if the pixel value of a certain point is smaller than the second threshold, the background is considered as the calibration plate, and the calibration plate is segmented from the infrared image;
fitting a second quadrangle according to the segmentation result, and taking four corner coordinates of the second quadrangle, namely, two-dimensional coordinates of the corner points of the calibration plate in the infrared image.
Preferably, in step S4, the obtaining a three-dimensional coordinate of the corner point of the calibration board in the laser radar point cloud data specifically includes:
step a: obtaining a rough calibration plate plane;
randomly selecting three points from the point cloud data, fitting a plane, calculating the distance between all the points and the plane, if the distance is smaller than a threshold value d, considering the points to be on the plane, and counting to obtain the number of the points on the plane;
repeating random selection for multiple times, and taking plane P with the highest number of points on the plane in the process 1 The method comprises the steps of carrying out a first treatment on the surface of the Respectively calculate plane P 1 An included angle alpha with the ground and an included angle beta with a forward axis in a three-dimensional coordinate system of the sensor, wherein when the included angle alpha is greater than or equal to a first angle threshold value and the included angle beta is greater than or equal to a second angle threshold value, the plane P is considered 1 Is a rough calibration plate plane;
if the included angle alpha is smaller than the first angle threshold and/or the included angle beta is smaller than the second angle threshold, the plane P is considered 1 Is background or ground, and the plane P is deleted from the point cloud data 1 All points in the plane are randomly selected again until two rough calibration plate planes with included angles meeting the conditions are obtained;
step b: acquiring a corrected calibration plate plane and plane point cloud;
after the rough calibration plate plane is obtained, returning to the original point cloud data to calculate the distance between all points and the rough calibration plate plane; firstly, screening out points on a plane of a coarse calibration plate by using a larger threshold value m, then randomly selecting points in point cloud data of the plane of the coarse calibration plate again, fitting a plane, judging whether the points are on the plane or not by using a smaller threshold value n, and repeating for a plurality of timesThe plane P with the most points on the plane in the process of taking is taken again 2 Obtaining a corrected calibration plate plane and plane point cloud;
step c: acquiring three-dimensional coordinates of corner points of the calibration plate;
projecting the point cloud on the corrected calibration plate plane according to the plane equation and the plane point cloud coordinate, and converting a coordinate system to obtain a two-dimensional point coordinate of the point cloud on the calibration plate plane; averaging the abscissa and the ordinate of the projection point set, calculating the point set center, setting a third threshold according to the size of the calibration plate, and deleting the point with the distance from the point set center exceeding the third threshold;
fitting the rest points with the minimum rectangular bounding boxes to obtain linear equations of four sides, calculating and sequencing the distances from all points to the sides, deleting the point closest to the points, calculating the minimum rectangular bounding boxes again, calculating the included angles and the area differences of the two minimum rectangular bounding boxes before and after deleting the closest point, if the preset condition is met, determining that the deleted point is noise, and if the preset condition is not met, adding the deleted point back; the preset condition means that the included angle and the area difference of the two minimum rectangular surrounding frames before and after the closest point is deleted are larger than the respective threshold value;
continuing to process the next nearest point, and cycling the operation until the point accounting for the preset proportion of the total point is processed, wherein the operation is that: deleting the point closest to the point, calculating the minimum rectangular bounding box again, calculating the included angle and the area difference of the two minimum rectangular bounding boxes before and after deleting the point closest to the point, if the point is in accordance with the preset condition, determining that the deleted point is noise, and if the point is not in accordance with the preset condition, adding the deleted point back again;
and the corner points of the minimum rectangular bounding box obtained at the moment are the corner point projection points of the calibration plate, and the corner points are mapped back to a three-dimensional space according to a plane equation to obtain the three-dimensional coordinates of the corner points of the calibration plate in the laser radar point cloud data.
Preferably, in step a, 200 times of random selection are repeated, and the plane P with the highest number of points on the plane in the process is taken 1
Plane P 1 Two included angles are formed between the ground and the ground, and an acute angle is recorded in the two included anglesIs an included angle alpha; plane P 1 Two included angles are formed between the sensor and a forward axis in a three-dimensional coordinate system of the sensor, and an acute angle in the sensor is taken as an included angle beta.
In another aspect, a combined calibration system of a visible light camera, an infrared camera and a laser radar is provided, the system comprising:
the setting module is used for setting a rectangular calibration plate, distinguishing the color of the calibration plate from the background, and refrigerating the calibration plate before collecting data;
the data acquisition module is used for adjusting the distance between the calibration plate and the visible light camera, the infrared camera and the laser radar in the vertical state of the calibration plate, so that the calibration plate is completely arranged in the angles of view of the visible light camera, the infrared camera and the laser radar, and synchronously acquiring data of more than one second; the data comprise visible light images, infrared images and laser radar point cloud data;
adjusting the orientation angle and the pitch angle of the calibration plate, changing the angles among the calibration plate, the visible light camera, the infrared camera and the laser radar, and collecting data of more than one second at each angle;
the data processing module is used for taking a frame of data of an angle of time synchronization and respectively acquiring two-dimensional coordinates of the corner point of the calibration plate in the visible light image, two-dimensional coordinates of the corner point of the calibration plate in the infrared image and three-dimensional coordinates of the corner point of the calibration plate in the laser radar point cloud data;
and solving a coordinate system rotation translation matrix based on the corresponding relation among the two-dimensional coordinates of the corner points of the calibration plate in the visible light image, the two-dimensional coordinates of the corner points of the calibration plate in the infrared image and the three-dimensional coordinates of the corner points of the laser radar point cloud data, so as to finish joint calibration.
In another aspect, there is provided an electronic device including:
a processor;
and the memory is stored with computer readable instructions which, when loaded and executed by the processor, implement the steps of the joint calibration method.
In another aspect, a computer readable storage medium having stored therein at least one instruction loaded and executed by a processor to implement the steps of a joint calibration method as described above is provided.
The technical scheme provided by the invention has the beneficial effects that at least:
according to the combined calibration method for the visible light camera, the infrared camera and the laser radar, provided by the invention, the infrared mode is fused on the basis of the visible light mode to sense, and the three-dimensional combined calibration is realized with the laser radar, so that the simple, low-cost and high-precision multi-mode sensor calibration can be realized in a low-illumination or extreme weather scene.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for joint calibration of a visible light camera, an infrared camera and a laser radar according to an embodiment of the present invention;
FIG. 2 is a schematic view of different angles of a calibration plate according to an embodiment of the present invention;
FIG. 3 is a schematic view of a calibration plate clip angle provided by an embodiment of the present invention;
FIG. 4 is a schematic structural diagram of a combined calibration system for a visible light camera, an infrared camera and a laser radar according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more clear, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings of the embodiments of the present invention. It will be apparent that the described embodiments are some, but not all, embodiments of the invention. All other embodiments, which can be made by a person skilled in the art without creative efforts, based on the described embodiments of the present invention fall within the protection scope of the present invention.
The embodiment of the invention provides a combined calibration method of a visible light camera, an infrared camera and a laser radar, as shown in fig. 1, the method comprises the following steps:
s1, setting a rectangular calibration plate, distinguishing the color of the calibration plate from the background, and refrigerating the calibration plate before collecting data.
The color of the calibration plate is distinguished from the background, so that the image can be conveniently collected by the visible light camera, and the calibration plate is refrigerated, so that the image can be conveniently collected by the infrared camera.
As an alternative implementation mode of the invention, the calibration site adopts an indoor space and cleans redundant sundries. The calibration plate adopts a rectangular wood plate, the length and the width are both greater than 40cm, and the thickness is about 5mm. For example, a standard rectangular wood board 45cm x 92cm thick of 5mm was used as the calibration board.
S2, adjusting the distance between the calibration plate and the visible light camera, the infrared camera and the laser radar in the vertical state of the calibration plate, so that the calibration plate is completely arranged in the angles of view of the visible light camera, the infrared camera and the laser radar, and synchronously collecting data for more than one second;
the data includes visible light images, infrared images, and lidar point cloud data.
In the step, the initial state of the calibration plate is adjusted, and the data acquisition of an angle is completed.
S3, adjusting the orientation angle and the pitch angle of the calibration plate, changing the angles between the calibration plate and the visible light camera, the infrared camera and the laser radar, and collecting data of more than one second at each angle.
As an optional implementation manner of the present invention, adjusting the orientation angle and the pitch angle of the calibration plate specifically includes:
rotating the calibration plate along a longitudinal central axis of the calibration plate to adjust an orientation angle of the calibration plate; and rotating the calibration plate along a transverse central shaft of the calibration plate so as to adjust the pitch angle of the calibration plate.
As shown in fig. 2, the calibration plate is shown in schematic view at different angles.
S4, taking time-synchronized frame data of an angle, and respectively obtaining two-dimensional coordinates of the corner point of the calibration plate in the visible light image, two-dimensional coordinates of the corner point of the calibration plate in the infrared image and three-dimensional coordinates of the corner point of the calibration plate in the laser radar point cloud data.
Specifically, as an optional implementation manner of the present invention, the obtaining the two-dimensional coordinates of the corner point of the calibration plate in the visible light image specifically includes:
converting the visible light image from an RGB color space to an HSV color space;
setting a first threshold according to the color of the calibration plate, and dividing the calibration plate from the visible light image based on the first threshold;
fitting a first quadrangle according to the segmentation result, and taking four corner coordinates of the first quadrangle, namely two-dimensional coordinates of the corner of the calibration plate in the visible light image.
As an optional implementation manner of the present invention, obtaining two-dimensional coordinates of the corner point of the calibration board in the infrared image specifically includes:
after the calibration plate is refrigerated, the temperature is lower than the background, so that the pixel value in the infrared image is distinguished from the background;
setting a second threshold according to the pixel value of the calibration plate, if the pixel value of a certain point is larger than the second threshold, the background is considered, and if the pixel value of a certain point is smaller than the second threshold, the background is considered as the calibration plate, and the calibration plate is segmented from the infrared image;
fitting a second quadrangle according to the segmentation result, and taking four corner coordinates of the second quadrangle, namely, two-dimensional coordinates of the corner points of the calibration plate in the infrared image.
As an optional implementation manner of the present invention, the obtaining the three-dimensional coordinates of the corner point of the calibration board in the laser radar point cloud data specifically includes:
step a: obtaining a rough calibration plate plane;
randomly selecting three points from the point cloud data, fitting a plane, calculating the distance between all the points and the plane, if the distance is smaller than a threshold value d, considering the points to be on the plane, and counting to obtain the number of the points on the plane;
repeating the random selection for a plurality of times (for example, 200 times), and taking the plane P with the highest number of points on the plane in the process 1 The method comprises the steps of carrying out a first treatment on the surface of the Respectively calculate plane P 1 An included angle alpha with the ground and an included angle beta with a forward axis in a three-dimensional coordinate system of the sensor.
As shown in fig. 3, an included angle α is illustrated in fig. 3 (a), and an included angle β is illustrated in fig. 3 (b). Wherein plane P 1 Two included angles are formed between the ground and the plane P, wherein the acute angle is recorded as an included angle alpha 1 Two included angles are formed between the sensor and a forward axis in a three-dimensional coordinate system of the sensor, and an acute angle in the sensor is taken as an included angle beta.
When the included angle alpha is greater than or equal to the first angle threshold and the included angle beta is greater than or equal to the second angle threshold, then the plane P is considered 1 Is a rough calibration plate plane;
if the included angle alpha is smaller than the first angle threshold and/or the included angle beta is smaller than the second angle threshold, the plane P is considered 1 Is background or ground, and the plane P is deleted from the point cloud data 1 And then starting the random selection process again until two rough calibration plate planes with included angles meeting the conditions are obtained.
The principle of the algorithm is as follows: the greater the angle α, i.e. closer to 90 °, the more vertical the calibration plate is to the ground, where ground refers to a ground based on a three-dimensional coordinate system. The greater the angle beta, i.e., the closer to 90 deg., the calibration plate is facing the sensor. The sensor here refers to a lidar. The two angles are calculated mainly to remove interference in a scene, such as a wall surface, a desktop, a ground surface, and the like.
Step b: acquiring a corrected calibration plate plane and plane point cloud;
after the rough calibration plate plane is obtained, returning to the original point cloud data to calculate the distance between all points and the rough calibration plate plane; firstly, screening out points on a plane of a coarse calibration plate by using a larger threshold value m, then randomly selecting points in point cloud data of the plane of the coarse calibration plate again, fitting a plane, judging whether the points are on the plane or not by using a smaller threshold value n, repeating for a plurality of times, and taking a plane P with the most points on the plane in the process 2 And obtaining the corrected calibration plate plane and plane point cloud.
The larger threshold m and the smaller threshold n refer to the distance from each point to the plane of the coarse calibration plate, and m is larger than n and represents a process from coarse screening to finishing.
Step c: acquiring three-dimensional coordinates of corner points of the calibration plate;
projecting the point cloud on the corrected calibration plate plane according to the plane equation and the plane point cloud coordinate, and converting a coordinate system to obtain a two-dimensional point coordinate of the point cloud on the calibration plate plane; averaging the abscissa and the ordinate of the projection point set, calculating the point set center, setting a third threshold according to the size of the calibration plate, and deleting the point with the distance from the point set center exceeding the third threshold;
fitting the rest points with the minimum rectangular bounding boxes to obtain linear equations of four sides, calculating and sequencing the distances from all points to the sides, deleting the point closest to the points, calculating the minimum rectangular bounding boxes again, calculating the included angles and the area differences of the two minimum rectangular bounding boxes before and after deleting the closest point, if the preset condition is met, determining that the deleted point is noise, and if the preset condition is not met, adding the deleted point back; the preset condition means that the included angle and the area difference of the two minimum rectangular surrounding frames before and after the closest point is deleted are larger than the respective threshold value;
continuing to process the next nearest point, and cycling the operation until the point accounting for the preset proportion of the total point is processed, wherein the operation is that: deleting the point closest to the point, calculating the minimum rectangular bounding box again, calculating the included angle and the area difference of the two minimum rectangular bounding boxes before and after deleting the point closest to the point, if the point is in accordance with the preset condition, determining that the deleted point is noise, and if the point is not in accordance with the preset condition, adding the deleted point back again;
and the corner points of the minimum rectangular bounding box obtained at the moment are the corner point projection points of the calibration plate, and the corner points are mapped back to a three-dimensional space according to a plane equation to obtain the three-dimensional coordinates of the corner points of the calibration plate in the laser radar point cloud data.
And S5, solving a coordinate system rotation translation matrix based on the corresponding relation among the two-dimensional coordinates of the corner points of the calibration plate in the visible light image, the two-dimensional coordinates of the corner points of the calibration plate in the infrared image and the three-dimensional coordinates of the corner points of the laser radar point cloud data, and completing the joint calibration.
Compared with the prior art, the combined calibration method provided by the invention combines the infrared mode to sense on the basis of the visible light mode, realizes three-dimensional combined calibration with the laser radar, and can realize simple, low-cost and high-precision multi-mode sensor calibration in low-illumination or extreme weather scenes.
Correspondingly, the embodiment of the invention also provides a combined calibration system of the visible light camera, the infrared camera and the laser radar, as shown in fig. 4, the system comprises:
the setting module 201 is configured to set a rectangular calibration plate, wherein the color of the calibration plate is distinguished from the background, and the calibration plate is refrigerated before data is collected;
the data acquisition module 202 is configured to adjust a distance between the calibration plate and the visible light camera, the infrared camera, and the laser radar in a vertical state of the calibration plate, so that the calibration plate is completely present in angles of view of the visible light camera, the infrared camera, and the laser radar, and synchronously acquire data of more than one second; the data comprise visible light images, infrared images and laser radar point cloud data;
adjusting the orientation angle and the pitch angle of the calibration plate, changing the angles among the calibration plate, the visible light camera, the infrared camera and the laser radar, and collecting data of more than one second at each angle;
the data processing module 203 is configured to acquire a frame of data of an angle in time synchronization, and respectively acquire a two-dimensional coordinate of a corner point of the calibration plate in a visible light image, a two-dimensional coordinate of the corner point in an infrared image, and a three-dimensional coordinate of the corner point in laser radar point cloud data;
and solving a coordinate system rotation translation matrix based on the corresponding relation among the two-dimensional coordinates of the corner points of the calibration plate in the visible light image, the two-dimensional coordinates of the corner points of the calibration plate in the infrared image and the three-dimensional coordinates of the corner points of the laser radar point cloud data, so as to finish joint calibration.
The system of the present embodiment may be used to implement the technical solution of the method embodiment shown in fig. 1, and its implementation principle and technical effects are similar, and are not described here again.
In an exemplary embodiment, the present invention also provides an electronic device including:
a processor;
and the memory is stored with computer readable instructions which, when loaded and executed by the processor, implement the steps of the joint calibration method.
Fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, and as shown in fig. 5, the electronic device 300 may include a processor 3001 and a memory 3002. Optionally, the electronic device 300 may also include a transceiver 3003. The processor 3001 may be connected to the memory 3002 and the transceiver 3003, for example, by a communication bus. The memory 3002 has stored thereon computer readable instructions which, when executed by the processor 3001, implement the steps of the joint calibration method as described above.
In a particular implementation, the processor 3001 may include one or more CPUs, such as CPU0 and CPU1 shown in fig. 5, as an embodiment.
In a particular implementation, as one embodiment, the electronic device 300 may also include multiple processors, such as the processor 3001 and the processor 3004 shown in FIG. 5. Each of these processors may be a single-core processor (single-CPU) or a multi-core processor (multi-CPU). A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
The memory 3002 is configured to store a software program for executing the solution of the present invention, and the processor 3001 controls the execution of the software program, and the specific implementation may refer to the above method embodiment, which is not described herein.
A transceiver 3003 for communicating with a network device or with a terminal device.
Alternatively, the transceiver 3003 may include a receiver and a transmitter. The receiver is used for realizing the receiving function, and the transmitter is used for realizing the transmitting function.
Alternatively, the transceiver 3003 may be integrated with the processor 3001, or may exist separately, and be coupled to the processor 3001 through an interface circuit of the electronic device 300, which is not specifically limited in this embodiment of the present invention.
It should be noted that the structure of the electronic device 300 shown in fig. 5 is not limited to the electronic device, and an actual electronic device may include more or fewer components than shown, or may combine some components, or may have a different arrangement of components. In addition, the technical effects of the electronic device 300 may refer to the technical effects of the above-mentioned method embodiments, which are not described herein.
In an exemplary embodiment, the invention also provides a computer readable storage medium having stored therein at least one instruction that is loaded and executed by a processor to implement the steps of the joint calibration method as described above. For example, the computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or terminal device comprising the element.
References in the specification to "one embodiment," "an example embodiment," "some embodiments," etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may not necessarily include the particular feature, structure, or characteristic. Further, when a particular feature, structure, or characteristic is described in connection with an embodiment, it is submitted that it is within the knowledge of one skilled in the relevant art to effect such feature, structure, or characteristic in connection with other embodiments whether or not explicitly described.
It should be understood that the term "and/or" is merely an association relationship describing the associated object, and means that three relationships may exist, for example, a and/or B may mean: there are three cases, a alone, a and B together, and B alone, wherein a, B may be singular or plural. In addition, the character "/" herein generally indicates that the associated object is an "or" relationship, but may also indicate an "and/or" relationship, and may be understood by referring to the context.
In the present invention, "at least one" means one or more, and "a plurality" means two or more. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b, or c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
It should be understood that, in various embodiments of the present invention, the sequence numbers of the foregoing processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic thereof, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
In the several embodiments provided by the present invention, it should be understood that the disclosed apparatus, device and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another device, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The invention is intended to cover any alternatives, modifications, equivalents, and variations that fall within the spirit and scope of the invention. In the following description of preferred embodiments of the invention, specific details are set forth in order to provide a thorough understanding of the invention, and the invention will be fully understood to those skilled in the art without such details. In other instances, well-known methods, procedures, flows, components, circuits, and the like have not been described in detail so as not to unnecessarily obscure aspects of the present invention.
The foregoing description of the preferred embodiments of the invention is not intended to limit the invention to the precise form disclosed, and any such modifications, equivalents, and alternatives falling within the spirit and scope of the invention are intended to be included within the scope of the invention.

Claims (9)

1. The combined calibration method of the visible light camera, the infrared camera and the laser radar is characterized by comprising the following steps of:
s1, setting a rectangular calibration plate, distinguishing the color of the calibration plate from the background, and refrigerating the calibration plate before collecting data;
s2, adjusting the distance between the calibration plate and the visible light camera, the infrared camera and the laser radar in the vertical state of the calibration plate, so that the calibration plate is completely arranged in the angles of view of the visible light camera, the infrared camera and the laser radar, and synchronously collecting data for more than one second;
the data comprise visible light images, infrared images and laser radar point cloud data;
s3, adjusting the orientation angle and the pitch angle of the calibration plate, changing the angles among the calibration plate, the visible light camera, the infrared camera and the laser radar, and collecting data of more than one second at each angle;
s4, taking a frame of data of an angle of time synchronization, and respectively obtaining a two-dimensional coordinate of an angular point of the calibration plate in a visible light image, a two-dimensional coordinate of the angular point of the calibration plate in an infrared image and a three-dimensional coordinate of the angular point of the calibration plate in laser radar point cloud data;
in step S4, obtaining three-dimensional coordinates of the corner point of the calibration board in the laser radar point cloud data specifically includes:
step a: obtaining a rough calibration plate plane;
randomly selecting three points from the point cloud data, fitting a plane, calculating the distance between all the points and the plane, if the distance is smaller than a threshold value d, considering the points to be on the plane, and counting to obtain the number of the points on the plane;
repeating random selection for multiple times, and taking plane P with the highest number of points on the plane in the process 1 The method comprises the steps of carrying out a first treatment on the surface of the Respectively calculate plane P 1 An included angle alpha with the ground and an included angle beta with a forward axis in a three-dimensional coordinate system of the sensor, wherein when the included angle alpha is greater than or equal to a first angle threshold value and the included angle beta is greater than or equal to a second angle threshold value, the plane P is considered 1 Is a rough calibration plate plane;
if the included angle alpha is smaller than the first angle threshold and/or the included angle beta is smaller than the second angle threshold, the plane P is considered 1 Is background or ground, and the plane P is deleted from the point cloud data 1 All points in the plane are randomly selected again until two rough calibration plate planes with included angles meeting the conditions are obtained;
step b: acquiring a corrected calibration plate plane and plane point cloud;
after the rough calibration plate plane is obtained, returning to the original point cloud data to calculate the distance between all points and the rough calibration plate plane; firstly, screening out points on a plane of a coarse calibration plate by using a larger threshold value m, then randomly selecting points in point cloud data of the plane of the coarse calibration plate again, fitting a plane, judging whether the points are on the plane or not by using a smaller threshold value n, repeating for a plurality of times, and taking a plane P with the most points on the plane in the process 2 Obtaining a corrected calibration plate plane and plane point cloud;
step c: acquiring three-dimensional coordinates of corner points of the calibration plate;
projecting the point cloud on the corrected calibration plate plane according to the plane equation and the plane point cloud coordinate, and converting a coordinate system to obtain a two-dimensional point coordinate of the point cloud on the calibration plate plane; averaging the abscissa and the ordinate of the projection point set, calculating the point set center, setting a third threshold according to the size of the calibration plate, and deleting the point with the distance from the point set center exceeding the third threshold;
fitting the rest points with the minimum rectangular bounding boxes to obtain linear equations of four sides, calculating and sequencing the distances from all points to the sides, deleting the point closest to the points, calculating the minimum rectangular bounding boxes again, calculating the included angles and the area differences of the two minimum rectangular bounding boxes before and after deleting the closest point, if the preset condition is met, determining that the deleted point is noise, and if the preset condition is not met, adding the deleted point back; the preset condition means that the included angle and the area difference of the two minimum rectangular surrounding frames before and after the closest point is deleted are larger than the respective threshold value;
continuing to process the next nearest point, and cycling the operation until the point accounting for the preset proportion of the total point is processed, wherein the operation is that: deleting the point closest to the point, calculating the minimum rectangular bounding box again, calculating the included angle and the area difference of the two minimum rectangular bounding boxes before and after deleting the point closest to the point, if the point is in accordance with the preset condition, determining that the deleted point is noise, and if the point is not in accordance with the preset condition, adding the deleted point back again;
the corner points of the minimum rectangular bounding box obtained at the moment are the corner point projection points of the calibration plate, and the corner points are mapped back to a three-dimensional space according to a plane equation to obtain the three-dimensional coordinates of the corner points of the calibration plate in the laser radar point cloud data;
and S5, solving a coordinate system rotation translation matrix based on the corresponding relation among the two-dimensional coordinates of the corner points of the calibration plate in the visible light image, the two-dimensional coordinates of the corner points of the calibration plate in the infrared image and the three-dimensional coordinates of the corner points of the laser radar point cloud data, and completing the joint calibration.
2. The combined calibration method according to claim 1, wherein in step S1, the calibration plate is a rectangular wood plate, and the length and width of the calibration plate are both greater than 40cm, and the thickness of the calibration plate is 5mm.
3. The combined calibration method according to claim 1, wherein in step S3, adjusting the orientation angle and the pitch angle of the calibration plate specifically includes:
rotating the calibration plate along a longitudinal central axis of the calibration plate to adjust an orientation angle of the calibration plate;
and rotating the calibration plate along a transverse central shaft of the calibration plate so as to adjust the pitch angle of the calibration plate.
4. The joint calibration method according to claim 1, wherein in step S4, two-dimensional coordinates of the corner point of the calibration plate in the visible light image are obtained, and specifically comprising:
converting the visible light image from an RGB color space to an HSV color space;
setting a first threshold according to the color of the calibration plate, and dividing the calibration plate from the visible light image based on the first threshold;
fitting a first quadrangle according to the segmentation result, and taking four corner coordinates of the first quadrangle, namely two-dimensional coordinates of the corner of the calibration plate in the visible light image.
5. The joint calibration method according to claim 1, wherein in step S4, two-dimensional coordinates of the corner points of the calibration plate in the infrared image are obtained, and specifically comprising:
after the calibration plate is refrigerated, the temperature is lower than the background, so that the pixel value in the infrared image is distinguished from the background;
setting a second threshold according to the pixel value of the calibration plate, if the pixel value of a certain point is larger than the second threshold, the background is considered, and if the pixel value of a certain point is smaller than the second threshold, the background is considered as the calibration plate, and the calibration plate is segmented from the infrared image;
fitting a second quadrangle according to the segmentation result, and taking four corner coordinates of the second quadrangle, namely, two-dimensional coordinates of the corner points of the calibration plate in the infrared image.
6. The joint calibration method according to claim 1, wherein in step a, the steps are repeated200 times of random selection, and taking plane P with the highest number of points on the plane in the process 1
Plane P 1 Two included angles are formed between the ground and the ground, and an acute angle in the included angles is taken as an included angle alpha; plane P 1 Two included angles are formed between the sensor and a forward axis in a three-dimensional coordinate system of the sensor, and an acute angle in the sensor is taken as an included angle beta.
7. A combined calibration system of a visible light camera, an infrared camera and a lidar for implementing the method according to any of claims 1 to 6, characterized in that the system comprises:
the setting module is used for setting a rectangular calibration plate, distinguishing the color of the calibration plate from the background, and refrigerating the calibration plate before collecting data;
the data acquisition module is used for adjusting the distance between the calibration plate and the visible light camera, the infrared camera and the laser radar in the vertical state of the calibration plate, so that the calibration plate is completely arranged in the angles of view of the visible light camera, the infrared camera and the laser radar, and synchronously acquiring data of more than one second; the data comprise visible light images, infrared images and laser radar point cloud data;
adjusting the orientation angle and the pitch angle of the calibration plate, changing the angles among the calibration plate, the visible light camera, the infrared camera and the laser radar, and collecting data of more than one second at each angle;
the data processing module is used for taking a frame of data of an angle of time synchronization and respectively acquiring two-dimensional coordinates of the corner point of the calibration plate in the visible light image, two-dimensional coordinates of the corner point of the calibration plate in the infrared image and three-dimensional coordinates of the corner point of the calibration plate in the laser radar point cloud data;
and solving a coordinate system rotation translation matrix based on the corresponding relation among the two-dimensional coordinates of the corner points of the calibration plate in the visible light image, the two-dimensional coordinates of the corner points of the calibration plate in the infrared image and the three-dimensional coordinates of the corner points of the laser radar point cloud data, so as to finish joint calibration.
8. An electronic device, the electronic device comprising:
a processor;
a memory having stored thereon computer readable instructions which, when loaded and executed by the processor, implement the method of any of claims 1 to 6.
9. A computer readable storage medium having stored therein at least one instruction that is loaded and executed by a processor to implement the method of any one of claims 1 to 6.
CN202410106754.9A 2024-01-25 2024-01-25 Combined calibration method and system for visible light camera, infrared camera and laser radar Active CN117630892B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410106754.9A CN117630892B (en) 2024-01-25 2024-01-25 Combined calibration method and system for visible light camera, infrared camera and laser radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410106754.9A CN117630892B (en) 2024-01-25 2024-01-25 Combined calibration method and system for visible light camera, infrared camera and laser radar

Publications (2)

Publication Number Publication Date
CN117630892A CN117630892A (en) 2024-03-01
CN117630892B true CN117630892B (en) 2024-03-29

Family

ID=90032444

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410106754.9A Active CN117630892B (en) 2024-01-25 2024-01-25 Combined calibration method and system for visible light camera, infrared camera and laser radar

Country Status (1)

Country Link
CN (1) CN117630892B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110349221A (en) * 2019-07-16 2019-10-18 北京航空航天大学 A kind of three-dimensional laser radar merges scaling method with binocular visible light sensor
WO2020233443A1 (en) * 2019-05-21 2020-11-26 菜鸟智能物流控股有限公司 Method and device for performing calibration between lidar and camera
CN113902809A (en) * 2021-09-14 2022-01-07 立得空间信息技术股份有限公司 Method for jointly calibrating infrared camera and laser radar
WO2022142759A1 (en) * 2020-12-31 2022-07-07 中国矿业大学 Lidar and camera joint calibration method
CN115267747A (en) * 2022-07-19 2022-11-01 北京理工大学 Calibration method for sparse laser radar and visible light/infrared imaging system
CN116027343A (en) * 2023-02-09 2023-04-28 沈阳航空航天大学 Multi-source information ranging method based on infrared laser radar and visible light camera
CN116430403A (en) * 2022-01-04 2023-07-14 航天图景(北京)科技有限公司 Real-time situation awareness system and method based on low-altitude airborne multi-sensor fusion

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020233443A1 (en) * 2019-05-21 2020-11-26 菜鸟智能物流控股有限公司 Method and device for performing calibration between lidar and camera
CN110349221A (en) * 2019-07-16 2019-10-18 北京航空航天大学 A kind of three-dimensional laser radar merges scaling method with binocular visible light sensor
WO2022142759A1 (en) * 2020-12-31 2022-07-07 中国矿业大学 Lidar and camera joint calibration method
CN113902809A (en) * 2021-09-14 2022-01-07 立得空间信息技术股份有限公司 Method for jointly calibrating infrared camera and laser radar
CN116430403A (en) * 2022-01-04 2023-07-14 航天图景(北京)科技有限公司 Real-time situation awareness system and method based on low-altitude airborne multi-sensor fusion
CN115267747A (en) * 2022-07-19 2022-11-01 北京理工大学 Calibration method for sparse laser radar and visible light/infrared imaging system
CN116027343A (en) * 2023-02-09 2023-04-28 沈阳航空航天大学 Multi-source information ranging method based on infrared laser radar and visible light camera

Also Published As

Publication number Publication date
CN117630892A (en) 2024-03-01

Similar Documents

Publication Publication Date Title
CN109598765B (en) Monocular camera and millimeter wave radar external parameter combined calibration method based on spherical calibration object
KR102487546B1 (en) Improved camera calibration system, target, and process
CN107223269B (en) Three-dimensional scene positioning method and device
US7196719B2 (en) Angled axis machine vision system and method
Chen et al. On pose recovery for generalized visual sensors
WO2021139176A1 (en) Pedestrian trajectory tracking method and apparatus based on binocular camera calibration, computer device, and storage medium
CN110458952B (en) Three-dimensional reconstruction method and device based on trinocular vision
CN112097732A (en) Binocular camera-based three-dimensional distance measurement method, system, equipment and readable storage medium
CN112686950B (en) Pose estimation method, pose estimation device, terminal equipment and computer readable storage medium
WO2018169035A1 (en) Imaging system, method of imaging control, image processing apparatus, and image processing program
CN113361365B (en) Positioning method, positioning device, positioning equipment and storage medium
CN110675436A (en) Laser radar and stereoscopic vision registration method based on 3D feature points
CN116433737A (en) Method and device for registering laser radar point cloud and image and intelligent terminal
CN113034612A (en) Calibration device and method and depth camera
CN112837207A (en) Panoramic depth measuring method, four-eye fisheye camera and binocular fisheye camera
JP2023505891A (en) Methods for measuring environmental topography
WO2022218161A1 (en) Method and apparatus for target matching, device, and storage medium
Xinmei et al. Passive measurement method of tree height and crown diameter using a smartphone
US20090080767A1 (en) Method for determining a depth map from images, device for determining a depth map
CN114926316A (en) Distance measuring method, distance measuring device, electronic device, and storage medium
CN210986289U (en) Four-eye fisheye camera and binocular fisheye camera
CN117630892B (en) Combined calibration method and system for visible light camera, infrared camera and laser radar
CN113763478A (en) Unmanned vehicle camera calibration method, device, equipment, storage medium and system
Tanaka et al. Single-Image Camera Calibration for Furniture Layout Using Natural-Marker-Based Augmented Reality
CN113034615B (en) Equipment calibration method and related device for multi-source data fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant