CN117665730A - Multi-sensor joint calibration method and system - Google Patents
Multi-sensor joint calibration method and system Download PDFInfo
- Publication number
- CN117665730A CN117665730A CN202311467311.4A CN202311467311A CN117665730A CN 117665730 A CN117665730 A CN 117665730A CN 202311467311 A CN202311467311 A CN 202311467311A CN 117665730 A CN117665730 A CN 117665730A
- Authority
- CN
- China
- Prior art keywords
- target
- center point
- calibration
- sensor
- point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 68
- 239000011159 matrix material Substances 0.000 claims abstract description 55
- 238000002310 reflectometry Methods 0.000 claims description 16
- 238000012545 processing Methods 0.000 claims description 12
- 238000004590 computer program Methods 0.000 claims description 10
- 238000001914 filtration Methods 0.000 claims description 5
- 238000011156 evaluation Methods 0.000 claims description 3
- 230000008569 process Effects 0.000 abstract description 18
- 238000010586 diagram Methods 0.000 description 10
- 230000006870 function Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 238000004422 calculation algorithm Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 238000013519 translation Methods 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000009434 installation Methods 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 238000012935 Averaging Methods 0.000 description 2
- 238000012938 design process Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000018109 developmental process Effects 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 238000013178 mathematical model Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000009432 framing Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000009545 invasion Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention provides a multi-sensor joint calibration method and a system, wherein the method comprises the following steps: acquiring a target radar point cloud and a target two-dimensional image; acquiring a forward distance parameter between the target radar point cloud and the target two-dimensional image according to the distance between the first center point and the second center point; determining a three-dimensional attitude angle adjustment angle according to an included angle between a connecting line of the first center point and a connecting line of the second center point and a position relationship between the first center point and the second center point; and acquiring an external parameter matrix of the radar sensor and the camera sensor according to the forward distance parameter and the three-dimensional attitude angle adjustment angle. The invention can automatically complete the joint calibration process of multiple sensors, and compared with the existing manual mode, the invention improves the calibration efficiency and accuracy.
Description
Technical Field
The invention relates to the technical field of sensor calibration, in particular to a multi-sensor combined calibration method and system.
Background
The existing sensing system applied to rail transit mainly comprises an industrial camera and a laser radar, different types of sensors have different characteristics and measurement errors, and the problems of data alignment, coordinate system conversion and the like need to be solved by fusing the data of the sensors.
The current multi-sensor calibration process of the rail transit is still relatively low in automation degree, and because different sensor types and system configurations can be different, manual adjustment and configuration are still needed, so that the efficiency and accuracy of multi-sensor calibration work on a train sensing system are low.
Therefore, there is a need for a multi-sensor joint calibration method and system to solve the above problems.
Disclosure of Invention
Aiming at the problems existing in the prior art, the invention provides a multi-sensor joint calibration method and a system.
The invention provides a multi-sensor joint calibration method, which comprises the following steps:
acquiring target radar point clouds and target two-dimensional images, wherein the target radar point clouds are acquired by a radar sensor on a target calibration plate, the target two-dimensional images are acquired by a camera sensor on the target calibration plate, and the target calibration plate consists of two calibration plates;
acquiring a forward distance parameter between the target radar point cloud and the target two-dimensional image according to the distance between the first center point and the second center point;
determining a three-dimensional attitude angle adjustment angle according to an included angle between a connecting line of the first center point and a connecting line of the second center point and a position relationship between the first center point and the second center point;
Acquiring an external parameter matrix of the radar sensor and the camera sensor according to the forward distance parameter and the three-dimensional attitude angle adjustment angle;
the first center point comprises the center point of the calibration plate point cloud corresponding to each of the two calibration plates in the target radar point cloud, and the second center point comprises the center point of the two-dimensional image corresponding to each of the two calibration plates in the target two-dimensional image.
According to the multi-sensor joint calibration method provided by the invention, the forward distance parameter between the target radar point cloud and the target two-dimensional image is obtained according to the distance between the first center point and the second center point, and the method comprises the following steps:
calculating coordinate information of corresponding center points in the target radar point cloud and the target two-dimensional image according to the predetermined key point coordinate information in the target calibration plate to obtain the first center point and the second center point;
according to the coordinate information of the corresponding center points in the target radar point cloud and the target two-dimensional image, calculating to obtain a first Euclidean distance and a second Euclidean distance, wherein the first Euclidean distance is the Euclidean distance between two center points in the target radar point cloud, and the second Euclidean distance is the Euclidean distance between two center points in the target two-dimensional image;
And judging whether the forward distance difference between the first Euclidean distance and the second Euclidean distance is larger than a preset distance threshold, and if so, acquiring a forward distance parameter between the target radar point cloud and the target two-dimensional image according to the forward distance difference.
According to the multi-sensor joint calibration method provided by the invention, the three-dimensional attitude angle adjustment angle is determined according to the included angle between the connecting line of the first center point and the connecting line of the second center point and the position relationship between the first center point and the second center point, and the method comprises the following steps:
judging whether an included angle between a connecting line of the first center point and a connecting line of the second center point is larger than a preset attitude angle threshold value, and if so, determining a roll angle adjusting value according to the included angle and the preset attitude angle threshold value;
judging whether the height distance difference between the first center point and the second center point is larger than a preset height distance difference threshold value or not based on the center point coordinate information, and if so, determining a pitch angle adjusting value according to the height distance difference and the preset height distance difference threshold value;
Judging whether the horizontal distance difference between the first center point and the second center point is larger than a preset horizontal distance difference threshold value, and if so, determining a yaw angle adjustment value according to the horizontal distance difference and the preset horizontal distance difference threshold value;
and acquiring the three-dimensional attitude angle adjustment angle according to the roll angle adjustment value, the pitch angle adjustment value and the yaw angle adjustment value.
According to the multi-sensor joint calibration method provided by the invention, the acquisition of the target two-dimensional image comprises the following steps:
shooting the target calibration plate through the camera sensor to obtain a calibration plate image;
acquiring all contours to be detected in the calibration plate image according to the number of top points of each contour in the calibration plate image, and determining a target contour in the calibration plate image according to the aspect ratio information of each contour to be detected;
and acquiring vertex pixel coordinate information in the target contour, and drawing a corresponding contour line in the calibration plate image based on the target contour to obtain the target two-dimensional image.
According to the multi-sensor combined calibration method provided by the invention, the calculation of the corresponding central point coordinate information in the target two-dimensional image according to the predetermined key point coordinate information in the target calibration plate comprises the following steps:
Based on OpenCV, acquiring a projection matrix of the camera sensor according to a plurality of calibration plate images shot by the camera sensor, wherein the projection matrix comprises an internal reference matrix and an external reference matrix of the camera sensor, and the calibration plates corresponding to the plurality of calibration plate images are marked with key points;
and determining the coordinate information of the key point pixels in the target two-dimensional image according to the projection matrix and the coordinate information of the key point, and acquiring the coordinate information of the corresponding central point in the target two-dimensional image according to the coordinate information of the key point pixels.
According to the multi-sensor combined calibration method provided by the invention, after the projection matrix of the camera sensor is acquired based on the OpenCV according to the plurality of calibration plate images shot by the camera sensor, the method further comprises the following steps:
and evaluating the projection matrix through the back projection error, and if the evaluation result does not meet the preset condition, calibrating the projection matrix of the camera sensor again.
According to the multi-sensor joint calibration method provided by the invention, the acquisition of the target radar point cloud comprises the following steps:
and comparing the reflectivity value of each point cloud data acquired by the radar sensor with a preset reflectivity threshold value, and filtering the point cloud data with the reflectivity value smaller than the preset reflectivity threshold value to obtain the target radar point cloud.
The invention also provides a multi-sensor combined calibration system, which comprises:
the sensor data acquisition module is used for acquiring target radar point clouds and target two-dimensional images, wherein the target radar point clouds are acquired by a radar sensor on a target calibration plate, the target two-dimensional images are acquired by a camera sensor on the target calibration plate, and the target calibration plate consists of two calibration plates;
the first processing module is used for acquiring a forward distance parameter between the target radar point cloud and the target two-dimensional image according to the distance between the first center point and the second center point;
the second processing module is used for determining a three-dimensional attitude angle adjustment angle according to the included angle between the connecting line of the first center point and the connecting line of the second center point and the position relationship between the first center point and the second center point;
the joint calibration module is used for acquiring an external parameter matrix of the radar sensor and the camera sensor according to the forward distance parameter and the three-dimensional attitude angle adjustment angle;
the first center point comprises the center point of the calibration plate point cloud corresponding to each of the two calibration plates in the target radar point cloud, and the second center point comprises the center point of the two-dimensional image corresponding to each of the two calibration plates in the target two-dimensional image.
The invention also provides electronic equipment, which comprises a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the processor realizes the multi-sensor joint calibration method according to any one of the above when executing the program.
The invention also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a multi-sensor joint calibration method as described in any one of the above.
According to the multi-sensor joint calibration method and system provided by the invention, the forward distance parameter between the radar point cloud and the two-dimensional image is determined through the distance between the radar point cloud and the corresponding center points of the two-dimensional image, and the three-dimensional attitude angle adjustment angle is determined according to the connecting line included angle and the position relation of the radar point cloud and the corresponding center points of the two-dimensional image, so that the multi-sensor joint calibration process is automatically completed, and compared with the conventional manual mode for calibration, the calibration efficiency and accuracy are improved.
Drawings
In order to more clearly illustrate the invention or the technical solutions of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described, and it is obvious that the drawings in the description below are some embodiments of the invention, and other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of a multi-sensor joint calibration method provided by the invention;
FIG. 2 is a schematic diagram of the conversion from a radar coordinate system to a camera coordinate system according to the present invention;
FIG. 3 is a schematic diagram of an overall flow of multi-sensor joint calibration provided by the invention;
FIG. 4 is a schematic view showing a calibration plate in an automatically selected image according to the present invention;
FIG. 5 is a schematic diagram of an interface of the multi-sensor calibration application software provided by the present invention;
FIG. 6 is a schematic structural diagram of a multi-sensor combined calibration system provided by the invention;
fig. 7 is a schematic structural diagram of an electronic device provided by the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The train operation control system is a key technology and core equipment for guaranteeing railway driving safety and improving transportation efficiency, and plays a vital role in a rail transit system. With the development of new generation information technology, the sensing technology is applied to the railway at a beginning scale and is used for detecting the invasion of the front obstacle, constructing a high-precision map and the like. The existing sensing system applied to the railway comprises an industrial camera and a laser radar, wherein camera internal parameters and radar external parameters of the camera are very important for the efficient operation of the sensing system, the purpose of camera calibration is to convert the space point coordinates of a camera coordinate system into an internal reference matrix of a pixel coordinate system, and the conversion relation between the camera coordinate system and the pixel coordinate system is solved; the external parameters are transformation matrixes for converting the space points of the radar coordinate system into the camera coordinate system, are basic guarantees of joint calibration of the radar and the camera, and the main method is to register solving parameters of images and point clouds, but the matching relation of the sensors after multi-angle placement is less studied.
The automation degree of the multi-sensor calibration work in the existing train sensing system is not high, manual solution is still relied on, tool software integrating internal reference generation, external participation and calibration and integration is not generated, the multi-sensor calibration work is extremely dependent on a robot operating system (Robot Operating System, ROS for short) system, and certain obstruction exists for a cross-platform architecture.
FIG. 1 is a schematic flow chart of a multi-sensor combined calibration method provided by the invention, and as shown in FIG. 1, the invention provides a multi-sensor combined calibration method, which comprises the following steps:
step 101, acquiring target radar point clouds and target two-dimensional images, wherein the target radar point clouds are acquired by a radar sensor on a target calibration plate, the target two-dimensional images are acquired by a camera sensor on the target calibration plate, and the target calibration plate consists of two calibration plates.
According to the invention, the radar sensor detects the target calibration plate by sending and receiving radio frequency signals, and extracts the spatial position information of the target from the returned signals, so that the target calibration plate is subjected to high-precision three-dimensional point cloud acquisition, and radar point cloud corresponding to the target calibration plate is obtained; meanwhile, the camera sensor converts visual information of the target into a digital image by shooting the target calibration plate, and the characteristics and the position information of the target can be extracted by image processing and a computer visual algorithm to obtain a two-dimensional image corresponding to the target calibration plate.
And 102, acquiring a forward distance parameter between the target radar point cloud and the target two-dimensional image according to the distance between the first center point and the second center point.
In the invention, 8 key points on the target calibration plate (generally, 4 vertexes on the calibration plate are selected) are anchored in advance, and the space three-dimensional coordinates of the key points are known, so that the respective center point coordinates of the target two-dimensional image and the target radar point cloud can be obtained through the coordinate information of the key points. In an embodiment, a two-dimensional image is obtained through a camera sensor, 8 anchor points (4 anchor points are respectively corresponding to two-dimensional images of two calibration plates) on the calibration plates are positioned and identified through image processing and a computer vision algorithm, and then the coordinates of the center point of the two-dimensional image are obtained by averaging the x-axis and y-axis coordinates of the anchor points respectively corresponding to the two calibration plates (namely, the center point respectively corresponding to the two calibration plates in the two-dimensional image is obtained). For point cloud data, firstly, preprocessing the point cloud acquired by a radar sensor, such as filtering, denoising and the like, then, segmenting and extracting the point cloud by using a point cloud processing library or a self-defined algorithm to obtain characteristic point clouds of 8 anchor points (the two calibration plate point clouds correspond to 4 anchor points respectively) on the calibration plates, and then, respectively averaging the coordinates of the x axis, the y axis and the z axis of the characteristic point clouds corresponding to the two calibration plates respectively to obtain coordinates of a point cloud center point (namely, obtain the center points corresponding to the two calibration plates in the point cloud).
Further, based on respective Euclidean distances of the center point of the two-dimensional image and the center point of the point cloud, the front-back relationship between the point cloud and the two-dimensional image (namely the front-back relationship between the point cloud and the two-dimensional image) can be judged by comparing the distances between the center point of the two-dimensional image and the center point of the point cloud, and further, the corresponding forward distance parameter Y is adjusted according to the difference value between the Euclidean distances until a preset distance threshold is met. In a train perception system, the forward distance parameter refers to the forward displacement distance of the radar sensor relative to the camera sensor.
Step 103, determining a three-dimensional attitude angle adjustment angle according to an included angle between a connecting line of the first center point and a connecting line of the second center point and a position relationship between the first center point and the second center point;
104, acquiring an external parameter matrix of the radar sensor and the camera sensor according to the forward distance parameter and the three-dimensional attitude angle adjustment angle;
the first center point comprises the center point of the calibration plate point cloud corresponding to each of the two calibration plates in the target radar point cloud, and the second center point comprises the center point of the two-dimensional image corresponding to each of the two calibration plates in the target two-dimensional image.
In the invention, the external parameters are defined as projecting the radar coordinate system to the camera coordinate system so that the radar coordinate system and the camera coordinate system are in the same coordinate system in the subsequent use. In an actual train sensing scene, the relative position relation and angle of the radar and the camera are not set based on a fixed mode, and the radar and the camera are inclined, hung upside down and the like. In a train perception system, three-dimensional attitude angle refers to the rotational angle of a radar sensor relative to a camera sensor in three-dimensional space, which describes the relative azimuth relationship between radar and camera.
Fig. 2 is a schematic diagram of conversion from a radar coordinate system to a camera coordinate system, which can be referred to in fig. 2, and because the mounting positions of the radar and the camera are different, the attitude angles of the radar and the camera are often different, and therefore the attitude angle needs to be adjusted; further, judging the position relation between the two-dimensional image and the center point of the radar point cloud, if the radar and the camera have the height difference, carrying out proper rotation adjustment on the Pitch angle Pitch according to the height difference, comparing the horizontal offset condition of the two-dimensional image and the center point of the radar point cloud, rotating and adjusting the Yaw angle Yaw, and adjusting the three-dimensional attitude angle, so that a complete external parameter matrix is finally obtained by utilizing a corresponding calibration algorithm based on the obtained forward distance parameter and the three-dimensional attitude angle adjustment angle.
According to the multi-sensor joint calibration method provided by the invention, the forward distance parameters between the radar point cloud and the two-dimensional image are determined through the distances between the radar point cloud and the corresponding center points of the two-dimensional image, and the three-dimensional attitude angle adjustment angle is determined according to the connecting line included angle and the position relation of the radar point cloud and the corresponding center points of the two-dimensional image, so that the multi-sensor joint calibration process is automatically completed, and compared with the conventional manual mode for calibration, the calibration efficiency and accuracy are improved.
On the basis of the foregoing embodiment, the acquiring, according to a distance between the first center point and the second center point, a forward distance parameter between the target radar point cloud and the target two-dimensional image includes:
calculating coordinate information of corresponding center points in the target radar point cloud and the target two-dimensional image according to the predetermined key point coordinate information in the target calibration plate to obtain the first center point and the second center point;
according to the coordinate information of the corresponding center points in the target radar point cloud and the target two-dimensional image, calculating to obtain a first Euclidean distance and a second Euclidean distance, wherein the first Euclidean distance is the Euclidean distance between two center points in the target radar point cloud, and the second Euclidean distance is the Euclidean distance between two center points in the target two-dimensional image;
And judging whether the forward distance difference between the first Euclidean distance and the second Euclidean distance is larger than a preset distance threshold, and if so, acquiring a forward distance parameter between the target radar point cloud and the target two-dimensional image according to the forward distance difference.
According to the method, firstly, euclidean distance between two central points in a two-dimensional image shot by a camera sensor is calculated according to coordinate information of key points (generally four vertexes of a target calibration plate, and the total number of the key points is 8) which are preset in the target calibration plate, and Euclidean distance between the two central points in a point cloud acquired by a radar sensor is calculated. Further, the two euclidean distances are compared, so that the front-back relationship between the two-dimensional image and the point cloud image can be judged according to the distance difference (namely the forward distance difference) between the euclidean distances of the two central points in the two-dimensional image and the euclidean distances of the two central points in the point cloud image, the forward distance difference is compared with a preset distance threshold value, when the forward distance difference is determined to be larger than the preset distance threshold value, the forward distance difference is used as a forward distance parameter, and the front-back relationship between the two-dimensional image and the point cloud image is adjusted, so that the translation matrix can be acquired more quickly and accurately in the calibration process.
On the basis of the above embodiment, the determining the three-dimensional attitude angle adjustment angle according to the included angle between the line connecting the first center point and the line connecting the second center point, and the positional relationship between the first center point and the second center point includes:
judging whether an included angle between a connecting line of the first center point and a connecting line of the second center point is larger than a preset attitude angle threshold value, and if so, determining a roll angle adjusting value according to the included angle and the preset attitude angle threshold value;
judging whether the height distance difference between the first center point and the second center point is larger than a preset height distance difference threshold value or not based on the center point coordinate information, and if so, determining a pitch angle adjusting value according to the height distance difference and the preset height distance difference threshold value;
judging whether the horizontal distance difference between the first center point and the second center point is larger than a preset horizontal distance difference threshold value, and if so, determining a yaw angle adjustment value according to the horizontal distance difference and the preset horizontal distance difference threshold value;
And acquiring the three-dimensional attitude angle adjustment angle according to the roll angle adjustment value, the pitch angle adjustment value and the yaw angle adjustment value.
In the invention, when the joint calibration of the radar sensor and the camera sensor is carried out, as the relative positions of the sensor are arranged in multiple angles, the different installation positions of the sensor and the camera sensor can lead to larger difference of the attitude angles, the corresponding adjustment angles are determined according to the included angle of the connecting line of the central point of the point cloud and the central point of the two-dimensional image and the position relation between the central point in the calibration process, and the original three-dimensional attitude angle is adjusted according to the rolling angle adjustment value, the pitch angle adjustment value and the yaw angle adjustment value, so as to obtain the final three-dimensional attitude angle adjustment angle. These adjustment angles can be used to correct the external matrix of the radar coordinate system to the camera coordinate system, completing the accurate registration between the radar and the camera. Fig. 3 is a schematic overall flow chart of multi-sensor joint calibration provided by the invention, and specifically referring to fig. 3, it should be noted that in the process of adjusting Pitch angle Pitch and Yaw angle Yaw, judgment can be performed according to the distance between any one center point in a point cloud image and any one center point in a two-dimensional image, and specific values (such as a preset attitude angle threshold, a preset height distance difference threshold, a preset level distance difference threshold, etc.) of a preset threshold can be set according to actual calibration accuracy requirements. According to the multi-sensor combined calibration process, through the included angle and the position relation, the relative gesture between the radar and the camera can be determined more accurately, position and gesture errors are eliminated, accuracy of a calibration result is improved, calculation and optimization steps in the calibration process can be simplified, parameter search space is reduced, and accordingly calibration efficiency and speed are improved.
On the basis of the above embodiment, the acquiring the target two-dimensional image includes:
shooting the target calibration plate through the camera sensor to obtain a calibration plate image;
acquiring all contours to be detected in the calibration plate image according to the number of top points of each contour in the calibration plate image, and determining a target contour in the calibration plate image according to the aspect ratio information of each contour to be detected;
and acquiring vertex pixel coordinate information in the target contour, and drawing a corresponding contour line in the calibration plate image based on the target contour to obtain the target two-dimensional image.
In the invention, the process of camera calibration and projection matrix acquisition is realized based on OpenCV, meanwhile, in the camera calibration process, a calibration plate in an image is automatically selected, the calibration plate is assumed to be approximately square, the calibration plate is ensured to appear in the image, in one embodiment, the following steps are adopted to automatically identify the position of the calibration plate in a two-dimensional image and frame an anchor point:
step S1, detecting a calibration plate in an image, and detecting all contours in the image;
in step S2, the input image is read using imread () in the OpenCV third party library and converted into a gray image, and the parameters are adjusted to obtain better contour detection by applying binarization to the gray image to create a binary image.
Step S3, loop checking all contours, finding an approximate contour for each contour, using the cv2.findcontours () function to find the contour in the image, and loop all contours. An approximate contour point for each contour cnt is calculated using the cv2.approxpolydp () function, and if the number of top points in the approximate contour is 4, the contour is drawn on the image. The aspect ratio of the contour cnt is calculated. An aspect ratio range is set to detect squares. It was set to [0.9,1.1]. If the ratio is between 0.9 and 1.1, the detected contour is approximately square, otherwise rectangular;
step S4, dotting the four vertexes of the obtained approximate square in the image, drawing a square outline for display, wherein FIG. 4 is a display schematic diagram of the calibration plate in the automatically selected image provided by the invention, and the square outline (for convenience of viewing, the square outline is schematically shown by a dotted line frame in FIG. 4) can be drawn in the image with reference to FIG. 4;
step S5, using cv2.imshow () to display the detected rectangular and square and the image of the drawn outline, and performing parameter adjustment optimization;
and S6, finally, recording pixel coordinate points of four vertexes of the square in the image, and inputting data when the radar and the camera are jointly calibrated.
According to the invention, the anchor point framing is automatically carried out on the calibration plate in the image through the OpenCV, compared with a manual drawing frame, the anchor point on the calibration plate can be more accurately positioned by automatic detection, the accuracy of a calibration result is improved, a large number of images can be rapidly processed, a consistent calibration standard is maintained, the calibration efficiency is improved, and the labor cost is reduced.
On the basis of the foregoing embodiment, the calculating, according to the predetermined key point coordinate information in the target calibration plate, the corresponding center point coordinate information in the target two-dimensional image includes:
based on OpenCV, acquiring a projection matrix of the camera sensor according to a plurality of calibration plate images shot by the camera sensor, wherein the projection matrix comprises an internal reference matrix and an external reference matrix of the camera sensor, and the calibration plates corresponding to the plurality of calibration plate images are marked with key points;
and determining the coordinate information of the key point pixels in the target two-dimensional image according to the projection matrix and the coordinate information of the key point, and acquiring the coordinate information of the corresponding central point in the target two-dimensional image according to the coordinate information of the key point pixels.
The camera's taking is a three-to-two-dimensional (perspective projection) process that can be described by a mathematical model, and the calibration is the computation of parameters in the mathematical model by which the three-dimensional world can be restored from the two-dimensional image.
According to the invention, through OpenCV, the internal reference matrix and the external reference matrix (simultaneously, the rotation matrix and the translation matrix of each calibration image can be obtained) of the camera are obtained, and the internal reference coefficient and the external reference coefficient can correct the image shot by the camera afterwards, so that the image with relatively small distortion is obtained.
In the camera calibration process, the same camera is used for shooting a plurality of pictures of a calibration plate from different positions and different angles, and image coordinates of all inner corner points on a calibration image are obtained, and the spatial three-dimensional coordinates of all inner corner points on the calibration plate image (generally, the image is assumed to be positioned on a Z=0 plane).
Further, since the photographed objects are all in the three-dimensional world coordinate system, and the lens faces the three-dimensional camera coordinate system when the camera photographs, the three-dimensional camera coordinate system is converted into the two-dimensional image coordinate system when imaging. The transformation matrix is different when different lenses are imaged, and meanwhile distortion can be introduced, and the calibration function is to approximately estimate the transformation matrix and the distortion coefficient. For estimation, it is necessary to know the coordinates in the three-dimensional world coordinate system of several points and the coordinates in the two-dimensional image coordinate system. Because of the influences of design process problems, camera installation environments or object placement positions and the like, the phenomenon that imaging is different from an actual image can be caused, and the influence of the design process illumination is the fact that the influence cannot be changed, the method is an internal reference of a camera; the influence of the environment or the installation mode can be changed, which is the external parameter of the camera. In one embodiment, the calibration camera process is illustrated at Zhang Zhengyou:
First, in-camera parameters are calculated: the checkerboard for calibration is specially made, the angular point coordinates of the checkerboard are known, the calibration checkerboard is a plane pi in a three-dimensional scene, and after the imaging plane is pi (the corresponding point coordinates of pi and pi are known, the homography matrix H corresponding to the two planes can be solved). According to the camera imaging model, P is a calibrated chessboard coordinate, and P is a pixel point coordinate, then:
p=K[R|T]P=HP;
after solving H through the corresponding point coordinates, the method can be used for solving the camera internal parameters K, the rotation matrix R and the translation matrix T.
Further, if the plane of the checkerboard is taken as the XOY plane on the world coordinate system, any angle P on the checkerboard is taken as the world coordinate system (X, Y, 0), the following matrix is provided:
wherein, the internal reference matrix of the camera can be calculated by the related function provided by OpenCV,and further, according to the internal parameters of the camera, the space coordinates of the calibration plate and the corresponding image coordinates are known, and the rotation matrix R and the translation vector T are solved based on the formula.
According to the invention, the internal reference matrix and the external reference matrix of the camera are solved through OpenCV, so that the automatic calibration process of the camera is simplified and efficient, and an accurate camera parameter basis is provided for the subsequent multi-sensor joint calibration.
On the basis of the above embodiment, after the OpenCV-based projection matrix of the camera sensor is obtained according to the plurality of calibration board images captured by the camera sensor, the method further includes:
and evaluating the projection matrix through the back projection error, and if the evaluation result does not meet the preset condition, calibrating the projection matrix of the camera sensor again.
In the invention, for each known three-dimensional space point, the projection calculation is carried out by using the internal reference matrix and the external reference matrix obtained by calibration, and the calibration effect can be rapidly determined by calculating the back projection error, wherein the closer the back projection error is to 0, the better the calibration effect is. If the calibration effect is not ideal, the calibration process can be optimized and the calibration can be carried out again.
On the basis of the above embodiment, the acquiring the target radar point cloud includes:
and comparing the reflectivity value of each point cloud data acquired by the radar sensor with a preset reflectivity threshold value, and filtering the point cloud data with the reflectivity value smaller than the preset reflectivity threshold value to obtain the target radar point cloud.
In the invention, each point cloud data has a corresponding reflectivity value to represent the reflection intensity of the point, and the invention filters out the point clouds with the reflectivity lower than the threshold value by setting the threshold value of the reflectivity, so that some noise or useless information can be removed, meaningful point cloud data with higher reflectivity is reserved, the quality and the effectiveness of the point cloud data are improved, and a better foundation is provided for the subsequent point cloud processing and application.
In an embodiment, based on the multi-sensor joint calibration method, a multi-sensor calibration application software is provided, and fig. 5 is an interface schematic diagram of the multi-sensor calibration application software provided by the invention, and can be referred to as fig. 5, where the software has three display interfaces, namely, a radar interface (i.e., a point cloud rendering interface), an image interface (i.e., an image rendering interface) and a fusion interface (i.e., a fusion calibration interface) according to function segmentation. The right side of the interface corresponds to an operation that can be performed, for example, connection selection of the device (e.g., radar link, camera link); the operations such as point cloud and image rotation filtering, automatic point selection and the like; generating the part of the internal participation and the external participation. When the user side is connected with the radar and the camera, the camera internal reference can be automatically generated after photographing according to the requirements of the calibration plate, the point cloud and the image can be automatically matched through the multi-sensor combined calibration process after the internal reference is selected, and finally the cross-correlation ratio of the point cloud and the image is calculated to calculate the combined calibration score.
The development and the use of the software are all carried out under Linux, but libraries and configuration files required by the X86 and ARM platform architectures are different, and two sets of programs are required to be developed. The library and header file paths contained in the qt.pro file control program may control the compiling options, linked static library versions, etc., in one embodiment, by setting the compiling options in the pro file as an X86 platform or an ARM platform, the library paths and library file versions of the two platforms are contained, and in the source code, the SDK version function interfaces under different platforms are distinguished by the compiling options.
According to the invention, based on the open source OpenCV, PCL, VTK library and the like, the image point cloud is operated and displayed, the OpenCV-based calibration is utilized to automatically identify and match the multi-sensor and multi-gesture, the edge points of the calibration plate and the gesture of the matched sensor are automatically depicted, and the usability of the software tool under different architectures is realized, wherein in the embodiment, the point cloud operation is completed through the PCL library, the image operation is completed through the OpenCV library, and the display is completed through the VTK library.
The multi-sensor combined calibration system provided by the invention is described below, and the multi-sensor combined calibration system described below and the multi-sensor combined calibration method described above can be referred to correspondingly.
Fig. 6 is a schematic structural diagram of a multi-sensor combined calibration system provided by the invention, as shown in fig. 6, the invention provides a multi-sensor combined calibration system, which comprises a sensor data acquisition module 601, a first processing module 602, a second processing module 603 and a combined calibration module 604, wherein the sensor data acquisition module 601 is used for acquiring a target radar point cloud and a target two-dimensional image, the target radar point cloud is acquired by a radar sensor on a target calibration plate, the target two-dimensional image is acquired by a camera sensor on the target calibration plate, and the target calibration plate is composed of two calibration plates; the first processing module 602 is configured to obtain a forward distance parameter between the target radar point cloud and the target two-dimensional image according to a distance between the first center point and the second center point; the second processing module 603 is configured to determine a three-dimensional attitude angle adjustment angle according to an included angle between a line connecting the first center point and a line connecting the second center point, and a positional relationship between the first center point and the second center point; the joint calibration module 604 is configured to obtain an extrinsic matrix of the radar sensor and the camera sensor according to the forward distance parameter and the three-dimensional attitude angle adjustment angle; the first center point comprises the center point of the calibration plate point cloud corresponding to each of the two calibration plates in the target radar point cloud, and the second center point comprises the center point of the two-dimensional image corresponding to each of the two calibration plates in the target two-dimensional image.
According to the multi-sensor combined calibration system provided by the invention, the forward distance parameters between the radar point cloud and the two-dimensional image are determined through the distances between the radar point cloud and the corresponding center points of the two-dimensional image, and the three-dimensional attitude angle adjustment angle is determined according to the connecting line included angle and the position relation of the radar point cloud and the corresponding center points of the two-dimensional image, so that the multi-sensor combined calibration process is automatically completed, and compared with the conventional manual mode for calibration, the calibration efficiency and the calibration accuracy are improved.
The system provided by the invention is used for executing the method embodiments, and specific flow and details refer to the embodiments and are not repeated herein.
Fig. 7 is a schematic structural diagram of an electronic device according to the present invention, as shown in fig. 7, the electronic device may include: a Processor (Processor) 701, a communication interface (Communications Interface) 702, a Memory (Memory) 703 and a communication bus 704, wherein the Processor 701, the communication interface 702 and the Memory 703 communicate with each other through the communication bus 704. The processor 701 may invoke logic instructions in the memory 703 to perform a multi-sensor joint calibration method comprising: acquiring target radar point clouds and target two-dimensional images, wherein the target radar point clouds are acquired by a radar sensor on a target calibration plate, the target two-dimensional images are acquired by a camera sensor on the target calibration plate, and the target calibration plate consists of two calibration plates; acquiring a forward distance parameter between the target radar point cloud and the target two-dimensional image according to the distance between the first center point and the second center point; determining a three-dimensional attitude angle adjustment angle according to an included angle between a connecting line of the first center point and a connecting line of the second center point and a position relationship between the first center point and the second center point; acquiring an external parameter matrix of the radar sensor and the camera sensor according to the forward distance parameter and the three-dimensional attitude angle adjustment angle; the first center point comprises the center point of the calibration plate point cloud corresponding to each of the two calibration plates in the target radar point cloud, and the second center point comprises the center point of the two-dimensional image corresponding to each of the two calibration plates in the target two-dimensional image.
Further, the logic instructions in the memory 703 may be implemented in the form of software functional units and may be stored in a computer readable storage medium when sold or used as a stand alone product. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In another aspect, the present invention also provides a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, enable the execution of the multi-sensor joint calibration method provided by the above methods, the method comprising: acquiring target radar point clouds and target two-dimensional images, wherein the target radar point clouds are acquired by a radar sensor on a target calibration plate, the target two-dimensional images are acquired by a camera sensor on the target calibration plate, and the target calibration plate consists of two calibration plates; acquiring a forward distance parameter between the target radar point cloud and the target two-dimensional image according to the distance between the first center point and the second center point; determining a three-dimensional attitude angle adjustment angle according to an included angle between a connecting line of the first center point and a connecting line of the second center point and a position relationship between the first center point and the second center point; acquiring an external parameter matrix of the radar sensor and the camera sensor according to the forward distance parameter and the three-dimensional attitude angle adjustment angle; the first center point comprises the center point of the calibration plate point cloud corresponding to each of the two calibration plates in the target radar point cloud, and the second center point comprises the center point of the two-dimensional image corresponding to each of the two calibration plates in the target two-dimensional image.
In yet another aspect, the present invention further provides a non-transitory computer readable storage medium having stored thereon a computer program, which when executed by a processor, is implemented to perform the multi-sensor joint calibration method provided in the above embodiments, the method comprising: acquiring target radar point clouds and target two-dimensional images, wherein the target radar point clouds are acquired by a radar sensor on a target calibration plate, the target two-dimensional images are acquired by a camera sensor on the target calibration plate, and the target calibration plate consists of two calibration plates; acquiring a forward distance parameter between the target radar point cloud and the target two-dimensional image according to the distance between the first center point and the second center point; determining a three-dimensional attitude angle adjustment angle according to an included angle between a connecting line of the first center point and a connecting line of the second center point and a position relationship between the first center point and the second center point; acquiring an external parameter matrix of the radar sensor and the camera sensor according to the forward distance parameter and the three-dimensional attitude angle adjustment angle; the first center point comprises the center point of the calibration plate point cloud corresponding to each of the two calibration plates in the target radar point cloud, and the second center point comprises the center point of the two-dimensional image corresponding to each of the two calibration plates in the target two-dimensional image.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.
Claims (10)
1. The multi-sensor joint calibration method is characterized by comprising the following steps of:
acquiring target radar point clouds and target two-dimensional images, wherein the target radar point clouds are acquired by a radar sensor on a target calibration plate, the target two-dimensional images are acquired by a camera sensor on the target calibration plate, and the target calibration plate consists of two calibration plates;
acquiring a forward distance parameter between the target radar point cloud and the target two-dimensional image according to the distance between the first center point and the second center point;
determining a three-dimensional attitude angle adjustment angle according to an included angle between a connecting line of the first center point and a connecting line of the second center point and a position relationship between the first center point and the second center point;
Acquiring an external parameter matrix of the radar sensor and the camera sensor according to the forward distance parameter and the three-dimensional attitude angle adjustment angle;
the first center point comprises the center point of the calibration plate point cloud corresponding to each of the two calibration plates in the target radar point cloud, and the second center point comprises the center point of the two-dimensional image corresponding to each of the two calibration plates in the target two-dimensional image.
2. The method for calibrating a multi-sensor combination according to claim 1, wherein the step of obtaining a forward distance parameter between the target radar point cloud and the target two-dimensional image according to a distance between a first center point and a second center point includes:
calculating coordinate information of corresponding center points in the target radar point cloud and the target two-dimensional image according to the predetermined key point coordinate information in the target calibration plate to obtain the first center point and the second center point;
according to the coordinate information of the corresponding center points in the target radar point cloud and the target two-dimensional image, calculating to obtain a first Euclidean distance and a second Euclidean distance, wherein the first Euclidean distance is the Euclidean distance between two center points in the target radar point cloud, and the second Euclidean distance is the Euclidean distance between two center points in the target two-dimensional image;
And judging whether the forward distance difference between the first Euclidean distance and the second Euclidean distance is larger than a preset distance threshold, and if so, acquiring a forward distance parameter between the target radar point cloud and the target two-dimensional image according to the forward distance difference.
3. The multi-sensor joint calibration method according to claim 2, wherein the determining the three-dimensional attitude angle adjustment angle according to the included angle between the line connecting the first center point and the line connecting the second center point and the positional relationship between the first center point and the second center point includes:
judging whether an included angle between a connecting line of the first center point and a connecting line of the second center point is larger than a preset attitude angle threshold value, and if so, determining a roll angle adjusting value according to the included angle and the preset attitude angle threshold value;
judging whether the height distance difference between the first center point and the second center point is larger than a preset height distance difference threshold value or not based on the center point coordinate information, and if so, determining a pitch angle adjusting value according to the height distance difference and the preset height distance difference threshold value;
Judging whether the horizontal distance difference between the first center point and the second center point is larger than a preset horizontal distance difference threshold value, and if so, determining a yaw angle adjustment value according to the horizontal distance difference and the preset horizontal distance difference threshold value;
and acquiring the three-dimensional attitude angle adjustment angle according to the roll angle adjustment value, the pitch angle adjustment value and the yaw angle adjustment value.
4. The multi-sensor joint calibration method according to claim 2, wherein the acquiring the target two-dimensional image includes:
shooting the target calibration plate through the camera sensor to obtain a calibration plate image;
acquiring all contours to be detected in the calibration plate image according to the number of top points of each contour in the calibration plate image, and determining a target contour in the calibration plate image according to the aspect ratio information of each contour to be detected;
and acquiring vertex pixel coordinate information in the target contour, and drawing a corresponding contour line in the calibration plate image based on the target contour to obtain the target two-dimensional image.
5. The multi-sensor joint calibration method according to claim 2 or 4, wherein the calculating the corresponding center point coordinate information in the target two-dimensional image according to the predetermined key point coordinate information in the target calibration plate includes:
Based on OpenCV, acquiring a projection matrix of the camera sensor according to a plurality of calibration plate images shot by the camera sensor, wherein the projection matrix comprises an internal reference matrix and an external reference matrix of the camera sensor, and the calibration plates corresponding to the plurality of calibration plate images are marked with key points;
and determining the coordinate information of the key point pixels in the target two-dimensional image according to the projection matrix and the coordinate information of the key point, and acquiring the coordinate information of the corresponding central point in the target two-dimensional image according to the coordinate information of the key point pixels.
6. The multi-sensor joint calibration method according to claim 5, wherein after the obtaining the projection matrix of the camera sensor based on OpenCV from the plurality of calibration plate images captured by the camera sensor, the method further comprises:
and evaluating the projection matrix through the back projection error, and if the evaluation result does not meet the preset condition, calibrating the projection matrix of the camera sensor again.
7. The multi-sensor joint calibration method according to claim 1, wherein the acquiring the target radar point cloud comprises:
And comparing the reflectivity value of each point cloud data acquired by the radar sensor with a preset reflectivity threshold value, and filtering the point cloud data with the reflectivity value smaller than the preset reflectivity threshold value to obtain the target radar point cloud.
8. A multi-sensor joint calibration system, comprising:
the sensor data acquisition module is used for acquiring target radar point clouds and target two-dimensional images, wherein the target radar point clouds are acquired by a radar sensor on a target calibration plate, the target two-dimensional images are acquired by a camera sensor on the target calibration plate, and the target calibration plate consists of two calibration plates;
the first processing module is used for acquiring a forward distance parameter between the target radar point cloud and the target two-dimensional image according to the distance between the first center point and the second center point;
the second processing module is used for determining a three-dimensional attitude angle adjustment angle according to the included angle between the connecting line of the first center point and the connecting line of the second center point and the position relationship between the first center point and the second center point;
The joint calibration module is used for acquiring an external parameter matrix of the radar sensor and the camera sensor according to the forward distance parameter and the three-dimensional attitude angle adjustment angle;
the first center point comprises the center point of the calibration plate point cloud corresponding to each of the two calibration plates in the target radar point cloud, and the second center point comprises the center point of the two-dimensional image corresponding to each of the two calibration plates in the target two-dimensional image.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements the multi-sensor joint calibration method according to any one of claims 1 to 7 when the computer program is executed by the processor.
10. A non-transitory computer readable storage medium, having stored thereon a computer program, characterized in that the computer program, when executed by a processor, implements the multi-sensor joint calibration method according to any of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311467311.4A CN117665730A (en) | 2023-11-06 | 2023-11-06 | Multi-sensor joint calibration method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311467311.4A CN117665730A (en) | 2023-11-06 | 2023-11-06 | Multi-sensor joint calibration method and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117665730A true CN117665730A (en) | 2024-03-08 |
Family
ID=90085348
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311467311.4A Pending CN117665730A (en) | 2023-11-06 | 2023-11-06 | Multi-sensor joint calibration method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117665730A (en) |
-
2023
- 2023-11-06 CN CN202311467311.4A patent/CN117665730A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10636151B2 (en) | Method for estimating the speed of movement of a camera | |
CN111815716B (en) | Parameter calibration method and related device | |
JP4245963B2 (en) | Method and system for calibrating multiple cameras using a calibration object | |
JP5580164B2 (en) | Optical information processing apparatus, optical information processing method, optical information processing system, and optical information processing program | |
EP2111530B1 (en) | Automatic stereo measurement of a point of interest in a scene | |
AU2017225023A1 (en) | System and method for determining a camera pose | |
Yan et al. | Joint camera intrinsic and lidar-camera extrinsic calibration | |
CN110728715A (en) | Camera angle self-adaptive adjusting method of intelligent inspection robot | |
US20110293142A1 (en) | Method for recognizing objects in a set of images recorded by one or more cameras | |
US7764284B2 (en) | Method and system for detecting and evaluating 3D changes from images and a 3D reference model | |
JP2011253376A (en) | Image processing device, image processing method and program | |
WO2007133620A2 (en) | System and architecture for automatic image registration | |
CN109360230A (en) | A kind of method for registering images and system based on 2D camera Yu 3D camera | |
CN109801333A (en) | Volume measuring method, device, system and calculating equipment | |
CN113034612B (en) | Calibration device, method and depth camera | |
CN113096183B (en) | Barrier detection and measurement method based on laser radar and monocular camera | |
CN113763478B (en) | Unmanned vehicle camera calibration method, device, equipment, storage medium and system | |
CN112184811A (en) | Monocular space structured light system structure calibration method and device | |
Jung et al. | A novel 2.5 D pattern for extrinsic calibration of tof and camera fusion system | |
JP2024527156A (en) | System and method for optimal transport and epipolar geometry based image processing - Patents.com | |
JP2008309595A (en) | Object recognizing device and program used for it | |
JP3842988B2 (en) | Image processing apparatus for measuring three-dimensional information of an object by binocular stereoscopic vision, and a method for recording the same, or a recording medium recording the measurement program | |
CN112802114A (en) | Multi-vision sensor fusion device and method and electronic equipment | |
Pless et al. | Extrinsic calibration of a camera and laser range finder | |
CN113483669B (en) | Multi-sensor pose calibration method and device based on three-dimensional target |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |