CN111965624B - Laser radar and camera calibration method, device, equipment and readable storage medium - Google Patents
Laser radar and camera calibration method, device, equipment and readable storage medium Download PDFInfo
- Publication number
- CN111965624B CN111965624B CN202010784817.8A CN202010784817A CN111965624B CN 111965624 B CN111965624 B CN 111965624B CN 202010784817 A CN202010784817 A CN 202010784817A CN 111965624 B CN111965624 B CN 111965624B
- Authority
- CN
- China
- Prior art keywords
- point cloud
- cloud data
- calibration plate
- coordinate system
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 65
- 238000006243 chemical reaction Methods 0.000 claims abstract description 54
- 239000013598 vector Substances 0.000 claims description 83
- 239000011159 matrix material Substances 0.000 claims description 34
- 238000012937 correction Methods 0.000 claims description 22
- 238000013519 translation Methods 0.000 claims description 22
- 230000015654 memory Effects 0.000 claims description 19
- 238000000605 extraction Methods 0.000 claims description 13
- 238000013136 deep learning model Methods 0.000 claims description 6
- 230000001502 supplementing effect Effects 0.000 claims description 5
- 230000008569 process Effects 0.000 abstract description 11
- 238000004364 calculation method Methods 0.000 abstract description 4
- 238000013473 artificial intelligence Methods 0.000 abstract description 2
- 238000005516 engineering process Methods 0.000 abstract description 2
- 238000010586 diagram Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 4
- 238000004590 computer program Methods 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 238000013135 deep learning Methods 0.000 description 3
- 238000002372 labelling Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000000691 measurement method Methods 0.000 description 2
- 230000036544 posture Effects 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000003062 neural network model Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The embodiment of the application discloses a laser radar and camera calibration method, device, equipment and readable storage medium, which relate to the artificial intelligence technology, in particular to the technical fields of intelligent traffic and unmanned. The specific implementation scheme is as follows: acquiring point cloud data obtained by scanning a calibration plate by a laser radar, and shooting an image obtained by the calibration plate by a camera; extracting a target point cloud data set positioned on a plane of a calibration plate from the point cloud data, and determining first pose information of the calibration plate under a laser radar coordinate system according to the position of the target point cloud data set; determining second pose information of the calibration plate under a camera coordinate system according to the characteristics of the plane of the calibration plate in the image; and calibrating the laser radar and the camera according to the conversion relation between the first pose information and the second pose information. The method and the device have the advantages of simple process, less calculation amount and capability of saving computer resources.
Description
Technical Field
The application relates to artificial intelligence technology, especially relates to intelligent transportation and unmanned technical field.
Background
In the running process of the unmanned vehicle, the laser radar and the camera are required to work cooperatively to finish the sensing and positioning of the vehicle body. At present, laser radar and camera calibration are required to be performed on various unmanned vehicle platforms to achieve more accurate positioning and sensing.
The existing technical scheme often needs to reconstruct the three-dimensional image shot by the camera, match the reconstructed image with the point cloud acquired by the laser radar, calibrate the camera and the laser radar, and has large operation amount and consumes more calculation resources.
Disclosure of Invention
The embodiment of the application provides a calibration method, device and equipment of a laser radar and a camera and a readable storage medium.
In a first aspect, an embodiment of the present application provides a calibration method for a laser radar and a camera, including:
acquiring point cloud data obtained by scanning a calibration plate by a laser radar, and shooting an image obtained by the calibration plate by a camera;
extracting a target point cloud data set positioned on a plane of a calibration plate from the point cloud data, and determining first pose information of the calibration plate under a laser radar coordinate system according to the position of the target point cloud data set;
determining second pose information of the calibration plate under a camera coordinate system according to the characteristics of the plane of the calibration plate in the image;
And calibrating the laser radar and the camera according to the conversion relation between the first pose information and the second pose information.
In a second aspect, an embodiment of the present application further provides a calibration device for a laser radar and a camera, including:
the acquisition module is used for acquiring point cloud data obtained by scanning the calibration plate by the laser radar and an image obtained by shooting the calibration plate by the camera;
the first pose determining module is used for extracting a target point cloud data set positioned on the plane of the calibration plate from the point cloud data and determining first pose information of the calibration plate under a laser radar coordinate system according to the position of the target point cloud data set;
the second pose determining module is used for determining second pose information of the calibration plate under a camera coordinate system according to the characteristics of the plane of the calibration plate in the image;
and the calibration module is used for calibrating the laser radar and the camera according to the conversion relation between the first pose information and the second pose information.
In a third aspect, an embodiment of the present application provides an electronic device, including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
The memory stores instructions executable by the at least one processor to enable the at least one processor to perform a method of calibrating a lidar and a camera provided in any of the embodiments.
In a fourth aspect, embodiments of the present application provide a non-transitory computer-readable storage medium storing computer instructions for causing a computer to perform a method for calibrating a lidar and a camera provided in any of the embodiments.
According to the method and the device, the technical process is simple, the calculated amount is small, and computer resources can be saved.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
The drawings are for better understanding of the present solution and do not constitute a limitation of the present application. Wherein:
FIG. 1a is a flow chart of a first method of calibrating a lidar and a camera in an embodiment of the present application;
FIG. 1b is a schematic diagram of a lidar and camera view calibration plate in an embodiment of the present application;
FIG. 2a is a flow chart of a second method of calibrating a lidar with a camera in an embodiment of the application;
FIG. 2b is a schematic illustration of a marking plate plane in an image in an embodiment of the present application;
FIG. 3 is a flow chart of a third method of calibrating a lidar and a camera in an embodiment of the present application;
FIG. 4 is a flow chart of a fourth method of calibrating a lidar and a camera in an embodiment of the present application;
FIG. 5 is a block diagram of a laser radar and camera calibration device in an embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device in an embodiment of the present application.
Detailed Description
Exemplary embodiments of the present application are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present application to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
According to the embodiment of the application, fig. 1a is a flowchart of a first method for calibrating a laser radar and a camera in the embodiment of the application, where the embodiment of the application is applicable to a case of performing external parameter calibration on the laser radar and the camera by using a calibration plate, that is, a case of obtaining a conversion relationship between a laser radar coordinate system and a camera coordinate system. The method is executed by a calibration device of the laser radar and the camera, and the device is realized by software and/or hardware and is specifically configured in electronic equipment with certain data computing capability.
The calibration method of the laser radar and the camera shown in fig. 1a comprises the following steps:
s110, acquiring point cloud data obtained by scanning the calibration plate by the laser radar and an image obtained by shooting the calibration plate by the camera.
The calibration plate in this embodiment is a planar plate, and the surface of the calibration plate is painted with a pattern with striking colors and shapes, such as a grid with alternating black and white. The calibration plate is placed in the fields of view of the laser radar and the camera in different positions and/or postures (abbreviated as postures), so that the laser radar and the camera can observe the whole calibration plate. As shown in fig. 1 b.
Specifically, when the calibration plate is in each pose, starting the laser radar to scan the point cloud data obtained by the calibration plate, and starting the camera to shoot the image obtained by the calibration plate. The pose of the laser radar and the pose of the camera are kept unchanged, so that multiple groups of point cloud data and multiple images under different calibration plate poses can be obtained, the corresponding point cloud data and each image are utilized for calibration, and then the calibration results are averaged. Of course, the pose of the calibration plate can be not adjusted, and only one group of point cloud data and one image are used for calibration.
S120, extracting a target point cloud data set positioned on the plane of the calibration plate from the point cloud data, and determining first pose information of the calibration plate under a laser radar coordinate system according to the position of the target point cloud data set.
The point cloud data is three-dimensional and characterizes the shape, size, depth, etc. of the scanned object, wherein part of the point cloud data characterizes the plane of the calibration plate (i.e. the plane on which the pattern is drawn), i.e. is located on the plane of the calibration plate. For convenience of description and distinction, a plurality of point cloud data located on the plane of the calibration plate form a target point cloud data set.
The position of the target point cloud data set comprises the position of each point cloud data in the plurality of point cloud data, and the position and the gesture of the calibration plate under the laser radar coordinate system are commonly described, namely first gesture information.
S130, determining second pose information of the calibration plate under the camera coordinate system according to the characteristics of the plane of the calibration plate in the image.
The features of the calibration plate plane include, but are not limited to, the shape, size and angle of the calibration plate, the shape of the pattern, and the like, and when the pose of the calibration plate is fixed and known, the camera is characterized as being in different poses when the calibration plate plane presents different features. Based on the above, a camera coordinate system can be obtained according to the characteristics of the plane of the calibration plate in the image, and then the pose of the known calibration plate is converted into the camera coordinate system to obtain the position and the pose of the calibration plate under the camera coordinate system, which is abbreviated as second pose information.
And S140, calibrating the laser radar and the camera according to the conversion relation between the first pose information and the second pose information.
Because the laser radar and the camera observe the calibration plate with the same pose, the first pose information and the second pose information can reflect the pose of the laser radar and the camera. Based on this, the conversion relationship between the first pose information and the second pose information reflects the conversion relationship between the lidar coordinate system and the camera coordinate system. Optionally, the conversion relation between the coordinate systems includes a rotation matrix and a translation vector between the coordinate systems, so as to complete the calibration of the laser radar and the camera.
According to the embodiment, the target point cloud data set on the plane of the calibration plate can be extracted from the point cloud data, and further, according to the position of the target point cloud data set, the first pose information of the calibration plate under the laser radar coordinate system is automatically determined without manual labeling and interception; according to the characteristics of the plane of the calibration plate in the image, the second pose information of the calibration plate in the camera coordinate system can be automatically determined, the pose characteristics of the laser radar and the camera can be reflected by the first pose information and the second pose information, the laser radar and the camera are calibrated according to the conversion relation between the first pose information and the second pose information, the whole process is calibrated by the pose of the plane of the calibration plate in the two coordinate systems, the process is simple, the calculated amount is less, computer resources can be saved, three-dimensional reconstruction is not needed, and complex point set registration is not needed.
In the above and the following embodiments, extracting the target point cloud data set located on the calibration plate plane from the point cloud data includes the following two alternative embodiments: 1) Extracting a target point cloud data set positioned on the plane of the calibration plate from the point cloud data by adopting a deep learning model; 2) Based on initial external parameters of the laser radar and the camera, projecting the point cloud data to an image coordinate system; and extracting a target point cloud data set positioned in the plane of the marking plate from the point cloud data.
And 1), training the deep learning model by adopting a point cloud data sample corresponding to the calibration plate, and inputting the actually scanned point cloud data into the trained deep learning model to obtain a target point cloud data set which is output by the deep model and accords with the characteristics of the calibration plate.
For 2), the external parameters of the laser radar and the camera can be initially determined by adopting an actual measurement method or a drawing, and the external parameters are not accurate enough due to the error of the measurement method or the drawing, and are called initial external parameters, so that more accurate external parameters are required to be obtained on the basis. Specifically, a rotation matrix and a translation vector between a laser radar coordinate system and a camera coordinate system are constructed according to initial external parameters of the laser radar and the camera, the rotation matrix and the translation vector are acted on point cloud data to be projected under the camera coordinate system, then the point cloud data under the camera coordinate system are projected under an image coordinate system by utilizing known camera internal parameters, and three-dimensional point cloud data become two-dimensional data and have a corresponding relation with pixels or coordinates of the image. Based on the method, a plurality of two-dimensional point cloud data covered by the calibration plate plane in the image are extracted, and a target point cloud data set is formed by a plurality of three-dimensional point cloud data corresponding to the plurality of two-dimensional point cloud data.
According to the embodiment, the point cloud data on the plane of the calibration plate is automatically extracted by a deep learning method or an image projection method, and manual labeling or interception is not needed; furthermore, the method of deep learning and image projection can ensure a certain accuracy.
Fig. 2a is a flowchart of a second calibration method of a laser radar and a camera according to an embodiment of the present application, where the method of image projection is optimized based on the technical solutions of the embodiments described above.
The calibration method of the laser radar and the camera shown in fig. 2a specifically comprises the following operations:
s210, acquiring point cloud data obtained by scanning the calibration plate by the laser radar and an image obtained by shooting the calibration plate by the camera.
S220, based on initial external parameters of the laser radar and the camera, the point cloud data are projected to an image coordinate system.
S230, detecting the plane of the calibration plate in the image.
Optionally, the target recognition model is adopted to recognize the image, and a calibration plate plane is obtained. The target recognition model can be a neural network model based on deep learning, and is trained by adopting an image sample comprising a calibration plate plane.
The calibration plate plane is used for point cloud data projection, and the actual calibration plate plane in the image is properly reduced in consideration of the fact that the point cloud data on some non-calibration plates are projected into the calibration plate plane in the image due to inaccurate initial external parameters. Optionally, detecting a calibration plate region in the image; and (3) reducing the area of the calibration plate along the center of the area of the calibration plate to obtain the plane of the calibration plate.
Specifically, fig. 2b is a schematic diagram of a plane of a marking plate in an image in an embodiment of the present application. Due to the influence of the shooting angle, the calibration plate of the original matrix presents a trapezoid in the image. The outer frame of the calibration plate plane, called the calibration plate area, is detected via image detection (e.g., recognition using a target recognition model). Obtaining intersection points of diagonal lines of quadrangles by diagonal lines of the outer cover frame to obtain the center of the calibration plate area; and reducing the outer frame into a quadrangle with a side length which is half of that of the original quadrangle by utilizing the relation between the intersection points of the diagonal lines and the corner points.
S240, extracting a target point cloud data set positioned in the plane of the calibration plate from the projected point cloud data.
According to the description of the above embodiment, the three-dimensional point cloud data is projected onto the image coordinate system, and the three-dimensional point cloud data is two-dimensional data, and has a correspondence relationship with the pixels or coordinates of the image. And extracting a plurality of two-dimensional point cloud data covered by the calibration plate plane in the image, and forming a target point cloud data set by a plurality of three-dimensional point cloud data corresponding to the plurality of two-dimensional point cloud data.
S250, extracting point cloud data with the distance from the target point cloud data set within a set distance threshold from the rest point cloud data, and supplementing the point cloud data to the target point cloud data set.
The remaining point cloud data are point cloud data except the target point cloud data set in all the point cloud data. Wherein the remaining point cloud data, all point cloud data, and the target point cloud data set are three-dimensional. The remaining point cloud data may also have data located on the plane of the calibration plate, and the target point cloud data set in S240 is actually a point projected onto the plane of the calibration plate after shrinking, or there may be point cloud data projected out of the plane of the calibration plate, but actually belonging to the calibration plate. In a word, the current target point cloud data set is not complete enough and needs to be expanded to fully describe the plane of the calibration plate, so that the calibration precision is improved.
Based on the analysis, taking a piece of residual point cloud data, calculating the distance between the point cloud data and any point cloud data in the target point cloud data set, supplementing the point cloud data into the target point cloud data set if the distances are within a set distance threshold, and taking down the point cloud data; and if the distance non-uniformity is within the set distance threshold, discarding the point cloud data, and taking down the point cloud data until all the rest point cloud data are processed. Optionally, in the case shown in fig. 2b, if a distance between a remaining point cloud data and any point cloud data in the target point cloud data set is less than half the diagonal length of the calibration plate, the point cloud data is supplemented to the target point cloud data set.
And S260, determining first pose information of the calibration plate under the laser radar coordinate system according to the position of the target point cloud data set.
S270, determining second pose information of the calibration plate under the camera coordinate system according to the characteristics of the plane of the calibration plate in the image.
S280, calibrating the laser radar and the camera according to the conversion relation between the first pose information and the second pose information.
Fig. 3 is a flowchart of a third laser radar and camera calibration method according to an embodiment of the present application, which is further optimized based on the above embodiment.
Optionally, the first pose information includes a normal vector of the calibration plate and a position of the set point under the laser radar coordinate system; the second pose information includes a normal vector of the calibration plate in the camera coordinate system and a position of the set point.
The normal vector of the calibration plate characterizes the gesture of the calibration plate, and the set point is any point on the calibration plate, such as a corner point or a center point. The position of the set point characterizes the position of the calibration plate. According to the embodiment, the position of the calibration plate can be clearly and concisely represented through the normal vector and the position of the set point, a data base is provided for subsequent calibration, and the calculation amount of calibration is reduced.
Alternatively, the operations will be: calibrating the laser radar and the camera according to the conversion relation between the first pose information and the second pose information to 'refine' a rotation matrix between the normal vector of the calibration plate under the laser radar coordinate system and the normal vector of the calibration plate under the camera coordinate system; calculating a translation vector between the position of the setpoint in the lidar coordinate system and the position of the setpoint in the camera coordinate system based on the rotation matrix; and calibrating the laser radar and the camera according to the rotation matrix and the translation vector.
The calibration method of the laser radar and the camera shown in fig. 3 comprises the following steps:
s310, acquiring point cloud data obtained by scanning the calibration plate by the laser radar and an image obtained by shooting the calibration plate by the camera.
S320, extracting a target point cloud data set positioned on the plane of the calibration plate from the point cloud data, and determining first pose information of the calibration plate under a laser radar coordinate system according to the position of the target point cloud data set; the first pose information includes a normal vector of the calibration plate in the lidar coordinate system and a position of the set point.
Optionally, performing plane fitting on the target point cloud data set according to the position of the target point cloud data set; and taking the normal vector of the plane and the position of the set point of the calibration plate in the target point cloud data set as first pose information of the calibration plate under the laser radar coordinate system.
Specifically, a random sampling consistency method is used for carrying out plane fitting on a target point cloud data set, and the process is as follows: 1) And (3) randomly taking out three non-collinear data from the target point cloud data set, and solving a plane equation. 2) And calculating the distance between each data in the target point cloud data set and the plane, if the distance is within a set threshold (an empirical value is generally 5 cm), judging the point as a point in the plane, and counting and recording the total number of the point cloud data in the plane. 3) Repeating the steps 1 and 2 for a plurality of times (for example, 50 times), taking the result of one time with the maximum total number of the point cloud data in the plane as a fitting result, and taking the point cloud data in the plane as the point cloud data on the calibration plate. And obtaining the normal vector of the plane according to the plane obtained by fitting. The position of the set point (e.g., midpoint or corner) is selected on the plane.
According to the embodiment, the accuracy of the point cloud data on the plane of the calibration plate is further improved by performing plane fitting on the point cloud data. And point clouds outside the plane are eliminated, so that the accuracy of the normal vector and the position of the set point is improved.
S330, determining second pose information of the calibration plate under the camera coordinate system according to the characteristics of the plane of the calibration plate in the image; the second pose information includes a normal vector of the calibration plate in the camera coordinate system and a position of the set point.
Optionally, calculating a conversion relation between a camera coordinate system and a calibration plate coordinate system according to the characteristics of the calibration plate plane in the image; and calculating the normal vector of the calibration plate and the position of the set point under the camera coordinate system based on the conversion relation, and taking the normal vector and the position of the set point as second pose information.
Optionally and according to the actual size of the calibration plate, a calibration plate coordinate system is defined, wherein the calibration plate plane is generally defined as z=0 plane, and the normal vector of the calibration plate under the calibration plate coordinate system can be expressed as N b =(0,0,1) T . According to the characteristics of the plane of the calibration plate in the image and by combining with the internal parameters of the camera, the conversion relation between the coordinate system of the camera and the coordinate system of the calibration plate during image shooting, namely the rotation matrix and the translation vector, can be calculated, thereby obtaining the normal vector N of the calibration plate under the coordinate system of the camera c :
Wherein,is a rotation matrix.
Optionally, a PNP (selective-N-Point) method is used to calibrate the camera, and the projection relationship is calculated by using N feature points on the calibration plate and N pixel points in the image, so as to obtain the pose of the camera, and further obtain the conversion relationship between the camera coordinate system and the calibration plate coordinate system.
In this embodiment, the conversion relationship between the camera coordinate system and the calibration plate coordinate system can be accurately obtained through the computer vision principle, and further, under the condition that the calibration plate coordinate system is known, accurate second pose information can be obtained through simple coordinate system conversion between the normal vector and the set point position.
S340, calculating a rotation matrix between the normal vector of the calibration plate under the laser radar coordinate system and the normal vector of the calibration plate under the camera coordinate system.
Specifically, the normal vector of the calibration plate under the laser radar coordinate system is multiplied by the rotation matrix to obtain the normal vector of the calibration plate under the camera coordinate system. The normal vector of the calibration plate has the following relation with the representation of the laser radar coordinate system in the camera coordinate system:
wherein N is l (i) The normal vector of the calibration plate is calculated according to the ith group of point cloud data, N c (i) The normal vector of the calibration plate is calculated according to the ith image,is the rotation matrix between the normal vectors. Assuming that i is less than or equal to m, there are m relationships represented by the formula (2), that is, m rotation matrices can be obtained, and thus rotation matrices R_l≡c between m coordinate systems can be obtained.
S350, calculating a translation vector between the position of the setpoint in the laser radar coordinate system and the position of the setpoint in the camera coordinate system based on the rotation matrix.
Specifically, solving the equation (3) yields a translation vector t_l ζ.
P_c(i)=R_l^c*P_l(i)+t_l^c; (3)
Wherein, P_c (i) is the position of the set point under the camera coordinate system obtained according to the ith image, P_l (i) is the position of the set point under the laser radar coordinate system obtained according to the ith set point cloud data, and t_l≡c obtained by corresponding solution is m.
S360, calibrating the laser radar and the camera according to the rotation matrix and the translation vector.
Optionally, m rotation matrices R_l≡c and m translation vectors are respectively averaged to calibrate the laser radar and the camera.
According to the embodiment, the positions of the normal vector and the set point of the calibration plate under the laser radar coordinate system and the camera coordinate system are respectively aligned, so that the laser radar coordinate system and the camera coordinate system are aligned, the rotation matrix and the translation vector between the coordinate systems are sequentially calculated, and the purpose of calibration is achieved.
Fig. 4 is a flowchart of a fourth laser radar and camera calibration method according to an embodiment of the present application, where the calibration process is optimized based on the above embodiment.
The calibration method of the laser radar and the camera shown in fig. 4 comprises the following steps:
s410, acquiring point cloud data obtained by scanning the calibration plate by the laser radar and an image obtained by shooting the calibration plate by the camera.
S420, extracting a target point cloud data set positioned on the plane of the calibration plate from the point cloud data, and determining first pose information of the calibration plate under a laser radar coordinate system according to the position of the target point cloud data set.
S430, determining second pose information of the calibration plate under the camera coordinate system according to the characteristics of the plane of the calibration plate in the image.
S440, based on the conversion relation and the correction value of the conversion relation to be solved, the target point cloud data set is projected into a calibration plate plane under a camera coordinate system, and a vector formed by any two target point cloud data is obtained.
S450, enabling the vector to be perpendicular to the normal vector of the calibration plate under the camera coordinate system, and solving the correction value of the conversion relation.
S460, calibrating the laser radar and the camera according to the conversion relation and the correction value.
According to the description of the above embodiment, the preliminary conversion relationship between the coordinate systems, that is, the rotation matrix and the translation matrix can be obtained from the conversion relationship between the first pose information and the second pose information. In consideration of errors in the process of plane fitting error, image detection and projection, the present embodiment further corrects the preliminarily obtained conversion relationship. The correction values comprise correction values of the rotation matrix and correction values of the translation matrix, and in order to keep consistent with the matrix, the correction values are also in the form of the matrix, so that calculation is convenient.
It is assumed that the conversion relation and the correction value of the conversion relation are combined to approximate the real conversion relation. By using the conversion relation and the correction value of the conversion relation to be solved, almost all target point cloud data sets can be projected into a calibration plate plane under a camera coordinate system and perpendicular to the normal vector of the calibration plate. Based on the above analysis, the correction value of the conversion relation can be solved reversely.
Specifically, the target point cloud dataset is projected under the camera coordinate system using equation (4).
v_c(i)=R*R_l^c*v_l(i)+t_l^c+t; (4)
Wherein v_c (i) is data of projecting a target point cloud data set in the i-th group of point cloud data to a camera coordinate system, v_l (i) is a target point cloud data set in the i-th group of point cloud data, R is a correction value of a rotation matrix, and t is a correction value of a translation matrix.
The data v_c (i) projected to the camera coordinate system is three-dimensional, and the origin coordinates of the calibration plate in the camera coordinate system are subtracted from v_c (i), so that a plurality of point cloud data which are theoretically positioned in the plane of the calibration plate can be obtained. Any two target point cloud data in the data form a vector, and a plurality of vectors can be obtained.
The dot product of the vector and the normal vector of the calibration plate should be 0, as shown in formula (5):
(a(j)·N c )^2=cost; (5)
where a (j) is j vectors. cost is the value of the dot product, which should be approximated to 0. The cost can be optimized by using a nonlinear optimization theory, and the correction values of the rotation matrix and the translation vector can be obtained outside the initial value obtained in the front: r and t.
Thus, the final rotation matrix R_l ζ c and translation vector t_l ζ c+t between the laser radar coordinate system and the camera coordinate system can be obtained.
According to the embodiment of the application, fig. 5 is a block diagram of a calibration device of a laser radar and a camera in the embodiment of the application, and the embodiment of the application is suitable for the case of performing external parameter calibration on the laser radar and the camera by using a calibration plate, and the device is implemented by using software and/or hardware and is specifically configured in an electronic device with certain data computing capability.
A calibration device 500 for a lidar and a camera as shown in fig. 5, comprising: an acquisition module 501, a first pose determination module 502, a second pose determination module 503 and a calibration module 504; wherein,
the acquisition module 501 is used for acquiring point cloud data obtained by scanning the calibration plate by the laser radar and an image obtained by shooting the calibration plate by the camera;
the first pose determining module 502 is configured to extract a target point cloud data set located on a plane of the calibration plate from the point cloud data, and determine first pose information of the calibration plate under the laser radar coordinate system according to a position of the target point cloud data set;
a second pose determining module 503, configured to determine second pose information of the calibration plate under the camera coordinate system according to the features of the calibration plate plane in the image;
the calibration module 504 calibrates the laser radar and the camera according to the conversion relation between the first pose information and the second pose information.
According to the embodiment, the target point cloud data set on the plane of the calibration plate can be extracted from the point cloud data, and further, according to the position of the target point cloud data set, the first pose information of the calibration plate under the laser radar coordinate system is automatically determined without manual labeling and interception; according to the characteristics of the plane of the calibration plate in the image, the second pose information of the calibration plate in the camera coordinate system can be automatically determined, the pose characteristics of the laser radar and the camera can be reflected by the first pose information and the second pose information, the laser radar and the camera are calibrated according to the conversion relation between the first pose information and the second pose information, the whole process is calibrated by the pose of the plane of the calibration plate in the two coordinate systems, the process is simple, the calculated amount is less, computer resources can be saved, three-dimensional reconstruction is not needed, and complex point set registration is not needed.
Further, the first pose determining module 502 includes an extracting sub-module and a first pose determining sub-module; the first pose determining submodule is used for determining first pose information of the calibration plate under the laser radar coordinate system according to the position of the target point cloud data set; an extraction sub-module comprising: the first extraction unit is used for extracting a target point cloud data set positioned on the plane of the calibration plate from the point cloud data by adopting a deep learning model; or the second extraction unit is used for projecting the point cloud data under an image coordinate system based on initial external parameters of the laser radar and the camera; and extracting a target point cloud data set positioned in the plane of the marking plate from the point cloud data.
Further, the second extraction unit includes: a projection subunit and an extraction subunit; the projection subunit is used for projecting the point cloud data to an image coordinate system based on initial external parameters of the laser radar and the camera; an extraction subunit for: detecting a calibration plate plane in the image; extracting a target point cloud data set positioned in the plane of the calibration plate from the projected point cloud data; extracting point cloud data with the distance from the target point cloud data set within a set distance threshold from the rest point cloud data, and supplementing the point cloud data to the target point cloud data set; the rest point cloud data are point cloud data except the target point cloud data set in all the point cloud data.
Further, the extraction subunit is specifically configured to, when detecting the calibration plate plane in the image: detecting a calibration plate area in the image; and (3) reducing the area of the calibration plate along the center of the area of the calibration plate to obtain the plane of the calibration plate.
Further, the first pose information comprises the normal vector of the calibration plate and the position of the set point under the laser radar coordinate system; the second pose information includes a normal vector of the calibration plate in the camera coordinate system and a position of the set point.
Further, the first pose determining module 502 is specifically configured to: performing plane fitting on the target point cloud data set according to the position of the target point cloud data set; and taking the normal vector of the plane and the position of the set point of the calibration plate in the target point cloud data set as first pose information of the calibration plate under the laser radar coordinate system.
Further, the second pose determining module 503 is specifically configured to: according to the characteristics of the calibration plate plane in the image, calculating the conversion relation between the camera coordinate system and the calibration plate coordinate system; and calculating the normal vector of the calibration plate and the position of the set point under the camera coordinate system based on the conversion relation, and taking the normal vector and the position of the set point as second pose information.
Further, the calibration module 504 is specifically configured to: calculating a rotation matrix between the normal vector of the calibration plate under the laser radar coordinate system and the normal vector of the calibration plate under the camera coordinate system; calculating a translation vector between the position of the setpoint in the lidar coordinate system and the position of the setpoint in the camera coordinate system based on the rotation matrix; and calibrating the laser radar and the camera according to the rotation matrix and the translation vector.
Further, the calibration module 504 is specifically configured to: based on the conversion relation and a correction value of the conversion relation to be solved, projecting the target point cloud data set into a calibration plate plane under a camera coordinate system to obtain a vector formed by any two target point cloud data; the vector is perpendicular to the normal vector of the calibration plate under the camera coordinate system, and the correction value of the conversion relation is solved; and calibrating the laser radar and the camera according to the conversion relation and the correction value.
The laser radar and camera calibration device can execute the laser radar and camera calibration method provided by any embodiment of the application, and has the corresponding functional modules and beneficial effects of executing the laser radar and camera calibration method.
According to embodiments of the present application, an electronic device and a readable storage medium are also provided.
Fig. 6 is a block diagram of an electronic device implementing a method for calibrating a lidar and a camera according to an embodiment of the present application. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the application described and/or claimed herein.
As shown in fig. 6, the electronic device includes: one or more processors 601, memory 602, and interfaces for connecting the components, including high-speed interfaces and low-speed interfaces. The various components are interconnected using different buses and may be mounted on a common motherboard or in other manners as desired. The processor may process instructions executing within the electronic device, including instructions stored in or on memory to display graphical information of the GUI on an external input/output device, such as a display device coupled to the interface. In other embodiments, multiple processors and/or multiple buses may be used, if desired, along with multiple memories and multiple memories. Also, multiple electronic devices may be connected, with each terminal providing a portion of the necessary operations (e.g., as a server array, a set of blade servers, or a multiprocessor system). One processor 601 is illustrated in fig. 6.
Memory 602 is a non-transitory computer-readable storage medium provided herein. The memory stores instructions executable by the at least one processor to cause the at least one processor to perform the method for calibrating the lidar and the camera provided by the application. The non-transitory computer readable storage medium of the present application stores computer instructions for causing a computer to execute the laser radar and camera calibration method provided by the present application.
The memory 602 is used as a non-transitory computer readable storage medium, and may be used to store a non-transitory software program, a non-transitory computer executable program, and modules, such as program instructions/modules (e.g., including the acquisition module 501, the first pose determining module 502, the second pose determining module 503, and the calibration module 504 shown in fig. 5) corresponding to the calibration method of the laser radar and the camera in the embodiments of the present application. The processor 601 executes various functional applications of the server and data processing by running non-transitory software programs, instructions and modules stored in the memory 602, that is, implements the laser radar and camera calibration method in the above method embodiment.
The memory 602 may include a storage program area and a storage data area, wherein the storage program area may store an operating system, at least one application program required for a function; the storage data area may store data created by the use of an electronic device implementing a method of calibrating a lidar with a camera, etc. In addition, the memory 602 may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid-state storage device. In some embodiments, memory 602 may optionally include memory remotely located relative to processor 601, which may be connected via a network to an electronic device performing the laser radar and camera calibration method. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The electronic device that performs the calibration method of the lidar and the camera may further include: an input device 603 and an output device 604. The processor 601, memory 602, input device 603 and output device 604 may be connected by a bus or otherwise, for example in fig. 6.
The input device 603 may receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device performing the laser radar and camera calibration method, such as a touch screen, keypad, mouse, trackpad, touch pad, pointer stick, one or more mouse buttons, trackball, joystick, etc. input devices. The output means 604 may include a display device, auxiliary lighting means (e.g., LEDs), tactile feedback means (e.g., vibration motors), and the like. The display device may include, but is not limited to, a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display, and a plasma display. In some implementations, the display device may be a touch screen.
Various implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, application specific ASIC (application specific integrated circuit), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
These computing programs (also referred to as programs, software applications, or code) include machine instructions for a programmable processor, and may be implemented in a high-level procedural and/or object-oriented programming language, and/or in assembly/machine language. As used herein, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), the internet, and blockchain networks.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present application may be performed in parallel, sequentially, or in a different order, provided that the desired results of the technical solutions disclosed in the present application can be achieved, and are not limited herein.
The above embodiments do not limit the scope of the application. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present application are intended to be included within the scope of the present application.
Claims (16)
1. A calibration method of a laser radar and a camera comprises the following steps:
acquiring point cloud data obtained by scanning a calibration plate by a laser radar, and shooting an image obtained by the calibration plate by a camera;
extracting a target point cloud data set positioned on a plane of a calibration plate from the point cloud data, and determining first pose information of the calibration plate under a laser radar coordinate system according to the position of the target point cloud data set;
Determining second pose information of the calibration plate under a camera coordinate system according to the characteristics of the plane of the calibration plate in the image;
calibrating the laser radar and the camera according to the conversion relation between the first pose information and the second pose information;
the extracting the target point cloud data set on the plane of the calibration plate from the point cloud data comprises the following steps:
projecting the point cloud data under an image coordinate system based on initial external parameters of the laser radar and the camera; and extracting a target point cloud data set located in a target plate plane in the image from the point cloud data, including: detecting a calibration plate plane in the image; extracting a target point cloud data set positioned in the plane of the calibration plate from the projected point cloud data; extracting point cloud data with the distance from the target point cloud data set within a set distance threshold from the rest point cloud data, and supplementing the point cloud data to the target point cloud data set; the rest point cloud data are point cloud data except the target point cloud data set in all the point cloud data;
wherein, the detecting the calibration plate plane in the image includes: detecting a calibration plate area in the image; and reducing the calibration plate area along the center of the calibration plate area to obtain the calibration plate plane.
2. The method of claim 1, wherein the extracting the target point cloud data set located on the calibration plate plane from the point cloud data further comprises:
and extracting a target point cloud data set positioned on the plane of the calibration plate from the point cloud data by adopting a deep learning model.
3. The method of claim 1, wherein,
the first pose information comprises the normal vector of the calibration plate and the position of a set point under the laser radar coordinate system;
the second pose information includes a normal vector of the calibration plate and a position of a set point in the camera coordinate system.
4. A method according to claim 3, wherein said determining first pose information of the calibration plate in a lidar coordinate system from the position of the target point cloud data set comprises:
performing plane fitting on the target point cloud data set according to the position of the target point cloud data set;
and taking the normal vector of the plane and the position of the set point of the calibration plate in the target point cloud data set as first pose information of the calibration plate under the laser radar coordinate system.
5. A method according to claim 3, wherein said determining second pose information of the calibration plate in a camera coordinate system from features of the calibration plate plane in the image comprises:
Calculating the conversion relation between a camera coordinate system and a calibration plate coordinate system according to the characteristics of the calibration plate plane in the image;
and calculating the normal vector of the calibration plate and the position of the set point under the camera coordinate system based on the conversion relation, and taking the normal vector and the position of the set point as the second pose information.
6. The method of any of claims 3-5, wherein calibrating the lidar with the camera according to a conversion relationship between the first pose information and the second pose information comprises:
calculating a rotation matrix between a normal vector of the calibration plate under the laser radar coordinate system and a normal vector of the calibration plate under the camera coordinate system;
calculating a translation vector between a position of the setpoint in the lidar coordinate system and a position of the setpoint in the camera coordinate system based on the rotation matrix;
and calibrating the laser radar and the camera according to the rotation matrix and the translation vector.
7. The method of any of claims 1-5, wherein calibrating the lidar with the camera according to a conversion relationship between the first pose information and the second pose information comprises:
Based on the conversion relation and a correction value of the conversion relation to be solved, projecting the target point cloud data set into a calibration plate plane under a camera coordinate system to obtain a vector formed by any two target point cloud data;
the vector is perpendicular to the normal vector of the calibration plate under the camera coordinate system, and the correction value of the conversion relation is solved;
and calibrating the laser radar and the camera according to the conversion relation and the correction value.
8. A laser radar and camera calibration device, comprising:
the acquisition module is used for acquiring point cloud data obtained by scanning the calibration plate by the laser radar and an image obtained by shooting the calibration plate by the camera;
the first pose determining module is used for extracting a target point cloud data set positioned on the plane of the calibration plate from the point cloud data and determining first pose information of the calibration plate under a laser radar coordinate system according to the position of the target point cloud data set;
the second pose determining module is used for determining second pose information of the calibration plate under a camera coordinate system according to the characteristics of the plane of the calibration plate in the image;
the calibration module is used for calibrating the laser radar and the camera according to the conversion relation between the first pose information and the second pose information;
The first pose determining module comprises a projection subunit and an extraction subunit;
the projection subunit is used for projecting the point cloud data under an image coordinate system based on the initial external parameters of the laser radar and the camera;
the extraction subunit is configured to: detecting a calibration plate plane in the image; extracting a target point cloud data set positioned in the plane of the calibration plate from the projected point cloud data; extracting point cloud data with the distance from the target point cloud data set within a set distance threshold from the rest point cloud data, and supplementing the point cloud data to the target point cloud data set; the rest point cloud data are point cloud data except the target point cloud data set in all the point cloud data;
the extraction subunit is specifically configured to, when detecting a calibration plate plane in the image: detecting a calibration plate area in the image; and reducing the calibration plate area along the center of the calibration plate area to obtain the calibration plate plane.
9. The apparatus of claim 8, wherein the first pose determination module comprises an extraction sub-module and a first pose determination sub-module;
The first pose determining submodule is used for determining first pose information of the calibration plate under a laser radar coordinate system according to the position of the target point cloud data set;
the extraction submodule further comprises:
and the first extraction unit is used for extracting a target point cloud data set positioned on the plane of the calibration plate from the point cloud data by adopting a deep learning model.
10. The apparatus of claim 8, wherein,
the first pose information comprises the normal vector of the calibration plate and the position of a set point under the laser radar coordinate system;
the second pose information includes a normal vector of the calibration plate and a position of a set point in the camera coordinate system.
11. The apparatus of claim 10, wherein the first pose determination module is specifically configured to:
performing plane fitting on the target point cloud data set according to the position of the target point cloud data set;
and taking the normal vector of the plane and the position of the set point of the calibration plate in the target point cloud data set as first pose information of the calibration plate under the laser radar coordinate system.
12. The apparatus of claim 10, wherein the second pose determination module is specifically configured to:
Calculating the conversion relation between a camera coordinate system and a calibration plate coordinate system according to the characteristics of the calibration plate plane in the image;
and calculating the normal vector of the calibration plate and the position of the set point under the camera coordinate system based on the conversion relation, and taking the normal vector and the position of the set point as the second pose information.
13. The device according to any one of claims 10-12, wherein the calibration module is specifically configured to:
calculating a rotation matrix between a normal vector of the calibration plate under the laser radar coordinate system and a normal vector of the calibration plate under the camera coordinate system;
calculating a translation vector between a position of the setpoint in the lidar coordinate system and a position of the setpoint in the camera coordinate system based on the rotation matrix;
and calibrating the laser radar and the camera according to the rotation matrix and the translation vector.
14. The device according to any one of claims 8-12, wherein the calibration module is specifically configured to:
based on the conversion relation and a correction value of the conversion relation to be solved, projecting the target point cloud data set into a calibration plate plane under a camera coordinate system to obtain a vector formed by any two target point cloud data;
The vector is perpendicular to the normal vector of the calibration plate under the camera coordinate system, and the correction value of the conversion relation is solved;
and calibrating the laser radar and the camera according to the conversion relation and the correction value.
15. An electronic device, comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a method of calibrating a lidar and a camera according to any of claims 1-7.
16. A non-transitory computer readable storage medium storing computer instructions for causing the computer to perform a method of calibrating a lidar and a camera according to any of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010784817.8A CN111965624B (en) | 2020-08-06 | 2020-08-06 | Laser radar and camera calibration method, device, equipment and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010784817.8A CN111965624B (en) | 2020-08-06 | 2020-08-06 | Laser radar and camera calibration method, device, equipment and readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111965624A CN111965624A (en) | 2020-11-20 |
CN111965624B true CN111965624B (en) | 2024-04-09 |
Family
ID=73364647
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010784817.8A Active CN111965624B (en) | 2020-08-06 | 2020-08-06 | Laser radar and camera calibration method, device, equipment and readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111965624B (en) |
Families Citing this family (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112180348B (en) * | 2020-11-27 | 2021-03-02 | 深兰人工智能(深圳)有限公司 | Attitude calibration method and device for vehicle-mounted multi-line laser radar |
CN112509057B (en) * | 2020-11-30 | 2024-04-12 | 北京百度网讯科技有限公司 | Camera external parameter calibration method, device, electronic equipment and computer readable medium |
CN112562009A (en) * | 2020-12-03 | 2021-03-26 | 深圳宇磐科技有限公司 | Method and system for automatically calibrating camera equipment parameters and installation attitude parameters |
CN112258590B (en) * | 2020-12-08 | 2021-04-27 | 杭州迦智科技有限公司 | Laser-based depth camera external parameter calibration method, device and storage medium thereof |
CN112462350B (en) * | 2020-12-10 | 2023-04-04 | 苏州一径科技有限公司 | Radar calibration method and device, electronic equipment and storage medium |
CN112766302B (en) * | 2020-12-17 | 2024-03-29 | 浙江大华技术股份有限公司 | Image fusion method and device, storage medium and electronic device |
CN112509062B (en) * | 2020-12-17 | 2023-09-12 | 广东工业大学 | Calibration plate, calibration system and calibration method |
CN112446927B (en) * | 2020-12-18 | 2024-08-30 | 广东电网有限责任公司 | Laser radar and camera combined calibration method, device, equipment and storage medium |
CN112710235B (en) * | 2020-12-21 | 2022-08-26 | 阿波罗智联(北京)科技有限公司 | Calibration method and device of structured light measuring sensor |
CN112581542B (en) * | 2020-12-24 | 2024-07-19 | 阿波罗智联(北京)科技有限公司 | Evaluation method, device and equipment for monocular calibration algorithm of automatic driving |
CN114693770A (en) * | 2020-12-31 | 2022-07-01 | 北京小米移动软件有限公司 | Calibration method and device |
CN112631431B (en) * | 2021-01-04 | 2023-06-16 | 杭州光粒科技有限公司 | Method, device and equipment for determining pose of AR (augmented reality) glasses and storage medium |
CN112802124B (en) * | 2021-01-29 | 2023-10-31 | 北京罗克维尔斯科技有限公司 | Calibration method and device for multiple stereo cameras, electronic equipment and storage medium |
CN112964276B (en) * | 2021-02-09 | 2022-08-05 | 中国科学院深圳先进技术研究院 | Online calibration method based on laser and vision fusion |
CN113034593B (en) * | 2021-03-09 | 2023-12-12 | 深圳市广宁股份有限公司 | 6D pose labeling method, system and storage medium |
CN113256729B (en) * | 2021-03-17 | 2024-06-18 | 广西综合交通大数据研究院 | External parameter calibration method, device and equipment for laser radar and camera and storage medium |
CN112967347B (en) * | 2021-03-30 | 2023-12-15 | 深圳市优必选科技股份有限公司 | Pose calibration method, pose calibration device, robot and computer readable storage medium |
CN113077523B (en) * | 2021-03-31 | 2023-11-24 | 商汤集团有限公司 | Calibration method, calibration device, computer equipment and storage medium |
CN113160328B (en) * | 2021-04-09 | 2024-09-13 | 上海智蕙林医疗科技有限公司 | External parameter calibration method, system, robot and storage medium |
CN113325434A (en) * | 2021-04-16 | 2021-08-31 | 盎锐(上海)信息科技有限公司 | Explosion point display method and system for actual measurement actual quantity and laser radar |
CN113138375B (en) * | 2021-04-27 | 2022-11-29 | 北京理工大学 | Combined calibration method |
CN113176557B (en) * | 2021-04-29 | 2023-03-24 | 中国科学院自动化研究所 | Virtual laser radar online simulation method based on projection |
CN115327512A (en) * | 2021-05-10 | 2022-11-11 | 北京万集科技股份有限公司 | Calibration method, device, server and storage medium for laser radar and camera |
CN113269840B (en) * | 2021-05-27 | 2024-07-09 | 深圳一清创新科技有限公司 | Combined calibration method for camera and multi-laser radar and electronic equipment |
CN115598624B (en) * | 2021-06-28 | 2023-12-12 | 苏州一径科技有限公司 | Laser radar calibration method, device and equipment |
CN113436277A (en) * | 2021-07-15 | 2021-09-24 | 无锡先导智能装备股份有限公司 | 3D camera calibration method, device and system |
CN113436278A (en) * | 2021-07-22 | 2021-09-24 | 深圳市道通智能汽车有限公司 | Calibration method, calibration device, distance measurement system and computer readable storage medium |
CN113848541B (en) * | 2021-09-22 | 2022-08-26 | 深圳市镭神智能系统有限公司 | Calibration method and device, unmanned aerial vehicle and computer readable storage medium |
CN114387347B (en) * | 2021-10-26 | 2023-09-19 | 浙江视觉智能创新中心有限公司 | Method, device, electronic equipment and medium for determining external parameter calibration |
CN114332240B (en) * | 2021-12-23 | 2024-10-18 | 赛可智能科技(上海)有限公司 | Multi-sensor combined calibration method and calibration device |
CN114758005B (en) * | 2022-03-23 | 2023-03-28 | 中国科学院自动化研究所 | Laser radar and camera external parameter calibration method and device |
CN114862961B (en) * | 2022-04-13 | 2024-06-07 | 上海人工智能创新中心 | Position detection method and device for calibration plate, electronic equipment and readable storage medium |
CN115100287B (en) * | 2022-04-14 | 2024-09-03 | 美的集团(上海)有限公司 | External parameter calibration method and robot |
CN114511626B (en) * | 2022-04-20 | 2022-08-05 | 杭州灵西机器人智能科技有限公司 | Image processing device, method, device and medium based on RGBD camera system |
CN115049810A (en) * | 2022-06-10 | 2022-09-13 | 始途科技(杭州)有限公司 | Coloring method, device and equipment for solid-state laser radar point cloud and storage medium |
CN114782556B (en) * | 2022-06-20 | 2022-09-09 | 季华实验室 | Camera and laser radar registration method and system and storage medium |
CN115018935B (en) * | 2022-08-09 | 2022-10-18 | 季华实验室 | Calibration method and device for camera and vehicle, electronic equipment and storage medium |
CN115222826B (en) * | 2022-09-15 | 2022-12-27 | 深圳大学 | Three-dimensional reconstruction method and device with changeable relative poses of structured light and camera |
CN115909272A (en) * | 2022-11-09 | 2023-04-04 | 杭州枕石智能科技有限公司 | Method for acquiring obstacle position information, terminal device and computer medium |
CN115868961A (en) * | 2022-11-28 | 2023-03-31 | 杭州柳叶刀机器人有限公司 | Probe tip position calibration device and method and electronic equipment |
CN115932879B (en) * | 2022-12-16 | 2023-10-10 | 哈尔滨智兀科技有限公司 | Mine robot gesture rapid measurement system based on laser point cloud |
CN116091619A (en) * | 2022-12-27 | 2023-05-09 | 北京纳通医用机器人科技有限公司 | Calibration method, device, equipment and medium |
CN115877401B (en) * | 2023-02-07 | 2023-11-10 | 南京北路智控科技股份有限公司 | Posture detection method, device and equipment for hydraulic support and storage medium |
CN115856849B (en) * | 2023-02-28 | 2023-05-05 | 季华实验室 | Depth camera and 2D laser radar calibration method and related equipment |
CN115953484B (en) * | 2023-03-13 | 2023-07-04 | 福思(杭州)智能科技有限公司 | Parameter calibration method and device of detection equipment, storage medium and electronic device |
CN118463981B (en) * | 2024-07-09 | 2024-09-27 | 中联重科股份有限公司 | Pose calibration method for curtain wall and image acquisition equipment and curtain wall installation method |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3438777A1 (en) * | 2017-08-04 | 2019-02-06 | Bayerische Motoren Werke Aktiengesellschaft | Method, apparatus and computer program for a vehicle |
CN109920011A (en) * | 2019-05-16 | 2019-06-21 | 长沙智能驾驶研究院有限公司 | Outer ginseng scaling method, device and the equipment of laser radar and binocular camera |
CN110161485A (en) * | 2019-06-13 | 2019-08-23 | 同济大学 | A kind of outer ginseng caliberating device and scaling method of laser radar and vision camera |
KR102029850B1 (en) * | 2019-03-28 | 2019-10-08 | 세종대학교 산학협력단 | Object detecting apparatus using camera and lidar sensor and method thereof |
CN111127563A (en) * | 2019-12-18 | 2020-05-08 | 北京万集科技股份有限公司 | Combined calibration method and device, electronic equipment and storage medium |
CN111179358A (en) * | 2019-12-30 | 2020-05-19 | 浙江商汤科技开发有限公司 | Calibration method, device, equipment and storage medium |
CN111308448A (en) * | 2018-12-10 | 2020-06-19 | 杭州海康威视数字技术股份有限公司 | Image acquisition equipment and radar external parameter determination method and device |
CN111429521A (en) * | 2020-03-05 | 2020-07-17 | 深圳市镭神智能系统有限公司 | External parameter calibration method, device, medium and electronic equipment for camera and laser radar |
US10726579B1 (en) * | 2019-11-13 | 2020-07-28 | Honda Motor Co., Ltd. | LiDAR-camera calibration |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109297510B (en) * | 2018-09-27 | 2021-01-01 | 百度在线网络技术(北京)有限公司 | Relative pose calibration method, device, equipment and medium |
-
2020
- 2020-08-06 CN CN202010784817.8A patent/CN111965624B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3438777A1 (en) * | 2017-08-04 | 2019-02-06 | Bayerische Motoren Werke Aktiengesellschaft | Method, apparatus and computer program for a vehicle |
CN111308448A (en) * | 2018-12-10 | 2020-06-19 | 杭州海康威视数字技术股份有限公司 | Image acquisition equipment and radar external parameter determination method and device |
KR102029850B1 (en) * | 2019-03-28 | 2019-10-08 | 세종대학교 산학협력단 | Object detecting apparatus using camera and lidar sensor and method thereof |
CN109920011A (en) * | 2019-05-16 | 2019-06-21 | 长沙智能驾驶研究院有限公司 | Outer ginseng scaling method, device and the equipment of laser radar and binocular camera |
CN110161485A (en) * | 2019-06-13 | 2019-08-23 | 同济大学 | A kind of outer ginseng caliberating device and scaling method of laser radar and vision camera |
US10726579B1 (en) * | 2019-11-13 | 2020-07-28 | Honda Motor Co., Ltd. | LiDAR-camera calibration |
CN111127563A (en) * | 2019-12-18 | 2020-05-08 | 北京万集科技股份有限公司 | Combined calibration method and device, electronic equipment and storage medium |
CN111179358A (en) * | 2019-12-30 | 2020-05-19 | 浙江商汤科技开发有限公司 | Calibration method, device, equipment and storage medium |
CN111429521A (en) * | 2020-03-05 | 2020-07-17 | 深圳市镭神智能系统有限公司 | External parameter calibration method, device, medium and electronic equipment for camera and laser radar |
Non-Patent Citations (1)
Title |
---|
基于点云中心的激光雷达与相机联合标定方法研究;康国华 等;仪器仪表学报(第12期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN111965624A (en) | 2020-11-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111965624B (en) | Laser radar and camera calibration method, device, equipment and readable storage medium | |
US20210312209A1 (en) | Vehicle information detection method, electronic device and storage medium | |
US11875535B2 (en) | Method, apparatus, electronic device and computer readable medium for calibrating external parameter of camera | |
CN111612852B (en) | Method and apparatus for verifying camera parameters | |
CN110443205B (en) | Hand image segmentation method and device | |
US11587332B2 (en) | Method, apparatus, system, and storage medium for calibrating exterior parameter of on-board camera | |
US20210209792A1 (en) | Positioning Method, Electronic Device and Computer Readable Storage Medium | |
CN111767853B (en) | Lane line detection method and device | |
US20210240971A1 (en) | Data processing method and apparatus, electronic device and storage medium | |
CN112184828B (en) | Laser radar and camera external parameter calibration method and device and automatic driving vehicle | |
CN112184837B (en) | Image detection method and device, electronic equipment and storage medium | |
CN112967344B (en) | Method, device, storage medium and program product for calibrating camera external parameters | |
US20220358679A1 (en) | Parameter Calibration Method and Apparatus | |
US20160275359A1 (en) | Information processing apparatus, information processing method, and computer readable medium storing a program | |
CN112241716B (en) | Training sample generation method and device | |
US11557062B2 (en) | Method and apparatus for processing video frame | |
CN111652103B (en) | Indoor positioning method, device, equipment and storage medium | |
CN111080640B (en) | Hole detection method, device, equipment and medium | |
US10386930B2 (en) | Depth determining method and depth determining device of operating body | |
CN111191619A (en) | Method, device and equipment for detecting virtual line segment of lane line and readable storage medium | |
CN110619664B (en) | Laser pattern-assisted camera distance posture calculation method and server | |
CN117152270A (en) | Laser radar and camera combined calibration method, device, equipment and medium | |
CN117078767A (en) | Laser radar and camera calibration method and device, electronic equipment and storage medium | |
KR20200080409A (en) | Apparatus for analyzing microstructure | |
CN112651983B (en) | Splice graph identification method and device, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20211018 Address after: 100176 101, floor 1, building 1, yard 7, Ruihe West 2nd Road, Beijing Economic and Technological Development Zone, Daxing District, Beijing Applicant after: Apollo Intelligent Connectivity (Beijing) Technology Co., Ltd. Address before: 2 / F, baidu building, 10 Shangdi 10th Street, Haidian District, Beijing 100085 Applicant before: BEIJING BAIDU NETCOM SCIENCE AND TECHNOLOGY Co.,Ltd. |
|
GR01 | Patent grant | ||
GR01 | Patent grant |