CN112270713A - Calibration method and device, storage medium and electronic device - Google Patents

Calibration method and device, storage medium and electronic device Download PDF

Info

Publication number
CN112270713A
CN112270713A CN202011094343.0A CN202011094343A CN112270713A CN 112270713 A CN112270713 A CN 112270713A CN 202011094343 A CN202011094343 A CN 202011094343A CN 112270713 A CN112270713 A CN 112270713A
Authority
CN
China
Prior art keywords
calibration plate
point cloud
calibration
sensor
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011094343.0A
Other languages
Chinese (zh)
Inventor
欧阳真超
崔家赫
何云翔
朱进文
牛建伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Innovation Research Institute of Beihang University
Original Assignee
Hangzhou Innovation Research Institute of Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Innovation Research Institute of Beihang University filed Critical Hangzhou Innovation Research Institute of Beihang University
Priority to CN202011094343.0A priority Critical patent/CN112270713A/en
Publication of CN112270713A publication Critical patent/CN112270713A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix

Abstract

The embodiment of the application provides a calibration method and device, a storage medium and an electronic device, wherein the method comprises the following steps: collecting point cloud data of the solid laser radar to obtain an area where the calibration plate is located; extracting to obtain calibration plate point cloud according to the region; fitting a first corner position coordinate of the calibration plate through nonlinear optimization, wherein the first corner position coordinate is a three-dimensional inner corner position coordinate of point cloud obtained by measuring the calibration plate by a solid-state laser radar in the sensor; obtaining an external parameter conversion matrix of the sensor according to a second corner position coordinate of the calibration plate and the first corner position coordinate, wherein the second corner position coordinate is a two-dimensional inner corner coordinate of an image obtained by shooting the calibration plate by a camera in the sensor; and calibrating in the sensor through the external parameter transformation matrix. By adopting the scheme in the application, the calibration precision is increased, and the manual intervention is reduced.

Description

Calibration method and device, storage medium and electronic device
Technical Field
The present application relates to an automatic driving technology, and in particular, to a calibration method and apparatus, a storage medium, and an electronic apparatus.
Background
The sensors mounted at different positions of the vehicle body can acquire information of roads, pedestrians, vehicles coming and going in an open environment and the like, and are used in an automatic driving system.
The perception scheme of the autonomous driving vehicle in the related art can simultaneously obtain visual information of the real world and three-dimensional spatial distance information in space. But the two-dimensional plane imaging result of the camera and the sparse lidar point cloud three-dimensional data need to be registered. But the calibration accuracy can be affected by a number of factors.
Aiming at the problem that the calibration scheme of the perception task in the automatic driving system in the related technology is poor in effect, an effective solution does not exist at present.
Disclosure of Invention
The embodiment of the application provides a calibration method and device for a solid-state laser radar-camera multi-sensor system, a storage medium and an electronic device, and aims to at least solve the problem that the effect of a calibration scheme of a perception task in an automatic driving system in the related art is poor.
According to a first aspect of embodiments of the present application, there is provided a calibration method, including: collecting point cloud data of the solid laser radar to obtain an area where the calibration plate is located; extracting to obtain a calibration plate according to the region; solving according to the calibration plate to obtain coordinates of the plane of the calibration plate according to the point cloud space distribution; and obtaining an external parameter conversion matrix of the sensor according to the second corner position coordinate and the first corner position coordinate of the calibration plate. The first corner position coordinate is a three-dimensional inner corner coordinate of a point cloud obtained by measuring a calibration plate by a solid laser radar in the sensor; the second corner position coordinates are two-dimensional inner corner coordinates of an image obtained by shooting a calibration plate by a camera in the sensor; and calibrating in the sensor through the external parameter transformation matrix.
Optionally, after calibration in the sensor by the external reference transformation matrix, the method further includes: projecting radar point cloud data in the sensor to an imaging plane of a camera in the sensor; and/or mapping visible light image data in the sensor to three-dimensional space.
Optionally, the obtaining of the calibration plate according to the region extraction includes: calculating the probability density distribution of the Z-axis height of the point cloud in the region; converting the discrete probability density of the probability density distribution into a histogram; calculating a gradient between two adjacent bins in the histogram; and segmenting and extracting according to the gradient calculation result to obtain the calibration plate.
Optionally, the obtaining, according to the solution of the calibration plate, a fitted coordinate of the plane of the calibration plate according to the point cloud spatial distribution includes: carrying out plane fitting on the point cloud data of the calibration plate based on a plane segmentation method of random sampling consistency to obtain a fitting plane of the calibration plate; and determining the coordinates of the plane of the calibration plate according to the point cloud space distribution on the fitting plane of the calibration plate.
Optionally, the determining the first corner position coordinate of the calibration plate according to the plane of the calibration plate and the coordinate of the point cloud spatial distribution includes: generating a preset calibration plate in a two-dimensional plane according to the actual size parameter width and height of the calibration plate and the size of the calibration plate; and solving the three-dimensional pose difference value of the actual point cloud data and the preset calibration plate so as to enable the reflectivity of the actual point cloud data to be consistent with that of the preset calibration plate along with the spatial distribution condition.
Optionally, the acquiring point cloud data of the sensor, and determining the area where the calibration plate is located includes: filtering the collected point cloud data by utilizing statistical outliers, and overlapping the point cloud data in continuous time on a space; and marking the area of the initial position of the calibration plate.
Optionally, the obtaining an external parameter transformation matrix of the sensor according to the second corner position coordinate and the first corner position coordinate of the calibration board includes: the sensor includes: the camera and the solid-state laser radar are used for acquiring the second angular point and the first angular point corresponding to point cloud data of an image in calibration data in the calibration plate; and solving the PNP problem by using a random sampling consistency method according to the camera internal parameters, the second angular point and the first angular point to obtain calibration external parameters between the camera and the solid-state laser radar.
According to a second aspect of the embodiments of the present application, there is provided a calibration apparatus including: the acquisition module is used for acquiring point cloud data of the sensor and acquiring the area of the calibration plate; the segmentation extraction module is used for obtaining a calibration plate according to the region extraction; the fitting module is used for solving according to the calibration plate to obtain coordinates of the plane of the calibration plate which are fitted according to the point cloud space distribution; the angular point position determining module is used for determining a first angular point position coordinate of the calibration plate according to a plane of the calibration plate and a coordinate distributed in a point cloud space, wherein the first angular point position coordinate is a three-dimensional internal angular point coordinate of the point cloud obtained by measuring the calibration plate by a solid-state laser radar in the sensor; the external reference matrix solving module is used for obtaining an external reference conversion matrix of the sensor according to a second corner position coordinate and the first corner position coordinate of the calibration plate, wherein the second corner position coordinate is a two-dimensional inner corner coordinate of an image obtained by shooting the calibration plate by a camera in the sensor; the external parameter matrix solving module is also used for calibrating the sensor through the external parameter transformation matrix obtained by solving.
According to a third aspect of embodiments of the present application, there is provided a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the method when executed.
According to a fourth aspect of embodiments of the present application, there is provided an electronic apparatus, comprising a memory and a processor, wherein the memory stores a computer program, and the processor is configured to execute the computer program to perform the method.
According to the calibration method and device, the storage medium and the electronic device provided by the embodiment of the application, point cloud integration is performed through a time domain to achieve densification of local static point cloud, rich target point cloud data are obtained, and three-dimensional coordinates of calibration plate corner point cloud are obtained based on region detection, plane fitting and quasi-Newton method optimization. And then, acquiring the two-dimensional coordinates of the corner points of the calibration plate in the image by combining camera internal reference calibration and corner point detection in the sensor. And finally, optimizing a projection matrix of the two-dimensional point cloud data-two-dimensional image through a random sampling consistency algorithm, namely obtaining an external reference projection result among different sensors. The technical effects of increasing the calibration precision and reducing manual intervention are achieved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
FIG. 1 is a schematic flow chart of a calibration method in an embodiment of the present application;
FIG. 2 is a schematic structural diagram of a calibration device in an embodiment of the present application;
fig. 3 is a schematic flow chart of detecting an angular point of a checkerboard calibration board in a 3D point cloud in the embodiment of the present application;
fig. 4 is a schematic view of a flow chart of detecting corner points of a checkerboard calibration board in a picture in the embodiment of the present application;
FIG. 5 is a schematic flowchart of computing a radar-camera extrinsic parameter matrix according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a radar point cloud top view in an embodiment of the present application;
FIG. 7 is a schematic illustration of the area of a checkerboard calibration board in an embodiment of the present application in a point cloud;
FIG. 8 is a schematic diagram of a reticle segmented from regions of a tessellated point cloud in an embodiment of the present application;
fig. 9 is a schematic diagram of detecting 2D corner points in an image corresponding to a point cloud in an embodiment of the present application;
FIG. 10 is a schematic diagram of a radar-camera calibration using an external reference matrix in an embodiment of the present application.
Detailed Description
In the process of implementing the application, the inventor finds that in an automatic driving system, if a monocular vision sensor scheme is adopted, the cost is low, abundant image information can be provided for an automatic driving automobile, but reliable and accurate three-dimensional space distance information cannot be provided; if a binocular camera is adopted, short-distance three-dimensional space distance information can be provided through parallax calibration, but the data accuracy is poor after the limited distance is exceeded, and the binocular camera cannot be applied to outdoor scenes.
Further, the use of the laser radar can provide high-precision three-dimensional spatial distance information in the range of 200 meters, although the cost is high. The universal scheme is that all-around vehicle body environment sensing is realized by mutual cooperation of various sensors. The solid-state laser radar is low in cost and dense in point cloud, so that the solid-state laser radar is beneficial to wide popularization on an unmanned platform, but the defects of non-repeated scanning, easiness in color influence, high measurement noise and the like of the solid-state laser radar also cause that the practical use needs to be subjected to targeted optimization.
In the automatic driving sensing system, the multi-sensor fusion can not only enlarge the visual field of an automatic driving automobile, but also can make up for the defects among different sensor sensing modes, but the current fusion strategy needs to finish the calibration of different sensors in advance, project the original sensing data of different sensors to a uniform coordinate system, and have higher requirements on the calibration and the timestamp synchronization of different sensors.
The automatic driving automobile perception scheme performs information fusion through the camera and the laser, so that the visual information of the real world and the three-dimensional space distance information on the space can be obtained at the same time, but the registration needs to be performed on the two-dimensional imaging result of the camera plane and the point cloud data of the sparse laser radar in a three-dimensional mode. In addition, the solid-state laser radar is subject to the problem of absorption of target colors to the wavelength of laser beams, so that laser pulses form jitter noise on targets with different colors, and calibration accuracy is affected. Therefore, in order to complete the sensing task in the automatic driving system, it is necessary to develop a multi-sensor fusion method, and particularly, to optimize a scheme aiming at the point cloud instability of the solid-state laser radar.
In view of the foregoing problems, an embodiment of the present application provides a calibration method, including: acquiring point cloud data of a sensor to obtain an area where a calibration plate is located; extracting to obtain a calibration plate according to the region; solving according to the calibration plate to obtain a coordinate of the plane of the calibration plate which is fitted according to the point cloud space distribution; and determining the position coordinates of a first corner point of the calibration plate according to the plane of the calibration plate and the coordinates of point cloud space distribution, and obtaining the external parameter conversion matrix of the sensor according to the position coordinates of a second corner point of the calibration plate and the position coordinates of the first corner point. The position coordinate of the first corner point is the three-dimensional inner corner point coordinate of the solid-state laser radar in the sensor; the second corner position coordinates are two-dimensional inner corner coordinates of a camera in the sensor; and calibrating in the sensor through the external parameter transformation matrix.
In order to make the technical solutions and advantages of the embodiments of the present application more apparent, the following further detailed description of the exemplary embodiments of the present application with reference to the accompanying drawings makes it clear that the described embodiments are only a part of the embodiments of the present application, and are not exhaustive of all embodiments. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
In this embodiment, a calibration method is provided, as shown in fig. 1, the process includes the following steps:
step S101, point cloud data of a sensor is collected, and an area where a calibration plate is located is obtained;
step S102, extracting to obtain a calibration plate according to the region;
step S103, solving according to the calibration plate to obtain a coordinate of the plane of the calibration plate which is fitted according to the point cloud space distribution;
step S104, determining a first corner position coordinate of the calibration plate according to the plane of the calibration plate and the coordinate of point cloud space distribution, wherein the first corner position coordinate is a three-dimensional inner corner coordinate of a solid-state laser radar in the sensor;
step S105, obtaining an external parameter conversion matrix of the sensor according to a second corner position coordinate of the calibration board and the first corner position coordinate, wherein the second corner position coordinate is a two-dimensional inner corner coordinate of a camera in the sensor;
and S106, calibrating the sensor through the external parameter transformation matrix.
Through the steps, the following technical effects are realized:
the method comprises the steps of firstly carrying out point cloud integration through a time domain to realize the densification of local static point cloud to obtain rich target point cloud data, and obtaining three-dimensional coordinates of a calibration plate corner point based on region detection, plane fitting and quasi-Newton method optimization. And then, acquiring the two-dimensional coordinates of the corner points of the calibration plate in the image by combining camera internal reference calibration and corner point detection in the sensor. And finally, optimizing a projection matrix of the two-dimensional point cloud data-two-dimensional image through a random sampling consistency algorithm, namely obtaining an external reference projection result among different sensors. The technical effects of increasing the calibration precision and reducing manual intervention are achieved.
In step S101, the acquisition and preprocessing module of the original point cloud data performs Statistical Outlier filtering (Statistical Outlier Removal) noise reduction processing on each frame of solid state laser radar point cloud data, and integrates the point cloud within continuous T seconds; then, the approximate position of the checkerboard calibration plate in the bird's eye view is manually marked, and the Region (ROI) where the calibration plate is located is obtained.
In the step S102, the calibration board is divided, a calibration frame (bounding box) of the calibration board is obtained from the label, the point cloud is clipped by combining the Region (ROI), and the subsequent detection module only considers the points inside the region. Cutting out the calibration plate from the region, determining a height threshold value for segmenting the calibration plate by utilizing the distribution of the point cloud in the region in the Z-axis probability density, and segmenting the calibration plate.
The calibration plate plane feature is optimized in the above step S103. And then, carrying out plane fitting on the calibration plate, and obtaining the real position of the plane of the calibration plate through multiple iterations. After the plane parameters are obtained, calculating the ray parameters from the solid-state laser radar origin to each point in the original point cloud of the calibration plate by adopting a ray projection model, and then projecting the original point cloud to the plane of the ideal calibration plate, wherein the projection point is the intersection point of a ray set and the plane. As the solid laser radar carries out reciprocating scanning in the Y-axis direction, the problem of uneven density distribution of scanning line edge points exists, the random sampling is carried out on the fitted calibration plate plane point cloud in a grid dividing mode, and the point cloud density in each grid is guaranteed to be evenly distributed after resampling.
In the above step S104, three-dimensional corner detection is performed. Solving a pose difference matrix T of the ideal chessboard model and the actual calibration board plane point cloud by applying a quasi-Newton method (L-BGFS-B) optimization method and utilizing the reflectivity distribution characteristics of the point cloud; and transforming the coordinates of the corner points of the ideal chessboard back to the coordinate system of the original point cloud by using the T, so as to obtain the detection result of the coordinates in the three-dimensional corner points corresponding to the calibration board.
Repeating the steps 101 to 104 in the step S105, and acquiring the coordinates of the corresponding two-dimensional inner corner points and three-dimensional inner corner points in all the image-point clouds in the calibration data; combining with camera internal parameters, solving a PNP (passive N Point) problem by using a RANSAC (Random Sample Consensus) method to finally obtain a calibration external parameter E between the camera and the solid-state laser radar.
In step S106, the sensor is calibrated through the external reference transformation matrix to obtain accurate coordinates of corner points of the calibration plate in the three-dimensional space.
As a preferable example in the embodiment of the present application, after calibration is performed in the sensor by the external reference transformation matrix, the method further includes: projecting radar point cloud data in the sensor to an imaging plane of a camera in the sensor; and/or mapping visible light image data in the sensor to three-dimensional space.
Based on the steps, the solid-state laser radar and the camera are combined, the checkerboard calibration plate is used as a calibration reference target, the two-dimensional and three-dimensional positions of the angular points of the calibration plate in the visual fields of the camera and the solid-state laser radar are calculated in real time in the calibration process, and the whole process only needs a small amount of manual intervention operation and collection of a plurality of groups of image-point cloud data pairs. Through calculation of the algorithm, an accurate external parameter conversion matrix of the camera-solid laser radar can be obtained, and then radar point cloud can be projected to an imaging plane of the camera, or a visible light image is mapped to a three-dimensional space, so that fusion of multi-modal sensor data is realized; meanwhile, the method can also be used for calibrating and expanding the visual field range of a plurality of radars.
As a preferable example in the embodiment of the present application, the obtaining of the calibration board according to the region extraction includes: calculating the probability density distribution of the Z-axis height of the point cloud in the region; converting the discrete probability density of the probability density distribution into a histogram; calculating a gradient between two adjacent bins in the histogram; and segmenting and extracting according to the gradient calculation result to obtain the calibration plate.
During specific implementation, the calibration plate is segmented from the vicinity of the marked chessboard calibration plate region, the probability density distribution of the point cloud Z-axis height in the region is calculated, and the probability density of the distribution dispersion is converted into a histogram; then, the gradient between two adjacent bins (bins) in the histogram is calculated, and the bins of the histogram are color histograms, and the color space is divided into a plurality of small color intervals. Obtaining a color histogram by calculating pixels of the color in each cell; the more bins, the stronger the resolution of the histogram colors. Because no shielding object is arranged above the designated calibration plate, the top height Z _ max of the calibration plate (point cloud) is the maximum height value of the calibration plate in the area; and then selecting K bins with the highest gradient value to calculate the average height, solving the height difference between the average height and the top of the calibration plate, and selecting the height with the difference value closest to the diagonal length of the calibration plate as a segmentation threshold value Z _ max. The calibration plate can be divided by the height threshold (Z _ min, Z _ max).
As a preferred example in the embodiment of the present application, the obtaining, according to the solution of the calibration plate, coordinates of a plane of the calibration plate, which are obtained by fitting, according to the point cloud spatial distribution includes: carrying out plane fitting on the point cloud data of the calibration plate based on a plane segmentation method of random sampling consistency to obtain a fitting plane of the calibration plate; and determining the coordinates of the plane of the calibration plate according to the point cloud space distribution on the fitting plane of the calibration plate.
In specific implementation, a plane segmentation method based on random sampling consistency is applied to perform plane fitting on the point cloud of the calibration plate. And obtaining a plane parameter S when the plane distance tolerance is theta, calculating the distance from the point cloud in the calibration plate to the fitting plane S, filtering out points with the distance from S being larger than 2 theta, then halving theta to iterate the fitting plane until the distances from all the points to the new fitting plane are smaller than theta, and finally obtaining the fitting plane of the calibration plate as the residual point cloud set.
Further, P 'is a para-type'projResampling is carried out, and P 'is firstly carried out'projDivide into multiple gridpatchs (grid blocks), and calculate the density of each patch, then for each density less than DthThe patch of (1) is resampled, and the sample in the current patch is selected
Figure BDA0002723179230000071
And (4) points.
In some optional embodiments, since the distance variance of the solid-state lidar measured in the axial direction is large, a large error exists in the X-axis direction of the calibration plate obtained after time superposition, and the real position of the calibration plate plane needs to be estimated.
As a preferable example in this embodiment of the present application, the determining the first corner position coordinate of the calibration board according to the plane of the calibration board and the coordinate of the point cloud spatial distribution includes: generating a preset calibration plate in a two-dimensional plane according to the actual size parameter width and height of the calibration plate and the size of the calibration plate; and solving the three-dimensional pose difference value of the actual point cloud data and the preset calibration plate so as to enable the reflectivity of the actual point cloud data to be consistent with that of the preset calibration plate along with the spatial distribution condition.
In specific implementation, an L-BGFS-B optimization method is applied, and the point cloud corner coordinates of the fitting calibration plate plane are obtained by solving the reflectivity information. Firstly, generating an ideal checkerboard in a two-dimensional plane according to the actual size parameter width (w), the height (h) and the size (grid _ size) of the checkerboard, and enabling the reflectivity of the actual point cloud to be consistent with the reflectivity of the ideal checkerboard along with spatial distribution by solving the three-dimensional pose difference T of the actual point cloud and the ideal checkerboard [ theta, x, y ] through an optimization algorithm.
Further, the specific implementation steps of solving and obtaining the point cloud corner coordinates of the fitting calibration plate plane by using the reflectivity information by using an L-BGFS-B optimization method are as follows:
step S1, randomly generating an initial three-dimensional pose T0=[θ0,x0,y0]Let the current pose variable T equal to T0
Step S2, use T to mix P'projAnd transforming to a new pose (position), calculating a difference value cost between the pose and the ideal checkerboard spatial position distribution by using the reflectivity information:
step S21, traversing the point cloud, and obtaining the plane coordinate { p of each point px,py};
Step S22, if p falls outside the ideal checkerboard
costp=ming∈Gdist({px,py},{gx,gyH), where the G order is a set of ideal checkerboard corner points, and the current calculation is finished;
in step S23, if p falls inside the ideal checkerboard, its coordinates on the ideal checkerboard are { G }x,Gy};
Step S24, finding (G) according to the color pattern of the actual checkerboardx,Gy) Corresponding color CLRestimate
Step S25, obtaining the corresponding color CLR after the actual reflectivity of the current point is binarizedgtAnd the desired color CLRestimateComparing; cost if the color is the samep0, otherwise costp=ming∈Gdist({px,py},{gx,gy});
In step S26, the cost calculation of the current point is ended.
And step S3, optimizing T by using an L-BGFS-B algorithm according to the cost value to obtain a pose parameter T 'of the next moment, enabling T to be T', returning to the step ii, and repeating the optimization algorithm until the cost is converged.
As a preferred example in this embodiment of the present application, the acquiring point cloud data of the sensor, and determining the area where the calibration plate is located includes: filtering the collected point cloud data by utilizing statistical outliers, and overlapping the point cloud data in continuous time on a space; and marking the area of the initial position of the calibration plate.
In specific implementation, the collected point cloud data is filtered by using Statistical Outlier (Statistical Outlier Removal), the point clouds of continuous T seconds are spatially overlapped, and then an operator marks out an area of the approximate position of the checkerboard calibration plate in the point cloud top view.
As a preferred embodiment of the present application, the obtaining an external reference transformation matrix of the sensor according to the second corner position coordinate and the first corner position coordinate of the calibration board includes: the sensor includes: the camera and the solid-state laser radar are used for acquiring the second angular point and the first angular point corresponding to point cloud data of an image in calibration data in the calibration plate; and solving the PNP problem by using a random sampling consistency method according to the camera internal parameters, the second angular point and the first angular point to obtain calibration external parameters between the camera and the solid-state laser radar.
In specific implementation, the two-dimensional inner corner points and the three-dimensional inner corner point coordinates corresponding to all image-point clouds in the calibration data are obtained, camera internal parameters are combined, a Random Sample Consensus (Random Sample Consensus) method is used for solving the PNP problem, and finally the calibration external parameter E between the camera and the solid-state laser radar is obtained.
Through the above description of the embodiments, those skilled in the art can clearly understand that the method according to the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but the former is a better implementation mode in many cases. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (e.g., ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (e.g., a mobile phone, a computer, a server, or a network device) to execute the method according to the embodiments of the present invention.
In this embodiment, a calibration device is further provided, and the calibration device is used to implement the above embodiments and preferred embodiments, which have already been described and are not described again. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 2 is a block diagram of a calibration apparatus according to an embodiment of the present invention, and as shown in fig. 2, the apparatus includes:
the acquisition module 21 is used for acquiring point cloud data of the sensor and acquiring an area where the calibration plate is located;
a segmentation extraction module 22, configured to extract a calibration plate according to the region;
the fitting module 23 is configured to solve the calibration plate to obtain a coordinate of the plane of the calibration plate, which is fitted according to the point cloud spatial distribution;
an angular point position determining module 24, configured to determine a first angular point position coordinate of the calibration plate according to a coordinate of the point cloud spatial distribution on the plane of the calibration plate, where the first angular point position coordinate is a three-dimensional angular point coordinate of a solid state laser radar in the sensor;
an external reference matrix solving module 25, configured to obtain an external reference transformation matrix of the sensor according to a second corner position coordinate of the calibration board and the first corner position coordinate, where the second corner position coordinate is a two-dimensional inner corner coordinate of a camera in the sensor;
and the external parameter matrix module is also used for calibrating the sensor through the external parameter conversion matrix.
The acquisition module 21 performs statistical outlier filtering and noise reduction processing on each frame of solid-state laser radar point cloud data, and overlaps point clouds in continuous T seconds in space; then, the approximate position of the checkerboard calibration plate in the bird's eye view is manually marked, and the area (area) where the calibration plate is located is acquired.
The segmentation and extraction module 22 obtains a calibration frame (bounding box) of the calibration board from the label, cuts the point cloud in combination with the area where the point cloud is located, and only considers the points inside the area in the subsequent detection module. And dividing the calibration plate from the region, obtaining a height threshold value of the calibration plate by using the probability density of Z-axis point cloud distribution in the region, determining the width of the calibration plate, and finally obtaining the calibration plate.
The fitting module 23 obtains the real position of the calibration plate plane through multiple iterations, and specifically, the operation adopts a linear projection model, calculates the ray parameters from the origin of the solid-state laser radar to each point in the calibration plate plane, calculates the intersection point space P between the emergent ray and the calibration plate plane, projects the point cloud near the plane onto an ideal plane, performs Principal Component Analysis (PCA) conversion on the intersection point space P, and solves to obtain the coordinates of the fitted calibration plate plane distributed according to the point cloud space.
The angular point position determining module 24 applies a quasi-Newton method (L-BGFS-B) optimization method, and solves the coordinate position of the point cloud angular point in the calibration plate plane by using the reflectivity of the point cloud; and performing quasi transformation on the angular point coordinates of the plane of the obtained point cloud calibration plate into an original point cloud space to obtain the coordinates of the corresponding three-dimensional internal angular point of the detected calibration plate.
Since the solid-state lidar is reciprocating scanning in the Y-axis direction, the problem of non-uniform density of scanning line edge points is caused, and therefore resampling on the plane of the calibration plate is required.
The external parameter matrix module 25 obtains a radar-camera external parameter matrix based on a random sampling consistency solution (PNP) method, and obtains coordinates of corresponding two-dimensional inner corner points and three-dimensional inner corner points in all image-Point clouds in calibration data. Combining with camera internal parameters, solving the PNP problem by using a Random Sample Consensus (RANSAC) method, and finally obtaining the calibration external parameter E between the camera and the solid-state laser radar.
In order to better understand the above calibration process, the following explains the above technical solutions with reference to preferred embodiments, but the technical solutions of the embodiments of the present invention are not limited.
Aiming at the inherent precision defect of the solid-state laser radar, the point cloud optimization noise reduction algorithm is designed for a multi-radar-camera sensing system, and a corresponding sensor calibration flow is designed on the basis, so that the calibration precision is increased, and meanwhile, the manual intervention is reduced. The whole method mainly comprises three main steps of camera two-dimensional corner detection, solid laser radar point cloud three-dimensional corner detection and two-dimensional-three-dimensional based projection transformation solving. Firstly, point cloud integration is carried out through a time domain to realize densification of local static point cloud to obtain rich target point cloud data, and three-dimensional coordinates of a calibration plate corner point are obtained based on region detection, plane fitting and quasi-Newton method optimization. And then, acquiring two-dimensional coordinates of the corner points of the calibration plate in the image by combining camera internal reference calibration and corner point detection. And finally, optimizing a projection matrix of the three-dimensional point cloud-two-dimensional image by a random sampling consistency algorithm, namely obtaining an external reference projection result among different sensors. The calibration method is mainly used for simultaneously obtaining visual information of the real world and three-dimensional space distance information on the space through the combination of the camera and the solid laser radar sensor, and specifically selecting the solid laser radar with denser point cloud as the distance sensor for experiments, wherein the noise distribution of different radar point clouds on the calibration plate is in front view and in side view.
The radar-camera calibration task is divided into two modules by the framework. Firstly, acquiring three-dimensional corner information of a chessboard grid calibration board in a laser radar point cloud view; then, acquiring two-dimensional corner information of a chessboard pattern calibration board in the camera picture; and finally, calculating by combining the three-dimensional and two-dimensional corner information to obtain an accurate radar-camera conversion matrix. Preferably, the sensor and calibration plate relative positions and corresponding coordinate systems set the sensor positions and checkerboard calibration plate positions. The solid-state lidar-camera calibration method based on time domain integration provided by the application has the main flow diagrams of two modules as shown in fig. 3 and fig. 4 respectively. Finally, the three-dimensional to two-dimensional checkerboard corner point information is obtained through the modules shown in fig. 3 and 4, and an accurate radar-camera conversion matrix is obtained, as shown in fig. 5.
Step 1, firstly, the solid laser radar and the camera are installed at a rigid body fixed position (such as a vehicle-mounted platform support), the view of the radar and the camera is ensured not to be blocked, and the radar and the camera are accessed into a computing device through data lines such as an Ethernet port or a USB. Simultaneously, prepare the chess board check calibration board and support the calibration board in the radar and the camera certain limit in front, ensure that chess board check calibration board is in the field of vision of radar and camera simultaneously all the time in whole experimentation, try hard to put the calibration board in a plurality of positions and sample in the experimentation to the angle of calibration board is properly adjusted.
Step 2, starting a calibration program, transmitting the captured picture to the computing equipment by the camera, and simultaneously popping up a window of the point cloud top view shown in fig. 6 by the computer. At the moment, noise reduction processing is carried out on each frame of solid-state laser radar point cloud data, and the point clouds in continuous T seconds are overlapped in space. The operator is required to manually mark the approximate position of the checkerboard calibration plate in the point cloud top view, and the area where the calibration plate is located can be obtained through automatic processing of the algorithm, as shown in fig. 7.
Step 3, obtaining a calibration frame (bounding box) of the calibration plate from the labeling by the algorithm, cutting the point cloud of the calibration plate by combining the area where the point cloud is located, and only considering the points in the area by the subsequent detection module, namely: dividing a calibration plate from the region, obtaining a height threshold of the calibration plate by using the probability density of Z-axis point cloud distribution in the region, determining the width of the calibration plate, and dividing the calibration plate as shown in FIG. 8.
And 4, performing plane fitting on the calibration plate, and obtaining the real position of the plane of the calibration plate through multiple iterations. The specific operation adopts a linear projection model, calculates the ray parameters from the origin of the solid-state laser radar to each point in the plane of the calibration plate, calculates the intersection point space P of the emergent ray and the plane of the calibration plate, and projects the point cloud near the plane onto the ideal plane. And carrying out PCA conversion on the intersection point space P, and solving to obtain the coordinate of the fitted calibration plate plane distributed according to the point cloud space.
Step 5, resampling the calibration plate plane, and solving the coordinate position of the point cloud corner in the calibration plate plane by using the reflectivity of the point cloud by applying a quasi-Newton method (L-BGFS-B) optimization method; performing quasi transformation on the angular point coordinates of the plane of the obtained point cloud calibration plate into an original point cloud space to obtain three-dimensional internal angle point coordinates corresponding to the detected calibration plate;
and 6, the computer simultaneously acquires two-dimensional corner information of the calibration plate from the picture, and the detected corner is shown in FIG. 9.
Step 7, placing the calibration plates at different positions, and repeating the step 2 to the step 6 to obtain a plurality of groups of three-dimensional and two-dimensional checkerboard angular point information; and combining the camera internal reference matrix, and calculating to obtain an accurate radar-camera external reference conversion matrix. Fig. 10 is an effect diagram of using the obtained extrinsic transformation matrix to mark the radar-camera in the same coordinate system.
Specifically, a solid-state laser radar reprojection error histogram can be adopted, and the quantification method is to reproject the checkerboard corner points detected in the point cloud view to the camera plane by using the radar-camera external reference conversion matrix solved in the steps, and then calculate pixel errors from the corner point information of the point cloud reprojection and the corner point information of the image to obtain the histogram.
Embodiments of the present invention also provide a storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the above method embodiments when executed.
Alternatively, in the present embodiment, the storage medium may be configured to store a computer program for executing the steps of:
s1, collecting point cloud data of the sensor, and acquiring the area where the calibration plate is located;
s2, extracting the point cloud in the area to obtain a calibration board;
s3, solving according to the calibration plate to obtain a coordinate of the plane of the calibration plate which is fitted according to the point cloud space distribution;
s4, determining a first corner position coordinate of the calibration plate through nonlinear optimization according to the coordinate of the calibration plate plane distributed according to the point cloud space, wherein the first corner position coordinate is a three-dimensional inner corner position coordinate of the point cloud obtained by measuring the calibration plate by a solid-state laser radar in the sensor;
s5, obtaining an external reference conversion matrix between the sensors according to a second corner position coordinate of the calibration board and the first corner position coordinate, wherein the second corner position coordinate is a two-dimensional inner corner coordinate of a camera in the sensor;
and S6, calibrating in the sensor through the external parameter transformation matrix.
Optionally, the storage medium is further arranged to store a computer program for performing the steps of:
s31, projecting the radar point cloud data in the sensor to an imaging plane of a camera in the sensor.
Optionally, the storage medium is further arranged to store a computer program for performing the steps of:
and S32, mapping the visible light image data in the sensor to a three-dimensional space.
Optionally, in this embodiment, the storage medium may include, but is not limited to: various media capable of storing computer programs, such as a usb disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk.
Embodiments of the present invention also provide an electronic device comprising a memory having a computer program stored therein and a processor arranged to run the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, collecting point cloud data of the sensor, and obtaining the area of the calibration plate
S2, solving according to the calibration plate to obtain a coordinate of the plane of the calibration plate which is fitted according to the point cloud space distribution;
s3, determining a first corner position coordinate of the calibration plate according to the plane of the calibration plate and the coordinate of the point cloud spatial distribution, wherein the first corner position coordinate is a three-dimensional inner corner position coordinate of the point cloud obtained by measuring the calibration plate by the solid-state laser radar in the sensor;
s4, obtaining an external reference transformation matrix of the sensor according to a second corner position coordinate of the calibration board and the first corner position coordinate, wherein the second corner position coordinate is a two-dimensional inner corner coordinate of an image obtained by shooting the calibration board by a camera in the sensor;
and S5, calibrating in the sensor through the external parameter transformation matrix.
Optionally, the specific examples in this embodiment may refer to the examples described in the above embodiments and optional implementation manners, and this embodiment is not described herein again.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. A calibration method, comprising:
collecting point cloud data of the solid laser radar to obtain an area where the calibration plate is located;
extracting the point cloud of the calibration plate from the point cloud of the region;
according to the calibration method, coordinates of the plane corresponding to the calibration plate are fitted according to the point cloud space distribution;
determining a first corner position coordinate of the calibration plate through nonlinear optimization according to the coordinate of the calibration plate plane distributed according to the point cloud space, wherein the first corner position coordinate is a three-dimensional inner corner position coordinate of the point cloud obtained by measuring the calibration plate by a solid-state laser radar in the sensor;
solving to obtain an external parameter transformation matrix of the sensor according to a second corner position coordinate of the calibration plate and the first corner position coordinate, wherein the second corner position coordinate is a two-dimensional inner corner coordinate of an image obtained by shooting the calibration plate by a camera in the sensor;
and calibrating in the sensor through the external parameter transformation matrix.
2. The method of claim 1, wherein after calibrating the sensor by the external reference transformation matrix, further comprising verifying calibration effects by:
projecting radar point cloud data in the sensor to an imaging plane of a camera in the sensor;
and/or mapping visible light image data in the sensor to three-dimensional space.
3. The method of claim 1, wherein extracting the calibration plate from the region comprises:
calculating the probability density distribution of the Z-axis height of the point cloud in the region;
converting the discrete probability density of the probability density distribution into a histogram;
calculating a gradient between two adjacent bins in the histogram;
and segmenting and extracting according to the gradient calculation result to obtain the calibration plate.
4. The method according to claim 1, wherein the coordinates of the plane of the calibration plate fitted according to the calibration method according to the point cloud spatial distribution comprise:
carrying out plane fitting on the point cloud data of the calibration plate based on a plane segmentation method of random sampling consistency to obtain a fitting plane of the calibration plate;
and determining the coordinates of the plane of the calibration plate according to the point cloud space distribution on the fitting plane.
5. The method of claim 1, wherein determining the first angular position coordinates of the calibration plate from the coordinates of the plane of the calibration plate as a spatial distribution of the point cloud comprises:
generating a preset calibration plate in a two-dimensional plane according to the actual size parameter width and height of the calibration plate and the size of the calibration plate;
solving the three-dimensional pose difference value of the actual point cloud data and the preset calibration plate to ensure that the reflectivity of the actual point cloud data is consistent with that of the preset calibration plate along with the spatial distribution condition;
and according to the three-dimensional pose difference value, carrying out inverse transformation on the angular points of the preset calibration plate to a coordinate system corresponding to the actual point cloud to obtain the position coordinates of the first angular points.
6. The method of claim 1, wherein determining the area of the calibration plate from the point cloud data of the acquisition sensor comprises:
filtering the collected point cloud data by using a statistical outlier filtering method, and overlapping the point cloud data in continuous time on space;
and marking the area of the initial position of the calibration plate.
7. The method of claim 1, wherein solving the external parametric transformation matrix for the sensor based on the second corner position coordinates and the first corner position coordinates of the calibration plate comprises:
the sensor includes at least: a camera, a solid-state laser radar,
acquiring the second corner position coordinate and the first corner position coordinate corresponding to point cloud data of an image in calibration data in the calibration plate;
and solving a PNP problem by using a random sampling consistency method according to the camera internal parameters, the second angular point and the first angular point to obtain an external parameter matrix between the camera and the solid-state laser radar.
8. A calibration device, comprising:
the data acquisition module is used for acquiring point cloud data of the sensor and acquiring the area of the calibration plate;
the segmentation extraction module is used for obtaining a calibration plate according to the region extraction;
the fitting module is used for solving to obtain the fitted coordinates of the plane of the calibration plate according to the point cloud spatial distribution;
the angular point position determining module is used for determining a first angular point position coordinate of the calibration plate according to a plane of the calibration plate and a coordinate distributed in a point cloud space, wherein the first angular point position coordinate is a three-dimensional internal angular point coordinate of the point cloud obtained by measuring the calibration plate by a solid-state laser radar in the sensor;
the external reference matrix solving module is used for obtaining an external reference conversion matrix of the sensor according to a second corner position coordinate and the first corner position coordinate of the calibration plate, wherein the second corner position coordinate is a two-dimensional inner corner coordinate of an image obtained by shooting the calibration plate by a camera in the sensor;
the external parameter matrix solving module is also used for calibrating the sensor through the external parameter transformation matrix obtained by solving.
9. A storage medium, in which a computer program is stored, wherein the computer program is arranged to perform the method of any of claims 1 to 7 when executed.
10. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and wherein the processor is arranged to execute the computer program to perform the method of any of claims 1 to 7.
CN202011094343.0A 2020-10-14 2020-10-14 Calibration method and device, storage medium and electronic device Pending CN112270713A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011094343.0A CN112270713A (en) 2020-10-14 2020-10-14 Calibration method and device, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011094343.0A CN112270713A (en) 2020-10-14 2020-10-14 Calibration method and device, storage medium and electronic device

Publications (1)

Publication Number Publication Date
CN112270713A true CN112270713A (en) 2021-01-26

Family

ID=74338185

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011094343.0A Pending CN112270713A (en) 2020-10-14 2020-10-14 Calibration method and device, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN112270713A (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112462348A (en) * 2021-02-01 2021-03-09 知行汽车科技(苏州)有限公司 Method and device for amplifying laser point cloud data and storage medium
CN113126115A (en) * 2021-04-06 2021-07-16 北京航空航天大学杭州创新研究院 Semantic SLAM method and device based on point cloud, electronic equipment and storage medium
CN113139569A (en) * 2021-03-04 2021-07-20 山东科技大学 Target classification detection method, device and system
CN113256729A (en) * 2021-03-17 2021-08-13 广西综合交通大数据研究院 External parameter calibration method, device, equipment and storage medium for laser radar and camera
CN113406604A (en) * 2021-06-30 2021-09-17 山东新一代信息产业技术研究院有限公司 Device and method for calibrating positions of laser radar and camera
CN113567964A (en) * 2021-06-29 2021-10-29 苏州一径科技有限公司 Laser radar automatic test method, device and system
CN114624683A (en) * 2022-04-07 2022-06-14 苏州知至科技有限公司 Calibration method for external rotating shaft of laser radar
CN114677429A (en) * 2022-05-27 2022-06-28 深圳广成创新技术有限公司 Positioning method and device of manipulator, computer equipment and storage medium
CN114689106A (en) * 2022-03-31 2022-07-01 上海擎朗智能科技有限公司 Sensor calibration method, robot and computer readable storage medium
WO2022193604A1 (en) * 2021-03-16 2022-09-22 Huawei Technologies Co., Ltd. Devices, systems, methods, and media for point cloud data augmentation using model injection
CN115239815A (en) * 2021-06-23 2022-10-25 上海仙途智能科技有限公司 Camera calibration method and device
CN116452439A (en) * 2023-03-29 2023-07-18 中国工程物理研究院计算机应用研究所 Noise reduction method and device for laser radar point cloud intensity image

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107976668A (en) * 2016-10-21 2018-05-01 法乐第(北京)网络科技有限公司 A kind of method of outer parameter between definite camera and laser radar
CN107976669A (en) * 2016-10-21 2018-05-01 法乐第(北京)网络科技有限公司 A kind of device of outer parameter between definite camera and laser radar
KR20180133745A (en) * 2017-06-07 2018-12-17 국방과학연구소 Flying object identification system using lidar sensors and pan/tilt zoom cameras and method for controlling the same
CN110349221A (en) * 2019-07-16 2019-10-18 北京航空航天大学 A kind of three-dimensional laser radar merges scaling method with binocular visible light sensor
CN110599541A (en) * 2019-08-28 2019-12-20 贝壳技术有限公司 Method and device for calibrating multiple sensors and storage medium
CN111127563A (en) * 2019-12-18 2020-05-08 北京万集科技股份有限公司 Combined calibration method and device, electronic equipment and storage medium
CN111179358A (en) * 2019-12-30 2020-05-19 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium
CN111192331A (en) * 2020-04-09 2020-05-22 浙江欣奕华智能科技有限公司 External parameter calibration method and device for laser radar and camera
CN111612845A (en) * 2020-04-13 2020-09-01 江苏大学 Laser radar and camera combined calibration method based on mobile calibration plate

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107976668A (en) * 2016-10-21 2018-05-01 法乐第(北京)网络科技有限公司 A kind of method of outer parameter between definite camera and laser radar
CN107976669A (en) * 2016-10-21 2018-05-01 法乐第(北京)网络科技有限公司 A kind of device of outer parameter between definite camera and laser radar
KR20180133745A (en) * 2017-06-07 2018-12-17 국방과학연구소 Flying object identification system using lidar sensors and pan/tilt zoom cameras and method for controlling the same
CN110349221A (en) * 2019-07-16 2019-10-18 北京航空航天大学 A kind of three-dimensional laser radar merges scaling method with binocular visible light sensor
CN110599541A (en) * 2019-08-28 2019-12-20 贝壳技术有限公司 Method and device for calibrating multiple sensors and storage medium
CN111127563A (en) * 2019-12-18 2020-05-08 北京万集科技股份有限公司 Combined calibration method and device, electronic equipment and storage medium
CN111179358A (en) * 2019-12-30 2020-05-19 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium
CN111192331A (en) * 2020-04-09 2020-05-22 浙江欣奕华智能科技有限公司 External parameter calibration method and device for laser radar and camera
CN111612845A (en) * 2020-04-13 2020-09-01 江苏大学 Laser radar and camera combined calibration method based on mobile calibration plate

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112462348A (en) * 2021-02-01 2021-03-09 知行汽车科技(苏州)有限公司 Method and device for amplifying laser point cloud data and storage medium
CN113139569A (en) * 2021-03-04 2021-07-20 山东科技大学 Target classification detection method, device and system
CN113139569B (en) * 2021-03-04 2022-04-22 山东科技大学 Target classification detection method, device and system
WO2022193604A1 (en) * 2021-03-16 2022-09-22 Huawei Technologies Co., Ltd. Devices, systems, methods, and media for point cloud data augmentation using model injection
CN113256729A (en) * 2021-03-17 2021-08-13 广西综合交通大数据研究院 External parameter calibration method, device, equipment and storage medium for laser radar and camera
CN113126115A (en) * 2021-04-06 2021-07-16 北京航空航天大学杭州创新研究院 Semantic SLAM method and device based on point cloud, electronic equipment and storage medium
CN113126115B (en) * 2021-04-06 2023-11-17 北京航空航天大学杭州创新研究院 Semantic SLAM method and device based on point cloud, electronic equipment and storage medium
CN115239815B (en) * 2021-06-23 2023-10-27 上海仙途智能科技有限公司 Camera calibration method and device
CN115239815A (en) * 2021-06-23 2022-10-25 上海仙途智能科技有限公司 Camera calibration method and device
CN113567964A (en) * 2021-06-29 2021-10-29 苏州一径科技有限公司 Laser radar automatic test method, device and system
CN113567964B (en) * 2021-06-29 2023-07-25 苏州一径科技有限公司 Laser radar automatic test method, device and system
CN113406604A (en) * 2021-06-30 2021-09-17 山东新一代信息产业技术研究院有限公司 Device and method for calibrating positions of laser radar and camera
CN114689106A (en) * 2022-03-31 2022-07-01 上海擎朗智能科技有限公司 Sensor calibration method, robot and computer readable storage medium
CN114689106B (en) * 2022-03-31 2024-03-08 上海擎朗智能科技有限公司 Sensor calibration method, robot and computer readable storage medium
CN114624683A (en) * 2022-04-07 2022-06-14 苏州知至科技有限公司 Calibration method for external rotating shaft of laser radar
CN114677429B (en) * 2022-05-27 2022-08-30 深圳广成创新技术有限公司 Positioning method and device of manipulator, computer equipment and storage medium
CN114677429A (en) * 2022-05-27 2022-06-28 深圳广成创新技术有限公司 Positioning method and device of manipulator, computer equipment and storage medium
CN116452439A (en) * 2023-03-29 2023-07-18 中国工程物理研究院计算机应用研究所 Noise reduction method and device for laser radar point cloud intensity image

Similar Documents

Publication Publication Date Title
CN112270713A (en) Calibration method and device, storage medium and electronic device
CN112907676B (en) Calibration method, device and system of sensor, vehicle, equipment and storage medium
CN107463918B (en) Lane line extraction method based on fusion of laser point cloud and image data
CN111383279B (en) External parameter calibration method and device and electronic equipment
CA3027921C (en) Integrated sensor calibration in natural scenes
CN109961468B (en) Volume measurement method and device based on binocular vision and storage medium
CN111563921B (en) Underwater point cloud acquisition method based on binocular camera
CN107993263B (en) Automatic calibration method for panoramic system, automobile, calibration device and storage medium
CN110555407B (en) Pavement vehicle space identification method and electronic equipment
CN110288659B (en) Depth imaging and information acquisition method based on binocular vision
CN112902874B (en) Image acquisition device and method, image processing method and device and image processing system
WO2012126500A1 (en) 3d streets
CN112907675B (en) Calibration method, device, system, equipment and storage medium of image acquisition equipment
CN113205604A (en) Feasible region detection method based on camera and laser radar
CN111382591B (en) Binocular camera ranging correction method and vehicle-mounted equipment
CN113658279B (en) Camera internal reference and external reference estimation method, device, computer equipment and storage medium
CN114119682A (en) Laser point cloud and image registration method and registration system
CN114463303A (en) Road target detection method based on fusion of binocular camera and laser radar
CN111724432B (en) Object three-dimensional detection method and device
CN113327296A (en) Laser radar and camera online combined calibration method based on depth weighting
CN117392237A (en) Robust laser radar-camera self-calibration method
CN115546216B (en) Tray detection method, device, equipment and storage medium
CN111899277A (en) Moving object detection method and device, storage medium and electronic device
CN115407338A (en) Vehicle environment information sensing method and system
Li et al. Lane detection and road surface reconstruction based on multiple vanishing point & symposia

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination