CN117422772A - Laser radar and camera combined calibration method and device and electronic equipment - Google Patents

Laser radar and camera combined calibration method and device and electronic equipment Download PDF

Info

Publication number
CN117422772A
CN117422772A CN202311338740.1A CN202311338740A CN117422772A CN 117422772 A CN117422772 A CN 117422772A CN 202311338740 A CN202311338740 A CN 202311338740A CN 117422772 A CN117422772 A CN 117422772A
Authority
CN
China
Prior art keywords
point cloud
point
coordinate
camera
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311338740.1A
Other languages
Chinese (zh)
Inventor
安晓宇
曹琼
霍鑫健
曾光
荣耀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Meike Tianma Automation Technology Co Ltd
Beijing Tianma Intelligent Control Technology Co Ltd
Original Assignee
Beijing Meike Tianma Automation Technology Co Ltd
Beijing Tianma Intelligent Control Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Meike Tianma Automation Technology Co Ltd, Beijing Tianma Intelligent Control Technology Co Ltd filed Critical Beijing Meike Tianma Automation Technology Co Ltd
Priority to CN202311338740.1A priority Critical patent/CN117422772A/en
Publication of CN117422772A publication Critical patent/CN117422772A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Abstract

The application provides a laser radar and camera combined calibration method and device and electronic equipment, and relates to the technical field of data processing. The combined calibration method of the laser radar and the camera comprises the following steps: acquiring a first image acquired by a camera; identifying a first characteristic point in the first image, and acquiring a first coordinate of the first characteristic point; acquiring a template point cloud corresponding to the first image, and acquiring a source point cloud corresponding to the template point cloud according to a laser radar; acquiring a second characteristic point in the source point cloud and a second coordinate of the second characteristic point according to the template point cloud; and based on the first coordinate and the second coordinate, the calibration of the laser radar and the camera is realized. The embodiment of the application can solve the problem of unstable calibration results in the prior art, and realize high-precision calibration of the laser radar and the camera.

Description

Laser radar and camera combined calibration method and device and electronic equipment
Technical Field
The application relates to the technical field of data processing, in particular to a laser radar and camera combined calibration method and device and electronic equipment.
Background
The laser radar and camera calibration refers to calibrating and aligning a coordinate system between two sensors of the laser radar and the camera; the conversion relation between the laser radar and the camera is generally obtained, and the accurate registration and fusion of the three-dimensional point cloud and the image are realized according to the conversion relation.
In the current method for calibrating the laser radar and the camera, the edge points of the calibration object are required to be manually screened and the translation parameters are manually measured, so that certain limitations exist and the accuracy of the calibration result is poor.
Disclosure of Invention
The embodiment of the application provides a combined calibration method and device for a laser radar and a camera and electronic equipment.
An embodiment of a first aspect of the present application provides a method for jointly calibrating a laser radar and a camera, including:
acquiring a first image acquired by a camera;
identifying a first characteristic point in the first image, and acquiring a first coordinate of the first characteristic point;
acquiring a template point cloud corresponding to the first image, and acquiring a source point cloud corresponding to the template point cloud according to a laser radar;
acquiring a second characteristic point in the source point cloud and a second coordinate of the second characteristic point according to the template point cloud;
and based on the first coordinate and the second coordinate, calibrating the laser radar and the camera.
In one embodiment of the present application, the identifying the first feature point in the first image includes:
acquiring an associated window of any target pixel point in the first image;
Determining a first matrix of the target pixel points according to the pixel values of the pixel points in the association window;
determining a response function of the target pixel point based on the first matrix;
and determining the first characteristic point of the first image according to the response function.
In one embodiment of the present application, the obtaining, according to the template point cloud, a second feature point in the source point cloud includes:
determining a corresponding third feature point in the template point cloud based on the first feature point in the first image;
registering the template point cloud and the source point cloud, and determining a second characteristic point corresponding to the third characteristic point in the source point cloud.
In one embodiment of the present application, determining the second coordinates of the second feature point includes:
acquiring a third coordinate of a third characteristic point in the template point cloud;
and determining the second coordinates of the second feature points according to the third coordinates.
In one embodiment of the present application, the determining the second coordinate of the second feature point according to the third coordinate includes:
determining a transformation matrix between the template point cloud and the source point cloud in response to when the template point cloud is aligned with the source point cloud;
And transforming the third coordinates of the third feature points based on the transformation matrix, and determining the second coordinates corresponding to the second feature points.
In one embodiment of the present application, the calibrating the lidar and the camera based on the first coordinate and the second coordinate includes:
determining an extrinsic matrix of the laser radar based on the first coordinate and the second coordinate;
and according to the external parameter matrix, calibrating the laser radar and the camera.
In one embodiment of the present application, determining the first coordinates of the first feature point includes:
determining an image coordinate system of the first image based on a camera coordinate system of the camera;
and determining the first coordinate of the first feature point according to the position of the first feature point in the image coordinate system.
An embodiment of a second aspect of the present application provides a joint calibration device for a laser radar and a camera, including:
the first acquisition module is used for acquiring a first image acquired by the camera;
the second acquisition module is used for identifying a first characteristic point in the first image and acquiring a first coordinate of the first characteristic point;
The third acquisition module is used for acquiring a template point cloud corresponding to the first image and acquiring a source point cloud corresponding to the template point cloud according to a laser radar;
a fourth obtaining module, configured to obtain, according to the template point cloud, a second feature point in the source point cloud, and a second coordinate of the second feature point;
and the calibration module is used for realizing the calibration of the laser radar and the camera based on the first coordinate and the second coordinate.
An embodiment of a third aspect of the present application provides an electronic device, including: a processor; a memory for storing the processor-executable instructions; the processor is configured to execute the instructions to implement the combined calibration method of the laser radar and the camera provided by the embodiment of the first aspect of the application.
An embodiment of a fourth aspect of the present application proposes a non-transitory computer readable storage medium, which when executed by a processor of an electronic device, enables the electronic device to perform the joint calibration method of a lidar and a camera proposed by the embodiment of the first aspect of the present application.
The technical scheme provided by the embodiment of the application at least brings the following beneficial effects:
The method comprises the steps of carrying out feature point identification on a first image acquired by a camera to obtain a first feature point and a corresponding coordinate, obtaining a template point cloud corresponding to the first image, obtaining a source point cloud according to the template point cloud, determining the coordinate of the feature point under a laser radar coordinate system through the corresponding relation between the template point cloud and the source point cloud, ensuring the accuracy of second coordinate identification, realizing high-precision joint calibration between the laser radar and the camera based on the more accurate first coordinate and the second coordinate, and solving the problem of low calibration precision between the laser radar and the camera at present.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
fig. 1 is a schematic flow chart of a method for joint calibration of a laser radar and a camera according to an embodiment of the present application;
FIG. 2 is a flow chart of another method for joint calibration of a laser radar and a camera according to an embodiment of the present disclosure;
FIG. 3 is a flow chart of another method for joint calibration of lidar and camera according to an embodiment of the present application;
FIG. 4 is a flow chart of another method for joint calibration of lidar and camera according to an embodiment of the present application;
FIG. 5 is a schematic structural diagram of a combined calibration device for a laser radar and a camera according to an embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of another electronic device according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary embodiments do not represent all implementations consistent with the embodiments of the present application. Rather, they are merely examples of apparatus and methods consistent with aspects of embodiments of the present application as detailed in the accompanying claims.
The terminology used in the embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the embodiments of the application. As used in this application in the examples and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used in embodiments of the present application to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, the first information may also be referred to as second information, and similarly, the second information may also be referred to as first information, without departing from the scope of embodiments of the present application. The words "if" and "if" as used herein may be interpreted as "at … …" or "at … …" or "in response to a determination", depending on the context.
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the like or similar elements throughout. The embodiments described below by referring to the drawings are exemplary and intended for the purpose of explaining the present application and are not to be construed as limiting the present application.
It should be noted that, the combined calibration method of the laser radar and the camera provided in any embodiment of the present application may be performed alone or in combination with possible implementation methods in other embodiments, and may also be performed in combination with any technical solution in the related art.
The specific steps of the current laser radar and camera calibration are as follows:
(1) And (3) data acquisition: and acquiring related data of the laser radar and the camera, wherein the related data comprises three-dimensional point cloud data acquired from the laser radar and image data acquired from the camera, the laser radar and the camera are ensured to acquire data at the same physical position, and the time stamp of each acquisition point is recorded, and the corresponding relation between the time stamp and the acquisition point is recorded.
(2) Setting a calibration mode: before calibration is performed, an appropriate calibration mode is set, for example, a calibration plate or sphere calibration object having a known geometry is placed, and its position and posture are adjusted. In a scene with rich features, enough shared feature points are visible, and the feature point matching method can be used for carrying out.
(3) Feature extraction and matching: extracting and matching features of the laser radar and the camera data; for the method based on the calibration plate or the sphere, the corresponding image processing algorithm can be adopted to extract and match the characteristic points of the specific calibration plate or sphere, and for the method of characteristic point matching, the characteristic descriptor algorithm can be adopted to identify and match the shared characteristic points.
(4) Parameter calculation and calibration: according to the matching relation of the feature points and the known geometric information, a conversion matrix between the laser radar and the camera is calculated, wherein the conversion matrix comprises parameters such as rotation, translation, scale and the like, the parameters describe the relative position and direction relation between the laser radar and the camera, a calibration algorithm is utilized to process data, and the calculation of the conversion matrix is optimized by minimizing errors.
(5) And (3) verification and adjustment: and verifying and adjusting the calibration result, wherein the verification and adjustment comprises the step of testing the new data set to determine the accuracy and stability of the calibration result, and the calibration parameters can be adjusted according to the verification result, and the calibration can be carried out again to obtain a better calibration effect.
It will be appreciated that the above steps are basic procedures for laser radar and camera calibration, but may vary from application scenario to application scenario, from device to device, and from algorithm to algorithm in the implementation process, for example, for more complex calibration tasks, higher-level algorithms may be required to process and adjust the data to obtain accurate calibration results.
The method for carrying out joint calibration on the laser radar and the camera based on feature matching or by means of a calibration tool is required to extract point or line features in a calibration object, and has high precision requirements on feature identification; however, the feature extraction precision can be influenced by various factors such as the laser radar resolution, a feature recognition algorithm and environmental noise, when the laser radar resolution is low or the recognition algorithm precision is low, the feature extraction precision of points or lines and the like can be low, the environmental noise is high, the point and line features of the scanning point cloud are unstable, and the joint calibration precision between the laser radar and the camera is low.
The following describes a laser radar and camera joint calibration method, a device and electronic equipment according to the embodiments of the present application with reference to the accompanying drawings.
Fig. 1 is a flow chart of a combined calibration method of a laser radar and a camera according to an embodiment of the present application. As shown in fig. 1, the method includes, but is not limited to, the steps of:
s101, acquiring a first image acquired by a camera.
In some embodiments, the first image may be an image acquired by the camera in real time, or may be an image acquired by the camera in history. Alternatively, the first image may be read from the memory of the camera.
It will be appreciated that the first image is an image for assisting the calibration of the camera and the laser camera, and the type of the first image is not limited.
S102, identifying a first feature point in the first image, and acquiring a first coordinate of the first feature point.
Optionally, the first feature point may be a point with a relatively clear feature in the first image, which can reflect the essential feature of the image, identify an object in the image, and so on, so as to perform joint calibration based on the first feature point.
In some embodiments, a corner recognition algorithm may be used to obtain a corner in the first image as the first feature point.
In some embodiments, a specific selection rule may be preset, for example, a point where a gray value changes drastically or a point with a larger curvature on an edge of an image, that is, an intersection point of edges, etc., and the first feature point in the first image is determined by the preset selection rule.
Further, after determining the first feature point in the first image, a first coordinate corresponding to the first feature point is determined. It is understood that the first coordinates refer to coordinates of the first feature point in the first image, so that an image coordinate system corresponding to the first image needs to be determined, and the corresponding first coordinates are determined based on the position of the first feature point in the image coordinate system.
In some embodiments, the camera correspondence coordinate system O may be established under a Cartesian space coordinate system C -x C y C z C The method comprises the steps of carrying out a first treatment on the surface of the After determining the camera coordinate system, the corresponding image coordinate system can be determined according to the camera coordinate system, the camera coordinate system is converted into the image coordinate system, the three-dimensional space is converted into the two-dimensional space, the camera coordinate system to the image coordinate system can be understood as perspective relation, the principle accords with pinhole imaging, and a similar triangle solving conversion formula can be adopted to obtain the image coordinate system corresponding to the camera coordinate system.
It will be appreciated that the final conversion of the camera coordinate system and the image coordinate system is:
wherein, (X c ,Y c ,Z c ) Is the coordinates of any point in the camera coordinate system, and (x, y) is the coordinates of the corresponding point in the image coordinate system, Z c Is the value of the corresponding point of the camera coordinate system on the Z axis, and f is the origin O of the camera coordinate system C Distance from the origin O of the image coordinate system, i.e. focal length.
After the image coordinate system is determined, according to the position of the first feature point in the image coordinate system, the first coordinate corresponding to the first feature point can be correspondingly determined.
S103, acquiring a template point cloud corresponding to the first image, and acquiring a source point cloud corresponding to the template point cloud according to the laser radar.
Alternatively, a marker template corresponding to the first image may be determined, which may be a checkerboard, spherical marker, or other geometric template. It can be appreciated that the calibration object template has a correspondence with the first image, so that the template feature points in the calibration object template can be determined based on the correspondence between the calibration object template and the first image.
In the embodiment of the application, the checkerboard is used as a calibration object template, specific checkerboard corner points in the calibration object template are marked, for example, four corner points at the most edge are used as the specific checkerboard corner points, the size of the checkerboard is further determined, and a series of point clouds are generated according to the size of the checkerboard to serve as template point clouds. For example, if the size of the checkerboard is 0.6mx 0.5m, a plane with the same size as the size of the checkerboard is generated in the three-dimensional space, and a template point cloud is obtained.
It will be appreciated that the template point cloud is determined by the calibration object template corresponding to the first image, so that the template point cloud may include feature point information in the first image, for example, a point corresponding to the first feature point in the first image in the template point cloud may be determined.
Further, a source point cloud corresponding to the template point cloud is obtained according to the laser radar, for example, the laser radar scans the template point cloud to obtain the corresponding source point cloud.
S104, obtaining a second characteristic point in the source point cloud and a second coordinate of the second characteristic point according to the template point cloud.
It can be understood that the template point cloud is determined by the calibration object template corresponding to the first image, so that the template point cloud can include the characteristic point information in the first image, and when the laser radar scans the template point cloud to obtain the source point cloud, the characteristic point information in the template point cloud can be correspondingly obtained, so that the second characteristic point in the source point cloud is determined. For example, feature points in the template point cloud can be marked during scanning, and the point cloud corresponding to the marked points is used as a second feature point in the source point cloud after scanning.
Further, a laser radar coordinate system, that is, a coordinate system corresponding to the source point cloud is determined, and a second coordinate of the second feature point is determined according to the position of the second feature point in the source point cloud in the laser radar coordinate system.
Alternatively, the extrinsic matrix of the lidar may be determined based on the first coordinate and the second coordinate; and according to the external parameter matrix, the calibration of the laser radar and the camera is realized.
In some embodiments, to achieve joint calibration between the lidar and the camera, the determination of the lidar coordinate system may be calibrated according to the coordinate system to which the template point cloud corresponds. For example, after the coordinate system corresponding to the template point cloud is determined, a mapping relation matrix between the template point cloud and the source point cloud is determined according to the corresponding relation between the coordinates of the characteristic points in the template point cloud and the characteristic points in the source point cloud, and according to the mapping relation matrix, the second coordinate corresponding to the second characteristic point in the source point cloud can be obtained from the coordinates of the characteristic points in the template point cloud, so that the accuracy of joint calibration between the laser radar and the camera is improved.
S105, calibrating the laser radar and the camera based on the first coordinate and the second coordinate.
When determining the first coordinate corresponding to the first feature point and the second coordinate corresponding to the second feature point, determining a conversion relation between the image coordinate system and the laser radar coordinate system based on the coordinate relation of the corresponding feature point, and further combining the relation between the image coordinate system and the camera coordinate system, and determining the conversion relation between the laser radar coordinate system and the camera coordinate system.
Optionally, after determining the first coordinate corresponding to the first feature point and the second coordinate corresponding to the second feature point, a PnP algorithm may be used to solve a conversion relationship between the laser radar coordinate system and the camera coordinate system, where the PnP algorithm is a corresponding method for solving a 3D point to a 2D point, and describes how to estimate the position of the camera according to the 3D space points after knowing N3D space points and positions (N is a positive integer), so as to obtain a conversion relationship between the laser radar and the camera, and perform conversion between point cloud coordinates and pixel coordinates corresponding to the laser radar and the camera, to obtain an external reference matrix of the laser radar, and to implement high-precision joint calibration of the laser radar and the camera.
In the embodiment of the application, the first characteristic point and the corresponding coordinate are obtained by carrying out characteristic point identification on the first image acquired by the camera, the template point cloud corresponding to the first image is obtained, the source point cloud is obtained according to the template point cloud, the coordinate of the characteristic point under the laser radar coordinate system is determined according to the transformation relation between the template point cloud and the source point cloud, the accuracy of coordinate identification is ensured, the corresponding relation between the camera and the laser radar can be obtained, the high-precision joint calibration between the laser radar and the camera is realized, and the problem of low calibration precision between the laser radar and the camera at present is solved.
Fig. 2 is a flow chart of a combined calibration method of a laser radar and a camera according to an embodiment of the present application. As shown in fig. 2, the method includes, but is not limited to, the steps of:
s201, acquiring a first image acquired by a camera.
In this embodiment of the present application, the implementation method of step S201 may be implemented in any manner in each embodiment of the present disclosure, which is not limited herein, and is not described herein again.
S202, acquiring an association window of any target pixel point in the first image.
Optionally, the association window of the target pixel point may be adaptively set, for example, a window with a size of 3*3, or a window with a size of 4*4, which uses the target pixel point as a center point, so as to analyze the target pixel point based on the pixel characteristics of other pixel points in the association window corresponding to the target pixel point.
S203, determining a first matrix of target pixel points according to the pixel values of the pixel points in the association window.
In the embodiment of the application, the joint calibration between the laser radar and the camera is carried out through the checkerboard template, and the angular point positions in the checkerboard are obtained.
Alternatively, in order to identify checkerboard Corner recognition, corner detection may be performed by using the Corner Harris algorithm, and feature points therein may be determined from the detected Corner points.
In some embodiments, when the checkerboard corner positions are moved, the gray values of the checkerboard corner positions can generate larger gradient changes in multiple directions, so that corresponding first matrixes can be obtained according to the gradient changes of pixels in the corresponding association windows of each target pixel point.
Optionally, the calculation of the first matrix is:
M=sum(w(x,y)[I(x,y)-avg] T *[I(x,y)-avg])
wherein w (x, y) is a window function, and a gaussian function can be used, which is not limited herein; i (x, y) represents the gray value of the target pixel point; avg represents the average gray value of the pixel points in the corresponding association window of the target pixel point.
It can be understood that the gray value of the pixel point is a pixel value determined in the gray image, that is, the first image can be converted into the gray image, the analysis is performed according to the gray value of the pixel point in the gray image, and the method for converting the gray image is a well-known means and will not be described herein.
S204, determining a response function of the target pixel point based on the first matrix.
After determining the first matrix corresponding to the pixel point, decomposing the first matrix to obtain two eigenvalues corresponding to the first matrix, and respectively marking the two eigenvalues as first eigenvalue lambda 1 And a second eigenvalue lambda 2
And after the first characteristic value and the second characteristic value are determined, acquiring a response function corresponding to the target pixel point based on the first characteristic value and the second characteristic value. Alternatively, the calculation of the response function may be:
R=lambda 1 *lambda 2 -k*(lambda 1 +lambda 2 ) 2
Where k is a constant, and may take a value of 0.04, which is not particularly limited.
S205, determining a first characteristic point of the first image according to the response function.
Optionally, after determining the response function corresponding to each target pixel, the corner points in the response function can be determined according to the response function corresponding to the target pixel, that is, the detection of the checkerboard corner points is completed. It can be understood that the Corner detection based on the Corner Harris algorithm is a well-known means, and detailed description thereof is omitted in this embodiment.
Further, the first feature point may be determined based on the detected corner point. Alternatively, the corner point at the most edge may be selected as the first feature point, or N corner points with the largest corner point response values may be selected as the first feature point, where N is a positive integer.
S206, acquiring first coordinates of the first feature points.
It will be appreciated that after the first feature point is determined, the first coordinate of the first feature point may be determined based on the location of the first feature point in the first image.
Alternatively, an image coordinate system corresponding to the first image may be acquired, and the first coordinates of the first feature point may be determined according to the image coordinate system corresponding to the first image. The conversion may be performed based on the camera coordinate system, and the image coordinate system of the first image may be obtained based on the camera parameters, so as to obtain the first coordinates of the first feature point.
The implementation method of step S206 may be implemented in any manner in each embodiment of the disclosure, which is not limited herein, and is not repeated herein.
S207, acquiring a template point cloud corresponding to the first image, and acquiring a source point cloud corresponding to the template point cloud according to the laser radar.
In this embodiment of the present application, the implementation method of step S207 may be implemented in any manner in each embodiment of the present disclosure, which is not limited herein, and is not described herein again.
S208, obtaining a second characteristic point in the source point cloud and a second coordinate of the second characteristic point according to the template point cloud.
In this embodiment of the present application, the implementation method of step S208 may be implemented in any manner in each embodiment of the present disclosure, which is not limited herein, and is not described herein again.
S209, calibrating the laser radar and the camera based on the first coordinate and the second coordinate.
In this embodiment of the present application, the implementation method of step S209 may be implemented in any manner in each embodiment of the present disclosure, which is not limited herein, and is not described herein again.
In the embodiment of the application, when the first characteristic points in the first image are determined, the checkerboard template is adopted, the corner points with obvious characteristics are determined as the first characteristic points, so that the subsequent analysis and calibration by using the first characteristic points are more accurate, and the analysis method is simple and has higher efficiency.
Fig. 3 is a flow chart of a combined calibration method of a laser radar and a camera according to an embodiment of the present application. As shown in fig. 3, the method includes, but is not limited to, the steps of:
s301, acquiring a first image acquired by a camera.
In this embodiment of the present application, the implementation method of step S301 may be implemented in any manner in each embodiment of the present disclosure, which is not limited herein, and is not described herein again.
S302, identifying a first feature point in the first image, and acquiring a first coordinate of the first feature point.
In this embodiment of the present application, the implementation method of step S302 may be implemented in any manner in each embodiment of the present disclosure, which is not limited herein, and is not described herein again.
S303, acquiring a template point cloud corresponding to the first image, and acquiring a source point cloud corresponding to the template point cloud according to the laser radar.
In this embodiment of the present application, the implementation method of step S303 may be implemented in any manner in each embodiment of the present disclosure, which is not limited herein, and is not described herein again.
S304, determining a corresponding third feature point in the template point cloud based on the first feature point in the first image.
In some implementations, a marker template corresponding to the first image may be set, in which the first feature point in the first image is labeled.
And generating a three-dimensional space with the same size according to the size of the calibration object template, wherein the three-dimensional space comprises a series of point clouds, so that the template point clouds corresponding to the calibration object template are obtained. It can be understood that, since the first feature point in the first image is marked in the marker template, a third feature point corresponding to the first feature point in the first image can be correspondingly obtained in the template point cloud, and subsequent analysis can be performed based on the third feature point.
And S305, registering the template point cloud and the source point cloud, and determining a second characteristic point corresponding to the third characteristic point in the source point cloud.
Alternatively, an ICP registration algorithm may be employed to achieve registration between the template point cloud and the source point cloud. The ICP registration algorithm is a classical algorithm of point cloud matching, and the registration accuracy is high.
The template point cloud and the source point cloud can be aligned based on an ICP registration algorithm, namely, the corresponding relation between the template point cloud and the source point cloud is obtained. Considering that the laser radar scanning source point cloud has a checkerboard extraneous background, the ICP registration accuracy and speed are affected, and therefore, before ICP registration, the extraneous background, namely noise points or outliers and the like in the extraneous background are removed, so that the accuracy of ICP registration is improved.
In some embodiments, the K-D tree neighbor search may be used to remove noise points and outliers, and in other embodiments, other methods may be used to remove noise points, which is not limited herein.
Further, in order to avoid the problem of reverse matching in registration, the adjustment of the direction of the point cloud can be performed based on the curvature of the point cloud, considering the problem of reverse matching possibly occurring in registration of the regular object. For example, a K-D tree nearest neighbor searching method is adopted to obtain a neighborhood point set of the farthest point in the main direction of the two-point cloud, and curvature estimation is carried out on the neighborhood point set through least square tangent plane fitting. And establishing a local coordinate system by using any point of the point cloud, solving a quadric surface coefficient according to a least square method, and respectively solving first-order and second-order partial derivatives of the quadric surface to obtain the single-point average curvature. And obtaining a final transformation matrix according to the average curvature value of the corresponding points of the source point cloud and the calibration object template point cloud. The method for acquiring the transformation matrix by the ICP registration algorithm is a well-known method, and is not limited herein, and is not described in detail.
In some embodiments, to improve the efficiency of ICP registration, a better registration position may also be manually determined, so as to facilitate rapid ICP registration. The registration position with better effect is, for example, a position corresponding to a certain feature point or a position with obvious features, and is easy to register.
It can be understood that the transformation matrix reflects the transformation relationship between the template point cloud and the source point cloud, and the coordinates of the corresponding points in the source point cloud can be obtained according to the coordinates of the point cloud in the template point cloud and the transformation matrix. And in the ICP registration process of the template point cloud and the source point cloud, determining the corresponding relation between the template point cloud and the source point cloud, so as to determine a second characteristic point in the source point cloud, which is correspondingly matched with a third characteristic point in the template point cloud.
S306, obtaining second coordinates of the second feature points.
In some embodiments, the second coordinates of the second feature point in the source point cloud may be determined based on the third coordinates of the third feature point in the template point cloud.
Optionally, according to a third coordinate of a third feature point in the template point cloud and the transformation matrix, determining a second coordinate of a corresponding second feature point in the source point cloud, where the second coordinate may be:
wherein [ R, T ] represents a transformation matrix; (X, Y, Z) is the second coordinate of the second characteristic point in the source point cloud, namely the second coordinate of the second characteristic point under the laser radar coordinate system; (X ', Y ', Z ') is the third coordinate of the third feature point in the template point cloud.
S307, based on the first coordinate and the second coordinate, calibration of the laser radar and the camera is achieved.
In this embodiment of the present application, the implementation method of step S307 may be implemented in any manner in each embodiment of the present disclosure, which is not limited herein, and is not described herein again.
In the embodiment of the application, the source point cloud is determined through the template point cloud in the first image, the third characteristic point corresponding to the first characteristic point in the template point cloud is obtained, the third coordinate of the third characteristic point is obtained, the corresponding second characteristic point in the source point cloud is determined based on the registration algorithm, the transformation matrix between the source point cloud and the template point cloud is obtained, the second coordinate of the second characteristic point under the laser radar camera is determined by taking the template point cloud as a medium, the second coordinate is obtained more accurately, and further, the high-precision calibration between the laser radar and the camera is performed based on the more accurate second coordinate and the first coordinate.
Fig. 4 is a flow chart of a combined calibration method of a laser radar and a camera according to an embodiment of the present application. As shown in fig. 4, the method includes, but is not limited to, the steps of:
s401, acquiring a first image acquired by a camera.
In this embodiment of the present application, the implementation method of step S401 may be implemented by any one of the methods in each embodiment of the present disclosure, which is not limited herein, and is not described herein again.
S402, acquiring an association window of any target pixel point in the first image.
In this embodiment of the present application, the implementation method of step S402 may be implemented by any one of the methods in each embodiment of the present disclosure, which is not limited herein, and is not described herein again.
S403, determining a first matrix of target pixel points according to the pixel values of the pixel points in the association window.
In this embodiment of the present application, the implementation method of step S403 may be implemented by any one of the methods in each embodiment of the present disclosure, which is not limited herein, and is not described herein again.
S404, determining a response function of the target pixel point based on the first matrix.
In this embodiment of the present application, the implementation method of step S404 may be implemented by any one of the methods in the embodiments of the present disclosure, which is not limited herein, and is not described herein again.
S405, determining a first characteristic point of the first image according to the response function.
In this embodiment of the present application, the implementation method of step S405 may be implemented by any one of the methods in each embodiment of the present disclosure, which is not limited herein, and is not described herein again.
S406, acquiring first coordinates of the first feature points.
In this embodiment of the present application, the implementation method of step S406 may be implemented by any one of the methods in the embodiments of the present disclosure, which is not limited herein, and is not described herein again.
S407, acquiring a template point cloud corresponding to the first image, and acquiring a source point cloud corresponding to the template point cloud according to the laser radar.
In this embodiment of the present application, the implementation method of step S407 may be implemented by any one of the methods in each embodiment of the present disclosure, which is not limited herein, and is not described herein again.
S408, based on the first feature points in the first image, corresponding third feature points in the template point cloud are determined.
In this embodiment of the present application, the implementation method of step S408 may be implemented by any one of the methods in each embodiment of the present disclosure, which is not limited herein, and is not described herein again.
And S409, registering the template point cloud and the source point cloud, and determining a second characteristic point corresponding to the third characteristic point in the source point cloud.
In this embodiment of the present application, the implementation method of step S409 may be implemented by any one of the methods in the embodiments of the present disclosure, which is not limited herein, and is not described herein again.
S410, obtaining second coordinates of the second feature points.
In this embodiment of the present application, the implementation method of step S410 may be implemented by any one of the methods in each embodiment of the present disclosure, which is not limited herein, and is not described herein again.
S411, calibrating the laser radar and the camera based on the first coordinate and the second coordinate.
In this embodiment of the present application, the implementation method of step S411 may be implemented by any one of the methods in each embodiment of the present disclosure, which is not limited herein, and is not described herein again.
In the embodiment of the application, the characteristic points are identified by the first image acquired by the camera, the checkerboard template is adopted and the angular points are detected, the angular points with obvious characteristics are determined to serve as the first characteristic points, the subsequent analysis and calibration are more accurate by using the first characteristic points, the second coordinates of the second characteristic points under the laser radar camera are determined by taking the template point cloud as a medium, the second coordinates are acquired more accurately, and then the high-precision calibration between the laser radar and the camera is performed based on the more accurate second coordinates and the first coordinates, so that the problem of low calibration precision between the laser radar and the camera at present is solved.
Fig. 5 is a schematic structural diagram of a combined calibration device for a laser radar and a camera according to an embodiment of the present application. As shown in fig. 5, the combined calibration device 500 of the laser radar and the camera includes:
A first acquiring module 501, configured to acquire a first image acquired by a camera;
a second obtaining module 502, configured to identify a first feature point in the first image, and obtain a first coordinate of the first feature point;
a third obtaining module 503, configured to obtain a template point cloud corresponding to the first image, and obtain a source point cloud corresponding to the template point cloud according to the laser radar;
a fourth obtaining module 504, configured to obtain, according to the template point cloud, a second feature point in the source point cloud, and a second coordinate of the second feature point;
and the calibration module 505 is used for realizing the calibration of the laser radar and the camera based on the first coordinate and the second coordinate.
In some implementations, the second acquisition module 502 includes:
acquiring an associated window of any target pixel point in a first image;
determining a first matrix of target pixel points according to pixel values of the pixel points in the associated window;
determining a response function of the target pixel point based on the first matrix;
and determining a first characteristic point of the first image according to the response function.
In some implementations, the fourth acquisition module 504 includes:
determining a corresponding third feature point in the template point cloud based on the first feature point in the first image;
registering the template point cloud and the source point cloud, and determining a second characteristic point corresponding to the third characteristic point in the source point cloud.
In some implementations, the fourth acquisition module 504 includes:
acquiring a third coordinate of a third characteristic point in the template point cloud;
and determining the second coordinates of the second feature points according to the third coordinates.
In some implementations, the fourth acquisition module 504 includes:
determining a transformation matrix between the template point cloud and the source point cloud in response to registration of the template point cloud and the source point cloud;
and transforming the third coordinates of the third feature points based on the transformation matrix, and determining the second coordinates corresponding to the second feature points.
In some implementations, the calibration module 505 includes:
determining an extrinsic matrix of the laser radar based on the first coordinate and the second coordinate;
and according to the external parameter matrix, the calibration of the laser radar and the camera is realized.
In some implementations, the second acquisition module 502 includes:
determining an image coordinate system of the first image based on a camera coordinate system of the camera;
and determining the first coordinate of the first feature point according to the position of the first feature point in the image coordinate system.
In the embodiment of the application, the first characteristic point and the corresponding coordinate are obtained by carrying out characteristic point identification on the first image acquired by the camera, the template point cloud corresponding to the first image is obtained, the source point cloud is obtained according to the template point cloud, the coordinate of the characteristic point under the laser radar coordinate system is determined according to the transformation relation between the template point cloud and the source point cloud, the accuracy of coordinate identification is ensured, the corresponding relation between the camera and the laser radar can be obtained, the high-precision joint calibration between the laser radar and the camera is realized, and the problem of low calibration precision between the laser radar and the camera at present is solved.
Fig. 6 is a block diagram of an electronic device, according to an example embodiment. The electronic device shown in fig. 6 is only an example and should not impose any limitation on the functionality and scope of use of the embodiments of the present application.
As shown in fig. 6, the electronic device 600 includes a processor 601 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 602 or a program loaded from a Memory 606 into a random access Memory (RAM, random Access Memory) 603. In the RAM 603, various programs and data required for the operation of the electronic apparatus 600 are also stored. The processor 601, the ROM 602, and the RAM 603 are connected to each other through a bus 604. An Input/Output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: a memory 606 including a hard disk and the like; and a communication section 607 including a network interface card such as a LAN (local area network ) card, a modem, or the like, the communication section 607 performing communication processing via a network such as the internet; the drive 608 is also connected to the I/O interface 605 as needed.
In particular, according to embodiments of the present application, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present application include a computer program embodied on a computer readable medium, the computer program containing program code for performing the methods shown in the flowcharts. In such an embodiment, the computer program can be downloaded and installed from the network through the communication section 607. The above-described functions defined in the combined calibration method of the lidar and the camera of the present application are performed when the computer program is executed by the processor 601.
In an exemplary embodiment, a storage medium is also provided, such as a memory, comprising instructions executable by the processor 601 of the electronic device 600 to perform the above-described method. Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Fig. 7 is a block diagram of an electronic device, according to an example embodiment. The electronic device shown in fig. 7 is only an example and should not impose any limitation on the functionality and scope of use of the embodiments of the present application. As shown in fig. 7, the electronic device 700 includes a processor 701 and a memory 702. The memory 702 is used for storing program codes, and the processor 701 is connected with the memory 702 and is used for reading the program codes from the memory 702, so as to implement the combined calibration method of the laser radar and the camera in the embodiment.
Alternatively, the number of processors 701 may be one or more.
Optionally, the electronic device may further comprise an interface 703, and the number of interfaces 703 may be a plurality. The interface 703 may be connected to an application program and may receive data of an external device such as a sensor, etc.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It is to be understood that the present application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (10)

1. A method for joint calibration of a lidar and a camera, the method comprising:
acquiring a first image acquired by a camera;
identifying a first characteristic point in the first image, and acquiring a first coordinate of the first characteristic point;
acquiring a template point cloud corresponding to the first image, and acquiring a source point cloud corresponding to the template point cloud according to a laser radar;
acquiring a second characteristic point in the source point cloud and a second coordinate of the second characteristic point according to the template point cloud;
and based on the first coordinate and the second coordinate, calibrating the laser radar and the camera.
2. The method of claim 1, wherein the identifying the first feature point in the first image comprises:
acquiring an associated window of any target pixel point in the first image;
determining a first matrix of the target pixel points according to the pixel values of the pixel points in the association window;
Determining a response function of the target pixel point based on the first matrix;
and determining the first characteristic point of the first image according to the response function.
3. The method according to claim 2, wherein the obtaining the second feature point in the source point cloud according to the template point cloud includes:
determining a corresponding third feature point in the template point cloud based on the first feature point in the first image;
registering the template point cloud and the source point cloud, and determining a second characteristic point corresponding to the third characteristic point in the source point cloud.
4. A method according to claim 3, wherein obtaining the second coordinates of the second feature point comprises:
acquiring a third coordinate of a third characteristic point in the template point cloud;
and determining the second coordinates of the second feature points according to the third coordinates.
5. The method of claim 4, wherein said determining said second coordinates of said second feature point from said third coordinates comprises:
determining a transformation matrix between the template point cloud and the source point cloud in response to when the template point cloud is aligned with the source point cloud;
And transforming the third coordinates of the third feature points based on the transformation matrix, and determining the second coordinates corresponding to the second feature points.
6. The method of any of claims 1-5, wherein the calibrating the lidar and the camera based on the first coordinate and the second coordinate comprises:
determining an extrinsic matrix of the laser radar based on the first coordinate and the second coordinate;
and according to the external parameter matrix, calibrating the laser radar and the camera.
7. The method of claim 2, wherein obtaining the first coordinates of the first feature point comprises:
determining an image coordinate system of the first image based on a camera coordinate system of the camera;
and determining the first coordinate of the first feature point according to the position of the first feature point in the image coordinate system.
8. A combined calibration device for a laser radar and a camera, comprising:
the first acquisition module is used for acquiring a first image acquired by the camera;
the second acquisition module is used for identifying a first characteristic point in the first image and acquiring a first coordinate of the first characteristic point;
The third acquisition module is used for acquiring a template point cloud corresponding to the first image and acquiring a source point cloud corresponding to the template point cloud according to a laser radar;
a fourth obtaining module, configured to obtain, according to the template point cloud, a second feature point in the source point cloud, and a second coordinate of the second feature point;
and the calibration module is used for realizing the calibration of the laser radar and the camera based on the first coordinate and the second coordinate.
9. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the method of any one of claims 1 to 7.
10. A non-transitory computer readable storage medium, wherein instructions in the storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the method of any one of claims 1 to 7.
CN202311338740.1A 2023-10-16 2023-10-16 Laser radar and camera combined calibration method and device and electronic equipment Pending CN117422772A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311338740.1A CN117422772A (en) 2023-10-16 2023-10-16 Laser radar and camera combined calibration method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311338740.1A CN117422772A (en) 2023-10-16 2023-10-16 Laser radar and camera combined calibration method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN117422772A true CN117422772A (en) 2024-01-19

Family

ID=89525752

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311338740.1A Pending CN117422772A (en) 2023-10-16 2023-10-16 Laser radar and camera combined calibration method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN117422772A (en)

Similar Documents

Publication Publication Date Title
CN110689579B (en) Rapid monocular vision pose measurement method and measurement system based on cooperative target
CN109521403B (en) Parameter calibration method, device and equipment of multi-line laser radar and readable medium
CN106651752B (en) Three-dimensional point cloud data registration method and splicing method
CN111383279B (en) External parameter calibration method and device and electronic equipment
CN109801333B (en) Volume measurement method, device and system and computing equipment
EP3460715B1 (en) Template creation apparatus, object recognition processing apparatus, template creation method, and program
CN114119864A (en) Positioning method and device based on three-dimensional reconstruction and point cloud matching
CN112581421A (en) Three-dimensional point cloud processing method, building detection method, device, equipment and medium
CN110766758A (en) Calibration method, device, system and storage device
CN112686950B (en) Pose estimation method, pose estimation device, terminal equipment and computer readable storage medium
CN110533649B (en) Unmanned aerial vehicle general structure crack identification and detection device and method
CN111750804A (en) Object measuring method and device
CN116433737A (en) Method and device for registering laser radar point cloud and image and intelligent terminal
CN115685160A (en) Target-based laser radar and camera calibration method, system and electronic equipment
CN114029946A (en) Method, device and equipment for guiding robot to position and grab based on 3D grating
CN110458951B (en) Modeling data acquisition method and related device for power grid pole tower
CN113763478A (en) Unmanned vehicle camera calibration method, device, equipment, storage medium and system
CN108564626B (en) Method and apparatus for determining relative pose angle between cameras mounted to an acquisition entity
CN109489658B (en) Moving target positioning method and device and terminal equipment
CN111336938A (en) Robot and object distance detection method and device thereof
CN114782556B (en) Camera and laser radar registration method and system and storage medium
CN117422772A (en) Laser radar and camera combined calibration method and device and electronic equipment
CN116188591A (en) Multi-camera global calibration method and device and electronic equipment
CN115393299A (en) Monocular vision-based assembly line workpiece distance measuring method and device
CN115063489A (en) External parameter calibration method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination