CN112946609B - Calibration method, device and equipment for laser radar and camera and readable storage medium - Google Patents

Calibration method, device and equipment for laser radar and camera and readable storage medium Download PDF

Info

Publication number
CN112946609B
CN112946609B CN202110140667.1A CN202110140667A CN112946609B CN 112946609 B CN112946609 B CN 112946609B CN 202110140667 A CN202110140667 A CN 202110140667A CN 112946609 B CN112946609 B CN 112946609B
Authority
CN
China
Prior art keywords
point cloud
cloud data
image
matrix
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110140667.1A
Other languages
Chinese (zh)
Other versions
CN112946609A (en
Inventor
姚艳南
黄晓延
吕吉亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Automotive Technology and Research Center Co Ltd
Automotive Data of China Tianjin Co Ltd
Original Assignee
China Automotive Technology and Research Center Co Ltd
Automotive Data of China Tianjin Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Automotive Technology and Research Center Co Ltd, Automotive Data of China Tianjin Co Ltd filed Critical China Automotive Technology and Research Center Co Ltd
Priority to CN202110140667.1A priority Critical patent/CN112946609B/en
Publication of CN112946609A publication Critical patent/CN112946609A/en
Application granted granted Critical
Publication of CN112946609B publication Critical patent/CN112946609B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Abstract

The embodiment of the invention discloses a method, a device and equipment for calibrating a laser radar and a camera and a readable storage medium, and relates to the technical field of intelligent networked automobile and equipment calibration. The method comprises the following steps: acquiring point cloud data obtained by scanning the calibration plate by a laser radar and an image obtained by shooting the calibration plate by a camera; performing initial calibration according to pose information of the laser radar and the camera to obtain a rotation matrix and a translation matrix; mapping the point cloud data into an image coordinate system according to the internal reference matrix, the rotation matrix and the translation matrix of the camera to obtain two-dimensional point cloud data; according to the geometric change information of the two-dimensional point cloud data relative to the calibration plate in the image, adjusting corresponding parameters of a rotation matrix and a translation matrix; the geometric variation information includes at least one of an amount of trapezoidal deformation, an angle of rotation, an amount of enlargement/reduction, and an amount of offset. The embodiment of the invention realizes the joint calibration of the laser radar and the camera by an efficient and accurate method.

Description

Calibration method, device and equipment for laser radar and camera and readable storage medium
Technical Field
The embodiment of the invention relates to an intelligent networking automobile and equipment calibration technology, in particular to a laser radar and camera calibration method, a device, equipment and a readable storage medium.
Background
The environment comprehensive perception technology is the basis of the intelligent networking automobile technology, and the sensor is used for acquiring information such as roads, obstacles, positions of vehicles and pedestrians around the automobile and transmitting the information to the vehicle decision control center, so that data support is provided for decision behaviors of the intelligent networking automobile, and the environment comprehensive perception technology is an indispensable 'eye' of the intelligent networking automobile.
The laser radar and the camera are used as two core sensors of the intelligent networked automobile sensing system, and laser radar data and camera data are fused to realize accurate space object positioning and tracking, so that the laser radar and the camera become a research hotspot of the current intelligent networked automobile technology. The premise of data fusion of the laser radar and the camera is that the laser radar and the camera need to be subjected to spatial position joint calibration, namely, the spatial synchronization of data acquired by the laser radar and the camera is realized. The existing combined calibration method has the problems of requirements on brands of used radars and cameras, no source opening of algorithms, complex operation process and the like, and is not friendly to users.
Disclosure of Invention
The embodiment of the invention provides a calibration method, a calibration device and a calibration equipment for a laser radar and a camera and a readable storage medium, so that the combined calibration of the laser radar and the camera is realized by an efficient and accurate method.
In a first aspect, an embodiment of the present invention provides a method for calibrating a laser radar and a camera, including:
acquiring point cloud data obtained by scanning a calibration plate by a laser radar and an image obtained by shooting the calibration plate by a camera;
preliminarily calibrating the laser radar and the camera according to the pose information of the laser radar and the camera to obtain a rotation matrix and a translation matrix of a laser radar coordinate system relative to a camera coordinate system;
according to the internal reference matrix, the rotation matrix and the translation matrix of the camera, mapping the point cloud data under the laser radar coordinate system to an image coordinate system to obtain two-dimensional point cloud data;
adjusting corresponding parameters of the rotation matrix and the translation matrix according to the geometric change information of the two-dimensional point cloud data relative to a standard plate in the image;
wherein the geometric variation information includes at least one of a trapezoidal deformation amount, a rotation angle, an enlargement/reduction amount, and an offset amount.
In a second aspect, an embodiment of the present invention further provides a calibration apparatus for a laser radar and a camera, including:
the system comprises an acquisition module, a calibration board and a camera, wherein the acquisition module is used for acquiring point cloud data obtained by scanning the calibration board by a laser radar and an image obtained by shooting the calibration board by the camera;
the preliminary calibration module is used for preliminarily calibrating the laser radar and the camera according to the pose information of the laser radar and the camera to obtain a rotation matrix and a translation matrix of a laser radar coordinate system relative to the camera coordinate system;
the mapping module is used for mapping the point cloud data under the laser radar coordinate system into an image coordinate system according to the internal reference matrix, the rotation matrix and the translation matrix of the camera to obtain two-dimensional point cloud data;
the adjusting module is used for adjusting corresponding parameters of the rotation matrix and the translation matrix according to the geometric change information of the two-dimensional point cloud data relative to a standard plate in the image;
wherein the geometric variation information includes at least one of a trapezoidal deformation amount, a rotation angle, an enlargement/reduction amount, and an offset amount.
In a third aspect, an embodiment of the present invention further provides an electronic device, where the electronic device includes:
one or more processors;
a memory for storing one or more programs,
when the one or more programs are executed by the one or more processors, the one or more processors implement the lidar and camera calibration method of any embodiment.
In a fourth aspect, an embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the calibration method for a laser radar and a camera according to any embodiment.
According to the method, firstly, preliminary calibration is carried out according to pose information of a laser radar and a camera, and then corresponding parameters of a rotation matrix and a translation matrix are adjusted to carry out finer secondary calibration according to geometric change information of two-dimensional point cloud data relative to a calibration plate in an image, so that the combined calibration of the laser radar and the camera is efficiently, accurately and simply realized through a two-step calibration method; in addition, in this embodiment, according to the geometric change information, the corresponding parameters of the rotation matrix and the translation matrix are adjusted, that is, the parameters causing the geometric change are specifically adjusted according to the geometric change information, so as to further improve the calibration efficiency and accuracy.
Drawings
Fig. 1a is a flowchart of a calibration method for a laser radar and a camera according to an embodiment of the present invention;
FIG. 1b is a schematic diagram of a coordinate system of a laser radar and a camera according to an embodiment of the present invention;
fig. 2a is a flowchart of another calibration method for a lidar and a camera according to an embodiment of the present invention;
FIG. 2b is a diagram illustrating the correspondence between geometric variation information and parameters provided by an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a calibration apparatus for a laser radar and a camera according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting of the invention. It should be further noted that, for the convenience of description, only some of the structures related to the present invention are shown in the drawings, not all of the structures.
The embodiment of the invention provides a calibration method of a laser radar and a camera, a flow chart of which is shown in fig. 1a, and the calibration method can be suitable for the condition of carrying out combined calibration on the laser radar and the camera through a calibration plate. The method may be performed by a calibration arrangement of a lidar and a camera, which may be constituted by software and/or hardware, and is typically integrated in an electronic device.
With reference to fig. 1a, the method provided in this embodiment specifically includes:
s110, point cloud data obtained by scanning the calibration plate by the laser radar and an image obtained by shooting the calibration plate by the camera are obtained.
The laser radar and the camera are installed at fixed positions, and the calibration plate (such as a checkerboard calibration plate) is arranged in a completely detectable area of the camera and the laser radar, so that a test scene is clear. The method comprises the steps of scanning a calibration plate through a laser radar to obtain point cloud data, shooting the calibration plate under the same posture to obtain an image, and obtaining the point cloud data and the image of the calibration plate under the same posture.
And S120, preliminarily calibrating the laser radar and the camera according to the pose information of the laser radar and the camera to obtain a rotation matrix and a translation matrix of a laser radar coordinate system relative to a camera coordinate system.
The pose information of the lidar and the camera includes position information and attitude information of the lidar. Optionally, a lidar coordinate system and a camera coordinate system are pre-established, as shown in fig. 1 b. The coordinate system of the laser radar belongs to a right-hand coordinate system, the origin of the coordinate system of the laser radar is positioned at the center of the transmitter inside the laser radar, the X axis and the Y axis are respectively parallel to two sides of the base, the Z axis is positioned on the rotating shaft and upwards represents the positive direction, wherein the positive direction of the Y axis is the positive direction of the transmitter inside the laser radar; the camera coordinate system belongs to the right hand coordinate system, the original point of camera coordinate system is located the optical center of camera, the y axle passes optical center perpendicular to camera imaging plane, moves towards the calibration board is the positive direction, and the x axle passes optical center and perpendicular to y axle, is located the imaging plane horizontal direction, faces the calibration board level is the positive direction right, and the z axle passes optical center and perpendicular to x axle, z axle, faces the calibration board upwards is the positive direction.
Roughly measuring the translation t of three axes of the laser radar coordinate system X, Y, Z relative to three axes of the camera coordinate system x, y and z by using a common measuring ruler based on the camera coordinate systemx,ty,tzAnd obtaining a translation matrix T of the laser radar coordinate system relative to the camera coordinate system, as shown in the formula (1). Roughly measuring the rotation angle theta of three axes of the laser radar coordinate system X, Y, Z by using an inclinometer based on the camera coordinate system1,θ2,θ3And obtaining a rotation matrix R of the laser radar coordinate system relative to the camera coordinate system, wherein the rotation matrix R is shown as a formula (2).
Figure BDA0002928682330000051
Figure BDA0002928682330000052
Because the accuracy of the pose information is limited, the step can be called as a rigid body transformation matrix rough calibration stage.
And S130, mapping the point cloud data under the laser radar coordinate system to an image coordinate system according to the internal reference matrix, the rotation matrix and the translation matrix of the camera to obtain two-dimensional point cloud data.
Optionally, before S130, camera internal reference calibration is performed by using a calibration plate and a zhangying calibration method, so as to obtain an internal reference matrix F of the camera.
At S130, the point cloud data is mapped to the image coordinate system of the camera according to the camera internal reference matrix F, the translation matrix T from the lidar coordinate system to the camera coordinate system obtained through rough measurement, and the rotation matrix R from the lidar coordinate system to the camera coordinate system obtained through rough measurement, so as to obtain two-dimensional point cloud data, as shown in formula (3).
Figure BDA0002928682330000061
Wherein (X)L,YL,ZL) For calibrating point cloud data coordinates, Y, of a panel in a lidar coordinate systemCTo calibrate the y-axis coordinates of the plate in the camera coordinate system, (u, v) are the mapped coordinates of the point cloud data in the pixel coordinate system.
140. And adjusting corresponding parameters of the rotation matrix and the translation matrix according to the geometric change information of the two-dimensional point cloud data relative to the calibration plate in the image.
Wherein the geometric variation information includes at least one of a trapezoidal deformation amount, a rotation angle, an enlargement/reduction amount, and an offset amount. Since the parameters of the translation matrix and the rotation matrix are not necessarily all inaccurate, there may be more accurate parameters, such that the geometric variation information includes only one of the trapezoidal deformation amount, the rotation angle, the enlargement/reduction amount, and the offset amount, or a combination of several.
Through a great deal of research and experiments, the inventor creatively discovers that when different parameters in a rotation matrix and a translation matrix are deviated, different kinds of geometric change information can be caused. Based on the corresponding relation, parameters corresponding to the geometric change information can be adjusted in a targeted manner, so that the geometric change information disappears, and a high-precision rotation matrix and a high-precision translation matrix are obtained.
According to the method, firstly, preliminary calibration is carried out according to pose information of a laser radar and a camera, and then corresponding parameters of a rotation matrix and a translation matrix are adjusted to carry out finer secondary calibration according to geometric change information of two-dimensional point cloud data relative to a calibration plate in an image, so that the combined calibration of the laser radar and the camera is efficiently, accurately and simply realized through a two-step calibration method; in addition, in this embodiment, according to the geometric change information, the corresponding parameters of the rotation matrix and the translation matrix are adjusted, that is, the parameters causing the geometric change are specifically adjusted according to the geometric change information, so as to further improve the calibration efficiency and accuracy.
Fig. 2a is a flowchart of another calibration method for a laser radar and a camera according to an embodiment of the present invention, where the refinement of the parameter adjustment process includes the following operations:
s210, point cloud data obtained by scanning the calibration plate by the laser radar and an image obtained by shooting the calibration plate by the camera are obtained.
S220, preliminarily calibrating the laser radar and the camera according to the pose information of the laser radar and the camera to obtain a rotation matrix and a translation matrix of a laser radar coordinate system relative to a camera coordinate system.
And S230, mapping the point cloud data under the laser radar coordinate system to an image coordinate system according to the internal reference matrix, the rotation matrix and the translation matrix of the camera to obtain two-dimensional point cloud data.
S240, selecting one unprocessed target geometric change information from the geometric change information of the two-dimensional point cloud data relative to the calibration plate in the image according to a set sequence.
In conjunction with fig. 2b, the inventors have innovatively discovered the following: 1) when the rotation angle of the Y axis in the rotation matrix is larger, the two-dimensional point cloud data has a clockwise rotation angle relative to a calibration plate in the image; when the measured value of the rotation amount of the Y axis is smaller, the two-dimensional point cloud data has a counterclockwise rotation angle relative to the standard plate in the image. 2) When the rotation angle of an X axis in the rotation matrix is larger, trapezoidal deformation with wide top and narrow bottom and integral upward translation amount exist in the two-dimensional point cloud data relative to a calibration plate in the image; when the rotation angle of the X axis is small, the two-dimensional point cloud data has trapezoidal deformation with a narrow top and a wide bottom and integral downward translation relative to a calibration plate in the image. 3) When the rotation angle of the Z axis in the rotation matrix is larger, trapezoidal deformation with a wide left part and a narrow right part and integral leftward offset of the two-dimensional point cloud data relative to a calibration plate in the image exist; when the rotation angle of the Z axis is smaller, trapezoidal deformation with a narrow left and a wide right and offset of the whole right exist in the two-dimensional point cloud data relative to a calibration plate in the image. 4) When the translation amount of an X axis in the translation matrix is larger, the two-dimensional point cloud data has integral rightward offset relative to a calibration plate in the image; when the translation amount of the X axis is small, the two-dimensional point cloud data has integral leftward offset relative to a calibration plate in the image. 5) When the translation amount of the Z axis in the translation matrix is larger, the two-dimensional point cloud data has integral upward translation amount relative to a calibration plate in the image; when the translation amount of the Z axis is small, the two-dimensional point cloud data has integral downward translation amount relative to the calibration plate in the image. 6) When the translation amount of the Y axis in the translation matrix is larger, the two-dimensional point cloud data has integral reduction amount relative to a calibration plate in the image; when the translation amount of the Y axis is small, the two-dimensional point cloud data has integral amplification amount relative to a calibration plate in the image.
The analysis of the above rules shows that the overall translation can occur while the trapezoidal deformation is generated, and the overall translation can be caused by the larger or smaller translation amounts of the X axis and the Z axis in the translation matrix, so that the trapezoidal deformation is firstly adjusted, and then the translation amounts of the X axis and the Z axis are adjusted. The trapezoidal deformation also affects the amount of enlargement and reduction of the two-dimensional point cloud data, and therefore, the amount of enlargement/reduction should be adjusted after the trapezoidal deformation is adjusted. Because the zooming in/out and the deviation do not influence each other, the adjusting sequence of the zooming in/out and the deviation is not limited. Because the rotation is centrosymmetric, the trapezoidal deformation, the offset, the amplification/reduction can not influence the rotation angle, and therefore the rotation angle can be adjusted before the trapezoidal deformation is adjusted.
Based on the above analysis, setting the order includes: rotation angle, trapezoidal deformation amount, offset amount, enlargement/reduction amount; or a rotation angle, an amount of trapezoidal deformation, an amount of enlargement/reduction, and an amount of offset. Specifically, if the geometric change information includes the trapezoidal deformation amount, the rotation angle, the enlargement/reduction amount, and the offset amount, and none of them is processed, the rotation angle is selected as the target geometric change information, and after the rotation angle disappears due to parameter adjustment, the trapezoidal deformation amount is selected as the target geometric change information. After the trapezoidal deformation disappears through parameter adjustment, selecting the offset as target geometric change information, and finally selecting the amplification/reduction amount as target geometric change information after the offset disappears through parameter adjustment, and enabling the amplification/reduction amount to disappear through parameter adjustment, so that the position, the shape and the size of the two-dimensional point cloud data and the position, the shape and the size of a standard plate in the image are completely matched.
S250, adjusting parameters corresponding to the target geometric change information in the rotation matrix and the translation matrix according to the target geometric change information of the two-dimensional point cloud data relative to the calibration plate in the image.
Optionally, S250 includes one or a combination of several of the following four alternative embodiments.
A first alternative embodiment: and adjusting the rotation angle of the Y axis of the laser radar coordinate system in the rotation matrix according to the rotation angle of the two-dimensional point cloud data relative to the calibration plate in the image.
If the two-dimensional point cloud data has a clockwise rotation angle relative to the calibration plate in the image, reducing the rotation angle of the Y axis of the laser radar coordinate system in the rotation matrix; and if the two-dimensional point cloud data has a counterclockwise rotation angle relative to the calibration plate in the image, increasing the rotation angle of the Y axis of the laser radar coordinate system in the rotation matrix.
Second alternative embodiment: and adjusting the rotation angle of the X axis and/or the Z axis in the rotation matrix according to the trapezoidal deformation amount of the two-dimensional point cloud data relative to the calibration plate in the image.
If the two-dimensional point cloud data has trapezoidal deformation with wide top and narrow bottom relative to a standard plate in the image, reducing the rotation angle of an X axis in the rotation matrix; if the two-dimensional point cloud data has trapezoidal deformation with a narrow top and a wide bottom relative to a standard plate in the image, increasing the rotation angle of an X axis in the rotation matrix; if the two-dimensional point cloud data has trapezoidal deformation with a wide left part and a narrow right part relative to a standard plate in the image, reducing the rotation angle of a Z axis in the rotation matrix; and if the two-dimensional point cloud data has trapezoidal deformation with a narrow left part and a wide right part relative to the calibration plate in the image, increasing the rotation angle of the Z axis in the rotation matrix.
The rotation angle of the X axis in the rotation matrix can be adjusted firstly, and then the rotation angle of the Z axis is adjusted; the rotation angle of the Z axis may be adjusted first and then the rotation angle of the X axis may be adjusted.
A third alternative embodiment: and adjusting the translation amount of the X axis and/or the Z axis in the translation matrix according to the offset of the two-dimensional point cloud data relative to the calibration plate in the image.
If the two-dimensional point cloud data has right offset relative to a calibration plate in the image, reducing the translation amount of an X axis in the translation matrix; if the two-dimensional point cloud data has leftward offset relative to a calibration plate in the image, increasing the translation amount of an X axis in the translation matrix; if the two-dimensional point cloud data has upward offset relative to a calibration plate in the image, reducing the translation amount of a Z axis in the translation matrix; and if the two-dimensional point cloud data has downward offset relative to a standard plate in the image, increasing the translation amount of the Z axis in the translation matrix.
The translation amount of an X axis in a translation matrix can be adjusted firstly, and then the translation amount of a Z axis is adjusted; the translation of the Z axis may be adjusted first and then the translation of the X axis may be adjusted.
A fourth alternative embodiment: and adjusting the translation amount of the Y axis in the translation matrix according to the enlargement/reduction amount of the two-dimensional point cloud data relative to the calibration plate in the image.
If the two-dimensional point cloud data has amplification amount relative to a standard plate in the image, increasing the translation amount of a Y axis in the translation matrix; and if the two-dimensional point cloud data has a reduction amount relative to the standard plate in the image, reducing the translation amount of the Y axis in the translation matrix.
S260, judging whether unprocessed geometric change information exists or not, and jumping to S240 if the unprocessed geometric change information exists; if not, jump to S270.
And S270, ending the operation.
The embodiment sequentially adjusts the parameters corresponding to various geometric change information according to the set sequence, thereby completely and accurately eliminating the geometric change information caused by each parameter, and the method is simple and effective.
In the above embodiment, adjusting, according to target geometric change information of the two-dimensional point cloud data relative to a standard plate in the image, a parameter corresponding to the target geometric change information in the rotation matrix and the translation matrix includes: adjusting parameters corresponding to the target geometric change information in the rotation matrix and the translation matrix according to a set step length; judging whether the target geometric change information disappears; and if the geometric change information does not disappear, returning to the adjustment operation of the parameters corresponding to the target geometric change information until the target geometric change information disappears.
Alternatively, the set offset may be 0.1 or 0.2, which may be determined based on the accuracy required for calibration.
And when judging whether the target geometric change information disappears, carrying out contour recognition and contour linear fitting on the two-dimensional point cloud data so as to determine the position, shape and storage of the two-dimensional point cloud data. At the same time, the position, shape and size of the calibration plate in the image are identified. And determining whether the target geometric change information disappears according to the position, the shape and the size of the target geometric change information.
It should be noted that, in each of the above embodiments, before adjusting the corresponding parameters of the rotation matrix and the translation matrix according to the geometric change information of the two-dimensional point cloud data relative to the standard plate in the image, the geometric change information of the two-dimensional point cloud data relative to the standard plate in the image may also be determined by the above identification method, which is not described herein again.
The embodiment can automatically judge whether the target geometric change information disappears, adjust the corresponding parameters according to the set compensation, does not need manual operation, does not need to pay attention to the mathematical relationship between the geometric change information and the parameter adjustment amount, and finds the accurate parameters in a gradual adjustment mode, thereby further simplifying the calculation and simultaneously ensuring certain accuracy.
In the above embodiment, the point cloud data except for the calibration plate in the point cloud data is filtered by using a distance filtering method; the interference of the point cloud data of the environment on calibration is avoided, and therefore the calibration accuracy is improved. Specifically, before the point cloud data under the laser radar coordinate system is mapped to an image coordinate system according to the internal reference matrix, the rotation matrix and the translation matrix of the camera, the method further includes: and filtering out the point cloud data with the depth value outside the set distance range from the point cloud data.
Wherein the set distance range is determined according to a distance between the calibration plate and the laser radar. For example, if the distance between the calibration plate and the laser radar is measured to be 2 meters, the set distance may be 1.9 meters to 2.1 meters.
It is worth to be noted that, the premise of adopting the distance filtering method is that "the test scene is ensured to be spacious" in the above embodiment, that is, the distances between the environmental object and the laser radar in the detection range of the laser radar are all greater than the distance between the calibration plate and the laser radar, and meanwhile, the calibration plate is not shielded.
Fig. 3 is a schematic structural diagram of a calibration apparatus for a laser radar and a camera according to an embodiment of the present invention, which is suitable for a situation where dynamic scene data is imported into a simulator and a calibration board is used to jointly calibrate the laser radar and the camera. With reference to fig. 3, the calibration apparatus for lidar and camera includes: an acquisition module 301, a preliminary calibration module 302, a mapping module 303, and an adjustment module 304.
The acquisition module 301 is configured to acquire point cloud data obtained by scanning a calibration plate by a laser radar and an image obtained by shooting the calibration plate by a camera;
a preliminary calibration module 302, configured to perform preliminary calibration on the lidar and the camera according to pose information of the lidar and the camera, so as to obtain a rotation matrix and a translation matrix of a lidar coordinate system relative to a camera coordinate system;
the mapping module 303 is configured to map point cloud data in the laser radar coordinate system to an image coordinate system according to the internal reference matrix, the rotation matrix, and the translation matrix of the camera to obtain two-dimensional point cloud data;
an adjusting module 304, configured to adjust corresponding parameters of the rotation matrix and the translation matrix according to geometric change information of the two-dimensional point cloud data relative to a standard plate in the image;
wherein the geometric variation information includes at least one of a trapezoidal deformation amount, a rotation angle, an enlargement/reduction amount, and an offset amount.
According to the method, firstly, preliminary calibration is carried out according to pose information of a laser radar and a camera, and then corresponding parameters of a rotation matrix and a translation matrix are adjusted to carry out finer secondary calibration according to geometric change information of two-dimensional point cloud data relative to a calibration plate in an image, so that the combined calibration of the laser radar and the camera is efficiently, accurately and simply realized through a two-step calibration method; in addition, in this embodiment, according to the geometric change information, the corresponding parameters of the rotation matrix and the translation matrix are adjusted, that is, the parameters causing the geometric change are specifically adjusted according to the geometric change information, so as to further improve the calibration efficiency and accuracy.
Optionally, the adjusting module 304 is specifically configured to: selecting an unprocessed target geometric change information from the geometric change information according to a set sequence; adjusting parameters corresponding to the target geometric change information in the rotation matrix and the translation matrix according to the target geometric change information of the two-dimensional point cloud data relative to a standard plate in the image; and returning the mapping operation of the point cloud data until all the geometric change information is processed.
Optionally, the setting sequence includes: rotation angle, trapezoidal deformation amount, offset amount, enlargement/reduction amount; or a rotation angle, an amount of trapezoidal deformation, an amount of enlargement/reduction, and an amount of offset.
Optionally, the adjusting module 304 is specifically configured to perform at least one operation when adjusting parameters corresponding to the target geometric change information in the rotation matrix and the translation matrix according to the target geometric change information of the two-dimensional point cloud data relative to the standard plate in the image: adjusting the rotation angle of the Y axis of the laser radar coordinate system in the rotation matrix according to the rotation angle of the two-dimensional point cloud data relative to the calibration plate in the image; adjusting the rotation angle of an X axis and/or a Z axis in the rotation matrix according to the trapezoidal deformation amount of the two-dimensional point cloud data relative to the calibration plate in the image; adjusting the translation amount of an X axis and/or a Z axis in the translation matrix according to the offset of the two-dimensional point cloud data relative to a calibration plate in the image; adjusting the translation amount of the Y axis in the translation matrix according to the amplification/reduction amount of the two-dimensional point cloud data relative to a calibration plate in the image; the coordinate system of the laser radar belongs to a right-hand coordinate system, the origin of the coordinate system of the laser radar is located at the center of the transmitter inside the laser radar, the X axis and the Y axis are respectively parallel to two sides of the base, the Z axis is located on the rotating shaft and represents the positive direction upwards, and the positive direction of the Y axis is the positive direction of the transmitter inside the laser radar; the camera coordinate system belongs to the right hand coordinate system, the original point of camera coordinate system is located the optical center of camera, the y axle passes optical center perpendicular to camera imaging plane, moves towards the calibration board is the positive direction, and the x axle passes optical center and perpendicular to z axle, is located the imaging plane horizontal direction, faces the calibration board level is the positive direction right, and the z axle passes optical center and perpendicular to x axle, y axle, faces the calibration board upwards is the positive direction.
Optionally, when the adjusting module 304 adjusts the rotation angle of the laser radar coordinate system Y axis in the rotation matrix according to the rotation angle of the two-dimensional point cloud data relative to the calibration plate in the image, it is specifically configured to: if the two-dimensional point cloud data has a clockwise rotation angle relative to the calibration plate in the image, reducing the rotation angle of the Y axis of the laser radar coordinate system in the rotation matrix; if the two-dimensional point cloud data has a counterclockwise rotation angle relative to the calibration plate in the image, increasing the rotation angle of the Y axis of the laser radar coordinate system in the rotation matrix;
the adjusting module 304 is specifically configured to, when adjusting the rotation angle of the X axis and/or the Z axis in the rotation matrix according to the trapezoidal deformation amount of the two-dimensional point cloud data relative to the calibration plate in the image: if the two-dimensional point cloud data has trapezoidal deformation with wide top and narrow bottom relative to a standard plate in the image, reducing the rotation angle of an X axis in the rotation matrix; if the two-dimensional point cloud data has trapezoidal deformation with a narrow top and a wide bottom relative to a standard plate in the image, increasing the rotation angle of an X axis in the rotation matrix; if the two-dimensional point cloud data has trapezoidal deformation with a wide left part and a narrow right part relative to a standard plate in the image, reducing the rotation angle of a Z axis in the rotation matrix; if the two-dimensional point cloud data has trapezoidal deformation with a narrow left part and a wide right part relative to a standard plate in the image, increasing the rotation angle of the Z axis in the rotation matrix;
the adjusting module 304 is specifically configured to, when adjusting the translation amount of the X axis and/or the Z axis in the translation matrix according to the offset of the two-dimensional point cloud data with respect to the calibration plate in the image: if the two-dimensional point cloud data has right offset relative to a calibration plate in the image, reducing the translation amount of an X axis in the translation matrix; if the two-dimensional point cloud data has leftward offset relative to a calibration plate in the image, increasing the translation amount of an X axis in the translation matrix; if the two-dimensional point cloud data has upward offset relative to a calibration plate in the image, reducing the translation amount of a Z axis in the translation matrix; if the two-dimensional point cloud data has downward offset relative to a standard plate in the image, increasing the translation amount of a Z axis in the translation matrix; the adjusting module 304 is specifically configured to, when adjusting the translation amount of the Y axis in the translation matrix according to the enlargement/reduction amount of the two-dimensional point cloud data relative to the calibration plate in the image: if the two-dimensional point cloud data has amplification amount relative to a standard plate in the image, increasing the translation amount of a Y axis in the translation matrix; and if the two-dimensional point cloud data has a reduction amount relative to the standard plate in the image, reducing the translation amount of the Y axis in the translation matrix.
Optionally, the adjusting module 304 is specifically configured to, when adjusting parameters corresponding to the target geometric change information in the rotation matrix and the translation matrix according to the target geometric change information of the two-dimensional point cloud data relative to the standard plate in the image, perform: adjusting parameters corresponding to the target geometric change information in the rotation matrix and the translation matrix according to a set step length; judging whether the target geometric change information disappears; and if the geometric change information does not disappear, returning to the adjustment operation of the parameters corresponding to the target geometric change information until the target geometric change information disappears.
Optionally, the apparatus further includes a filtering module, configured to filter, before the point cloud data in the laser radar coordinate system is mapped to an image coordinate system according to the internal reference matrix, the rotation matrix, and the translation matrix of the camera to obtain two-dimensional point cloud data, point cloud data whose depth value is outside a set distance range from the point cloud data; wherein the set distance range is determined according to a distance between the calibration plate and the laser radar.
The calibration device for the laser radar and the camera provided by the embodiment of the invention can execute the calibration method for the laser radar and the camera provided by any embodiment of the invention, and has corresponding functional modules and beneficial effects of the execution method.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, as shown in fig. 4, the electronic device includes a processor 40, a memory 41, an input device 42, and an output device 43; the number of processors 40 in the device may be one or more, and one processor 40 is taken as an example in fig. 4; the processor 40, the memory 41, the input means 42 and the output means 43 in the device may be connected by a bus or other means, as exemplified by the bus connection in fig. 4.
The memory 41 serves as a computer-readable storage medium for storing software programs, computer-executable programs, and modules, such as program instructions/modules corresponding to the laser radar and camera calibration method in the embodiment of the present invention (for example, the obtaining module 301, the preliminary calibration module 302, the mapping module 303, and the adjusting module 304 in the laser radar and camera calibration device). The processor 40 executes various functional applications and data processing of the device by running software programs, instructions and modules stored in the memory 41, that is, the laser radar and camera calibration method described above is realized.
The memory 41 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the memory 41 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some examples, memory 41 may further include memory located remotely from processor 40, which may be connected to the device over a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The input device 42 is operable to receive input numeric or character information and to generate key signal inputs relating to user settings and function controls of the apparatus. The output device 43 may include a display device such as a display screen.
The embodiment of the invention also provides a computer-readable storage medium, on which a computer program is stored, and when the program is executed by a processor, the calibration method of the laser radar and the camera in any embodiment is realized.
Computer storage media for embodiments of the invention may employ any combination of one or more computer-readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, or the like, as well as conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present invention and the technical principles employed. It will be understood by those skilled in the art that the present invention is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the invention. Therefore, although the present invention has been described in greater detail by the above embodiments, the present invention is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present invention, and the scope of the present invention is determined by the scope of the appended claims.

Claims (10)

1. A calibration method for a laser radar and a camera is characterized by comprising the following steps:
acquiring point cloud data obtained by scanning a calibration plate by a laser radar and an image obtained by shooting the calibration plate by a camera;
preliminarily calibrating the laser radar and the camera according to the pose information of the laser radar and the camera to obtain a rotation matrix and a translation matrix of a laser radar coordinate system relative to a camera coordinate system;
according to the internal reference matrix, the rotation matrix and the translation matrix of the camera, mapping the point cloud data under the laser radar coordinate system to an image coordinate system to obtain two-dimensional point cloud data;
adjusting corresponding parameters of the rotation matrix and the translation matrix according to the geometric change information of the two-dimensional point cloud data relative to a standard plate in the image;
wherein the geometric variation information includes at least one of a trapezoidal deformation amount, a rotation angle, an enlargement/reduction amount, and an offset amount.
2. The method of claim 1, wherein adjusting corresponding parameters of the rotation matrix and the translation matrix according to the information about the geometric changes of the two-dimensional point cloud data relative to a standard plate in the image comprises:
selecting an unprocessed target geometric change information from the geometric change information according to a set sequence;
adjusting parameters corresponding to the target geometric change information in the rotation matrix and the translation matrix according to the target geometric change information of the two-dimensional point cloud data relative to a standard plate in the image;
and returning the mapping operation of the point cloud data until all the geometric change information is processed.
3. The method of claim 2, wherein the setting the order comprises: rotation angle, trapezoidal deformation amount, offset amount, enlargement/reduction amount; alternatively, the first and second electrodes may be,
rotation angle, amount of trapezoidal deformation, amount of enlargement/reduction, and amount of offset.
4. The method according to claim 2 or 3, wherein the adjusting the parameters in the rotation matrix and the translation matrix corresponding to the target geometric change information according to the target geometric change information of the two-dimensional point cloud data relative to the standard plate in the image comprises at least one of the following operations:
adjusting the rotation angle of the Y axis of the laser radar coordinate system in the rotation matrix according to the rotation angle of the two-dimensional point cloud data relative to the calibration plate in the image;
adjusting the rotation angle of an X axis and/or a Z axis in the rotation matrix according to the trapezoidal deformation amount of the two-dimensional point cloud data relative to the calibration plate in the image;
adjusting the translation amount of an X axis and/or a Z axis in the translation matrix according to the offset of the two-dimensional point cloud data relative to a calibration plate in the image;
adjusting the translation amount of the Y axis in the translation matrix according to the amplification/reduction amount of the two-dimensional point cloud data relative to a calibration plate in the image;
the coordinate system of the laser radar belongs to a right-hand coordinate system, the origin of the coordinate system of the laser radar is located at the center of the transmitter inside the laser radar, the X axis and the Y axis are respectively parallel to two sides of the base, the Z axis is located on the rotating shaft and represents the positive direction upwards, and the positive direction of the Y axis is the positive direction of the transmitter inside the laser radar;
the camera coordinate system belongs to the right hand coordinate system, the original point of camera coordinate system is located the optical center of camera, the y axle passes optical center perpendicular to camera imaging plane, moves towards the calibration board is the positive direction, and the x axle passes optical center and perpendicular to z axle, is located the imaging plane horizontal direction, faces the calibration board level is the positive direction right, and the z axle passes optical center and perpendicular to x axle, y axle, faces the calibration board upwards is the positive direction.
5. The method of claim 4, wherein adjusting the rotation angle of the lidar coordinate system Y axis in the rotation matrix based on the rotation angle of the two-dimensional point cloud data relative to the calibration plate in the image comprises:
if the two-dimensional point cloud data has a clockwise rotation angle relative to the calibration plate in the image, reducing the rotation angle of the Y axis of the laser radar coordinate system in the rotation matrix;
if the two-dimensional point cloud data has a counterclockwise rotation angle relative to the calibration plate in the image, increasing the rotation angle of the Y axis of the laser radar coordinate system in the rotation matrix;
the adjusting the rotation angle of the X axis and/or the Z axis in the rotation matrix according to the trapezoidal deformation amount of the two-dimensional point cloud data relative to the calibration plate in the image comprises:
if the two-dimensional point cloud data has trapezoidal deformation with wide top and narrow bottom relative to a standard plate in the image, reducing the rotation angle of an X axis in the rotation matrix;
if the two-dimensional point cloud data has trapezoidal deformation with a narrow top and a wide bottom relative to a standard plate in the image, increasing the rotation angle of an X axis in the rotation matrix;
if the two-dimensional point cloud data has trapezoidal deformation with a wide left part and a narrow right part relative to a standard plate in the image, reducing the rotation angle of a Z axis in the rotation matrix;
if the two-dimensional point cloud data has trapezoidal deformation with a narrow left part and a wide right part relative to a standard plate in the image, increasing the rotation angle of the Z axis in the rotation matrix;
the adjusting the translation amount of the X axis and/or the Z axis in the translation matrix according to the offset of the two-dimensional point cloud data relative to the calibration plate in the image comprises:
if the two-dimensional point cloud data has right offset relative to a calibration plate in the image, reducing the translation amount of an X axis in the translation matrix;
if the two-dimensional point cloud data has leftward offset relative to a calibration plate in the image, increasing the translation amount of an X axis in the translation matrix;
if the two-dimensional point cloud data has upward offset relative to a calibration plate in the image, reducing the translation amount of a Z axis in the translation matrix;
if the two-dimensional point cloud data has downward offset relative to a standard plate in the image, increasing the translation amount of a Z axis in the translation matrix;
adjusting the translation amount of the Y axis in the translation matrix according to the enlargement/reduction amount of the two-dimensional point cloud data relative to the calibration plate in the image, and the method comprises the following steps:
if the two-dimensional point cloud data has amplification amount relative to a standard plate in the image, increasing the translation amount of a Y axis in the translation matrix;
and if the two-dimensional point cloud data has a reduction amount relative to the standard plate in the image, reducing the translation amount of the Y axis in the translation matrix.
6. The method of claim 2, wherein the adjusting parameters of the rotation matrix and the translation matrix corresponding to the target geometric change information according to the target geometric change information of the two-dimensional point cloud data relative to a standard plate in the image comprises:
adjusting parameters corresponding to the target geometric change information in the rotation matrix and the translation matrix according to a set step length;
judging whether the target geometric change information disappears;
and if the geometric change information does not disappear, returning to the adjustment operation of the parameters corresponding to the target geometric change information until the target geometric change information disappears.
7. The method of claim 1, further comprising, before the mapping point cloud data in the lidar coordinate system into an image coordinate system according to the internal reference matrix, the rotation matrix, and the translation matrix of the camera to obtain two-dimensional point cloud data:
filtering point cloud data with depth values outside a set distance range from the point cloud data;
wherein the set distance range is determined according to a distance between the calibration plate and the laser radar.
8. A calibration device for a laser radar and a camera is characterized by comprising:
the system comprises an acquisition module, a calibration board and a camera, wherein the acquisition module is used for acquiring point cloud data obtained by scanning the calibration board by a laser radar and an image obtained by shooting the calibration board by the camera;
the preliminary calibration module is used for preliminarily calibrating the laser radar and the camera according to the pose information of the laser radar and the camera to obtain a rotation matrix and a translation matrix of a laser radar coordinate system relative to the camera coordinate system;
the mapping module is used for mapping the point cloud data under the laser radar coordinate system into an image coordinate system according to the internal reference matrix, the rotation matrix and the translation matrix of the camera to obtain two-dimensional point cloud data;
the adjusting module is used for adjusting corresponding parameters of the rotation matrix and the translation matrix according to the geometric change information of the two-dimensional point cloud data relative to a standard plate in the image;
wherein the geometric variation information includes at least one of a trapezoidal deformation amount, a rotation angle, an enlargement/reduction amount, and an offset amount.
9. An electronic device, comprising:
one or more processors;
a memory for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the lidar and camera calibration method of any of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out a method for lidar and camera calibration according to any one of claims 1 to 7.
CN202110140667.1A 2021-02-02 2021-02-02 Calibration method, device and equipment for laser radar and camera and readable storage medium Active CN112946609B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110140667.1A CN112946609B (en) 2021-02-02 2021-02-02 Calibration method, device and equipment for laser radar and camera and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110140667.1A CN112946609B (en) 2021-02-02 2021-02-02 Calibration method, device and equipment for laser radar and camera and readable storage medium

Publications (2)

Publication Number Publication Date
CN112946609A CN112946609A (en) 2021-06-11
CN112946609B true CN112946609B (en) 2022-03-15

Family

ID=76241366

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110140667.1A Active CN112946609B (en) 2021-02-02 2021-02-02 Calibration method, device and equipment for laser radar and camera and readable storage medium

Country Status (1)

Country Link
CN (1) CN112946609B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113848541B (en) * 2021-09-22 2022-08-26 深圳市镭神智能系统有限公司 Calibration method and device, unmanned aerial vehicle and computer readable storage medium
CN115291196B (en) * 2022-07-06 2023-07-25 南京牧镭激光科技股份有限公司 Calibration method for mounting posture of laser clearance radar
CN115082572B (en) * 2022-07-22 2023-11-03 南京慧尔视智能科技有限公司 Automatic calibration method and system combining radar and camera

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102054455B1 (en) * 2018-09-28 2019-12-10 재단법인대구경북과학기술원 Apparatus and method for calibrating between heterogeneous sensors
US11105905B2 (en) * 2018-11-30 2021-08-31 Lyft, Inc. LiDAR and camera rotational position calibration using multiple point cloud comparisons
CN109946680B (en) * 2019-02-28 2021-07-09 北京旷视科技有限公司 External parameter calibration method and device of detection system, storage medium and calibration system
CN110189381B (en) * 2019-05-30 2021-12-03 北京眸视科技有限公司 External parameter calibration system, method, terminal and readable storage medium
US10841483B1 (en) * 2019-07-11 2020-11-17 Denso International America, Inc. System and method for calibrating at least one camera and a light detection and ranging sensor
US10726579B1 (en) * 2019-11-13 2020-07-28 Honda Motor Co., Ltd. LiDAR-camera calibration
GB2594111B (en) * 2019-12-18 2023-06-07 Motional Ad Llc Camera-to-LiDAR calibration and validation
CN112017251B (en) * 2020-10-19 2021-02-26 杭州飞步科技有限公司 Calibration method and device, road side equipment and computer readable storage medium

Also Published As

Publication number Publication date
CN112946609A (en) 2021-06-11

Similar Documents

Publication Publication Date Title
CN110146869B (en) Method and device for determining coordinate system conversion parameters, electronic equipment and storage medium
CN112946609B (en) Calibration method, device and equipment for laser radar and camera and readable storage medium
CN110148185B (en) Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment
EP3751519B1 (en) Method, apparatus, device and medium for calibrating pose relationship between vehicle sensor and vehicle
KR102581429B1 (en) Method and apparatus for detecting obstacle, electronic device, storage medium and program
CN110988849B (en) Calibration method and device of radar system, electronic equipment and storage medium
WO2021143286A1 (en) Method and apparatus for vehicle positioning, controller, smart car and system
US9536306B2 (en) Vehicle vision system
CN113657224B (en) Method, device and equipment for determining object state in vehicle-road coordination
CN113074727A (en) Indoor positioning navigation device and method based on Bluetooth and SLAM
CN110766760B (en) Method, device, equipment and storage medium for camera calibration
CN111383279A (en) External parameter calibration method and device and electronic equipment
CN110231832B (en) Obstacle avoidance method and obstacle avoidance device for unmanned aerial vehicle
CN110728720B (en) Method, apparatus, device and storage medium for camera calibration
US20230206500A1 (en) Method and apparatus for calibrating extrinsic parameter of a camera
CN112598756B (en) Roadside sensor calibration method and device and electronic equipment
CN114063046A (en) Parameter calibration method and device, computer equipment and storage medium
CN105809685A (en) Single-concentric circle image-based camera calibration method
CN113112551B (en) Camera parameter determining method and device, road side equipment and cloud control platform
CN116385550A (en) External parameter calibration method, device, computing equipment, medium and vehicle
CN115272452A (en) Target detection positioning method and device, unmanned aerial vehicle and storage medium
CN112232451B (en) Multi-sensor data fusion method and device, electronic equipment and medium
CN115447568A (en) Data processing method and device
CN117351053B (en) Photovoltaic power station image registration method, storage medium and electronic equipment
CN115201796B (en) External reference correction method of vehicle sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant