CN110221275B - Calibration method and device between laser radar and camera - Google Patents

Calibration method and device between laser radar and camera Download PDF

Info

Publication number
CN110221275B
CN110221275B CN201910425720.5A CN201910425720A CN110221275B CN 110221275 B CN110221275 B CN 110221275B CN 201910425720 A CN201910425720 A CN 201910425720A CN 110221275 B CN110221275 B CN 110221275B
Authority
CN
China
Prior art keywords
camera
determining
point cloud
preset
calibration plate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910425720.5A
Other languages
Chinese (zh)
Other versions
CN110221275A (en
Inventor
温英杰
孙孟孟
李凯
张斌
李吉利
林巧
曹丹
李卫斌
周光祥
余辉
蓝天翔
顾敏奇
吴紫薇
梁庆羽
毛非一
刘宿东
张善康
李文桐
张成华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cainiao Smart Logistics Holding Ltd
Original Assignee
Cainiao Smart Logistics Holding Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cainiao Smart Logistics Holding Ltd filed Critical Cainiao Smart Logistics Holding Ltd
Priority to CN201910425720.5A priority Critical patent/CN110221275B/en
Publication of CN110221275A publication Critical patent/CN110221275A/en
Priority to PCT/CN2020/089722 priority patent/WO2020233443A1/en
Application granted granted Critical
Publication of CN110221275B publication Critical patent/CN110221275B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the application provides a calibration method and device between a laser radar and a camera, wherein the method comprises the following steps: acquiring an image acquired by the camera aiming at a calibration plate and a point cloud acquired by the laser radar aiming at the calibration plate; determining a plurality of first rotation vectors in a preset first rotation vector interval; calculating the coincidence ratio between the corresponding image and the point cloud according to each first rotation vector; and determining a first rotation vector corresponding to the maximum contact ratio as a rotation vector of the coordinate system of the laser radar calibrated to the coordinate system of the camera. By adopting the calibration method of the embodiment of the application, the calibration precision requirement of the unmanned vehicle can be met when the laser radar with middle and low precision is calibrated to the camera.

Description

Calibration method and device between laser radar and camera
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method for calibrating a laser radar and a camera, a calibration device between the laser radar and the camera, and a calibration device.
Background
With the development of unmanned technology, almost all unmanned vehicles currently adopt a multi-sensor fusion scheme, and are provided with a plurality of sensors such as laser radars, industrial cameras and the like. In the unmanned scheme, the coordinate systems of a plurality of sensors are required to be transformed into a unified coordinate system, so that the spatial fusion of the data of the plurality of sensors is realized.
At present, the multi-sensor calibration is mainly divided into two types of manual calibration and automatic calibration, wherein the manual calibration is performed by a professional with certain calibration experience through off-line collected sensor data by a specific calibration method, and the manual calibration is not suitable for batch calibration;
the automatic calibration is realized by selecting a specific calibration scene and a calibration tool and through a specific algorithm.
At present, the automatic calibration schemes on the market are mainly suitable for unmanned vehicles adopting high-end laser radars, but the automatic calibration schemes are not suitable for unmanned vehicles adopting medium-end and low-end laser radars.
Because the ranging accuracy and the laser line number of the middle-low-end laser radar are far lower than those of the high-end laser radar, the obtained environment point cloud information is not rich and accurate in the high-end radar, and if a calibration algorithm similar to the high-end radar is used, the calibration accuracy requirement of an unmanned vehicle adopting the middle-low-end laser radar cannot be met.
Disclosure of Invention
In view of the above problems, embodiments of the present application have been proposed to provide a calibration method between a lidar and a camera, a calibration method, a calibration device between a lidar and a camera, and a calibration device that overcome or at least partially solve the above problems.
In order to solve the above problems, an embodiment of the present application discloses a calibration method between a laser radar and a camera, including:
acquiring an image acquired by the camera aiming at a calibration plate and a point cloud acquired by the laser radar aiming at the calibration plate;
determining a plurality of first rotation vectors in a preset first rotation vector interval;
calculating the coincidence ratio between the corresponding image and the point cloud according to each first rotation vector;
and determining a first rotation vector corresponding to the maximum contact ratio as a rotation vector of the coordinate system of the laser radar calibrated to the coordinate system of the camera.
Optionally, the calculating the coincidence degree between the corresponding image and the point cloud according to each first rotation vector includes:
acquiring a translation vector between a coordinate system of the laser radar and a coordinate system of a camera, and acquiring an internal reference of the camera;
determining a plurality of first transformation matrices using the plurality of first rotation vectors and the translation vector, respectively;
and aiming at one first conversion matrix, calculating the coincidence ratio between the corresponding image and the point cloud by adopting the first conversion matrix and the internal parameters of the camera.
Optionally, the calculating, using the first transformation matrix and internal parameters of the camera, a coincidence ratio between the corresponding image and the point cloud includes:
acquiring a camera coordinate system of the camera;
determining the outline of the calibration plate in the image, and determining the three-dimensional coordinates of the point cloud of the calibration plate, which is positioned in the calibration plate, in the point cloud;
projecting the calibration plate point cloud to the image by adopting the first conversion matrix, the internal reference of the camera and the three-dimensional coordinates of the calibration plate point cloud to obtain a first projection point cloud;
determining the number of first target projection points falling into the outline of the calibration plate in the image in the first projection point cloud;
and determining the coincidence degree of the image and the point cloud by adopting the number of the first target projection points.
Optionally, the determining the coincidence ratio of the image and the point cloud by using the number of the first target projection points includes:
calculating the ratio of the number of first target projection points corresponding to one calibration plate to the number of first target projection points of the calibration plate point clouds of the calibration plate;
and determining the coincidence ratio of the image and the point cloud by adopting the first target projection point proportion.
Optionally, the determining a plurality of first rotation vectors in the preset first rotation vector interval includes:
and determining a plurality of first rotation vectors according to the preset radian intervals in a preset first rotation vector interval.
Optionally, the preset first rotation vector section includes a preset first roll angle section, a preset first pitch angle section, and a preset first yaw angle section; in a preset first rotation vector interval, determining a plurality of first rotation vectors according to a preset radian interval, including:
determining a plurality of rolling angles according to a preset radian interval in the preset first rolling angle interval;
determining a plurality of pitch angles according to the preset radian intervals in the preset first pitch angle interval;
determining a plurality of yaw angles according to the preset radian intervals in the preset first yaw angle interval;
and selecting one rolling angle from the rolling angles, selecting one pitch angle from the pitch angles, and selecting one yaw angle from the yaw angles for combination to obtain a plurality of first rotation vectors.
Optionally, the method further comprises:
acquiring a horizontal view angle and a vertical view angle of the camera and resolution of the image;
Dividing the horizontal view angle by the width of the resolution to obtain a first radian;
dividing the vertical field angle by the height of the resolution to obtain a second radian;
and taking the smaller of the first radian and the second radian as the preset radian interval.
Optionally, the method further comprises:
determining a reference rotation vector;
and determining the preset first rotation vector interval by adopting the reference rotation vector and the preset radian interval.
Optionally, the determining the reference rotation vector includes:
acquiring a preset second rotation vector interval, wherein the preset second rotation vector interval comprises a preset second rolling angle interval, a preset second pitch angle interval and a preset second yaw angle interval;
adjusting a pitch angle in the preset second pitch angle interval and adjusting a yaw angle in the preset second yaw angle interval;
determining a target pitch angle and a target yaw angle when the center of the calibration plate of the image is coincident with the center of the first projection point cloud;
adjusting the rolling angle in the preset second rolling angle interval under the target pitch angle and the target yaw angle to obtain a plurality of second rotation vectors;
From the plurality of second rotation vectors, a reference rotation vector is determined.
Optionally, the determining a reference rotation vector from the plurality of second rotation vectors includes:
determining a plurality of second transformation matrices by using the plurality of second rotation vectors and translation vectors between the coordinate system of the laser radar and the coordinate system of the camera respectively;
calculating the coincidence ratio between the corresponding image and the point cloud by adopting the second transformation matrix and the internal reference of the camera aiming at one second transformation matrix;
and determining the second rotation vector corresponding to the maximum contact ratio as a reference rotation vector.
Optionally, the determining the three-dimensional coordinates of the calibration plate point cloud located in the calibration plate in the point cloud includes:
extracting a calibration plate point cloud positioned in the calibration plate from the point cloud by adopting a point cloud clustering algorithm;
and determining the three-dimensional coordinates of the calibration plate point cloud.
Optionally, the determining the three-dimensional coordinates of the calibration plate point cloud located in the calibration plate in the point cloud includes:
acquiring the reflectivity of each point in the point cloud;
determining a calibration plate point cloud positioned in the calibration plate by adopting a point with reflectivity larger than a preset reflectivity threshold value;
And determining the three-dimensional coordinates of the calibration plate point cloud.
Optionally, the determining the three-dimensional coordinates of the calibration plate point cloud located in the calibration plate in the point cloud includes:
acquiring the size information of the calibration plate;
determining a calibration plate point cloud positioned in the calibration plate in the point cloud by adopting the size information of the calibration plate;
and determining the three-dimensional coordinates of the calibration plate point cloud.
The embodiment of the application also discloses a calibration method, which is applied to the unmanned vehicle, wherein the unmanned vehicle comprises at least one camera and at least one laser radar, the at least one camera and the at least one laser radar respectively have own coordinate systems, and the method comprises the following steps:
selecting a target camera from the at least one camera, and taking a coordinate system of the target camera as a reference coordinate system;
determining a first laser radar associated with the target camera in the at least one laser radar, and calibrating a coordinate system of the first laser radar to the reference coordinate system;
among the cameras other than the target camera, a first camera corresponding to the first lidar is determined, and a coordinate system of the first camera is calibrated to a coordinate system of the corresponding first lidar.
Determining a second lidar not associated with the target camera, and determining a second camera corresponding to the second lidar;
calibrating the coordinate system of the second camera to the coordinate system of the associated first lidar, and calibrating the coordinate system of the second lidar to the coordinate system of the second camera.
Optionally, the at least one camera comprises: at least one industrial camera, at least one point-of-view camera; said selecting a target camera from said at least one camera comprises:
selecting one from the at least one industrial camera as a target camera.
Optionally, the determining, in the cameras other than the target camera, a first camera corresponding to the first lidar includes:
in the at least one looking-around camera, a first looking-around camera corresponding to the first lidar is determined.
Optionally, the determining a second camera corresponding to the second lidar includes:
and determining a second looking-around camera corresponding to the second laser radar.
The embodiment of the application also discloses a calibration device between the laser radar and the camera, comprising:
the image acquisition module is used for acquiring an image acquired by the camera aiming at the calibration plate and a point cloud acquired by the laser radar aiming at the calibration plate;
The first rotation vector determining module is used for determining a plurality of first rotation vectors in a preset first rotation vector interval;
the first contact ratio calculation module is used for calculating the contact ratio between the corresponding image and the point cloud according to each first rotation vector;
and the rotation vector calibration module is used for determining a first rotation vector corresponding to the maximum contact ratio as a rotation vector for calibrating the coordinate system of the laser radar to the coordinate system of the camera.
Optionally, the first contact ratio calculating module includes:
the parameter acquisition sub-module is used for acquiring a translation vector between the coordinate system of the laser radar and the coordinate system of the camera and acquiring an internal reference of the camera;
a first transformation matrix determining sub-module for determining a plurality of first transformation matrices using the plurality of first rotation vectors and the translation vector, respectively;
the first coincidence degree calculating sub-module is used for calculating the coincidence degree between the corresponding image and the point cloud by adopting the first transformation matrix and the internal parameters of the camera aiming at one first transformation matrix.
Optionally, the first contact ratio calculating submodule includes:
A camera coordinate system acquisition unit configured to acquire a camera coordinate system of the camera;
the image information determining unit is used for determining the outline of the calibration plate in the image and determining the three-dimensional coordinates of the point cloud of the calibration plate, which is positioned in the calibration plate, in the point cloud;
the projection unit is used for projecting the calibration plate point cloud to the image by adopting the first conversion matrix, the internal parameters of the camera and the three-dimensional coordinates of the calibration plate point cloud to obtain a first projection point cloud;
the target projection point determining unit is used for determining the number of first target projection points falling into the outline of the calibration plate in the image in the first projection point cloud;
and the first contact ratio determining unit is used for determining the contact ratio of the image and the point cloud by adopting the number of the first target projection points.
Optionally, the first contact ratio determining unit includes:
the projection proportion calculating subunit is used for calculating the proportion of the number of the first target projection points corresponding to one calibration plate to the first target projection points of the number of the calibration plate point clouds of the calibration plate;
and the first contact ratio determining subunit is used for determining the contact ratio of the image and the point cloud by adopting the first target projection point proportion.
Optionally, the first rotation vector determining module includes:
the first rotation vector determining sub-module is used for determining a plurality of first rotation vectors according to a preset radian interval in a preset first rotation vector interval.
Optionally, the preset first rotation vector section includes a preset first roll angle section, a preset first pitch angle section, and a preset first yaw angle section; the first rotation vector determination submodule includes:
the rolling angle determining unit is used for determining a plurality of rolling angles according to a preset radian interval in the preset first rolling angle interval;
the pitch angle determining unit is used for determining a plurality of pitch angles according to the preset radian intervals in the preset first pitch angle interval;
a yaw angle determining unit configured to determine a plurality of yaw angles at the preset radian intervals within the preset first yaw angle section;
and the first rotation vector determining unit is used for selecting one rolling angle from the rolling angles, selecting one pitch angle from the pitch angles and selecting one yaw angle from the yaw angles to be combined to obtain a plurality of first rotation vectors.
Optionally, the method further comprises:
A camera parameter acquisition module for acquiring a horizontal view angle and a vertical view angle of the camera and resolution of the image;
the first radian determining module is used for dividing the horizontal view angle by the width of the resolution to obtain a first radian;
the second radian determining module is used for dividing the vertical field angle by the height of the resolution ratio to obtain a second radian;
and the radian interval determining module is used for taking the smaller radian of the first radian and the second radian as the preset radian interval.
Optionally, the method further comprises:
a reference rotation vector determination module configured to determine a reference rotation vector;
and the first rotation vector interval determining module is used for determining the preset first rotation vector interval by adopting the reference rotation vector and the preset radian interval.
Optionally, the reference rotation vector determining module includes:
the second rotation vector interval acquisition submodule is used for acquiring a preset second rotation vector interval, and the preset second rotation vector interval comprises a preset second rolling angle interval, a preset second pitch angle interval and a preset second yaw angle interval;
the angle adjustment sub-module is used for adjusting the pitch angle in the preset second pitch angle interval and adjusting the yaw angle in the preset second yaw angle interval;
The target angle determining submodule is used for determining a target pitch angle and a target yaw angle when the center of the calibration plate of the image is overlapped with the center of the first projection point cloud;
the second rotation vector determining submodule is used for adjusting the rolling angle in the preset second rolling angle interval under the condition of the target pitch angle and the target yaw angle to obtain a plurality of second rotation vectors;
a reference rotation vector determination sub-module for determining a reference rotation vector from the plurality of second rotation vectors.
Optionally, the reference rotation vector determination submodule includes:
a second conversion matrix determining unit configured to determine a plurality of second conversion matrices using the plurality of second rotation vectors and a translation vector between the coordinate system of the lidar and the coordinate system of the camera, respectively;
a second coincidence degree calculating unit, configured to calculate, for one of the second transformation matrices, a coincidence degree between the corresponding image and the point cloud using the second transformation matrix and an internal reference of the camera;
and a reference rotation vector determination unit configured to determine a second rotation vector corresponding to the maximum overlap ratio as a reference rotation vector.
Optionally, the image information determining unit includes:
the first calibration board point cloud determining subunit is used for extracting the calibration board point cloud positioned in the calibration board from the point cloud by adopting a point cloud clustering algorithm;
and the first point cloud coordinate determining subunit is used for determining the three-dimensional coordinates of the point cloud of the calibration plate.
Optionally, the image information determining unit includes:
a reflectivity obtaining subunit, configured to obtain reflectivity of each point in the point cloud;
the second calibration plate point cloud determining subunit is used for determining the calibration plate point cloud positioned in the calibration plate by adopting points with reflectivity larger than a preset reflectivity threshold value;
and the second point cloud coordinate determining subunit is used for determining the three-dimensional coordinates of the point cloud of the calibration plate.
Optionally, the image information determining unit includes:
a dimension information obtaining subunit, configured to obtain dimension information of the calibration plate;
the third calibration plate point cloud determining subunit is used for determining the calibration plate point cloud positioned in the calibration plate in the point cloud by adopting the size information of the calibration plate;
and the third point cloud coordinate determining subunit is used for determining the three-dimensional coordinates of the point cloud of the calibration plate.
The embodiment of the application also discloses calibration device, is applied to the unmanned aerial vehicle, the unmanned aerial vehicle includes at least one camera and at least one laser radar, at least one camera with at least one laser radar has the coordinate system of self respectively, the device includes:
A reference coordinate system determining module, configured to select a target camera from the at least one camera, and use a coordinate system of the target camera as a reference coordinate system;
the first calibration module is used for determining a first laser radar associated with the target camera in the at least one laser radar and calibrating a coordinate system of the first laser radar to the reference coordinate system;
and the second calibration module is used for determining a first camera corresponding to the first laser radar in the cameras except the target camera and calibrating the coordinate system of the first camera to the coordinate system of the corresponding first laser radar.
A non-association determination module for determining a second lidar that is not associated with the target camera, and determining a second camera that corresponds to the second lidar;
and the third calibration module is used for calibrating the coordinate system of the second camera to the coordinate system of the associated first laser radar and calibrating the coordinate system of the second laser radar to the coordinate system of the second camera.
Optionally, the at least one camera comprises: at least one industrial camera, at least one point-of-view camera; the reference coordinate system determination module includes:
And the target camera selecting sub-module is used for selecting one from the at least one industrial camera as a target camera.
Optionally, the second calibration module includes:
and the first looking-around camera determining submodule is used for determining a first looking-around camera corresponding to the first laser radar in the at least one looking-around camera.
Optionally, the disassociation determination module includes:
and the second looking-around camera determining submodule is used for determining a second looking-around camera corresponding to the second laser radar.
The embodiment of the application also discloses a device, which comprises:
one or more processors; and
one or more machine-readable media having instructions stored thereon, which when executed by the one or more processors, cause the apparatus to perform one or more methods as described above.
One or more machine-readable media having instructions stored thereon, which when executed by one or more processors, cause the processors to perform the method(s) described above are also disclosed.
Embodiments of the present application include the following advantages:
in this embodiment of the present application, under the condition that a translation vector between a camera and a laser radar is fixed, in a preset first rotation vector interval, a first rotation vector with the highest point cloud contact ratio between an image collected by the camera and a point cloud collected by the laser radar is determined, and the first rotation vector with the corresponding maximum contact ratio is used as a rotation vector for finally calibrating a coordinate system of the laser radar to a coordinate system of the camera. By adopting the calibration method of the embodiment of the application, the calibration precision requirement of the unmanned vehicle can be met when the laser radar with middle and low precision is calibrated to the camera.
Drawings
FIG. 1 is a flowchart illustrating steps of a first embodiment of a method for calibrating a laser radar and a camera according to the present application;
FIG. 2 is a flow chart of steps of a second embodiment of a method for calibrating between a lidar and a camera according to the present application;
FIG. 3 is a schematic view of projecting a calibration plate point cloud onto an image in an embodiment of the present application;
FIG. 4 is a schematic view of another embodiment of the present application for projecting a calibration plate point cloud onto an image;
FIG. 5 is a flow chart of steps of an embodiment of a calibration method of the present application;
FIG. 6 is a schematic diagram of an unmanned vehicle calibration scenario in an embodiment of the present application;
FIG. 7 is a block diagram of an embodiment of a calibration device between a lidar and a camera of the present application;
FIG. 8 is a block diagram of an embodiment of a calibration device of the present application.
Detailed Description
In order that the above-recited objects, features and advantages of the present application will become more readily apparent, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments that are illustrated in the appended drawings.
The existing logistics unmanned vehicle uses a middle-low-end laser radar, and if a calibration algorithm similar to a high-end radar is used, the calibration accuracy requirement of the logistics unmanned vehicle cannot be met.
Calibration of laser to camera (industrial camera, looking around camera) is to determine a transformation matrix RT from a laser coordinate system to a camera coordinate system, the transformation matrix RT can be uniquely determined by a translation vector T (x, y, z) and a rotation vector R (R, p, y), if the optimization solution is performed on 6 variables at the same time, the search solution space is huge, and the algorithm is extremely easy to converge to a local optimal solution.
Considering that the relative position is fixed after the camera and the laser radar are fixed, and the very accurate value of the translation vector T can be obtained through measurement, the rotation vector solution space is traversed by adopting the fixed translation vector in the embodiment of the application, and the optimal rotation vector is found, so that the optimal transformation matrix is obtained. The specific implementation will be described in detail below.
Referring to fig. 1, a flowchart illustrating a step of a first embodiment of a calibration method between a lidar and a camera according to the present application may specifically include the following steps:
step 101, acquiring an image acquired by the camera aiming at a calibration plate and a point cloud acquired by the laser radar aiming at the calibration plate;
the calibration method provided by the embodiment of the application is aimed at the middle-low-end laser radar, and is suitable for the middle-low-end laser radar and the high-end laser radar.
In the unmanned vehicle, the number of cameras and the number of the laser radars can be multiple, and calibration can be achieved by adopting the method of the embodiment of the application between each camera and each laser radar. The camera may include an industrial camera, a point-of-view camera, etc. applied to the unmanned vehicle.
The method comprises the steps that a camera and a laser radar are adopted for collecting a calibration plate, wherein the camera collects images, and the images of the calibration plate are contained in the images; the laser radar collects point clouds, and the point clouds comprise laser points which are shot to and reflected by the calibration plate. The emitter of the laser radar emits a beam of laser, and after encountering an object, the laser beam returns to the laser receiver through diffuse reflection to obtain a laser spot.
In the embodiment of the application, the number and the color of the calibration plates are not limited, and any color and any number of calibration plates can be used. For example, 3 red schiff plates with a size of 80cm x 80cm may be used as calibration plates.
102, determining a plurality of first rotation vectors in a preset first rotation vector interval;
rotation vector (r, p, y), where r is roll angle roll, p is pitch angle, and y is yaw angle yaw.
After the relative positions of the camera and the laser radar are determined, the translation vector T between the camera and the laser radar can be accurately measured, so that the optimal transformation matrix can be obtained by searching the optimal rotation vector in a preset first rotation vector interval.
Step 103, calculating the coincidence ratio between the corresponding image and the point cloud according to each first rotation vector;
The image collected by the camera contains an object, and the position of the object is determined in the image; the point cloud is determined by the laser radar according to laser light reflected by the object, and the coordinate position of the point cloud reflects the position of the object. The degree of coincidence is a parameter describing the degree of coincidence of the coordinate position of the point cloud with the position of the object in the image.
Under different rotation vectors, the relative positions of the image and the point cloud change, and the degree of coincidence also changes.
And 104, determining a first rotation vector corresponding to the maximum contact ratio as a rotation vector of the coordinate system of the laser radar calibrated to the coordinate system of the camera.
The larger the overlap ratio is, the more accurate the calibration result is. The first rotation vector at which the degree of overlap is maximized can thus be used as the rotation vector that ultimately calibrates the coordinate system of the lidar to the coordinate system of the camera.
In this embodiment of the present application, under the condition that a translation vector between a camera and a laser radar is fixed, in a preset first rotation vector interval, a first rotation vector with the highest point cloud contact ratio between an image collected by the camera and a point cloud collected by the laser radar is determined, and the first rotation vector with the corresponding maximum contact ratio is used as a rotation vector for finally calibrating a coordinate system of the laser radar to a coordinate system of the camera. By adopting the calibration method of the embodiment of the application, the calibration precision requirement of the unmanned vehicle can be met when the laser radar with middle and low precision is calibrated to the camera.
In calibrating the cameras and lidar of the drone, a reference coordinate system may be first determined, for example, the coordinate system of one camera is selected as the reference coordinate system. By the method, the coordinate system of the laser radar or the coordinate system of the camera can be calibrated to the reference coordinate system except the reference coordinate system, so that the calibration of the unmanned vehicle is realized.
The calibration method can achieve automatic calibration. In the actual operation scene of the unmanned vehicle, after the calibration of the unmanned vehicle leaving the factory, various sensors are inevitably replaced in the actual operation of the vehicle, which means that the vehicle needs to calibrate the replaced sensors again, and the vehicle cannot be put into operation before the calibration work of the newly replaced sensors is completed, so that the targets of the instant replacement, instant calibration and instant operation of the sensors can be achieved by adopting the calibration method of the application.
Referring to fig. 2, a flowchart illustrating a second step of an embodiment of a calibration method between a lidar and a camera according to the present application may specifically include the following steps:
step 201, acquiring an image acquired by the camera aiming at a calibration plate and a point cloud acquired by the laser radar aiming at the calibration plate;
Step 202, determining a plurality of first rotation vectors in a preset first rotation vector interval;
in an embodiment of the present application, the step 202 may include: and determining a plurality of first rotation vectors according to the preset radian intervals in a preset first rotation vector interval.
In an implementation, the preset radian interval may be taken as a step length, and the whole preset first rotation vector interval may be traversed to determine a plurality of first rotation vectors.
Specifically, the preset first rotation vector section includes a preset first rolling angle section, a preset first pitch angle section and a preset first yaw angle section, and a plurality of rolling angles can be determined according to a preset radian interval in the preset first rolling angle section; determining a plurality of pitch angles according to the preset radian intervals in the preset first pitch angle interval; determining a plurality of yaw angles according to the preset radian intervals in the preset first yaw angle interval; and selecting one rolling angle from the rolling angles, selecting one pitch angle from the pitch angles, and selecting one yaw angle from the yaw angles for combination to obtain a plurality of first rotation vectors.
For example, the first rotation vector interval is [ (r 1, p1, y 1), (r 2, p2, y 3) ], wherein the first rolling angle interval is [ r1, r2], and n1 rolling angles are determined from the first rotation vector interval according to the preset radian interval; presetting a first pitch angle interval as [ p1, p2], and determining n2 pitch angles from the first pitch angle interval according to a preset radian interval; the preset first yaw angle interval is [ y1, y2], and n3 yaw angles are determined from the preset first yaw angle interval according to the preset radian interval. And respectively selecting one rolling angle from n1 rolling angles, selecting one pitch angle from n2 pitch angles, selecting one yaw angle from n3 yaw angles, and combining to obtain n1 n2 n3 first rotation vectors.
In the embodiment of the application, the preset radian interval may be determined by the following steps:
acquiring a horizontal view angle alpha and a vertical view angle beta of the camera and resolution w×h of the image; dividing the horizontal view angle alpha by the width w of the resolution to obtain a first radian; dividing the vertical view angle beta by the height h of the resolution ratio to obtain a second radian; and taking the smaller of the first radian and the second radian as the preset radian interval.
In the embodiment of the present application, the preset first rotation vector section may be determined by the following steps: determining a reference rotation vector; and determining the preset first rotation vector interval by adopting the reference rotation vector and the preset radian interval.
Specifically, it is assumed that the reference rotation vector is (r 0, p0, y 0), where r0 is the reference roll angle, p0 is the reference pitch angle, and y0 is the reference yaw angle.
The product of a preset first reference value M and a preset radian interval s can be subtracted by adopting a reference rolling angle r0 to obtain a rolling angle interval lower limit r 0-Ms; the upper limit r0+Ms of the rolling angle interval can be obtained by adopting a reference rolling angle r0 and adding the product of a preset first reference value M and a preset radian interval s; and determining a preset first rolling angle interval [ r0-M x s, r0+M x s ] by adopting a rolling angle interval lower limit and a rolling angle interval upper limit.
The product of a preset first reference value M and a preset radian interval s can be subtracted by adopting a reference pitch angle p0 to obtain a pitch angle interval lower limit p 0-Ms; the upper limit p0+ M x s of the pitch interval can be obtained by adding the product of the preset first reference value M and the preset radian interval s to the reference pitch p 0; and determining a preset first pitch angle interval [ p0-M x s, p0+ M x s ] by adopting a pitch angle interval lower limit and a pitch angle interval upper limit.
The product of a preset first reference value M and a preset radian interval s can be subtracted from a reference yaw angle y0 to obtain a yaw angle interval lower limit y 0-Mx s; the upper limit y0+Ms of the yaw angle interval can be obtained by adopting the reference yaw angle y0 and adding the product of a preset first reference value M and a preset radian interval s; and determining a preset first yaw angle section [ y0-M x s, y0+M x s ] by adopting a yaw angle section lower limit and a yaw angle section upper limit.
The first reference value M is a positive integer, and M is usually set to be relatively large, for example, m=200, in order to ensure a globally optimal solution.
In fact, considering the camera resolution, the angle of view and the angular resolution of the lidar, to obtain a relatively high calibration accuracy, the preset radian interval is usually set to be very small, for example 0.001rad, and the reasonable variation interval of (r, p, y) is usually very large relative to the preset radian interval, for example [ -0.1,0.1] rad is a relatively normal variation interval, so the number of traversals of the whole solution space is (0.2/0.001) ((0.2/0.001) = 8000000) times, assuming that each traversal of the program takes only 1ms (the actual value is much greater than 1ms, about 3-4 ms), the time required for calibrating a set of parameters is 8000000/1000/3600 =2.2 hours, which is only the time required for calibrating a set of parameters, and the actual scene may be required for calibrating a plurality of sets of parameters, so long running time is obviously unacceptable. Therefore, how to reduce the preset first rotation vector interval, it is particularly critical to reduce the running time of the program.
Therefore, in the embodiment of the application, the pitch and the yaw are adjusted in a directional manner, so that the center of the first projection point cloud of the calibration plate point cloud projected to the image coincides with the center of the calibration plate in the image, and generally, the calibration plate point cloud is converged after only being iterated for 50-100 times to obtain p0 and y0 of a reference.
And then, fixing p0 and y0, adjusting the roll in an original roll interval, finding out the roll value which is the most in the interval and enables the first projection point cloud to fall into the image area of the calibration plate, and marking the roll value as r0, wherein the step needs to be iterated for 200 times.
By the method, the scheme can find a reference solution (r 0, p0, y 0), the reference solution is taken as a center, the embodiment of the application can find the optimal solution in a small interval [ -0.015,0.015], and experimental tests show that the solution is also the global optimal solution.
In practice, roll cannot be adjusted until p0 and y0 are determined, and r0 and p0 cannot be determined first, then yaw is adjusted, or r0 and y0 are determined first, and then pitch is adjusted.
The number of loop iterations required by the optimized scheme is 100+200+ (0.03/0.001) ×0.03/0.001) =27300/1000=27s, and then the time can be reduced to 1/4 of the original time by using multi-core multithreading through OpenMP for acceleration, so that only about 6-8 seconds is required for calibrating one group of parameters, and therefore, the method of the embodiment of the application can complete instant calibration.
In an embodiment of the present application, the step of determining the reference rotation vector may include:
acquiring a preset second rotation vector interval, wherein the preset second rotation vector interval comprises a preset second rolling angle interval, a preset second pitch angle interval and a preset second yaw angle interval;
adjusting a pitch angle in the preset second pitch angle interval and adjusting a yaw angle in the preset second yaw angle interval;
determining a target pitch angle and a target yaw angle when the center of the calibration plate of the image is coincident with the center of the first projection point cloud;
adjusting the rolling angle in the preset second rolling angle interval under the target pitch angle and the target yaw angle to obtain a plurality of second rotation vectors;
from the plurality of second rotation vectors, a reference rotation vector is determined.
Wherein the step of determining a reference rotation vector from the plurality of second rotation vectors may include:
determining a plurality of second transformation matrices by using the plurality of second rotation vectors and translation vectors between the coordinate system of the laser radar and the coordinate system of the camera respectively; calculating the coincidence ratio between the corresponding image and the point cloud by adopting the second transformation matrix and the internal reference of the camera aiming at one second transformation matrix; and determining the second rotation vector corresponding to the maximum contact ratio as a reference rotation vector.
Wherein, the step of calculating the coincidence ratio between the corresponding image and the point cloud using the second transformation matrix and the internal parameters of the camera may include:
projecting the calibration plate point cloud to a camera coordinate system by adopting a second conversion matrix, an internal reference of the camera and the three-dimensional coordinates of the calibration plate point cloud to obtain a second projected point cloud; determining the number of second target projection points in the second projection point cloud, wherein the second target projection points fall into the outline of the calibration plate in the image; and determining the coincidence ratio of the image and the point cloud by adopting the number of the second target projection points.
In one example, the number of second target projection points may be taken as the degree of coincidence of the image with the point cloud. The greater the number of second target projection points, the higher the overlap ratio.
In another example, the ratio of the second target projection point to the calibration plate point cloud may be used to determine the degree of overlap. Specifically, the proportion of the number of second target projection points corresponding to one calibration plate to the number of the calibration plate point clouds of the calibration plate can be calculated; and determining the coincidence ratio of the image and the point cloud by adopting the second target projection point proportion.
Step 203, obtaining a translation vector between a coordinate system of the laser radar and a coordinate system of a camera, and obtaining an internal reference of the camera;
The internal parameters are parameters describing the camera characteristics. Since the camera coordinate system uses units of millimetre and the image plane uses pixels as units, the intrinsic parameters function to change linearly between the two coordinate systems. The internal parameters of the camera may be obtained by a camera calibration tool.
Step 204, determining a plurality of first transformation matrices by using the plurality of first rotation vectors and the translation vector, respectively;
in the embodiment of the application, the translation vector between the camera and the lidar is fixed, and each first transformation matrix consists of a first rotation vector and a fixed translation vector.
Step 205, calculating the coincidence ratio between the corresponding image and the point cloud by adopting the first transformation matrix and the internal reference of the camera for one first transformation matrix;
under different transformation matrixes, the relative positions of the image and the point cloud change, and the degree of coincidence also changes.
In an embodiment of the present application, the step 205 may include the following sub-steps:
s11, acquiring a camera coordinate system of the camera;
step S12, determining the outline of the calibration plate in the image and determining the three-dimensional coordinates of the point cloud of the calibration plate, which is positioned in the calibration plate, in the point cloud;
The point cloud data collected by the lidar is three-dimensional and is represented by a cartesian coordinate system (X, Y, Z).
In one example, a point cloud clustering algorithm may be employed to determine the three-dimensional coordinates of the calibration plate point cloud. Specifically, a point cloud clustering algorithm may be adopted to extract a calibration plate point cloud located in the calibration plate from the point cloud; and determining the three-dimensional coordinates of the calibration plate point cloud.
In another example, the reflectivity of the calibration plate to the laser may be used as a priori information to determine the three-dimensional coordinates of the calibration plate point cloud. Because the objects with different materials have different reflection degrees on the laser, the calibration plate with high reflectivity can be selected. In the acquired laser point cloud data, by setting a proper reflectivity threshold, the laser point with the reflectivity larger than the reflectivity threshold can be determined as the point where the laser strikes the calibration plate.
Specifically, the reflectivity of each point in the point cloud can be obtained; determining a calibration plate point cloud positioned in the calibration plate by adopting a point with reflectivity larger than a preset reflectivity threshold value; and determining the three-dimensional coordinates of the calibration plate point cloud.
In yet another example, the dimensional information of the calibration plate may be used as a priori information to determine the three-dimensional coordinates of the calibration plate point cloud. Specifically, the size information of the calibration plate can be obtained; determining a calibration plate point cloud positioned in the calibration plate in the point cloud by adopting the size information of the calibration plate; and determining the three-dimensional coordinates of the calibration plate point cloud.
S13, projecting the calibration plate point cloud to the image by adopting the first conversion matrix, the internal reference of the camera and the three-dimensional coordinates of the calibration plate point cloud to obtain a first projection point cloud;
in practice, with the conversion matrix and camera references known, a dedicated software interface may be invoked to effect projection, for example, using the projection function ProjectPoints of OpenCV software, to project three-dimensional coordinates into a two-dimensional image.
Referring to fig. 3, a schematic diagram of projecting a calibration plate point cloud onto an image in an embodiment of the present application is shown. As shown in fig. 3, the contact ratio of the projection point cloud of the calibration plate point cloud projected into the image and the calibration plate in the image is low. Under different transformation matrices, the position of the projected point cloud in the image may change.
A substep S14 of determining, in the first projection point cloud, the number of first target projection points falling within the outline of the calibration plate in the image;
and S15, determining the coincidence ratio of the image and the point cloud by adopting the number of the first target projection points.
In one example, the number of first target projection points may be taken as the degree of coincidence of the image with the point cloud. The greater the number of first target projection points, the higher the overlap ratio.
For example, if two calibration plates are used, the number of points at which the laser light emitted from the laser radar strikes the two calibration plates is 120 and 100, respectively. Under a certain first conversion matrix, the number of first target projection points of the calibration plate point cloud projected into the two calibration plate outlines of the image is 90 and 80 respectively, and if the total number of the first target projection points for each calibration plate is taken as the contact ratio, the contact ratio is 170.
In another example, the ratio of the first target projection point to the calibration plate point cloud may be used to determine the degree of overlap. In particular, the substep S15 may include: calculating the ratio of the number of first target projection points corresponding to one calibration plate to the number of first target projection points of the calibration plate point clouds of the calibration plate; and determining the coincidence ratio of the image and the point cloud by adopting the first target projection point proportion.
For example, in the above example, the first target projection point ratios of the two calibration plates are 90/120=0.75 and 80/100=0.8, respectively, and if the ratio sum of the number of first target projection points for each calibration plate and the number of point clouds of the calibration plate is the overlap ratio, the overlap ratio is 1.55.
And 206, determining a first rotation vector corresponding to the maximum contact ratio as a rotation vector of the coordinate system of the laser radar calibrated to the coordinate system of the camera.
Referring to fig. 4, another schematic diagram of projecting a calibration plate point cloud onto an image according to an embodiment of the present application is shown. In fig. 4, when the contact ratio is the highest, the projection point cloud of the calibration plate corresponds to the calibration plate in the image completely, and the whole image corresponds to the point cloud completely.
In this embodiment of the present application, under the condition that a translation vector between a camera and a laser radar is fixed, in a preset first rotation vector interval, a first rotation vector with the highest point cloud contact ratio between an image collected by the camera and a point cloud collected by the laser radar is determined, and the first rotation vector with the corresponding maximum contact ratio is used as a rotation vector for finally calibrating a coordinate system of the laser radar to a coordinate system of the camera. By adopting the calibration method of the embodiment of the application, when the laser radar with middle and low precision is calibrated to the camera, the calibration precision requirement of the unmanned vehicle can be met and the automatic calibration can be realized.
Referring to fig. 5, there is shown a flow chart of steps of an embodiment of a calibration method of the present application, the method being applied to an unmanned vehicle, the unmanned vehicle including at least one industrial camera, at least one looking-around camera, and at least one lidar, the at least one camera and the at least one lidar each having their own coordinate systems, the method specifically may include the steps of:
Step 501, selecting a target camera from the at least one camera, and taking the coordinate system of the target camera as a reference coordinate system;
the drone may be provided with a plurality of cameras, which may include at least one industrial camera and at least one look-around camera.
Industrial cameras have high image stability, high transmission capability and high anti-interference capability and are generally arranged in front of an unmanned vehicle to collect images of the front space.
The field angle of view of the looking-around camera is great, and the unmanned aerial vehicle is provided with a plurality of looking-around cameras which can cover the 360-degree area around the unmanned aerial vehicle, so that the blind area of the field of view in the running process of the unmanned aerial vehicle can be ensured to be as small as possible.
The coordinate systems of different cameras are selected as reference coordinate systems, the calibration process is different, the complexity is different, and in practice, one of the industrial camera and the looking-around camera can be selected as a target camera according to the relative positions of the industrial camera, the looking-around camera and the laser radar in the unmanned vehicle.
Referring to fig. 6, a schematic diagram of an unmanned vehicle calibration scenario in an embodiment of the present application is shown. Cameras or laser radars can be arranged in the front, back, left and right directions of the unmanned vehicle, and calibration plates can be placed in corresponding directions for the cameras and the laser radars needing calibration. The camera is adopted to collect images of the calibration plate, and the laser radar is used for collecting point clouds aiming at the calibration plate.
In one example of an embodiment of the present application, the industrial cameras may include a left industrial camera disposed in the left front and a right industrial camera disposed in the right front, both industrial cameras constituting a binocular camera.
The lidar may include a front lidar disposed in front, a rear lidar disposed in rear, a left lidar disposed in left, and a right lidar disposed in right.
The looking-around camera may include a front looking-around camera disposed in front, a rear looking-around camera disposed in rear, a left looking-around camera disposed in left, and a right looking-around camera disposed in right.
For simplicity, one may be selected as a target camera from at least one industrial camera when selecting the target camera.
In the above example, the left industrial camera may be selected as the target camera, and the coordinate system of the left industrial camera may be selected as the reference coordinate system. The coordinate system of the right industrial camera may be directly calibrated to the reference coordinate system of the left industrial camera.
Step 502, determining a first laser radar associated with the target camera in the at least one laser radar, and calibrating a coordinate system of the first laser radar to the reference coordinate system;
The association between the camera and the lidar refers to the association between the shooting spaces of the two. The two can be directly calibrated only by shooting a common space. If the two images have no common shooting space, the two images are not related, and the two images cannot be directly calibrated. For example, the laser radar arranged at the rear of the unmanned vehicle collects a rear point cloud, and the industrial camera arranged at the front of the unmanned vehicle collects an image of the front end, so that a common shooting space is not formed between the two images, and therefore direct calibration cannot be performed between the two images.
In the above example, the front lidar, the left lidar, and the right lidar may have a common shooting space with the left industrial camera, and thus have an association therebetween. The coordinate system of the first lidar associated with the target camera may be calibrated directly to the reference coordinate system.
Step 503, determining a first camera corresponding to the first laser radar in the cameras except the target camera, and calibrating a coordinate system of the first camera to a coordinate system of the corresponding first laser radar;
the correspondence referred to herein refers to the correspondence of the orientation. Specifically, a first looking-around camera corresponding to the first laser radar may be determined.
In the above example, the looking around camera and the lidar are used correspondingly, the front lidar corresponds to the front looking around camera, the rear lidar corresponds to the rear looking around camera, the left lidar corresponds to the left looking around camera, and the right lidar corresponds to the right looking around camera.
The coordinate system of the front looking-around camera can be directly calibrated to the coordinate system of the front laser radar, so that the coordinate system of the front looking-around camera is indirectly calibrated; the coordinate system of the left looking around camera can be directly calibrated to the coordinate system of the left laser radar, so that the coordinate system of the left looking around camera is indirectly calibrated; the coordinate system of the right looking around camera can be directly calibrated to the coordinate system of the right laser radar, so that the coordinate system of the right looking around camera is indirectly calibrated.
Step 504, determining a second lidar not associated with the target camera, and determining a second camera corresponding to the second lidar;
for a second laser radar which is not associated with the target camera, the coordinate system of the second laser radar cannot be directly calibrated to the reference coordinate system, and the second laser radar can be indirectly calibrated to the reference coordinate system through the second camera corresponding to the second laser radar. The second camera corresponding to the rear lidar may specifically be: a corresponding second looking-around camera.
For example, the rear lidar and the left industrial camera do not have a common imaging space, and are therefore not associated with each other. A rear looking around camera corresponding to the rear lidar may be determined.
Step 505, calibrating the coordinate system of the second camera to the coordinate system of the associated first lidar, and calibrating the coordinate system of the second lidar to the coordinate system of the second camera.
In the embodiment of the application, the coordinate system of the calibrated first laser radar can be utilized to realize indirect calibration.
Determining a first laser radar associated with a second camera, calibrating a coordinate system of the second camera to the coordinate system of the associated first laser radar, and then calibrating the coordinate system of the second laser radar to the coordinate system of the second camera, so as to indirectly calibrate the coordinate system of the second laser radar to a reference coordinate system.
For example, the first lidar associated with the rear-looking-around camera may have a left lidar and a right lidar, and the coordinate system of the rear-looking-around camera may be calibrated to the coordinate system of the left lidar, and then the coordinate system of the rear-looking-around camera may be calibrated to the coordinate system of the rear-looking-around camera.
In the embodiment of the application, the calibration process between the industrial camera and the laser radar and the calibration process between the looking-around camera and the laser radar can be realized by adopting the embodiment of the calibration method between the laser radar and the camera.
The calibration method is suitable for the unmanned vehicle with multiple sensors, can directly or indirectly calibrate the industrial camera, the looking-around camera and the laser radar in the unmanned vehicle to a reference coordinate system, has high calibration precision, and can realize automatic calibration. Calibration of other sensors may also be accomplished by a reference coordinate system, which may be calibrated to inertial measurement unit IMU (Inertial measurement unit), for example.
It should be noted that, for simplicity of description, the method embodiments are shown as a series of acts, but it should be understood by those skilled in the art that the embodiments are not limited by the order of acts described, as some steps may occur in other orders or concurrently in accordance with the embodiments. Further, those skilled in the art will appreciate that the embodiments described in the specification are all preferred embodiments and that the acts referred to are not necessarily required by the embodiments of the present application.
Referring to fig. 7, a block diagram of an embodiment of a calibration device between a lidar and a camera according to the present application is shown, and may specifically include the following modules:
an image acquisition module 701, configured to acquire an image acquired by the camera for a calibration board and a point cloud acquired by the laser radar for the calibration board;
a first rotation vector determining module 702, configured to determine a plurality of first rotation vectors within a preset first rotation vector interval;
a first contact ratio calculating module 703, configured to calculate a contact ratio between the corresponding image and the point cloud according to each first rotation vector;
and the rotation vector calibration module 704 is used for determining a first rotation vector corresponding to the maximum contact ratio as a rotation vector for calibrating the coordinate system of the laser radar to the coordinate system of the camera.
In the embodiment of the present application, the first contact ratio calculating module 703 may include:
the parameter acquisition sub-module is used for acquiring a translation vector between the coordinate system of the laser radar and the coordinate system of the camera and acquiring an internal reference of the camera;
a first transformation matrix determining sub-module for determining a plurality of first transformation matrices using the plurality of first rotation vectors and the translation vector, respectively;
the first coincidence degree calculating sub-module is used for calculating the coincidence degree between the corresponding image and the point cloud by adopting the first transformation matrix and the internal parameters of the camera aiming at one first transformation matrix.
In an embodiment of the present application, the first contact ratio calculating sub-module may include:
a camera coordinate system acquisition unit configured to acquire a camera coordinate system of the camera;
the image information determining unit is used for determining the outline of the calibration plate in the image and determining the three-dimensional coordinates of the point cloud of the calibration plate, which is positioned in the calibration plate, in the point cloud;
the projection unit is used for projecting the calibration plate point cloud to the image by adopting the first conversion matrix, the internal parameters of the camera and the three-dimensional coordinates of the calibration plate point cloud to obtain a first projection point cloud;
The target projection point determining unit is used for determining the number of first target projection points falling into the outline of the calibration plate in the image in the first projection point cloud;
and the first contact ratio determining unit is used for determining the contact ratio of the image and the point cloud by adopting the number of the first target projection points.
In an embodiment of the present application, the first contact ratio determining unit may include:
the projection proportion calculating subunit is used for calculating the proportion of the number of the first target projection points corresponding to one calibration plate to the first target projection points of the number of the calibration plate point clouds of the calibration plate;
and the first contact ratio determining subunit is used for determining the contact ratio of the image and the point cloud by adopting the first target projection point proportion.
In an embodiment of the present application, the first rotation vector determining module 702 may include:
the first rotation vector determining sub-module is used for determining a plurality of first rotation vectors according to a preset radian interval in a preset first rotation vector interval.
In this embodiment of the present application, the preset first rotation vector section includes a preset first roll angle section, a preset first pitch angle section, and a preset first yaw angle section; the first rotation vector determination submodule may include:
The rolling angle determining unit is used for determining a plurality of rolling angles according to a preset radian interval in the preset first rolling angle interval;
the pitch angle determining unit is used for determining a plurality of pitch angles according to the preset radian intervals in the preset first pitch angle interval;
a yaw angle determining unit configured to determine a plurality of yaw angles at the preset radian intervals within the preset first yaw angle section;
and the first rotation vector determining unit is used for selecting one rolling angle from the rolling angles, selecting one pitch angle from the pitch angles and selecting one yaw angle from the yaw angles to be combined to obtain a plurality of first rotation vectors.
In an embodiment of the present application, the apparatus may further include:
a camera parameter acquisition module for acquiring a horizontal view angle and a vertical view angle of the camera and resolution of the image;
the first radian determining module is used for dividing the horizontal view angle by the width of the resolution to obtain a first radian;
the second radian determining module is used for dividing the vertical field angle by the height of the resolution ratio to obtain a second radian;
And the radian interval determining module is used for taking the smaller radian of the first radian and the second radian as the preset radian interval.
In an embodiment of the present application, the apparatus may further include:
a reference rotation vector determination module configured to determine a reference rotation vector;
and the first rotation vector interval determining module is used for determining the preset first rotation vector interval by adopting the reference rotation vector and the preset radian interval.
In an embodiment of the present application, the reference rotation vector determining module may include:
the second rotation vector interval acquisition submodule is used for acquiring a preset second rotation vector interval, and the preset second rotation vector interval comprises a preset second rolling angle interval, a preset second pitch angle interval and a preset second yaw angle interval;
the angle adjustment sub-module is used for adjusting the pitch angle in the preset second pitch angle interval and adjusting the yaw angle in the preset second yaw angle interval;
the target angle determining submodule is used for determining a target pitch angle and a target yaw angle when the center of the calibration plate of the image is overlapped with the center of the first projection point cloud;
the second rotation vector determining submodule is used for adjusting the rolling angle in the preset second rolling angle interval under the condition of the target pitch angle and the target yaw angle to obtain a plurality of second rotation vectors;
A reference rotation vector determination sub-module for determining a reference rotation vector from the plurality of second rotation vectors.
In an embodiment of the present application, the reference rotation vector determination submodule may include:
a second conversion matrix determining unit configured to determine a plurality of second conversion matrices using the plurality of second rotation vectors and a translation vector between the coordinate system of the lidar and the coordinate system of the camera, respectively;
a second coincidence degree calculating unit, configured to calculate, for one of the second transformation matrices, a coincidence degree between the corresponding image and the point cloud using the second transformation matrix and an internal reference of the camera;
and a reference rotation vector determination unit configured to determine a second rotation vector corresponding to the maximum overlap ratio as a reference rotation vector.
In an embodiment of the present application, the image information determining unit may include:
the first calibration board point cloud determining subunit is used for extracting the calibration board point cloud positioned in the calibration board from the point cloud by adopting a point cloud clustering algorithm;
and the first point cloud coordinate determining subunit is used for determining the three-dimensional coordinates of the point cloud of the calibration plate.
In an embodiment of the present application, the image information determining unit may include:
A reflectivity obtaining subunit, configured to obtain reflectivity of each point in the point cloud;
the second calibration plate point cloud determining subunit is used for determining the calibration plate point cloud positioned in the calibration plate by adopting points with reflectivity larger than a preset reflectivity threshold value;
and the second point cloud coordinate determining subunit is used for determining the three-dimensional coordinates of the point cloud of the calibration plate.
In an embodiment of the present application, the image information determining unit may include:
a dimension information obtaining subunit, configured to obtain dimension information of the calibration plate;
the third calibration plate point cloud determining subunit is used for determining the calibration plate point cloud positioned in the calibration plate in the point cloud by adopting the size information of the calibration plate;
and the third point cloud coordinate determining subunit is used for determining the three-dimensional coordinates of the point cloud of the calibration plate.
Referring to fig. 8, there is shown a block diagram of an embodiment of a calibration device of the present application, the calibration device being applied to an unmanned vehicle, the unmanned vehicle including at least one camera and at least one lidar, the at least one camera and the at least one lidar each having their own coordinate systems, the device may specifically include the following modules:
A reference coordinate system determining module 801, configured to select a target camera from the at least one camera, and use a coordinate system of the target camera as a reference coordinate system;
a first calibration module 802, configured to determine, among the at least one lidar, a first lidar associated with the target camera, and calibrate a coordinate system of the first lidar to the reference coordinate system;
a second calibration module 803, configured to determine, among cameras other than the target camera, a first camera corresponding to the first lidar, and calibrate a coordinate system of the first camera to a coordinate system of the corresponding first lidar.
An unassociated determining module 804, configured to determine a second lidar unassociated with the target camera, and determine a second camera corresponding to the second lidar;
a third calibration module 805 configured to calibrate the coordinate system of the second camera to the coordinate system of the associated first lidar and to calibrate the coordinate system of the second lidar to the coordinate system of the second camera.
In an embodiment of the present application, the at least one camera may include: at least one industrial camera, at least one point-of-view camera; the reference coordinate system determination module 801 may include:
And the target camera selecting sub-module is used for selecting one from the at least one industrial camera as a target camera.
In an embodiment of the present application, the second calibration module 803 may include:
and the first looking-around camera determining submodule is used for determining a first looking-around camera corresponding to the first laser radar in the at least one looking-around camera.
In an embodiment of the present application, the disassociation determination module 804 can include:
and the second looking-around camera determining submodule is used for determining a second looking-around camera corresponding to the second laser radar.
For the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments for relevant points.
The embodiment of the application also provides a device, which comprises:
one or more processors; and
one or more machine readable media having instructions stored thereon, which when executed by the one or more processors, cause the apparatus to perform the methods described in embodiments of the present application.
One or more machine-readable media are also provided, having instructions stored thereon, which when executed by one or more processors, cause the processors to perform the methods described in the embodiments of the present application.
In this specification, each embodiment is described in a progressive manner, and each embodiment is mainly described by differences from other embodiments, and identical and similar parts between the embodiments are all enough to be referred to each other.
It will be apparent to those skilled in the art that embodiments of the present application may be provided as a method, apparatus, or computer program product. Accordingly, the present embodiments may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present application may take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
Embodiments of the present application are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal device to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal device, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present embodiments have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the present application.
Finally, it is further noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or terminal device comprising the element.
The above description has been made in detail about a calibration method between a laser radar and a camera, and a calibration device, and specific examples are applied to illustrate the principles and embodiments of the present application, and the above description of the examples is only for helping to understand the method and core ideas of the present application; meanwhile, as those skilled in the art will have modifications in the specific embodiments and application scope in accordance with the ideas of the present application, the present description should not be construed as limiting the present application in view of the above.

Claims (32)

1. A method for calibrating between a lidar and a camera, comprising:
acquiring an image acquired by the camera aiming at a calibration plate and a point cloud acquired by the laser radar aiming at the calibration plate;
determining a plurality of first rotation vectors in a preset first rotation vector interval;
calculating the coincidence ratio between the corresponding image and the point cloud according to each first rotation vector;
determining a first rotation vector corresponding to the maximum contact ratio as a rotation vector of the coordinate system of the laser radar calibrated to the coordinate system of the camera;
the preset first rotation vector interval comprises a preset first rolling angle interval, a preset first pitch angle interval and a preset first yaw angle interval; the determining a plurality of first rotation vectors in a preset first rotation vector interval includes:
determining a plurality of rolling angles according to a preset radian interval in the preset first rolling angle interval;
determining a plurality of pitch angles according to the preset radian intervals in the preset first pitch angle interval;
determining a plurality of yaw angles according to the preset radian intervals in the preset first yaw angle interval;
and selecting one rolling angle from the rolling angles, selecting one pitch angle from the pitch angles, and selecting one yaw angle from the yaw angles for combination to obtain a plurality of first rotation vectors.
2. The method according to claim 1, wherein calculating the coincidence ratio between the corresponding image and the point cloud according to each first rotation vector includes:
acquiring a translation vector between a coordinate system of the laser radar and a coordinate system of a camera, and acquiring an internal reference of the camera;
determining a plurality of first transformation matrices using the plurality of first rotation vectors and the translation vector, respectively;
and aiming at one first conversion matrix, calculating the coincidence ratio between the corresponding image and the point cloud by adopting the first conversion matrix and the internal parameters of the camera.
3. The method of claim 2, wherein said calculating a degree of coincidence between the corresponding image and the point cloud using the first transformation matrix and internal parameters of the camera comprises:
acquiring a camera coordinate system of the camera;
determining the outline of the calibration plate in the image, and determining the three-dimensional coordinates of the point cloud of the calibration plate, which is positioned in the calibration plate, in the point cloud;
projecting the calibration plate point cloud to the image by adopting the first conversion matrix, the internal reference of the camera and the three-dimensional coordinates of the calibration plate point cloud to obtain a first projection point cloud;
Determining the number of first target projection points falling into the outline of the calibration plate in the image in the first projection point cloud;
and determining the coincidence degree of the image and the point cloud by adopting the number of the first target projection points.
4. A method according to claim 3, wherein said determining the degree of coincidence of said image with said point cloud using the number of said first target projection points comprises:
calculating the ratio of the number of first target projection points corresponding to one calibration plate to the number of first target projection points of the calibration plate point clouds of the calibration plate;
and determining the coincidence ratio of the image and the point cloud by adopting the first target projection point proportion.
5. The method as recited in claim 1, further comprising:
acquiring a horizontal view angle and a vertical view angle of the camera and resolution of the image;
dividing the horizontal view angle by the width of the resolution to obtain a first radian;
dividing the vertical field angle by the height of the resolution to obtain a second radian;
and taking the smaller of the first radian and the second radian as the preset radian interval.
6. The method as recited in claim 1, further comprising:
Determining a reference rotation vector;
and determining the preset first rotation vector interval by adopting the reference rotation vector and the preset radian interval.
7. The method of claim 6, wherein the determining a reference rotation vector comprises:
acquiring a preset second rotation vector interval, wherein the preset second rotation vector interval comprises a preset second rolling angle interval, a preset second pitch angle interval and a preset second yaw angle interval;
adjusting a pitch angle in the preset second pitch angle interval and adjusting a yaw angle in the preset second yaw angle interval;
determining a target pitch angle and a target yaw angle when the center of a calibration plate of the image is coincident with the center of the first projection point cloud; the first projection point cloud is obtained by projecting the point cloud positioned in the punctuation plate to the image;
adjusting the rolling angle in the preset second rolling angle interval under the target pitch angle and the target yaw angle to obtain a plurality of second rotation vectors;
from the plurality of second rotation vectors, a reference rotation vector is determined.
8. The method of claim 7, wherein said determining a reference rotation vector from said plurality of second rotation vectors comprises:
Determining a plurality of second transformation matrices by using the plurality of second rotation vectors and translation vectors between the coordinate system of the laser radar and the coordinate system of the camera respectively;
calculating the coincidence ratio between the corresponding image and the point cloud by adopting the second transformation matrix and the internal reference of the camera aiming at one second transformation matrix;
and determining the second rotation vector corresponding to the maximum contact ratio as a reference rotation vector.
9. A method according to claim 3, wherein said determining three-dimensional coordinates of a calibration plate point cloud of said point clouds located within said calibration plate comprises:
extracting a calibration plate point cloud positioned in the calibration plate from the point cloud by adopting a point cloud clustering algorithm;
and determining the three-dimensional coordinates of the calibration plate point cloud.
10. A method according to claim 3, wherein said determining three-dimensional coordinates of a calibration plate point cloud of said point clouds located within said calibration plate comprises:
acquiring the reflectivity of each point in the point cloud;
determining a calibration plate point cloud positioned in the calibration plate by adopting a point with reflectivity larger than a preset reflectivity threshold value;
and determining the three-dimensional coordinates of the calibration plate point cloud.
11. A method according to claim 3, wherein said determining three-dimensional coordinates of a calibration plate point cloud of said point clouds located within said calibration plate comprises:
acquiring the size information of the calibration plate;
determining a calibration plate point cloud positioned in the calibration plate in the point cloud by adopting the size information of the calibration plate;
and determining the three-dimensional coordinates of the calibration plate point cloud.
12. A calibration method, characterized by being applied to an unmanned vehicle, the unmanned vehicle comprising at least one camera and at least one lidar, the at least one camera and the at least one lidar each having their own coordinate systems, the method comprising:
selecting a target camera from the at least one camera, and taking a coordinate system of the target camera as a reference coordinate system;
determining, among the at least one lidar, a first lidar associated with the target camera and calibrating a coordinate system of the first lidar to the reference coordinate system using the method of any of claims 1-11;
determining a first camera corresponding to the first laser radar in cameras except the target camera, and calibrating a coordinate system of the first camera to a coordinate system of the corresponding first laser radar;
Determining a second lidar not associated with the target camera, and determining a second camera corresponding to the second lidar;
calibrating the coordinate system of the second camera to the coordinate system of the associated first lidar, and calibrating the coordinate system of the second lidar to the coordinate system of the second camera.
13. The method of claim 12, wherein the at least one camera comprises: at least one industrial camera, at least one point-of-view camera; said selecting a target camera from said at least one camera comprises:
selecting one from the at least one industrial camera as a target camera.
14. The method of claim 13, wherein the determining, among the cameras other than the target camera, a first camera corresponding to the first lidar comprises:
in the at least one looking-around camera, a first looking-around camera corresponding to the first lidar is determined.
15. The method of claim 13, wherein the determining a second camera corresponding to the second lidar comprises:
and determining a second looking-around camera corresponding to the second laser radar.
16. A calibration device between a lidar and a camera, comprising:
the image acquisition module is used for acquiring an image acquired by the camera aiming at the calibration plate and a point cloud acquired by the laser radar aiming at the calibration plate;
the first rotation vector determining module is used for determining a plurality of first rotation vectors in a preset first rotation vector interval;
the first contact ratio calculation module is used for calculating the contact ratio between the corresponding image and the point cloud according to each first rotation vector;
the rotation vector calibration module is used for determining a first rotation vector corresponding to the maximum contact ratio as a rotation vector of the coordinate system of the laser radar calibrated to the coordinate system of the camera;
the preset first rotation vector interval comprises a preset first rolling angle interval, a preset first pitch angle interval and a preset first yaw angle interval; the first rotation vector determination module includes:
the rolling angle determining unit is used for determining a plurality of rolling angles according to a preset radian interval in the preset first rolling angle interval;
the pitch angle determining unit is used for determining a plurality of pitch angles according to the preset radian intervals in the preset first pitch angle interval;
A yaw angle determining unit configured to determine a plurality of yaw angles at the preset radian intervals within the preset first yaw angle section;
and the first rotation vector determining unit is used for selecting one rolling angle from the rolling angles, selecting one pitch angle from the pitch angles and selecting one yaw angle from the yaw angles to be combined to obtain a plurality of first rotation vectors.
17. The apparatus of claim 16, wherein the first overlap ratio calculation module comprises:
the parameter acquisition sub-module is used for acquiring a translation vector between the coordinate system of the laser radar and the coordinate system of the camera and acquiring an internal reference of the camera;
a first transformation matrix determining sub-module for determining a plurality of first transformation matrices using the plurality of first rotation vectors and the translation vector, respectively;
the first coincidence degree calculating sub-module is used for calculating the coincidence degree between the corresponding image and the point cloud by adopting the first transformation matrix and the internal parameters of the camera aiming at one first transformation matrix.
18. The apparatus of claim 17, wherein the first overlap ratio calculation submodule comprises:
A camera coordinate system acquisition unit configured to acquire a camera coordinate system of the camera;
the image information determining unit is used for determining the outline of the calibration plate in the image and determining the three-dimensional coordinates of the point cloud of the calibration plate, which is positioned in the calibration plate, in the point cloud;
the projection unit is used for projecting the calibration plate point cloud to the image by adopting the first conversion matrix, the internal parameters of the camera and the three-dimensional coordinates of the calibration plate point cloud to obtain a first projection point cloud;
the target projection point determining unit is used for determining the number of first target projection points falling into the outline of the calibration plate in the image in the first projection point cloud;
and the first contact ratio determining unit is used for determining the contact ratio of the image and the point cloud by adopting the number of the first target projection points.
19. The apparatus according to claim 18, wherein the first overlap ratio determining unit includes:
the projection proportion calculating subunit is used for calculating the proportion of the number of the first target projection points corresponding to one calibration plate to the first target projection points of the number of the calibration plate point clouds of the calibration plate;
and the first contact ratio determining subunit is used for determining the contact ratio of the image and the point cloud by adopting the first target projection point proportion.
20. The apparatus as recited in claim 16, further comprising:
a camera parameter acquisition module for acquiring a horizontal view angle and a vertical view angle of the camera and resolution of the image;
the first radian determining module is used for dividing the horizontal view angle by the width of the resolution to obtain a first radian;
the second radian determining module is used for dividing the vertical field angle by the height of the resolution ratio to obtain a second radian;
and the radian interval determining module is used for taking the smaller radian of the first radian and the second radian as the preset radian interval.
21. The apparatus as recited in claim 16, further comprising:
a reference rotation vector determination module configured to determine a reference rotation vector;
and the first rotation vector interval determining module is used for determining the preset first rotation vector interval by adopting the reference rotation vector and the preset radian interval.
22. The apparatus of claim 21, wherein the reference rotation vector determination module comprises:
the second rotation vector interval acquisition submodule is used for acquiring a preset second rotation vector interval, and the preset second rotation vector interval comprises a preset second rolling angle interval, a preset second pitch angle interval and a preset second yaw angle interval;
The angle adjustment sub-module is used for adjusting the pitch angle in the preset second pitch angle interval and adjusting the yaw angle in the preset second yaw angle interval;
the target angle determining submodule is used for determining a target pitch angle and a target yaw angle when the center of the calibration plate of the image is overlapped with the center of the first projection point cloud; the first projection point cloud is obtained by projecting the point cloud positioned in the punctuation plate to the image;
the second rotation vector determining submodule is used for adjusting the rolling angle in the preset second rolling angle interval under the condition of the target pitch angle and the target yaw angle to obtain a plurality of second rotation vectors;
a reference rotation vector determination sub-module for determining a reference rotation vector from the plurality of second rotation vectors.
23. The apparatus of claim 22, wherein the reference rotation vector determination submodule comprises:
a second conversion matrix determining unit configured to determine a plurality of second conversion matrices using the plurality of second rotation vectors and a translation vector between the coordinate system of the lidar and the coordinate system of the camera, respectively;
a second coincidence degree calculating unit, configured to calculate, for one of the second transformation matrices, a coincidence degree between the corresponding image and the point cloud using the second transformation matrix and an internal reference of the camera;
And a reference rotation vector determination unit configured to determine a second rotation vector corresponding to the maximum overlap ratio as a reference rotation vector.
24. The apparatus according to claim 18, wherein the image information determining unit includes:
the first calibration board point cloud determining subunit is used for extracting the calibration board point cloud positioned in the calibration board from the point cloud by adopting a point cloud clustering algorithm;
and the first point cloud coordinate determining subunit is used for determining the three-dimensional coordinates of the point cloud of the calibration plate.
25. The apparatus according to claim 18, wherein the image information determining unit includes:
a reflectivity obtaining subunit, configured to obtain reflectivity of each point in the point cloud;
the second calibration plate point cloud determining subunit is used for determining the calibration plate point cloud positioned in the calibration plate by adopting points with reflectivity larger than a preset reflectivity threshold value;
and the second point cloud coordinate determining subunit is used for determining the three-dimensional coordinates of the point cloud of the calibration plate.
26. The apparatus according to claim 18, wherein the image information determining unit includes:
a dimension information obtaining subunit, configured to obtain dimension information of the calibration plate;
The third calibration plate point cloud determining subunit is used for determining the calibration plate point cloud positioned in the calibration plate in the point cloud by adopting the size information of the calibration plate;
and the third point cloud coordinate determining subunit is used for determining the three-dimensional coordinates of the point cloud of the calibration plate.
27. A calibration device, characterized in that it is applied to an unmanned vehicle, said unmanned vehicle comprising at least one camera and at least one lidar, said at least one camera and said at least one lidar each having their own coordinate systems, said device comprising:
a reference coordinate system determining module, configured to select a target camera from the at least one camera, and use a coordinate system of the target camera as a reference coordinate system;
a first calibration module for determining, among the at least one lidar, a first lidar associated with the target camera, and calibrating a coordinate system of the first lidar to the reference coordinate system using the method of any of claims 1-11;
the second calibration module is used for determining a first camera corresponding to the first laser radar in cameras except the target camera and calibrating a coordinate system of the first camera to a coordinate system of the corresponding first laser radar;
A non-association determination module for determining a second lidar that is not associated with the target camera, and determining a second camera that corresponds to the second lidar;
and the third calibration module is used for calibrating the coordinate system of the second camera to the coordinate system of the associated first laser radar and calibrating the coordinate system of the second laser radar to the coordinate system of the second camera.
28. The apparatus of claim 27, wherein the at least one camera comprises: at least one industrial camera, at least one point-of-view camera; the reference coordinate system determination module includes:
and the target camera selecting sub-module is used for selecting one from the at least one industrial camera as a target camera.
29. The apparatus of claim 28, wherein the second calibration module comprises:
and the first looking-around camera determining submodule is used for determining a first looking-around camera corresponding to the first laser radar in the at least one looking-around camera.
30. The apparatus of claim 28, wherein the disassociation determination module comprises:
and the second looking-around camera determining submodule is used for determining a second looking-around camera corresponding to the second laser radar.
31. A calibration device, comprising:
one or more processors; and
one or more machine readable media having instructions stored thereon, which when executed by the one or more processors, cause the apparatus to perform the method of any of claims 1-15.
32. A machine readable medium having instructions stored thereon, which when executed by one or more processors, cause the processors to perform the method of any of claims 1-15.
CN201910425720.5A 2019-05-21 2019-05-21 Calibration method and device between laser radar and camera Active CN110221275B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910425720.5A CN110221275B (en) 2019-05-21 2019-05-21 Calibration method and device between laser radar and camera
PCT/CN2020/089722 WO2020233443A1 (en) 2019-05-21 2020-05-12 Method and device for performing calibration between lidar and camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910425720.5A CN110221275B (en) 2019-05-21 2019-05-21 Calibration method and device between laser radar and camera

Publications (2)

Publication Number Publication Date
CN110221275A CN110221275A (en) 2019-09-10
CN110221275B true CN110221275B (en) 2023-06-23

Family

ID=67821629

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910425720.5A Active CN110221275B (en) 2019-05-21 2019-05-21 Calibration method and device between laser radar and camera

Country Status (2)

Country Link
CN (1) CN110221275B (en)
WO (1) WO2020233443A1 (en)

Families Citing this family (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110221275B (en) * 2019-05-21 2023-06-23 菜鸟智能物流控股有限公司 Calibration method and device between laser radar and camera
CN112823294B (en) * 2019-09-18 2024-02-02 北京航迹科技有限公司 System and method for calibrating cameras and multi-line lidar
CN112578396B (en) * 2019-09-30 2022-04-19 上海禾赛科技有限公司 Method and device for coordinate transformation between radars and computer-readable storage medium
CN112669388B (en) * 2019-09-30 2022-06-21 上海禾赛科技有限公司 Calibration method and device for laser radar and camera device and readable storage medium
CN110596683B (en) * 2019-10-25 2021-03-26 中山大学 Multi-group laser radar external parameter calibration system and method thereof
CN110988801A (en) * 2019-10-25 2020-04-10 东软睿驰汽车技术(沈阳)有限公司 Radar installation angle adjusting method and device
CN110853101B (en) * 2019-11-06 2022-08-23 深圳市巨力方视觉技术有限公司 Camera position calibration method and device and computer readable storage medium
CN112785649A (en) * 2019-11-11 2021-05-11 北京京邦达贸易有限公司 Laser radar and camera calibration method and device, electronic equipment and medium
CN111179358B (en) * 2019-12-30 2024-01-05 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium
CN113077517B (en) * 2020-01-03 2022-06-24 湖南科天健光电技术有限公司 Spatial light measurement system calibration device and method based on light beam straight line characteristics
CN111122128B (en) * 2020-01-03 2022-04-19 浙江大华技术股份有限公司 Calibration method and device of spherical camera
CN113866779A (en) * 2020-06-30 2021-12-31 上海商汤智能科技有限公司 Point cloud data fusion method and device, electronic equipment and storage medium
CN111918203B (en) * 2020-07-03 2022-10-28 武汉万集信息技术有限公司 Target transport vehicle positioning method and device, storage medium and electronic equipment
CN112017250B (en) * 2020-08-31 2023-07-25 杭州海康威视数字技术股份有限公司 Calibration parameter determination method and device, radar equipment and Lei Qiu relay system
CN112017251B (en) * 2020-10-19 2021-02-26 杭州飞步科技有限公司 Calibration method and device, road side equipment and computer readable storage medium
CN112233188B (en) * 2020-10-26 2024-03-12 南昌智能新能源汽车研究院 Calibration method of data fusion system of laser radar and panoramic camera
CN112180348B (en) * 2020-11-27 2021-03-02 深兰人工智能(深圳)有限公司 Attitude calibration method and device for vehicle-mounted multi-line laser radar
CN112363130B (en) * 2020-11-30 2023-11-14 东风汽车有限公司 Vehicle-mounted sensor calibration method, storage medium and system
CN112446927B (en) * 2020-12-18 2024-08-30 广东电网有限责任公司 Laser radar and camera combined calibration method, device, equipment and storage medium
CN112881999B (en) * 2021-01-25 2024-02-02 上海西虹桥导航技术有限公司 Semi-automatic calibration method for multi-line laser radar and vision sensor
US11418771B1 (en) 2021-01-31 2022-08-16 Techman Robot Inc. Method for calibrating 3D camera by employing calibrated 2D camera
EP4040391B1 (en) * 2021-02-09 2024-05-29 Techman Robot Inc. Method for calibrating 3d camera by employing calibrated 2d camera
CN113009456B (en) * 2021-02-22 2023-12-05 中国铁道科学研究院集团有限公司 Vehicle-mounted laser radar data calibration method, device and system
CN113156407B (en) * 2021-02-24 2023-09-05 长沙行深智能科技有限公司 Vehicle-mounted laser radar external parameter joint calibration method, system, medium and device
CN112946591A (en) * 2021-02-26 2021-06-11 商汤集团有限公司 External parameter calibration method and device, electronic equipment and storage medium
CN112819861B (en) * 2021-02-26 2024-06-04 广州小马慧行科技有限公司 Point cloud motion compensation method, device and computer readable storage medium
CN112946612B (en) * 2021-03-29 2024-05-17 上海商汤临港智能科技有限公司 External parameter calibration method and device, electronic equipment and storage medium
CN113188569B (en) * 2021-04-07 2024-07-23 东软睿驰汽车技术(沈阳)有限公司 Coordinate system calibration method, equipment and storage medium for vehicle and laser radar
CN113177988B (en) * 2021-04-30 2023-12-05 中德(珠海)人工智能研究院有限公司 Spherical screen camera and laser calibration method, device, equipment and storage medium
CN113436278A (en) * 2021-07-22 2021-09-24 深圳市道通智能汽车有限公司 Calibration method, calibration device, distance measurement system and computer readable storage medium
CN113790738A (en) * 2021-08-13 2021-12-14 上海智能网联汽车技术中心有限公司 Data compensation method based on intelligent cradle head IMU
CN113744344B (en) * 2021-08-18 2023-09-08 富联裕展科技(深圳)有限公司 Calibration method, device, equipment and storage medium of laser equipment
CN113643382B (en) * 2021-08-22 2023-10-10 浙江大学 Method and device for acquiring dense colored point cloud based on rotary laser fusion camera
CN113838141B (en) * 2021-09-02 2023-07-25 中南大学 External parameter calibration method and system for single-line laser radar and visible light camera
CN113884278B (en) * 2021-09-16 2023-10-27 杭州海康机器人股份有限公司 System calibration method and device for line laser equipment
CN114035187B (en) * 2021-10-26 2024-08-23 北京国家新能源汽车技术创新中心有限公司 Perception fusion method of automatic driving system
CN114022566A (en) * 2021-11-04 2022-02-08 安徽省爱夫卡电子科技有限公司 Combined calibration method for single line laser radar and camera
CN113740829A (en) * 2021-11-05 2021-12-03 新石器慧通(北京)科技有限公司 External parameter monitoring method and device for environment sensing equipment, medium and running device
CN114022569B (en) * 2021-11-18 2024-06-07 湖北中烟工业有限责任公司 Method and device for measuring square accuracy of box body based on vision
CN114152935B (en) * 2021-11-19 2023-02-03 苏州一径科技有限公司 Method, device and equipment for evaluating radar external parameter calibration precision
CN114167393A (en) * 2021-12-02 2022-03-11 新境智能交通技术(南京)研究院有限公司 Position calibration method and device for traffic radar, storage medium and electronic equipment
CN114549651B (en) * 2021-12-03 2024-08-02 聚好看科技股份有限公司 Calibration method and device for multiple 3D cameras based on polyhedral geometric constraint
CN114371472B (en) * 2021-12-15 2024-07-12 中电海康集团有限公司 Automatic combined calibration device and method for laser radar and camera
CN114494806A (en) * 2021-12-17 2022-05-13 湖南国天电子科技有限公司 Target identification method, system, device and medium based on multivariate information fusion
CN114460552A (en) * 2022-01-21 2022-05-10 苏州皓宇云联科技有限公司 Road-end multi-sensor combined calibration method based on high-precision map
CN114779188B (en) * 2022-01-24 2023-11-03 南京慧尔视智能科技有限公司 Method, device, equipment and medium for evaluating calibration effect
CN116897300A (en) * 2022-02-10 2023-10-17 华为技术有限公司 Calibration method and device
CN114755662B (en) * 2022-03-21 2024-04-30 北京航空航天大学 Road-vehicle fusion perception laser radar and GPS calibration method and device
CN114723715B (en) * 2022-04-12 2023-09-19 小米汽车科技有限公司 Vehicle target detection method, device, equipment, vehicle and medium
CN115166701B (en) * 2022-06-17 2024-04-09 清华大学 System calibration method and device for RGB-D camera and laser radar
CN115856849B (en) * 2023-02-28 2023-05-05 季华实验室 Depth camera and 2D laser radar calibration method and related equipment
CN116540219B (en) * 2023-07-04 2023-09-22 北醒(北京)光子科技有限公司 Method and device for correcting radar emergent light angle, storage medium and electronic equipment
CN116630444B (en) * 2023-07-24 2023-09-29 中国矿业大学 Optimization method for fusion calibration of camera and laser radar
CN116740197B (en) * 2023-08-11 2023-11-21 之江实验室 External parameter calibration method and device, storage medium and electronic equipment
CN117073581B (en) * 2023-09-12 2024-01-26 梅卡曼德(北京)机器人科技有限公司 Calibration method and device of line laser profilometer system and electronic equipment
CN117607829B (en) * 2023-12-01 2024-06-18 哈尔滨工业大学(深圳)(哈尔滨工业大学深圳科技创新研究院) Ordered reconstruction method of laser radar point cloud and computer readable storage medium
CN117630892B (en) * 2024-01-25 2024-03-29 北京科技大学 Combined calibration method and system for visible light camera, infrared camera and laser radar

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107564069A (en) * 2017-09-04 2018-01-09 北京京东尚科信息技术有限公司 The determination method, apparatus and computer-readable recording medium of calibrating parameters
WO2018195999A1 (en) * 2017-04-28 2018-11-01 SZ DJI Technology Co., Ltd. Calibration of laser and vision sensors
CN109151439A (en) * 2018-09-28 2019-01-04 上海爱观视觉科技有限公司 A kind of the automatic tracing camera system and method for view-based access control model

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7437226B2 (en) * 2003-08-20 2008-10-14 Samsung Electronics Co., Ltd. Method of constructing artificial mark for autonomous driving, apparatus and method of determining position of intelligent system using artificial mark and intelligent system employing the same
CN107167790B (en) * 2017-05-24 2019-08-09 北京控制工程研究所 A kind of two step scaling method of laser radar based on Calibration Field
CN109118542B (en) * 2017-06-22 2021-11-23 阿波罗智能技术(北京)有限公司 Calibration method, device, equipment and storage medium between laser radar and camera
CN109521403B (en) * 2017-09-19 2020-11-20 百度在线网络技术(北京)有限公司 Parameter calibration method, device and equipment of multi-line laser radar and readable medium
CN109029284B (en) * 2018-06-14 2019-10-22 大连理工大学 A kind of three-dimensional laser scanner based on geometrical constraint and camera calibration method
CN109215063B (en) * 2018-07-05 2021-12-17 中山大学 Registration method of event trigger camera and three-dimensional laser radar
CN110221275B (en) * 2019-05-21 2023-06-23 菜鸟智能物流控股有限公司 Calibration method and device between laser radar and camera

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018195999A1 (en) * 2017-04-28 2018-11-01 SZ DJI Technology Co., Ltd. Calibration of laser and vision sensors
CN107564069A (en) * 2017-09-04 2018-01-09 北京京东尚科信息技术有限公司 The determination method, apparatus and computer-readable recording medium of calibrating parameters
CN109151439A (en) * 2018-09-28 2019-01-04 上海爱观视觉科技有限公司 A kind of the automatic tracing camera system and method for view-based access control model

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于梯形棋盘格的摄像机和激光雷达标定方法;贾子永;任国全;李冬伟;程子阳;;计算机应用(07);全文 *

Also Published As

Publication number Publication date
CN110221275A (en) 2019-09-10
WO2020233443A1 (en) 2020-11-26

Similar Documents

Publication Publication Date Title
CN110221275B (en) Calibration method and device between laser radar and camera
CN110244282B (en) Multi-camera system and laser radar combined system and combined calibration method thereof
US8427472B2 (en) Multidimensional evidence grids and system and methods for applying same
US20220276339A1 (en) Calibration method and apparatus for sensor, and calibration system
CN112907676A (en) Calibration method, device and system of sensor, vehicle, equipment and storage medium
CN112816949B (en) Sensor calibration method and device, storage medium and calibration system
CN112949478B (en) Target detection method based on tripod head camera
CN110910459B (en) Camera device calibration method and device and calibration equipment
CN111815716A (en) Parameter calibration method and related device
CN111739104A (en) Calibration method and device of laser calibration system and laser calibration system
CN111862180B (en) Camera set pose acquisition method and device, storage medium and electronic equipment
CN207766424U (en) A kind of filming apparatus and imaging device
CN111383279A (en) External parameter calibration method and device and electronic equipment
KR101342393B1 (en) Georeferencing Method of Indoor Omni-Directional Images Acquired by Rotating Line Camera
CN113034612B (en) Calibration device, method and depth camera
CN111220126A (en) Space object pose measurement method based on point features and monocular camera
CN114820725A (en) Target display method and device, electronic equipment and storage medium
KR20170020629A (en) Apparatus for registration of cloud points
CN117250956A (en) Mobile robot obstacle avoidance method and obstacle avoidance device with multiple observation sources fused
CN115239816A (en) Camera calibration method, system, electronic device and storage medium
CN112669388B (en) Calibration method and device for laser radar and camera device and readable storage medium
CN112556596B (en) Three-dimensional deformation measurement system, method, device and storage medium
CN212163540U (en) Omnidirectional stereoscopic vision camera configuration system
CN113421300A (en) Method and device for determining actual position of object in fisheye camera image
CN112804515A (en) Omnidirectional stereoscopic vision camera configuration system and camera configuration method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant