CN111505606B - Method and device for calibrating relative pose of multi-camera and laser radar system - Google Patents

Method and device for calibrating relative pose of multi-camera and laser radar system Download PDF

Info

Publication number
CN111505606B
CN111505606B CN202010291167.3A CN202010291167A CN111505606B CN 111505606 B CN111505606 B CN 111505606B CN 202010291167 A CN202010291167 A CN 202010291167A CN 111505606 B CN111505606 B CN 111505606B
Authority
CN
China
Prior art keywords
camera
coordinate system
laser
laser radar
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010291167.3A
Other languages
Chinese (zh)
Other versions
CN111505606A (en
Inventor
黄玉春
张博
陈玉曦
陈宇飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN202010291167.3A priority Critical patent/CN111505606B/en
Publication of CN111505606A publication Critical patent/CN111505606A/en
Application granted granted Critical
Publication of CN111505606B publication Critical patent/CN111505606B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • G01S7/4972Alignment of sensor
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

A multi-camera and laser radar system relative pose calibration method and device comprises the steps of establishing a world coordinate system in a calibration field, and respectively calibrating each sensor in the multi-camera and laser radar system relative to external parameters of the world coordinate system; converting the camera coordinate system into a laser coordinate system, and solving a transformation matrix; generating a three-dimensional model by using laser data, mapping images shot by a plurality of cameras at a certain moment to the three-dimensional model at corresponding positions according to the relative pose relation between the cameras and a laser coordinate system, and attaching gray information to point cloud; and optimizing the external parameters of the camera according to the characteristic matching degree between the coloring point cloud overlapping areas in the geometric splicing result, finally obtaining more accurate external parameters of the multi-camera under a laser coordinate system, and realizing the relative pose calibration of the multi-camera and the laser radar system. The invention optimizes the relative pose relationship between the cameras, plays a key role in image splicing and measurement, and has high stability and sufficient precision.

Description

Method and device for calibrating relative pose of multi-camera and laser radar system
Technical Field
The invention belongs to the field of sensor calibration and multi-sensor fusion, and particularly relates to a relative pose calibration method and device based on a multi-camera and a laser radar system.
Background
The sensor calibration is an indispensable important link in a vehicle-mounted multi-sensor integrated system, wherein the calibration result of the relative pose of the camera and the laser radar is the basis for the fusion of image data and point cloud data, and has important significance for actual engineering measurement. For example, when a tunnel is measured, the advantage of high resolution is achieved by means of the linear array cameras, image data are acquired for the tunnel by the aid of the linear array cameras, point cloud data acquired by the aid of laser radar are geometrically spliced, a panoramic image of the whole tunnel is acquired, and efficiency of tunnel measurement and inspection is improved to a great extent.
Therefore, how to accurately and efficiently check the relative pose between the camera and the laser radar is a technical problem which needs to be solved urgently at present.
Disclosure of Invention
Aiming at the defects or improvement requirements of the prior art, the invention provides a method and a device for detecting the relative pose of a multi-camera and a laser radar, so that the technical problem of how to obtain the pose conversion relation of the multi-camera and the laser radar with high precision and high efficiency is solved.
In order to achieve the above object, the present invention provides a method for calibrating the relative pose of a multi-camera and a lidar system, comprising the following steps,
step 1, establishing a world coordinate system in a checking field;
step 2, calibrating external parameters of each sensor in the multi-camera and the laser radar system relative to a world coordinate system respectively;
step 3, converting the camera coordinate system to a laser coordinate system, and solving a transformation matrix in the conversion process;
step 4, generating a three-dimensional model by using laser data, mapping a group of images with certain overlapping degree shot by a plurality of cameras at a certain moment to the three-dimensional model at a corresponding position according to the relative position and posture relation of the camera coordinate system and the laser coordinate system, attaching gray information to the point cloud, and generating a colored point cloud;
and 5, optimizing external parameters of the camera according to the characteristic matching degree between the coloring point cloud overlapping areas in the geometric splicing result, finally obtaining more accurate external parameters of the multi-camera under a laser coordinate system, and realizing the relative pose calibration of the multi-camera and a laser radar system.
And the step 1 is realized by selecting the original point position and three coordinate axis directions in the calibration field and establishing a three-dimensional space coordinate system, so that the point position coordinate values on the calibration frame conform to the actual space cognition and are easy to represent and calculate under the coordinate system and are recorded as a world coordinate system O w -X w Y w Z w
Furthermore, step 2 comprises the sub-steps of,
step 2.1, arranging a multi-camera to be checked and a laser radar system, wherein the multi-camera and the laser radar system are composed of a plurality of cameras and laser radars, the cameras are radially arranged in the direction of a main optical axis, and the installation positions of the laser radars are arranged on one side of the plane where the optical centers of the multi-camera are located;
step 2.2, two types of calibration markers of a camera and a laser radar are respectively arranged, and the markers are adhered or welded on an arc surface calibration frame, so that the data of the two types of calibration marker areas can be simultaneously acquired when the multi-camera and the laser radar system work in a certain posture;
2.3, collecting multiple groups of data of the calibration markers by using the laser radar and the multi-camera system carried on the mobile platform;
step 2.4, measuring coordinates of the characteristic points of the calibration marker in a world coordinate system, wherein the characteristic points of the calibration marker adopt pattern edge intersection points or space plane edge intersection points in the calibration marker;
step 2.5, respectively obtaining the coordinates of the feature points of the calibration marker in the camera and the laser coordinate system, and calculating a transformation matrix [ R ] of the laser radar and the camera converted into the world coordinate system lw T lw ]And [ R ] cw T cw ]Wherein R is lw 、T lw Respectively representing a rotation matrix and a translation vector R of the laser radar converted to a world coordinate system cw 、T cw Respectively representing a rotation matrix and a translation vector of the camera converted to a world coordinate system.
Furthermore, in step 3, the camera coordinate system O c -X c Y c Z c Conversion to laser coordinate system O l -X l Y l Z l Then, the transformation matrix [ R ] of the conversion process is obtained cl T cl ]The implementation mode is as follows,
transformation matrix R for converting simultaneous laser radar and camera into world coordinate system lw T lw ]And [ R ] cw T cw ]The expression (c) of (a),
Figure BDA0002450443210000021
Figure BDA0002450443210000022
obtaining a transformation matrix [ R ] for transforming the camera coordinate system to the laser coordinate system cl T cl ]There is an expression as follows,
Figure BDA0002450443210000023
R cl =R lw T R cw
T cl =R lw T (T cw -T lw )
wherein R is cl 、T cl Respectively representing a rotation matrix and a translation vector when the camera coordinate system is converted into the laser coordinate system.
Furthermore, step 4 comprises the sub-steps of,
step 4.1, generating a three-dimensional model by laser radar original data, recovering a plurality of groups of poses during laser scanning according to data generation modes of different types of laser radars, dividing the original data into a plurality of groups of data at a plurality of moments, and splicing the data to the same coordinate reference to realize recovery of the original data to obtain a scene three-dimensional model;
and 4.2, bringing the coordinates of the three-dimensional model points and the external parameters of the camera in the laser coordinate system into an imaging model of the camera to obtain coordinate values of the three-dimensional model points in each camera image plane coordinate system, thereby determining the mapping relation between the image and the point cloud model points, endowing the three-dimensional model points in the imaging range of the camera after mapping with gray values corresponding to image positions according to the relation, and embodying image characteristics on the three-dimensional model.
Furthermore, the implementation of step 5 is as follows,
based on the mapping relation generated in the step 4.2, a group of images with certain overlapping degree shot by a plurality of cameras at a certain moment are mapped to the three-dimensional model, each camera obtains a piece of colored point cloud, the group of colored point clouds are placed in the same space, the matching degree of the gray features of the overlapping area is compared, and a transformation matrix [ R ] is adjusted according to the matching condition of the overlapping area and the property of orthogonal transformation cl T cl ]And optimizing the calibration result.
And, for tunnel measurements.
The invention provides a device for calibrating the relative pose of a multi-camera and a laser radar system, which is used for executing the method for calibrating the relative pose of the multi-camera and the laser radar system.
In general, compared with the prior art, the above technical solution contemplated by the present invention can achieve the following beneficial effects:
the invention is based on a device composed of multiple cameras and a laser radar, obtains the relative pose relationship between the laser radar and the cameras, optimizes the problem of inaccurate relative pose relationship between the cameras, and plays a key role in image splicing and measurement.
Drawings
FIG. 1 is a schematic diagram of a multi-camera and lidar system according to an embodiment of the invention;
FIG. 2 is a schematic flow chart of a method provided by an embodiment of the present invention;
FIG. 3 is a schematic diagram of a calibration marker and its layout according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a coordinate system provided by an embodiment of the present invention;
fig. 5 is a schematic diagram for adjusting the relative pose between cameras according to the matching condition of the colored point cloud provided by the embodiment of the invention.
Detailed Description
The technical solutions of the present invention are described below with reference to the accompanying drawings and examples so that those skilled in the art can better understand the present invention.
Referring to fig. 2, an embodiment of the present invention provides a method for calibrating relative poses of multiple cameras and a laser radar, where multiple sensors are unified into one coordinate reference, and multiple camera external references obtained by independent calibration are integrally optimized by means of laser point cloud, and the specific process implementation includes the following steps:
(1) referring to FIG. 4, a world coordinate system O is established in the calibration yard w -X w Y w Z w
Preferably, step (1) comprises:
selecting proper origin position in calibration fieldAnd three coordinate axis directions, establishing a three-dimensional space coordinate system, enabling the point position coordinate values on the calibration frame to conform to actual space cognition and be easy to represent and calculate under the coordinate system, and defining the coordinate system as a world coordinate system O w -X w Y w Z w
(2) Respectively calibrating each sensor in the multi-camera and the laser radar system relative to the external reference of a world coordinate system;
preferably, step (2) comprises:
(2.1) referring to fig. 1, the sensor system to be calibrated in the present invention is composed of a plurality of cameras and a lidar, and is therefore referred to as a multi-camera and lidar system. In order to acquire a wider range of images in data acquisition, the cameras are radially arranged in the direction of the main optical axis, and the optical centers of the multiple cameras are as shown in the figure. In order to facilitate the mapping from the subsequent images to the point cloud, the laser radar is arranged on one side of the plane where the optical centers of the multiple cameras are located, and the distance between the laser radar and the plane is close to the plane. The method is limited by machining precision, and a certain error exists between the pose relationship among the sensors and the relationship designed in a drawing, so that the sensors need to be respectively calibrated and combined for optimization;
(2.2) referring to fig. 3, setting two types of calibration markers of a camera and a laser radar respectively, and pasting or welding the markers on an arc surface calibration frame in order to simulate a tunnel environment, so that the data of the two types of calibration marker areas can be acquired by the multi-camera and the laser radar system simultaneously when the multi-camera and the laser radar system work in a certain posture;
(2.3) collecting multiple groups of data of the calibration marker by using the laser radar and the multi-camera system carried on the mobile platform (2.1);
(2.4) measuring coordinates of the marker feature points for calibration in a world coordinate system by means of total station measurement, laser scanner scanning modeling and the like, wherein the marker feature points for calibration refer to pattern edge intersection points or space plane edge intersection points of which accurate positions are easy to select in the calibration markers;
(2.5) respectively acquiring the coordinates of the characteristic points of the calibration marker in the camera and the laser coordinate system by using a semi-automatic or automatic method, and calculating the conversion from the laser radar to the camera by using a calibration algorithmTransformation matrix [ R ] under boundary coordinate system lw T lw ]And [ R ] cw T cw ]Wherein R is lw 、T lw Respectively representing a rotation matrix and a translation vector R of the laser radar converted to a world coordinate system cw 、T cw Respectively representing a rotation matrix and a translation vector of the camera converted to a world coordinate system. In specific implementation, the calibration algorithm may adopt the prior art, and the present invention is not described in detail.
(3) Referring to fig. 4, a camera coordinate system O c -X c Y c Z c Conversion to laser coordinate system O l -X l Y l Z l Then, find the transformation matrix [ R ] of the conversion process cl T cl ]Wherein R is cl T cl Representing a rotation matrix and a translation vector converted from a camera coordinate system to a laser coordinate system;
preferably, step (3) comprises:
suppose [ X w Y w Z w ] T ,[X l Y l Z l ] T ,[X c Y c Z c ] T Respectively, the coordinate value of a certain point in the space in the world, the coordinate value of the laser radar and the camera coordinate system, and the transformation matrix [ R ] obtained by converting the simultaneous laser radar and the camera into the world coordinate system lw T lw ]And [ R ] cw T cw ]Expression (c):
Figure BDA0002450443210000051
Figure BDA0002450443210000052
obtaining a transformation matrix [ R ] for transforming the camera coordinate system to the laser coordinate system cl T cl ]There is the expression:
Figure BDA0002450443210000053
R cl =R lw T R cw
T cl =R lw T (T cw -T lw )
wherein R is cl 、T cl Respectively representing a rotation matrix and a translation vector when the camera coordinate system is converted into the laser coordinate system.
(4) Generating a three-dimensional model by using laser data, mapping a group of images with certain overlapping degree shot by a plurality of cameras at a certain moment to the three-dimensional model at a corresponding position according to a camera imaging model according to the relative position and posture relation of a camera coordinate system and the laser coordinate system, and attaching gray information to the point cloud to generate a colored point cloud;
preferably, step (4) comprises:
and (4.1) generating a three-dimensional model by using the laser radar raw data. The method is limited by a data generation mode, the original data of a plurality of laser radars on a mobile platform cannot or cannot directly reflect detailed three-dimensional space information, when problems of the type are encountered, the poses of a plurality of groups of laser scanning can be recovered by using methods such as POS (point of sale) and the like according to the data generation modes of different types of laser radars, the original data are divided into a plurality of groups of data at a plurality of moments, and then the data are spliced under the same coordinate reference, so that a scene three-dimensional model is recovered from the original data;
and (4.2) bringing the coordinates of the three-dimensional model points and the external parameters of the camera in the laser coordinate system into an imaging model of the camera to obtain coordinate values of the three-dimensional model points in each camera image plane coordinate system to obtain a mapping relation between the image and the point cloud model points, and giving gray values of corresponding image positions to the three-dimensional model points mapped in the imaging range of the camera according to the relation, namely attaching gray information to the three-dimensional model data originally only recording spatial position information (X, Y and Z) so as to embody image characteristics on the three-dimensional model. Compared with the situation that the resolution ratio is not uniform due to the fact that the distances from different regions of the object space to the main point of the camera in the original image are not uniform, the colored point cloud is generated by calculating the real space position of the object in the image and a strict imaging model of the camera, and the influence of nonuniform resolution ratio is eliminated.
(5) And optimizing the external parameters of the cameras by utilizing a certain strategy according to the characteristic matching degree between the coloring point cloud overlapping areas in the geometric splicing result to finally obtain more accurate external parameters of the multiple cameras under a laser coordinate system, thereby realizing the calibration of the relative pose of the multiple cameras and the laser radar system.
Preferably, step (5) comprises:
referring to fig. 5, based on the mapping relationship generated in (4.2), a group of images with a certain degree of overlap captured by a plurality of cameras at a certain time is mapped onto the three-dimensional model, each camera obtains a block of colored point cloud, the group of colored point clouds are placed in the same space, the matching degree of the gray features of the overlapping region is compared, and the transformation matrix [ R ] is adjusted according to the matching condition of the overlapping region and the property of orthogonal transformation cl T cl ]The calibration result can be optimized. For example, let the transformation matrix from the coordinate system of two adjacent cameras to the laser radar coordinate system be [ R ] cl1 T cl1 ]And [ R ] cl2 T cl2 ]Wherein R is cl1 T cl1 And R cl2 T cl2 Respectively representing the rotation matrix and the translation vector of the two cameras under the condition of converting the coordinate systems of the two cameras into the laser coordinate system, obtaining the gray characteristic matching degree of the coloring point cloud according to the mapping of the two cameras to the three-dimensional model, and converting the gray characteristic matching degree into R in the matrix cl1 And R cl2 By left-multiplying the rotation matrix generated by the desired angle of rotation transformation, or by varying T according to the value of the desired translation transformation cl1 And T cl2 The relative pose among multiple cameras can be indirectly optimized by adjusting the transformation matrix to realize the matching of the best gray-level homonymous features of the coloring point clouds of the adjacent cameras, wherein the numerical value of the required rotation translation transformation can be obtained by a semi-automatic or automatic matching mode.
While the invention has been described with respect to specific embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention.
In specific implementation, the above processes can be supported by computer software technology to realize automatic operation, and the device for operating the method should also be within the protection scope of the present invention.
The specific examples described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made or substituted in a similar manner to the specific embodiments described herein by those skilled in the art without departing from the spirit of the invention or exceeding the scope thereof as defined in the appended claims.

Claims (7)

1. A method for calibrating relative pose of a multi-camera and a laser radar system is characterized by comprising the following steps: comprises the following steps of (a) carrying out,
step 1, establishing a world coordinate system in a checking field;
step 2, calibrating external parameters of each sensor in the multi-camera and the laser radar system relative to a world coordinate system respectively;
step 3, converting the camera coordinate system to a laser coordinate system, and solving a transformation matrix in the conversion process;
step 4, generating a three-dimensional model by using laser data, mapping a group of images with certain overlapping degree shot by a plurality of cameras at a certain moment to the three-dimensional model at a corresponding position according to a camera imaging model according to the relative position and posture relation of a camera coordinate system and the laser coordinate system, attaching gray information to the point cloud, and generating a colored point cloud; step 4 comprises the sub-steps of,
step 4.1, generating a three-dimensional model by laser radar original data, recovering a plurality of groups of poses during laser scanning according to data generation modes of different types of laser radars, dividing the original data into a plurality of groups of data at a plurality of moments, and splicing the data to the same coordinate reference to realize recovery of the original data to obtain a scene three-dimensional model;
step 4.2, bringing the coordinates of the three-dimensional model points and the external parameters of the camera in the laser coordinate system into an imaging model of the camera to obtain coordinate values of the three-dimensional model points in each camera image plane coordinate system, thereby determining the mapping relation between the image and the point cloud model points, endowing the three-dimensional model points in the imaging range of the camera after mapping with gray values corresponding to image positions according to the relation, and embodying image characteristics on the three-dimensional model;
and 5, optimizing external parameters of the camera according to the characteristic matching degree between the coloring point cloud overlapping areas in the geometric splicing result, finally obtaining more accurate external parameters of the multi-camera under a laser coordinate system, and realizing the relative pose calibration of the multi-camera and a laser radar system.
2. The method for calibrating the relative pose of the multi-camera and the laser radar system according to claim 1, characterized in that: the step 1 is realized by selecting an original point position and three coordinate axis directions in a calibration field and establishing a three-dimensional space coordinate system, so that point position coordinate values on a calibration frame in the coordinate system conform to actual space cognition and are easy to express and calculate and are recorded as a world coordinate system O w -X w Y w Z w
3. The method for calibrating the relative pose of the multi-camera and the laser radar system according to claim 2, wherein: step 2 comprises the sub-steps of,
step 2.1, arranging a multi-camera to be checked and a laser radar system, wherein the multi-camera and the laser radar system are composed of a plurality of cameras and laser radars, the cameras are radially arranged in the direction of a main optical axis, and the installation positions of the laser radars are arranged on one side of the plane where the optical centers of the multi-camera are located;
step 2.2, two types of calibration markers of a camera and a laser radar are respectively arranged, and the markers are adhered or welded on an arc surface calibration frame, so that the data of the two types of calibration marker areas can be simultaneously acquired when the multi-camera and the laser radar system work in a certain posture;
step 2.3, collecting multiple groups of data of the calibration marker by using the laser radar and the multi-camera system carried on the mobile platform;
step 2.4, measuring coordinates of the characteristic points of the calibration marker in a world coordinate system, wherein the characteristic points of the calibration marker adopt pattern edge intersection points or space plane edge intersection points in the calibration marker;
step 2.5, respectively obtaining the coordinates of the characteristic points of the calibration marker in the camera and the laser coordinate system, and calculating the transformation matrix [ 2 ] when the laser radar and the camera are converted into the world coordinate systemR lw T lw ]And [ R ] cw T cw ]Wherein R is lw 、T lw Respectively representing a rotation matrix and a translation vector R of the laser radar converted into a world coordinate system cw 、T cw Respectively representing a rotation matrix and a translation vector of the camera converted to a world coordinate system.
4. The method for calibrating the relative pose of the multi-camera and the laser radar system according to claim 3, wherein: in step 3, a camera coordinate system O c -X c Y c Z c Conversion to laser coordinate system O l -X l Y l Z l Then, the transformation matrix [ R ] of the conversion process is obtained cl T cl ]The implementation mode is as follows,
transformation matrix R for converting simultaneous laser radar and camera into world coordinate system lw T lw ]And [ R ] cw T cw ]The expression (c) of (a),
Figure FDA0003608724360000021
Figure FDA0003608724360000022
obtaining a transformation matrix [ R ] for transforming the camera coordinate system to the laser coordinate system cl T cl ]There is an expression as follows,
Figure FDA0003608724360000023
R cl =R lw T R cw
T cl =R lw T (T cw -T lw )
wherein R is cl 、T cl Respectively representing a rotation matrix and a translation vector when the camera coordinate system is converted into the laser coordinate system.
5. The method for calibrating the relative pose of the multi-camera and the laser radar system according to claim 4, wherein: the implementation of step 5 is as follows,
based on the mapping relation generated in the step 4.2, a group of images with certain overlapping degree shot by a plurality of cameras at a certain moment are mapped to the three-dimensional model, each camera obtains a piece of colored point cloud, the group of colored point clouds are placed in the same space, the matching degree of the gray features of the overlapping area is compared, and a transformation matrix [ R ] is adjusted according to the matching condition of the overlapping area and the property of orthogonal transformation cl T cl ]And optimizing the calibration result.
6. The method for calibrating the relative pose of the multiple cameras and the laser radar system according to the claim 1, the claim 2, the claim 3, the claim 4 or the claim 5, wherein: the method is used for tunnel measurement.
7. The utility model provides a polyphaser and relative position appearance of laser radar system examine school device which characterized in that: for performing a method of relative pose calibration of a multi-camera and lidar system as defined in any of claims 1 to 6.
CN202010291167.3A 2020-04-14 2020-04-14 Method and device for calibrating relative pose of multi-camera and laser radar system Active CN111505606B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010291167.3A CN111505606B (en) 2020-04-14 2020-04-14 Method and device for calibrating relative pose of multi-camera and laser radar system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010291167.3A CN111505606B (en) 2020-04-14 2020-04-14 Method and device for calibrating relative pose of multi-camera and laser radar system

Publications (2)

Publication Number Publication Date
CN111505606A CN111505606A (en) 2020-08-07
CN111505606B true CN111505606B (en) 2022-08-05

Family

ID=71864125

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010291167.3A Active CN111505606B (en) 2020-04-14 2020-04-14 Method and device for calibrating relative pose of multi-camera and laser radar system

Country Status (1)

Country Link
CN (1) CN111505606B (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112254670B (en) * 2020-10-15 2022-08-12 天目爱视(北京)科技有限公司 3D information acquisition equipment based on optical scanning and intelligent vision integration
CN112489110A (en) * 2020-11-25 2021-03-12 西北工业大学青岛研究院 Optical hybrid three-dimensional imaging method for underwater dynamic scene
CN112362054B (en) * 2020-11-30 2022-12-16 上海商汤临港智能科技有限公司 Calibration method, calibration device, electronic equipment and storage medium
CN112561841A (en) * 2020-12-04 2021-03-26 深兰人工智能(深圳)有限公司 Point cloud data fusion method and device for laser radar and camera
CN112710235B (en) * 2020-12-21 2022-08-26 阿波罗智联(北京)科技有限公司 Calibration method and device of structured light measuring sensor
CN112581451B (en) * 2020-12-21 2024-02-23 北京科技大学 Warehouse coil position detection system and method based on laser radar
CN112598750B (en) * 2020-12-22 2024-05-10 阿波罗智联(北京)科技有限公司 Road side camera calibration method and device, electronic equipment and storage medium
CN112907727B (en) * 2021-01-25 2023-09-01 中国科学院空天信息创新研究院 Calibration method, device and system of relative transformation matrix
CN113075683B (en) * 2021-03-05 2022-08-23 上海交通大学 Environment three-dimensional reconstruction method, device and system
CN113109791B (en) * 2021-03-15 2022-11-22 上海交通大学 Vehicle-mounted laser radar calibration result evaluation system and method
CN113570671A (en) * 2021-06-30 2021-10-29 惠州市德赛西威智能交通技术研究院有限公司 Fisheye camera calibration method, storage medium and electronic device
CN113643382B (en) * 2021-08-22 2023-10-10 浙江大学 Method and device for acquiring dense colored point cloud based on rotary laser fusion camera
CN114157808B (en) * 2021-12-13 2022-11-29 北京国泰星云科技有限公司 Efficient container gate image acquisition system and method
CN114578328B (en) * 2022-02-24 2023-03-17 苏州驾驶宝智能科技有限公司 Automatic calibration method for spatial positions of multiple laser radars and multiple camera sensors
CN115082570B (en) * 2022-07-01 2024-03-19 中国科学院宁波材料技术与工程研究所 Calibration method for laser radar and panoramic camera
CN114994648B (en) * 2022-08-05 2022-11-08 聚时科技(深圳)有限公司 External parameter calibration method for 2D laser radar on linear motion mechanism
CN115222826B (en) * 2022-09-15 2022-12-27 深圳大学 Three-dimensional reconstruction method and device with changeable relative poses of structured light and camera
CN115220012A (en) * 2022-09-20 2022-10-21 成都睿芯行科技有限公司 Positioning method based on reflecting plate
CN115343299B (en) * 2022-10-18 2023-03-21 山东大学 Lightweight highway tunnel integrated detection system and method
CN117315018B (en) * 2023-08-31 2024-04-26 上海理工大学 User plane pose detection method, equipment and medium based on improved PnP
CN116973895B (en) * 2023-09-22 2023-12-12 山东科技大学 Real-time pose correction method for laser point cloud matching

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101192825B1 (en) * 2011-06-30 2012-10-18 서울시립대학교 산학협력단 Apparatus and method for lidar georeferencing based on integration of gps, ins and image at
CN104880215B (en) * 2015-06-05 2017-07-28 立得空间信息技术股份有限公司 The method of multisensor platform camera laser integration calibration
CN105758426B (en) * 2016-02-19 2019-07-26 深圳市杉川机器人有限公司 The combined calibrating method of the multisensor of mobile robot
CN109949372B (en) * 2019-03-18 2021-12-10 北京智行者科技有限公司 Laser radar and vision combined calibration method
CN110827199B (en) * 2019-10-29 2021-04-27 武汉大学 Tunnel image splicing method and device based on guidance of laser range finder

Also Published As

Publication number Publication date
CN111505606A (en) 2020-08-07

Similar Documents

Publication Publication Date Title
CN111505606B (en) Method and device for calibrating relative pose of multi-camera and laser radar system
Luhmann Close range photogrammetry for industrial applications
JP6282098B2 (en) Calibration apparatus and method
CN112733428B (en) Scanning attitude and coverage path planning method for optical measurement
CN108362205B (en) Space distance measuring method based on fringe projection
CN112991467B (en) Camera-based laser projection identification automatic guiding positioning and real-time correction method
CN112815843B (en) On-line monitoring method for printing deviation of workpiece surface in 3D printing process
CN113793270A (en) Aerial image geometric correction method based on unmanned aerial vehicle attitude information
CN111561868A (en) Method for realizing non-contact measurement of antenna profile by utilizing optical tracking structure optical scanner
US20230105991A1 (en) Method of imaging a wind turbine rotor blade
Mi et al. A vision-based displacement measurement system for foundation pit
CN110044266B (en) Photogrammetry system based on speckle projection
WO2018168757A1 (en) Image processing device, system, image processing method, article manufacturing method, and program
CN115187612A (en) Plane area measuring method, device and system based on machine vision
CN113375594A (en) Aircraft skin profile digital detection method
CN112665523B (en) Combined measurement method for complex profile
CN116592766A (en) Precise three-dimensional measurement method and device based on fusion of laser and monocular vision
CN113658270B (en) Method, device, medium and system for multi-vision calibration based on workpiece hole center
KR20240056516A (en) Method and system for generating camera model for camera calibration
CN112935562A (en) Laser precision machining method based on paraxial offline measurement
JP2012013593A (en) Calibration method for three-dimensional shape measuring machine, and three-dimensional shape measuring machine
CN114279326B (en) Global positioning method of three-dimensional scanning equipment
US20230333028A1 (en) Method and system for inspecting a surface with artifical intelligence assist
Yang et al. Beam orientation of EAST visible optical diagnostic using a robot-camera system
CN116593490B (en) Nondestructive testing method and system for surface defects of soft rubber mold of wing wallboard

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant