CN111325801B - Combined calibration method for laser radar and camera - Google Patents
Combined calibration method for laser radar and camera Download PDFInfo
- Publication number
- CN111325801B CN111325801B CN202010076301.8A CN202010076301A CN111325801B CN 111325801 B CN111325801 B CN 111325801B CN 202010076301 A CN202010076301 A CN 202010076301A CN 111325801 B CN111325801 B CN 111325801B
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- camera
- laser radar
- target
- cubic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000000034 method Methods 0.000 title claims abstract description 28
- 239000011159 matrix material Substances 0.000 claims abstract description 49
- 230000009466 transformation Effects 0.000 claims abstract description 6
- 239000013598 vector Substances 0.000 claims description 6
- 238000003384 imaging method Methods 0.000 claims description 4
- 230000000007 visual effect Effects 0.000 claims description 4
- 238000005259 measurement Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 3
- 238000005070 sampling Methods 0.000 description 2
- 241001421235 Radara Species 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
Abstract
The invention relates to a combined calibration method of a laser radar and a camera, which aims at a cubic target to carry out one-time shooting by a sensor, obtains three-dimensional point cloud of the cubic target by the shooting of the laser radar, and obtains a rotation matrix R from a laser radar coordinate system to a world coordinate system in which the cubic target is positionedLWAnd translation matrix TLW(ii) a Shooting by a camera to obtain an image of the cubic target, and obtaining a rotation matrix R from a camera coordinate system to a world coordinate system where the cubic target is locatedCWAnd translation matrix TCW(ii) a Finally, transformation of coordinate systems is carried out according to the two previous pairs of rotation and translation matrixes to obtain a rotation matrix R between the laser radar and the cameraLCAnd translation matrix TLC。
Description
Technical Field
The invention belongs to the technical field of sensor fusion calibration, and particularly relates to calibration of external parameters between a laser radar and a camera.
Background
The laser radar is based on the ToF (time of flight) principle, continuously sends laser pulses to a target, then receives light returned by an object by using a sensor, and achieves the purpose of detecting the distance by detecting the round-trip flight time of the emitted laser and the received laser. The vision measurement is to use a camera to collect a measurement image and use the accurate mapping relation between the image information and the geometric information in the object space to realize the measurement. By adopting the scheme of fusing the laser radar and the camera, the laser radar can accurately acquire the three-dimensional information of the object, and the camera can acquire the rich texture and color information of the object, so that the object information can be extracted to the greatest extent by combining the advantages of the laser radar and the camera.
The joint calibration of the laser radar and the vision sensor is the basis for accurate measurement of the sensor. External reference calibration of the laser radar and the vision sensor is divided into an online method and an offline method. The online calibration refers to calibrating the sensor in the using process of the system, and the offline calibration refers to calibrating the sensor before use. The online method may be used when the vehicle or robot cannot conveniently perform offline calibration. However, offline calibration methods may provide more accurate results where feasible.
Off-line calibration of the laser radar and the vision sensor generally adopts a Zhang calibration method, a planar checkerboard target is printed, the planar checkerboard target is placed at different positions and is shot by the laser radar and the camera respectively, and external parameters between the laser radar and the camera are found by fixing the relative position relationship between the laser radar and the camera into geometric constraint. However, this method requires repeatedly moving the position of the target, and measures are performed at a plurality of positions to obtain a more accurate calibration result, which is complicated to operate and inefficient.
In view of the defects of the traditional calibration method, the invention provides the three-dimensional calibration method between the laser radar and the camera, the external parameter matrix between the laser radar and the camera can be obtained only by one-time measurement, the operation is simple and convenient, and the efficiency is high.
Disclosure of Invention
The invention aims to solve the problem that the traditional laser radar and camera calibration efficiency is low, and provides a novel method for calibrating the laser radar and the camera. The technical scheme is as follows:
a combined calibration method for a laser radar and a camera includes the steps that a sensor is aligned to a cubic target to carry out shooting for one time, the laser radar shoots to obtain three-dimensional point cloud of the cubic target, and the three-dimensional point cloud is obtainedRotation matrix R from laser radar coordinate system to world coordinate system of cubic targetLWAnd translation matrix TLW(ii) a Shooting by a camera to obtain an image of the cubic target, and obtaining a rotation matrix R from a camera coordinate system to a world coordinate system where the cubic target is locatedCWAnd translation matrix TCW(ii) a Finally, transformation of coordinate systems is carried out according to the two previous pairs of rotation and translation matrixes to obtain a rotation matrix R between the laser radar and the cameraLCAnd translation matrix TLC. The method comprises the following steps:
(1) making a cubic box as a cubic target, three adjacent faces of the box being pi1、π2、π3Arranging the same black and white checkerboard targets, wherein the side length of each checkerboard and the number of the checkerboards are known;
(2) fixing the laser radar and the camera, placing the cubic target in the center of the visual fields of the laser radar and the camera, and enabling three surfaces with checkerboards to face the laser radar and the camera;
(3) shooting a target by using a laser radar to obtain a depth map of the target;
(4) converting the depth map into a three-dimensional point cloud under a laser radar coordinate system according to a pinhole imaging principle;
(5) fitting pi by random sampling consensus (RANSAC) algorithm1、π2、π3The equations of the three planes in the lidar coordinate system:
a1x+b1y+c1z+d1=0
a2x+b2y+c2z+d2=0
a3x+b3y+c3z+d3=0
get pi from the above equation1、π2、π3The direction of the unit normal vector is selected to be in accordance with the direction of a right-hand coordinate system:
(6) calculate out the plane pi1、π2、π3Coordinates of intersection point O:
(7) calculating a rotation matrix R from a laser radar coordinate system to a world coordinate systemLWAnd translation matrix TLW:
RLW=[Ln1,Ln2,Ln3]-1
TLW=-[LxO LyO LzO]T
(8) Shooting by a camera to obtain an image of a cubic target, and splitting the three checkerboard images into three parts, wherein each part only comprises one checkerboard image;
(9) calculating out plane pi according to Zhang's scaling method1、π2、π3Rotation matrix R to the camera coordinate system1、R2、R3And translation matrix T1、T2、T3;
(10) Calculate out the plane pi1、π2、π3A normal vector in a camera coordinate system;
Cn1=R1·(0,0,1)T
Cn2=R2·(0,0,1)T
Cn3=R3·(0,0,1)T
(11) computingOut-of-plane pi1、π2、π3Equation under camera coordinate system:
([x,y,z]-T1 T)·Cn1=0
([x,y,z]-T2 T)·Cn2=0
([x,y,z]-T3 T)·Cn3=0
(12) calculate out the plane pi1、π2、π3Coordinates of intersection O of (a) in the camera coordinate system:
(CxO,CyO,CzO)=[Cn1,Cn2,Cn3]-1[T1 TCn1,CT2 Tn2,CT3 Tn3]
(13) calculating a rotation matrix R from a camera coordinate system to a world coordinate systemCWAnd translation matrix TCW:
RCW=[Cn1,Cn2,Cn3]-1
TCW=-[CxO,CyO,CzO]T
(14) Calculating a rotation matrix R from a laser radar coordinate system to a camera coordinate systemLCAnd translation matrix TLC:
Drawings
FIG. 1 is a flow chart of the combined calibration of external parameters by a lidar and a camera adopted by the invention.
FIG. 2 is a schematic diagram of a cubic calibration target used in the present invention.
FIG. 3 is a schematic diagram of coordinate transformation from a laser radar coordinate system to a world coordinate system in a calibration process of the present invention.
FIG. 4 is a schematic diagram of coordinate system transformation from a camera coordinate system to a world coordinate system in the calibration process of the present invention.
Detailed Description
The invention provides a combined calibration method of a laser radar and a camera, wherein the laser radar and the camera are used for shooting a cubic target provided by the invention once respectively to obtain an external parameter matrix between the laser radar and the camera. The target is a cube, the side length does not need to be known, three identical checkerboard targets are respectively arranged on three adjacent faces of the cube, and the side length and the number of the checkerboard grids are known. The entire target is shown in figure 2.
Specifically, the calibration target used in the invention is a cubic target, wherein three adjacent faces of the target are pi1、π2、π3The black and white checkerboard targets are respectively arranged on the checkerboard target, the side length of each checkerboard target and the number of the checkerboards are known, and the rest checkerboards do not need to be specified in size.
According to the calibration method, in the shooting process by using the laser radar and the camera, the calibration target is required to be approximately in the center of the visual fields of the laser radar and the camera, and the camera can shoot three surfaces with the checkerboard target so as to obtain a better image.
The world coordinate system of the cubic target is a three-plane pi of a cube1、π2、π3Respectively as a space rectangular coordinate system OW-XWYWZWPlane X ofWOWYWPlane YWOWZWPlane ZWOWXWRight-hand space rectangular coordinate system.
The laser radar shoots the cubic target to obtain the depth map of the target, and the depth map can be converted into the cube according to the internal reference and small hole imaging principle of the laser radarA three-dimensional point cloud of the target. Fitting an equation of three surfaces of the cubic target in a laser radar coordinate system in the point cloud according to a random sampling consensus (RANSAC) algorithm, and solving a rotation matrix R between the laser radar coordinate system and a world coordinate system according to coordinate system transformationLWAnd translation matrix TLW。
Shooting a cubic target by a camera to obtain an image of the target, wherein the image simultaneously comprises pi of the cubic target1、π2、π3The checkerboard images of the three surfaces are divided into three parts, and each part only comprises one checkerboard image. According to the Zhang scaling method, pi can be obtained1、π2、π3The equations of the three planes in the camera coordinate system are transformed according to the coordinate system to obtain a rotation matrix R between the camera coordinate system and the world coordinate systemCWAnd translation matrix TCW。
Finally, according to the coordinate transformation, a rotation matrix R between the laser radar coordinate system and the camera coordinate system is obtainedLCAnd translation matrix TLC。
The specific process of calibration is as follows:
1. the laser radar and the camera are fixed well, and the laser radar and the camera cannot be moved in the calibration process.
2. Placing the cubic target in the center of the visual field of the laser radar and the camera to enable the laser radar and the camera to clearly shoot the target, and placing the three planes pi with the checkerboard target1、π2、π3The lens of the laser radar and the camera are aligned, so that clear images can be shot.
3. And opening the laser radar and the camera, and shooting the target once respectively to obtain stable data. The laser radar obtains a depth map of the target, and the camera obtains a color image of the target.
4. And converting the depth map shot by the laser radar into a three-dimensional point cloud of the target by using internal reference of the laser radar according to the pinhole imaging principle.
Wherein (A) and (B)LxP,LyP,LzP) In order to convert the coordinate of any point in the three-dimensional point cloud in the laser radar coordinate system,is the depth value of any pixel of the depth map, m and n are the positions of the depth map pixels, f is the focal length of the lidar, dxAnd dyIs the actual size, u, of each pixel of the lidar photosensitive chip0And v0Is the position of the optical center of the lidar.
5. In the three-dimensional point cloud of the target, fitting three planes pi of the target by using a random sample consensus (RANSAC) algorithm1、π2、π3And obtaining equations of the three planes in a laser radar coordinate system.
a1x+b1y+c1z+d1=0
a2x+b2y+c2z+d2=0
a3x+b3y+c3z+d3=0
6. Calculate out the plane pi1、π2、π3The direction of the unit normal vector is selected to be in accordance with the direction of the right-hand coordinate system. As shown in fig. 3.
7. Calculate out the plane pi1、π2、π3Coordinates of intersection of (1: (LxO,LyO,LzO)。
8. Calculating a rotation matrix R from a laser radar coordinate system to a world coordinate systemLWAnd translation matrix TLW。
RLW=[Ln1,Ln2,Ln3]-1
TLW=-[LxO LyO LzO]T
9. The camera shoots an image of a cubic target, wherein the image comprises three checkerboards, and the approximate area of each checkerboard is divided to obtain three checkerboard target images.
10. And detecting the pixel coordinates of all the corner points of the three checkerboard images by using a Harris corner point detection method.
11. Inputting the size and number of each checkerboard, camera internal parameters and all corner point pixel coordinates on each checkerboard into a function of the Zhang scaling method, and calculating the plane pi1、π2、π3Rotation matrix R to the camera coordinate system1、R2、R3And translation matrix T1、T2、T3。
12. Let plane Pi1、π2、π3All plane equations in the world coordinate system are Z-0, so unit normal vectors are nZ(0, 0, 1), calculatePlane pi1、π2、π3Is represented in the camera coordinate systemCn1,Cn2,Cn3。
Cn1=R1·(0,0,1)T
Cn2=R2·(0,0,1)T
Cn3=R3·(0,0,1)T
13. Calculate out the plane pi1、π2、π3Equations in the camera coordinate system.
14. The three equations are combined to calculate the plane pi1、π2、π3The intersection O of (a) is the coordinate in the camera coordinate system.
(CxO,CyO,CzO)=[Cn1,Cn2,Cn3]-1[T1 TCn1,CT2 Tn2,CT3 Tn3]
15. Calculating a rotation matrix R from a camera coordinate system to a world coordinate systemCWAnd translation matrix TCW。
RCW=[Cn1,Cn2,Cn3]-1
TCW=-[CxO,CyO,CzO]T
16. Calculating a rotation matrix R from a laser radar coordinate system to a camera coordinate systemLCAnd translation matrix TLC。
Claims (1)
1. A combined calibration method of a laser radar and a camera includes the steps that a sensor is aligned to a cubic target to conduct shooting for the first time, the laser radar shoots to obtain three-dimensional point cloud of the cubic target, and a rotation matrix R from a laser radar coordinate system to a world coordinate system where the cubic target is located is obtainedLWAnd translation matrix TLW(ii) a Shooting by a camera to obtain an image of the cubic target, and obtaining a rotation matrix R from a camera coordinate system to a world coordinate system where the cubic target is locatedCWAnd translation matrix TCW(ii) a Finally, transformation of coordinate systems is carried out according to the two previous pairs of rotation and translation matrixes to obtain a rotation matrix R between the laser radar and the cameraLCAnd translation matrix TLCThe method comprises the following steps:
(1) making a cubic box as a cubic target, three adjacent faces of the box being pi1、π2、π3Arranging the same black and white checkerboard targets, wherein the side length of each checkerboard and the number of the checkerboards are known;
(2) fixing the laser radar and the camera, placing the cubic target in the center of the visual fields of the laser radar and the camera, and enabling three surfaces with checkerboards to face the laser radar and the camera;
(3) shooting a target by using a laser radar to obtain a depth map of the target;
(4) converting the depth map into a three-dimensional point cloud under a laser radar coordinate system according to a pinhole imaging principle;
(5) get pi from the above equation1、π2、π3The direction of the unit normal vector is selected to be in accordance with the direction of a right-hand coordinate system:
(6) calculate out the plane pi1、π2、π3Coordinates of the intersection of (1):
(7) calculating a rotation matrix R from a laser radar coordinate system to a world coordinate systemLWAnd translation matrix TLW:
RLW=[Ln1,Ln2,Ln3]-1
TLW=-[LxO LyO LzO]T
(8) Shooting by a camera to obtain an image of a cubic target, and splitting the three checkerboard images into three parts, wherein each part only comprises one checkerboard image;
(9) calculating out plane pi according to Zhang's scaling method1、π2、π3Rotation matrix R to the camera coordinate system1、R2、R3And translation matrix T1、T2、T3;
(10) Calculate outPlane pi1、π2、π3A normal vector in a camera coordinate system;
Cn1=R1·(0,0,1)T
Cn2=R2·(0,0,1)T
Cn3=R3·(0,0,1)T
(11) calculate out the plane pi1、π2、π3Equation under camera coordinate system:
(12) calculate out the plane pi1、π2、π3Coordinates of intersection 0 under the camera coordinate system:
(13) calculating a rotation matrix R from a camera coordinate system to a world coordinate systemCWAnd translation matrix TCW:
RCW=[Cn1,Cn2,Cn3]-1
TCW=-[CxO,CyO,CzO]T
(14) Calculating the rotation of the laser radar coordinate system to the camera coordinate systemMatrix RLCAnd translation matrix TLC:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010076301.8A CN111325801B (en) | 2020-01-23 | 2020-01-23 | Combined calibration method for laser radar and camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010076301.8A CN111325801B (en) | 2020-01-23 | 2020-01-23 | Combined calibration method for laser radar and camera |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111325801A CN111325801A (en) | 2020-06-23 |
CN111325801B true CN111325801B (en) | 2022-03-15 |
Family
ID=71172847
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010076301.8A Expired - Fee Related CN111325801B (en) | 2020-01-23 | 2020-01-23 | Combined calibration method for laser radar and camera |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111325801B (en) |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111815716A (en) * | 2020-07-13 | 2020-10-23 | 北京爱笔科技有限公司 | Parameter calibration method and related device |
CN112365545B (en) * | 2020-10-15 | 2023-02-24 | 中国科学院宁波材料技术与工程研究所 | Calibration method of laser radar and visible light camera based on large-plane composite target |
CN112379352B (en) * | 2020-11-04 | 2022-01-14 | 广州文远知行科技有限公司 | Laser radar calibration method, device, equipment and storage medium |
CN112684250B (en) * | 2020-12-03 | 2022-03-18 | 华中科技大学 | Calibration method for high-power millimeter wave intensity measurement system |
CN112767492A (en) * | 2020-12-25 | 2021-05-07 | 江苏集萃智能光电系统研究所有限公司 | Railway wheel set size detection device and calibration method thereof |
CN112684424B (en) * | 2020-12-30 | 2022-08-16 | 同济大学 | Automatic calibration method for millimeter wave radar and camera |
CN112819903B (en) * | 2021-03-02 | 2024-02-20 | 福州视驰科技有限公司 | L-shaped calibration plate-based camera and laser radar combined calibration method |
CN113592957B (en) * | 2021-08-06 | 2023-07-11 | 北京易航远智科技有限公司 | Multi-laser radar and multi-camera combined calibration method and system |
CN113848541B (en) * | 2021-09-22 | 2022-08-26 | 深圳市镭神智能系统有限公司 | Calibration method and device, unmanned aerial vehicle and computer readable storage medium |
CN116091610B (en) * | 2021-11-08 | 2023-11-10 | 南京理工大学 | Combined calibration method of radar and camera based on three-dimensional tower type checkerboard |
CN116051659B (en) * | 2023-03-30 | 2023-06-13 | 山东科技大学 | Linear array camera and 2D laser scanner combined calibration method |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103049912A (en) * | 2012-12-21 | 2013-04-17 | 浙江大学 | Random trihedron-based radar-camera system external parameter calibration method |
CN103837869A (en) * | 2014-02-26 | 2014-06-04 | 北京工业大学 | Vector-relation-based method for calibrating single-line laser radar and CCD camera |
CN104567727A (en) * | 2014-12-24 | 2015-04-29 | 天津大学 | Three-dimensional target and global unified calibration method for linear structured light profile sensor |
CN104897142A (en) * | 2015-06-11 | 2015-09-09 | 湖北工业大学 | Three-dimensional target for binocular or multi-view vision dimension measuring |
CN107883870A (en) * | 2017-10-24 | 2018-04-06 | 四川雷得兴业信息科技有限公司 | Overall calibration method based on binocular vision system and laser tracker measuring system |
CN108010085A (en) * | 2017-11-30 | 2018-05-08 | 西南科技大学 | Target identification method based on binocular Visible Light Camera Yu thermal infrared camera |
CN108020826A (en) * | 2017-10-26 | 2018-05-11 | 厦门大学 | Multi-line laser radar and multichannel camera mixed calibration method |
CN108648232A (en) * | 2018-05-04 | 2018-10-12 | 北京航空航天大学 | A kind of binocular stereo visual sensor integral type scaling method based on accurate two-axis platcform |
CN109242915A (en) * | 2018-09-29 | 2019-01-18 | 合肥工业大学 | Multicamera system scaling method based on multi-face solid target |
CN110487213A (en) * | 2019-08-19 | 2019-11-22 | 杭州电子科技大学 | Full view line laser structured light three-dimensional image forming apparatus and method based on spatial offset |
-
2020
- 2020-01-23 CN CN202010076301.8A patent/CN111325801B/en not_active Expired - Fee Related
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103049912A (en) * | 2012-12-21 | 2013-04-17 | 浙江大学 | Random trihedron-based radar-camera system external parameter calibration method |
CN103837869A (en) * | 2014-02-26 | 2014-06-04 | 北京工业大学 | Vector-relation-based method for calibrating single-line laser radar and CCD camera |
CN104567727A (en) * | 2014-12-24 | 2015-04-29 | 天津大学 | Three-dimensional target and global unified calibration method for linear structured light profile sensor |
CN104897142A (en) * | 2015-06-11 | 2015-09-09 | 湖北工业大学 | Three-dimensional target for binocular or multi-view vision dimension measuring |
CN107883870A (en) * | 2017-10-24 | 2018-04-06 | 四川雷得兴业信息科技有限公司 | Overall calibration method based on binocular vision system and laser tracker measuring system |
CN108020826A (en) * | 2017-10-26 | 2018-05-11 | 厦门大学 | Multi-line laser radar and multichannel camera mixed calibration method |
CN108010085A (en) * | 2017-11-30 | 2018-05-08 | 西南科技大学 | Target identification method based on binocular Visible Light Camera Yu thermal infrared camera |
CN108648232A (en) * | 2018-05-04 | 2018-10-12 | 北京航空航天大学 | A kind of binocular stereo visual sensor integral type scaling method based on accurate two-axis platcform |
CN109242915A (en) * | 2018-09-29 | 2019-01-18 | 合肥工业大学 | Multicamera system scaling method based on multi-face solid target |
CN110487213A (en) * | 2019-08-19 | 2019-11-22 | 杭州电子科技大学 | Full view line laser structured light three-dimensional image forming apparatus and method based on spatial offset |
Non-Patent Citations (7)
Title |
---|
"Automatic Joint Calibration of odometry and sensor parameters";Lavish Arora等;《Proceedings of the Advances in Robotics 2019》;20190731;1-6 * |
"Extrinsic Calibration and Odometry for Camera-LiDAR Systems";Chenghao Shi等;《IEEE Access》;20190827;第7卷;120106 - 120116 * |
"Extrinsic Calibration of 2-D Lidars Using Two Orthogonal Planes";Dong-Geol Choi等;《IEEE TRANSACTIONS ON ROBOTICS》;20160228;第32卷(第1期);83-98 * |
"多相机系统快速标定算法研究";杨瑞华;《中国优秀硕士学位论文全文数据库 信息科技辑》;20180315(第03期);I138-1789 * |
"多面立体靶标的多相机标定方法研究";余寰;《中国优秀硕士学位论文全文数据库 信息科技辑》;20190115(第01期);I138-2782 * |
立体视觉和三维激光系统的联合标定方法;董方新等;《仪器仪表学报》;20171015(第10期);论文全文 * |
车载激光雷达外参数的标定方法;程金龙等;《光电工程》;20131215(第12期);论文全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN111325801A (en) | 2020-06-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111325801B (en) | Combined calibration method for laser radar and camera | |
CN110021046B (en) | External parameter calibration method and system for camera and laser radar combined sensor | |
US9443308B2 (en) | Position and orientation determination in 6-DOF | |
CN111735479A (en) | Multi-sensor combined calibration device and method | |
CN111369630A (en) | Method for calibrating multi-line laser radar and camera | |
CN108594245A (en) | A kind of object movement monitoring system and method | |
CN109559355B (en) | Multi-camera global calibration device and method without public view field based on camera set | |
CN112396664A (en) | Monocular camera and three-dimensional laser radar combined calibration and online optimization method | |
García-Moreno et al. | LIDAR and panoramic camera extrinsic calibration approach using a pattern plane | |
Chai et al. | A novel method for lidar camera calibration by plane fitting | |
CN113724337B (en) | Camera dynamic external parameter calibration method and device without depending on tripod head angle | |
CN208350997U (en) | A kind of object movement monitoring system | |
Liu et al. | Research on 3D reconstruction method based on laser rotation scanning | |
CN117115272A (en) | Telecentric camera calibration and three-dimensional reconstruction method for precipitation particle multi-angle imaging | |
Nguyen et al. | Calibbd: Extrinsic calibration of the lidar and camera using a bidirectional neural network | |
Jutzi et al. | Improved UAV-borne 3D mapping by fusing optical and laserscanner data | |
Chen et al. | A structured-light-based panoramic depth camera | |
Xu et al. | A real-time ranging method based on parallel binocular vision | |
Singandhupe et al. | Single frame lidar and stereo camera calibration using registration of 3d planes | |
Hamrouni et al. | A new method for stereo-cameras self-calibration in Scheimpflug condition | |
Liu et al. | Joint calibration of a camera and a laser range finder from grid calibration target | |
Abbasi et al. | A Step-By-Step Approach for Camera and Low-Resolution-3D-LiDAR Calibration | |
Coudron et al. | Multi-view tof fusion for object detection in industrial applications | |
Sergiyenko et al. | Multi-view 3D data fusion and patching to reduce Shannon entropy in Robotic Vision | |
Feng et al. | A general model and calibration method for spherical stereoscopic vision |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20220315 |
|
CF01 | Termination of patent right due to non-payment of annual fee |