CN111678534A - Combined calibration platform and method combining RGBD binocular depth camera, IMU and multi-line laser radar - Google Patents
Combined calibration platform and method combining RGBD binocular depth camera, IMU and multi-line laser radar Download PDFInfo
- Publication number
- CN111678534A CN111678534A CN201910246795.7A CN201910246795A CN111678534A CN 111678534 A CN111678534 A CN 111678534A CN 201910246795 A CN201910246795 A CN 201910246795A CN 111678534 A CN111678534 A CN 111678534A
- Authority
- CN
- China
- Prior art keywords
- sensor
- imu
- laser radar
- rgbd
- depth camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
Abstract
The invention discloses a combined calibration platform and a method combining an RGBD binocular depth camera, an IMU and a multi-line laser radar, wherein the multi-sensor combined calibration platform comprises a main body bracket, the RGBD binocular depth camera, the IMU and a buffer pad; the main body support is built by ABS materials and is of a trapezoidal structure, and the laser radar and the depth camera are respectively placed on the upper surface and the lower surface of the upper end face of the support. The combined calibration method calibrates the multi-sensor fusion platform according to the scheme. The combined calibration platform provided by the invention has a simple structure, and the provided calibration method can calibrate and fuse the original data of the sensor, is particularly suitable for small robots, has low cost and is easy to operate.
Description
Technical Field
The invention relates to the field of unmanned driving, in particular to an external parameter calibration device and method based on RGBD binocular depth camera, IMU (inertial measurement unit) and multiline laser radar data fusion.
Background
In the relevant research of the unmanned technology, the establishment of a high-precision map and the positioning of an unmanned vehicle are key links for promoting the maturity of the unmanned vehicle. The high-precision map not only provides centimeter-level positioning precision, but also needs to contain various environmental information. At present, sensors depending on automatic driving schemes at home and abroad mainly comprise a camera, an IMU (inertial measurement Unit), a laser radar and the like, and are used for sensing the surrounding environment and the self posture of the unmanned vehicle. In order to guarantee the mapping and positioning accuracy, strict requirements are imposed on data fusion of various sensors, and the error parameters of the sensors, including the distortion of a camera, the zero offset of an IMU, noise and the like, need to be considered. The combined calibration not only can offset the error of the sensor, but also provides convenience for data fusion, and has important significance for unmanned vehicle decision making.
At present, unmanned vehicle solutions at home and abroad are mostly researched and developed on the basis of automobiles, expensive sensor equipment is distributed on the whole vehicle, and the unmanned vehicle mainly runs in environments such as a motorway. With the development of automatic driving, the unmanned requirement of various specific scenes, such as small unmanned logistics distribution vehicles of a garden, is gradually shown. The small size of the unmanned vehicle places greater constraints on the size of the sensor solution and requires adaptation to lower cost sensors, which needs to be met through data processing and fusion. The data of several sensors are taken as reference and have certain errors, so that calibration and calibration processing of the sensors are necessary before the data are fused.
Disclosure of Invention
The invention provides a combined calibration platform and a method combining an RGBD binocular depth camera, an IMU and a multi-line laser radar. The main application scene of the scheme is the small-size unmanned vehicle, can be suitable for various road environments, is a fusion scheme with good adaptability, and can be conveniently transplanted to various unmanned vehicle platforms.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows:
the utility model provides an unmanned vehicle multisensor fuses platform, includes the monolith support, RGBD binocular camera, inertial measurement unit and multi-thread laser radar. The integral support is distributed in a trapezoidal shape, is made of ABS engineering plastics and is integrally formed by using a 3D printer; the top end of the bracket is provided with a mounting hole position for fixedly placing a laser radar and a binocular camera; the IMU is fixed at the bottom of the bracket by being placed on the shock absorption pad, and the shock absorption pad is made of silicon rubber. The support and the fixing material have certain damping characteristics, and disorder fluctuation of sensor data caused by vibration of the unmanned vehicle when the unmanned vehicle runs on a rugged road is reduced.
Preferably, the height of the bracket is controlled within 200mm, the included angle between the inclined plane and the bottom surface is less than 60 degrees, and the upper end surface of the bracket is slightly larger than the size of the laser radar so as to ensure the stability of the whole structure.
Preferably, the bottom of the laser radar is padded with a silicon rubber buffer pad, so that the influence of mechanical vibration in the radar on other sensors is reduced.
Preferably, the IMU, the camera and the installation center of the laser radar are placed in the same vertical plane, so that the data of each sensor can be better fused when the posture is changed.
Preferably, the IMU core is a nine-axis unit, i.e. comprising an accelerometer, a gyroscope and a magnetometer, the multi-axis IMU being able to more accurately estimate the sensor attitude and output the absolute attitude from the magnetometer. More accurate noise and zero offset can also be obtained during IMU correction.
Preferably, the camera adopts an RGBD binocular depth camera, the output image contains color and depth information, and a three-dimensional map can be constructed.
As an optimization, the radar adopts a 16-line laser radar, can provide abundant environment point cloud images, can have higher resolution by being fused with a camera, is easy to analyze characteristic points in the environment, and further improves the accuracy of mapping and positioning.
Drawings
The invention is further illustrated by the following figures and examples. The drawings in the following description are only some embodiments of the invention, and other drawings can be obtained by those skilled in the art from the contents of the embodiments of the invention and the drawings without inventive effort.
Fig. 1 is a schematic structural diagram of a multi-sensor combined calibration platform provided by the invention.
In the figure: 1. a multiline laser radar; RGBD binocular depth camera; 3. a main body support; 4. a cushion pad; IMU.
Detailed Description
As shown in FIG. 1, a multi-sensor combined calibration platform can be carried on any mobile transport vehicle. The platform includes: the main part support is used for forming an equipment installation position. And the RGBD binocular depth camera is used for acquiring dense point cloud with color information. And the IMU is used for predicting the motion trail of the system. And the laser radar is used for acquiring more accurate sparse point cloud. The method comprises the following specific steps:
A. firstly, correcting an IMU error, standing the IMU for 120 minutes, collecting data of an accelerometer, a gyroscope and a magnetometer, calculating the Allen variance to obtain a corresponding zero offset error, and then fitting an Allen variance curve to obtain a white noise parameter according to the slope of the curve.
B. Data of the IMU and the camera are collected and calibrated, and the specific process is as follows:
b 1: a standard checkerboard calibration board is prepared for camera internal reference calibration and motion attitude estimation.
b 2: the depth camera is set to be in a low frame rate mode (10Hz), and the picture is prevented from being cracked in motion. Data of the camera and the IMU are acquired synchronously.
b 3: and moving the support, acquiring data from a multi-angle direction to the calibration plate, and calibrating internal parameters of the camera at first. After the camera data are corrected according to the internal parameters, the postures of the cameras in different pictures are estimated, and the postures of the cameras are compared with the trajectories estimated by the imu to obtain calibration parameters of the cameras and the imu.
C. Data of a laser radar and a camera are collected and calibrated, and the specific process is as follows:
c 1: and collecting calibration plate data of the laser radar and the camera from multiple angles and multiple scales, correcting the camera data, and removing miscellaneous points in the laser point cloud.
c 2: and selecting points on the calibration plate from the series of laser point clouds, and comparing the points with the calibration plate identified in the camera to obtain calibration parameters.
c 3: and c2 is repeated, and the calibration parameters are updated iteratively until the calibration parameters are converged, so that the calibration parameters of the camera and the laser radar are obtained.
D. The calibration parameters of the whole sensor platform can be obtained by integrating the two groups of calibration parameters, and the internal error is corrected.
E. And data of the three sensors are fused to complete accurate matching of the sensors in a three-dimensional space, so that point cloud, images and position and attitude information of the platform in the space are obtained.
Furthermore, the fusion platform can complete scanning of the environment model, and output data information in real time, so that more accurate basis is provided for requirements of construction, positioning and the like of the three-dimensional map.
Claims (8)
1. A combined calibration platform and method combining an RGBD binocular depth camera, an IMU and a multiline laser radar comprise: the system comprises a multi-line laser radar (1), an RGBD binocular depth camera (2), a main body support (3), a silicon rubber buffer (4) and an IMU (inertial measurement Unit) (5); the main body support is built by ABS material, is the trapezium structure, and there is the installation hole site of radar and camera the upper end, places the silicon rubber blotter between main body support and the sensor.
2. The multi-sensor joint calibration platform of claim 1, wherein: the height of the main body support (3) is controlled within 200mm, and the included angle between the inclined plane and the bottom surface is less than 60 degrees.
3. The multi-sensor joint calibration platform according to claim 1 or 2, wherein: the upper end face of the main body support (3) is slightly larger than the size of the laser radar.
4. The multi-sensor joint calibration platform of claim 1, wherein: the bottom of the multi-line laser radar (1) is padded with a silicon rubber cushion pad (4).
5. The multi-sensor joint calibration platform of claim 1, wherein: and a silicon rubber buffer pad (4) is padded at the bottom of the IMU (5).
6. The multi-sensor joint calibration platform of claim 1, wherein: the installation centers of the multi-line laser radar (1), the RGBD binocular depth camera (2) and the IMU (5) are placed in the same vertical plane.
7. A method of multi-sensor calibration, characterized by using the multi-sensor platform according to claims 1-6 for sensor error calibration and joint calibration.
8. A multi-sensor fusion data acquisition method, characterized in that the multi-sensor platform according to claims 1-6 is used for map data acquisition and unmanned vehicle positioning.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910246795.7A CN111678534A (en) | 2019-03-11 | 2019-03-11 | Combined calibration platform and method combining RGBD binocular depth camera, IMU and multi-line laser radar |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910246795.7A CN111678534A (en) | 2019-03-11 | 2019-03-11 | Combined calibration platform and method combining RGBD binocular depth camera, IMU and multi-line laser radar |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111678534A true CN111678534A (en) | 2020-09-18 |
Family
ID=72433195
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910246795.7A Pending CN111678534A (en) | 2019-03-11 | 2019-03-11 | Combined calibration platform and method combining RGBD binocular depth camera, IMU and multi-line laser radar |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111678534A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112529965A (en) * | 2020-12-08 | 2021-03-19 | 长沙行深智能科技有限公司 | Calibration method and device for laser radar and monocular camera |
CN112577517A (en) * | 2020-11-13 | 2021-03-30 | 上汽大众汽车有限公司 | Multi-element positioning sensor combined calibration method and system |
CN112598757A (en) * | 2021-03-03 | 2021-04-02 | 之江实验室 | Multi-sensor time-space calibration method and device |
CN117437290A (en) * | 2023-12-20 | 2024-01-23 | 深圳市森歌数据技术有限公司 | Multi-sensor fusion type three-dimensional space positioning method for unmanned aerial vehicle in natural protection area |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108073167A (en) * | 2016-11-10 | 2018-05-25 | 深圳灵喵机器人技术有限公司 | A kind of positioning and air navigation aid based on depth camera and laser radar |
CN207408593U (en) * | 2017-09-11 | 2018-05-25 | 深圳灵喵机器人技术有限公司 | A kind of hand-held synchronous superposition equipment |
CN109029433A (en) * | 2018-06-28 | 2018-12-18 | 东南大学 | Join outside the calibration of view-based access control model and inertial navigation fusion SLAM on a kind of mobile platform and the method for timing |
CN109188458A (en) * | 2018-07-25 | 2019-01-11 | 武汉中海庭数据技术有限公司 | A kind of traverse measurement system based on double laser radar sensor |
CN109270534A (en) * | 2018-05-07 | 2019-01-25 | 西安交通大学 | A kind of intelligent vehicle laser sensor and camera online calibration method |
CN109345596A (en) * | 2018-09-19 | 2019-02-15 | 百度在线网络技术(北京)有限公司 | Multisensor scaling method, device, computer equipment, medium and vehicle |
CN109341706A (en) * | 2018-10-17 | 2019-02-15 | 张亮 | A kind of production method of the multiple features fusion map towards pilotless automobile |
-
2019
- 2019-03-11 CN CN201910246795.7A patent/CN111678534A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108073167A (en) * | 2016-11-10 | 2018-05-25 | 深圳灵喵机器人技术有限公司 | A kind of positioning and air navigation aid based on depth camera and laser radar |
CN207408593U (en) * | 2017-09-11 | 2018-05-25 | 深圳灵喵机器人技术有限公司 | A kind of hand-held synchronous superposition equipment |
CN109270534A (en) * | 2018-05-07 | 2019-01-25 | 西安交通大学 | A kind of intelligent vehicle laser sensor and camera online calibration method |
CN109029433A (en) * | 2018-06-28 | 2018-12-18 | 东南大学 | Join outside the calibration of view-based access control model and inertial navigation fusion SLAM on a kind of mobile platform and the method for timing |
CN109188458A (en) * | 2018-07-25 | 2019-01-11 | 武汉中海庭数据技术有限公司 | A kind of traverse measurement system based on double laser radar sensor |
CN109345596A (en) * | 2018-09-19 | 2019-02-15 | 百度在线网络技术(北京)有限公司 | Multisensor scaling method, device, computer equipment, medium and vehicle |
CN109341706A (en) * | 2018-10-17 | 2019-02-15 | 张亮 | A kind of production method of the multiple features fusion map towards pilotless automobile |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112577517A (en) * | 2020-11-13 | 2021-03-30 | 上汽大众汽车有限公司 | Multi-element positioning sensor combined calibration method and system |
CN112529965A (en) * | 2020-12-08 | 2021-03-19 | 长沙行深智能科技有限公司 | Calibration method and device for laser radar and monocular camera |
CN112598757A (en) * | 2021-03-03 | 2021-04-02 | 之江实验室 | Multi-sensor time-space calibration method and device |
CN117437290A (en) * | 2023-12-20 | 2024-01-23 | 深圳市森歌数据技术有限公司 | Multi-sensor fusion type three-dimensional space positioning method for unmanned aerial vehicle in natural protection area |
CN117437290B (en) * | 2023-12-20 | 2024-02-23 | 深圳市森歌数据技术有限公司 | Multi-sensor fusion type three-dimensional space positioning method for unmanned aerial vehicle in natural protection area |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111678534A (en) | Combined calibration platform and method combining RGBD binocular depth camera, IMU and multi-line laser radar | |
CN110243358B (en) | Multi-source fusion unmanned vehicle indoor and outdoor positioning method and system | |
CN110221332B (en) | Dynamic lever arm error estimation and compensation method for vehicle-mounted GNSS/INS integrated navigation | |
CN106767752B (en) | Combined navigation method based on polarization information | |
US6778928B2 (en) | Method of calibrating a sensor system | |
CN110361010B (en) | Mobile robot positioning method based on occupancy grid map and combined with imu | |
GREJNER‐BRZEZINSKA | Direct exterior orientation of airborne imagery with GPS/INS system: Performance analysis | |
EP1972893A1 (en) | System and method for position determination | |
CN111380514A (en) | Robot position and posture estimation method and device, terminal and computer storage medium | |
CN108759815B (en) | Information fusion integrated navigation method used in global visual positioning method | |
CN111750853A (en) | Map establishing method, device and storage medium | |
CN110617795B (en) | Method for realizing outdoor elevation measurement by using sensor of intelligent terminal | |
CN106441372B (en) | A kind of quiet pedestal coarse alignment method based on polarization with gravitation information | |
CN112093065B (en) | Surveying and mapping scanning equipment based on unmanned aerial vehicle technology | |
CN111504323A (en) | Unmanned aerial vehicle autonomous positioning method based on heterogeneous image matching and inertial navigation fusion | |
CN112611361A (en) | Method for measuring installation error of camera of airborne surveying and mapping pod of unmanned aerial vehicle | |
KR102494006B1 (en) | System and method for dynamic stereoscopic calibration | |
CN107316280A (en) | Li Island satellite image RPC models high accuracy geometry location method | |
CN109470274B (en) | Vehicle-mounted photoelectric theodolite vehicle-mounted platform deformation measurement system and method | |
CN113311452B (en) | Positioning method and system based on multiple sensors | |
CN110068325A (en) | A kind of lever arm error compensating method of vehicle-mounted INS/ visual combination navigation system | |
KR101183866B1 (en) | Apparatus and method for real-time position and attitude determination based on integration of gps, ins and image at | |
CN109945785A (en) | A kind of platform inclination angle and height method for real-time measurement and system | |
Eugster et al. | Integrated georeferencing of stereo image sequences captured with a stereovision mobile mapping system–approaches and practical results | |
CN108955683A (en) | Localization method based on overall Vision |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20200918 |
|
WD01 | Invention patent application deemed withdrawn after publication |