CN104574406B - A kind of combined calibrating method between 360 degree of panorama laser and multiple vision systems - Google Patents

A kind of combined calibrating method between 360 degree of panorama laser and multiple vision systems Download PDF

Info

Publication number
CN104574406B
CN104574406B CN201510021270.5A CN201510021270A CN104574406B CN 104574406 B CN104574406 B CN 104574406B CN 201510021270 A CN201510021270 A CN 201510021270A CN 104574406 B CN104574406 B CN 104574406B
Authority
CN
China
Prior art keywords
laser
calibration
dimensional
points
black
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510021270.5A
Other languages
Chinese (zh)
Other versions
CN104574406A (en
Inventor
闫飞
庄严
金鑫彤
王伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Original Assignee
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of Technology filed Critical Dalian University of Technology
Priority to CN201510021270.5A priority Critical patent/CN104574406B/en
Publication of CN104574406A publication Critical patent/CN104574406A/en
Application granted granted Critical
Publication of CN104574406B publication Critical patent/CN104574406B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a kind of 360 degree of panorama laser and the combined calibrating method of multiple vision systems, belong to the autonomous technical field of environmental perception of mobile robot.Maximum innovative point of the invention is, as calibration facility, to be demarcated while the quick multiple vision systems of realization are with panorama laser using simple black paperboard.For this present invention can make the laser beam for being irradiated to its surface have relatively low reflectivity Characteristics based on black paperboard, by generating reflected value figure, scope filtering, binaryzation, point clustering processing, go out to belong to the characteristic point of caliberating device from the three-dimensional laser extracting data of collection, and using the noise object in plane extracting method filtering environmental, and then the three-dimensional feature point based on laser data and the two dimensional character point based on view data are solved using iterative optimization method, so as to obtain the spin matrix between sensor and translation matrix.The present invention lays the foundation for multi-sensor information fusion, can be used on the fields such as mobile robot scene reconstruction.

Description

Joint calibration method between 360-degree panoramic laser and multiple vision systems
Technical Field
The invention belongs to the technical field of environmental perception, relates to data fusion between a three-dimensional laser ranging system and a plurality of vision systems, and particularly relates to a combined calibration method between the three-dimensional laser ranging system and the plurality of vision systems.
Background
In a complex scene, a single sensor cannot meet task requirements such as environment perception and scene understanding, and therefore data matching and fusion among multiple sensors are necessary means for improving performance of environment perception and scene understanding, and joint calibration among multiple sensors is a key step of the method. Currently, there are many external reference calibration methods for three-dimensional laser and monocular vision, and the most common method is to use a black and white calibration plate to calibrate external parameters between the three-dimensional laser and the vision system (joint J H, ankh, Kang J W, et al 3D environment orientation using modified color ICPalgorithm by fusion of a camera and a 3D laser range finder [ C ], IEEE/rs theoretical Conference on illumination Robots and Systems (IROS),2009:3082 · 3088). In order to facilitate the three-dimensional laser system to acquire Calibration characteristics, the literature (garci-Moreno a I, Gonzalez-Barbosa J, oranela-Rodriguez F J, et al, lidar and photonic Camera empirical Calibration apparatus Using a Pattern Plane [ M ], Pattern registration, spring Berlin Heidelberg,2013:104-113.) uses the edge corners of a plurality of diamond-shaped holes as characteristics to calibrate the parameters of the three-dimensional laser radar and the panoramic Camera, but due to the influence of the stimulated light spot edge effect, the acquisition of the characteristics is not particularly accurate, and the stability of the Calibration corners cannot be guaranteed. Furthermore, the literature (ThomasJ Osgood, Yingping Huang. calibration of laser scanner and camera fusion system for the use of inter-element vehicles like Nelder-mean optimization [ J ], measuring science & Technology,2013, vol.24, No.3, pp:1-10.) uses a circular white card suspended in the air as a calibration object for external referencing, which, while facilitating the determination of the position of the feature object in the three-dimensional laser data and images, requires an open experimental environment for calibration and because the volume of the calibration object is relatively small, amplifies the influence of the laser dot edge effect on the three-dimensional coordinates of the calibration object, affecting the calibration accuracy. Patent (solemn, wangwei, chendong, yanshenpeng, university of great graduate, a method for automatic calibration between three-dimensional laser and monocular vision, patent No. 200910187344) proposes the use of a black and white calibration plate with holes for automatic calibration between three-dimensional laser and monocular vision, on which two kinds of information are available: color information and hole information. The color information is acquired by a camera, and intersection points (namely feature points) of black and white lattices are detected by a related corner detection algorithm provided in OpenCV; the hole information is obtained through a laser range finder, intersection points of black and white grids are found in laser data through a correlation algorithm, and then feature points in two scenes are corresponded to complete scene matching. However, the method has certain limitations, requires that the distance between the camera and the laser range finder is small, is not suitable for the situation that the camera and the laser range finder are far away, and can only be suitable for calibrating the laser and a single vision system.
Disclosure of Invention
The invention aims to solve the problems that the automatic combined calibration between the three-dimensional laser ranging system and the plurality of vision systems can conveniently realize the external parameter calculation of the plurality of systems only by one-time data acquisition, solve the limitation that the laser ranging system and the vision systems cannot be too far away, reduce the requirements of the calibration method on a calibration object and a calibration environment, reduce the influence of the edge effect of a laser spot on a calibration result and enhance the practicability and the accuracy of the calibration method.
The technical scheme of the invention is as follows:
1. design of three-dimensional panoramic laser characteristic analysis and calibration device
The three-dimensional panoramic laser system used by the invention is composed of a two-dimensional laser sensor and a rotating tripod head with a stepping motor, wherein the rotating tripod head rotates on a horizontal plane, a scanning plane of the two-dimensional laser sensor is in a fan shape and is vertical to the rotating plane of the tripod head, each group of laser data simultaneously comprises distance measurement data and reflection value data, and the two data correspond to each other one by one. Because the laser sensor has a certain measurement error, and the laser data at the edge of the object is influenced by the edge effect, when the distance between the foreground and the rear scene body is relatively close, the data of the foreground and the rear scene body are difficult to be accurately distinguished according to the ranging data; the reflection value data is simultaneously influenced by various object attributes such as object material, color and the like, is not limited by the distance between objects, and can conveniently realize the division of various object data.
Aiming at the characteristics of the three-dimensional panoramic laser, a combined calibration device of the three-dimensional panoramic laser and a plurality of vision systems is designed, the selected material is black paperboard, and the uniform shape and specification are cut (as shown in figure 1). Experiments prove that the laser beam irradiated on the surface of the black paperboard can have lower reflectivity by selecting the black paperboard, a uniform shape is designed for the calibration device, the subsequent characteristic points can be conveniently extracted, and the robustness of a calibration algorithm and the accuracy of a calibration result can be ensured. In order to obtain the best calibration effect, the size of the calibration black paper can be properly adjusted according to the visible range of the camera and the density degree of the laser data.
2. Extraction of characteristic points of calibrated required laser data
The characteristic point of laser data to be extracted in calibration is the central point of the calibration black paper, and firstly, a three-dimensional laser spot projected on the calibration black paper needs to be extracted. The extraction of the calibration black paper data is divided into two links of pre-detection and denoising, wherein the pre-detection link searches the serial number of the laser spot on the calibration black paper from the two-dimensional reflection value graph by using the reflection value characteristic of the black paperboard, and determines the three-dimensional coordinate of the laser spot. And the denoising link analyzes and processes the acquired three-dimensional laser data, and removes the three-dimensional laser data which is detected by mistake so as to ensure the calibration precision.
The calibration black paper pre-detection method based on laser data comprises the following steps:
1) and generating a binary reflection value map. And generating a two-dimensional picture by taking the total data group number and the laser point number of each group of data as pixel points in the x direction and the y direction, so that one laser point corresponds to one image pixel. The gray value of each pixel point is obtained by performing gray processing on the picture by using a formula (1) according to the size of the reflection value of the laser point, so as to obtain a two-dimensional reflection value image corresponding to the three-dimensional laser point cloud (as shown in fig. 5 (a)).
Wherein d isiAnd giRespectively the reflection value and the grey value of the laser spot i, dmaxAnd dminAre the maximum and minimum reflection values of all laser spots.
In order to facilitate processing and avoid more interference, laser data within a certain range from the three-dimensional panoramic laser where the calibration black paper is located is selected, and the range can be properly adjusted according to the placement position of the calibration black paper (as shown in fig. 5 (b)). And (3) carrying out binarization processing on the reflection value map:
wherein g isi' is the pixel gray value after binarization,is the mean value of the gray levels, k, of all the pixelsgThe threshold value is adjusted for the gray scale, and the binarization effect is influenced. After binarization, the gray values of the black paper area data are all 0, and only black and white are left in the image, so that the black paper area is clearer (as shown in fig. 5 (c)).
2) And clustering the black pixel points.
The generated binary image has some black miscellaneous points, and in order to determine the laser points belonging to each calibration black paper and eliminate interference, a neighborhood search method is adopted to cluster the black pixel points in the binary gray image. The clustering process is as follows:
and (3) clustering algorithm:
through the clustering algorithm, the black pixel points are clustered into a plurality of pixel clusters. Because the area occupied by the calibration black paper in the binary image is large, whether the calibration black paper is the calibration black paper can be determined by judging the size of the pixel cluster, if the number of the pixel cluster is less than a certain range, the calibration black paper is judged to be a mixed point cluster, and then the gray value of all the points in the cluster is changed into 255. Through the above process, only the calibration black paper remains in the reflection value map (as shown in fig. 5 (d)).
Because the laser ranging data and the reflection value data correspond one to one in order, the image index (x) of each pixel in the clustered pixel cluster is usedi,yi) Using the formula m × xi+yiCorresponding ranging data serial numbers can be obtained, wherein m is the number of laser points of each group of laser data, and therefore each pixel cluster can find a stress light spot cluster.
The algorithm completes the initial detection of the laser point on the calibrated black paper. The detection method has the advantages that: the reflection value of the laser is used as a detection basis, the black paper is calibrated without being far away from the background environment, and the requirement on the calibration environment is reduced; and clustering is performed on the basis of the two-dimensional reflection value graph, and the one-dimensional difficulty is reduced compared with that of a three-dimensional laser point.
3) Denoising algorithm for three-dimensional laser characteristic points
Due to the influence of interference objects in the environment, the detected laser point clusters do not all correspond to the calibration black paper, so that the laser point clusters obtained by primary detection need to be verified by the following given algorithm, the influence of environmental noise is removed, and the calibration precision is ensured.
Because the reflection value of the laser is used as the detection basis, objects with color and material similar to the calibrated black paper in the environment are mainly interfered. For the main interference, a flatness evaluation method is used for distinguishing a calibration object from a non-calibration object. Because the laser points belonging to the calibration object are completely on one plane, the non-planar calibration object can be distinguished by judging that the planes formed by the laser points are all flat. Setting a cluster of laser spots to include p ═ p1,p2,...,pN]And N laser points in total, the coordinate covariance matrix of the laser points is as follows:
wherein,is the laser spot midpoint. Three eigenvalues of the covariance matrix are calculated: lambda [ alpha ]0、λ1、λ2When the minimum eigenvalue satisfies lambda2<<λ1≈λ0The laser spots are clustered into planar rows. In this case, two eigenvalues λ may be used2And λ0The flatness of the formed plane is evaluated by the ratio of (a) to (b), and when the ratio is less than a given threshold, the laser point cluster is considered to belong to a calibration object:
wherein k isλThe threshold value is evaluated for flatness.
In addition, when the laser beam is over-projected to the glass mirror surface, the reflection value of the laser is small due to the combined action of reflection and refraction, and under the condition, the laser beam is difficult to be effectively distinguished from calibration black paper in a gray scale image, and the interference can be started from the characteristics of the laser sensor. After passing through the glass mirror surface, the laser beam is reflected back by hitting an obstacle, and the actual measurement distance is far larger than the distance between the glass mirror surface and the laser sensor. Therefore, when the distance between the center points of the laser spot cluster is smaller than a certain threshold, the spot cluster is considered as a calibration object:
wherein Andis the laser spot midpointThree-dimensional coordinates of (a), kdIs a distance threshold.
3. Iterative computation of transformation relationships between three-dimensional laser data to two-dimensional visual data
Because the visual range of the visual sensor is limited and is far smaller than the measurement range of the three-dimensional laser sensor, when calibration is carried out, a background with chromatic aberration with calibration black paper can be designed, then corresponding pixels of the black paperboard in an image can be extracted by utilizing pixel gray information, pixel points are segmented by taking the black pixel point clustering method as reference, and average image coordinates of all pixels of each pixel cluster are used as image feature points.
And solving the projective transformation from the three-dimensional space to the two-dimensional space by adopting an iterative optimization method according to the obtained matching pair of the characteristic center point of the laser data and the characteristic angular point of the visual data, wherein the external parameter optimization solving is carried out by adopting a Gauss-Newton iteration method. For ease of calculation, both image and laser feature points will be represented by homogeneous coordinates. Definition ofFor two-dimensional homogeneous coordinate vectors, p, of image feature pointsi=[xi,yi,zi,1]The coordinate vector of the pixel point in the corresponding image after the laser point is converted isThe purpose of calibration is to calculate a set of transformation parametersSuch that:
wherein f isx,fyFocal lengths in x and y directions of the vision sensor, respectively, (u)x,uy) Is the offset vector of the light-sensing element midpoint relative to the image center. f. ofx,fy,ux,uyWhich is an internal parameter of the vision system, can be obtained by conventional internal reference calibration methods and is therefore a known quantity here. r is1,r2,r3The rotation angles, t, about the three axes x, y, z, respectivelyx,ty,tzRespectively the translation amount in the three coordinate axis directions.
Given the transformation parameters at time k by Gauss-Newton iterationOrder toIn thatThe vicinity is:
wherein Jacobian matrixFind the next iteration pointSuch that:
order toObtaining a normalized equation:
substituting the iterative format of the gauss-newton method of the preceding formula:
and (5) solving the transformation parameters meeting the formula (6) by using the iteration format to obtain a calibration result.
The invention has the advantages that the invention can effectively reduce the interference of noise, distance, incident angle and the like on the calibration between the three-dimensional laser and the vision system, has more stable calibration effect, can simultaneously realize the calibration between the three-dimensional panoramic laser and a plurality of vision systems, and effectively improves the practical applicability of the calibration. The method applied to calibration is simple in design, the calibration device is light in weight and easy to carry, and can be applied indoors and also can realize quick and correct calibration of the three-dimensional panoramic laser and the visual system in an outdoor complex environment, so that fusion of three-dimensional laser data and visual data can be realized, scene reconstruction of the complex environment can be realized, and a solid foundation is laid for development of a machine intelligent technology based on multi-sensor data fusion.
Drawings
Fig. 1 shows a black cardboard for calibration.
FIG. 2 is a device layout diagram including four vision systems and a panoramic laser
FIG. 3 is a picture including calibration black paper collected by four vision systems
FIG. 4 is three-dimensional laser data collected by a panoramic laser sensor
Fig. 5 is a process of extracting calibration black paper from laser data. (a) The method comprises the following steps of (a) obtaining a two-dimensional reflection value image corresponding to three-dimensional laser point cloud, (b) obtaining a reflection value image in a selected range, (c) obtaining a reflection value image after binarization, and (d) obtaining a reflection value image after clustering.
Fig. 6 is a diagram of extracting a three-dimensional laser spot projected onto a human body from three-dimensional laser data.
Fig. 7 is an effect diagram of the laser points in fig. 5 being back projected into four pictures by using the calibration result.
Detailed Description
To verify the effectiveness of the method, verification of the calibration method was performed using a sensor system constructed as shown in fig. 2. The panoramic laser sensor consists of a Hokuyo UTM-30LX type laser sensor and a rotating holder, wherein the planar scanning angle of the laser sensor is 0-270 degrees, and the application range of the frequency of a holder stepping motor is 500-2500 Hz. And driving the laser sensor to obtain three-dimensional laser ranging data of the scene by using the motor. The four vision systems adopt a common ANC FULL HD1080P network camera, use a USB2.0 interface, have a visual angle of 60 degrees and a resolution of 1280 x 960. The calibration device uses nine black paper of 300mm x 100mm, placed at different positions in the scene.
Pictures of calibration devices in a scene are acquired from the four vision systems respectively (as shown in fig. 3), and the calibration devices can be extracted from the pictures by using corresponding image processing methods. The panoramic laser sensor can acquire three-dimensional point cloud data (as shown in fig. 4) of the whole space and a reflection value corresponding to each laser point, and can extract a laser point (as shown in fig. 5) of the data calibration device from the laser data after reflection value map generation, range filtering, binarization and point clustering processing. Iterative matching is carried out on the characteristic points of the two-dimensional calibration device in the picture and the characteristic points of the three-dimensional calibration device in the laser data, and a rotation matrix and a translation matrix can be calculated.
The obtained external parameters of the three-dimensional panoramic laser ranging system and the four vision systems are as follows:
and analyzing from a qualitative angle, and visually verifying the calibration result in a mode of projecting the laser point to the picture. The human body is taken as a reference object, corresponding pictures and three-dimensional laser data are collected again, laser points (shown in figure 6) belonging to a body part are extracted from the three-dimensional laser points, laser points are projected onto the four pictures by utilizing a rotation matrix and a translation matrix obtained by calibration, the calibration effect is checked, white points on the human body in the four pictures in figure 7 are projected laser points, and the fact that the laser points can be accurately projected onto corresponding areas of the pictures can be seen.

Claims (1)

1. A joint calibration method between 360-degree panoramic laser and a plurality of vision systems is characterized in that: the method comprises the steps of adopting simple black card paper as a calibration device, utilizing the black card paper to enable laser beams irradiated to the surface of the black card paper to have a low reflectivity characteristic, extracting laser points belonging to the black card paper from a two-dimensional reflection value image, taking the middle points of the laser points as characteristic points, respectively matching the corresponding characteristic points in images of different vision systems, and iteratively calculating a transformation relation among systems to finish the joint calibration of the three-dimensional panoramic laser and the multiple vision systems; the method comprises the following specific steps:
a) firstly, cutting a plurality of rectangular black paperboard, wherein the length of the paperboard is not less than 30 cm, and the width of the paperboard is not less than 10 cm, and the paperboard is randomly placed on different planes in an indoor environment;
b) collecting environmental three-dimensional data and reflection value data by using panoramic laserCarrying out gray level processing on the picture to obtain a two-dimensional reflection value image corresponding to the three-dimensional laser point cloud, wherein diAnd giRespectively the reflection value and the grey value of the laser spot i, dmaxAnd dminIs the maximum and minimum reflection values of all laser points; selecting laser data in a certain range from the three-dimensional panoramic laser where the calibration black paper is located, and performing binarization processing on pixel points corresponding to the data to enable only black and white to be left in the image and enable the black paper area to be clearer; clustering the black pixel points after binarization by adopting a neighborhood search method, and eliminating interference;
c) a flatness evaluation method is used for distinguishing a calibration object from a non-calibration interference object, and a laser spot cluster is set to contain p ═ p1,p2,...,pN]For a total of N laser points, the coordinate covariance matrix of the laser points isWhereinIs the laser point midpoint; three eigenvalues of the covariance matrix are calculated: lambda [ alpha ]0、λ1、λ2When lambda is2<<λ1≈λ0And the ratio is less than a given thresholdWherein k isλConsidering the laser point cluster as a calibration object for a flatness evaluation threshold;
d) definition ofAs image feature points, pi=[xi,yi,zi,1]The three-dimensional laser point is converted to correspond to a pixel point coordinate vector in the image asThe purpose of calibration is to calculate a set of transformation parametersSatisfy the requirement ofWherein f isx,fyFocal lengths in x and y directions of the vision sensor, respectively, (u)x,uy) The offset vectors of the middle points of the photosensitive elements relative to the center of the image are all known quantities; r is1,r2,r3The rotation angles, t, about the three axes x, y, z, respectivelyx,ty,tzRespectively the translation amount in the three coordinate axis directions; given transformation parameters at time kOrder toIn thatNearby hasWherein Jacobian matrixFind the next iteration pointSuch that:order toGet the normalized equationThereby obtainingAnd solving the transformation parameter by using the iteration format to obtain a calibration result.
CN201510021270.5A 2015-01-16 2015-01-16 A kind of combined calibrating method between 360 degree of panorama laser and multiple vision systems Active CN104574406B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510021270.5A CN104574406B (en) 2015-01-16 2015-01-16 A kind of combined calibrating method between 360 degree of panorama laser and multiple vision systems

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510021270.5A CN104574406B (en) 2015-01-16 2015-01-16 A kind of combined calibrating method between 360 degree of panorama laser and multiple vision systems

Publications (2)

Publication Number Publication Date
CN104574406A CN104574406A (en) 2015-04-29
CN104574406B true CN104574406B (en) 2017-06-23

Family

ID=53090378

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510021270.5A Active CN104574406B (en) 2015-01-16 2015-01-16 A kind of combined calibrating method between 360 degree of panorama laser and multiple vision systems

Country Status (1)

Country Link
CN (1) CN104574406B (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105067023B (en) * 2015-08-31 2017-11-14 中国科学院沈阳自动化研究所 A kind of panorama three-dimensional laser sensing data calibration method and device
CN107024687B (en) * 2016-02-01 2020-07-24 北京自动化控制设备研究所 Method for quickly calibrating installation error of POS/laser radar in offline manner
CN105798909B (en) * 2016-04-29 2018-08-03 上海交通大学 Robot Zero positioning System and method for based on laser and vision
CN105844658B (en) * 2016-06-06 2018-08-17 南昌航空大学 The visible light and laser sensor extrinsic calibration method detected automatically based on straight line
CN106097348B (en) * 2016-06-13 2019-03-05 大连理工大学 A kind of fusion method of three-dimensional laser point cloud and two dimensional image
CN106446378B (en) * 2016-09-13 2019-12-03 中国科学院计算技术研究所 A kind of room shape geometric feature description method and system
CN108120447B (en) * 2016-11-28 2021-08-31 沈阳新松机器人自动化股份有限公司 Multi-laser equipment data fusion method
CN106679671B (en) * 2017-01-05 2019-10-11 大连理工大学 A kind of navigation identification figure recognition methods based on laser data
CN109589179B (en) * 2017-10-02 2023-01-17 吕孝宇 Mixed reality system and method for determining spatial coordinates of a dental instrument
CN109993801A (en) * 2019-03-22 2019-07-09 上海交通大学 A kind of caliberating device and scaling method for two-dimensional camera and three-dimension sensor
CN110189381B (en) * 2019-05-30 2021-12-03 北京眸视科技有限公司 External parameter calibration system, method, terminal and readable storage medium
CN110823252B (en) * 2019-11-06 2022-11-18 大连理工大学 Automatic calibration method for multi-line laser radar and monocular vision
CN111462251B (en) * 2020-04-07 2021-05-11 深圳金三立视频科技股份有限公司 Camera calibration method and terminal
CN112258517A (en) * 2020-09-30 2021-01-22 无锡太机脑智能科技有限公司 Automatic map repairing method and device for laser radar grid map
CN113759346B (en) * 2020-10-10 2024-06-18 北京京东乾石科技有限公司 Laser radar calibration method and device, electronic equipment and storage medium
CN117953082B (en) * 2024-03-26 2024-07-19 深圳市其域创新科技有限公司 Laser radar and camera combined calibration method, system and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080091891A (en) * 2007-04-10 2008-10-15 삼성중공업 주식회사 The automatic calibration method in robot based multi-laser vision system
CN101493318A (en) * 2008-09-16 2009-07-29 北京航空航天大学 Rudder deflection angle synchronization dynamic measurement system and implementing method thereof
CN101698303A (en) * 2009-09-11 2010-04-28 大连理工大学 Automatic calibration method between three-dimensional laser and monocular vision
CN101799271A (en) * 2010-04-01 2010-08-11 哈尔滨工业大学 Method for obtaining camera calibration point under large viewing field condition

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080091891A (en) * 2007-04-10 2008-10-15 삼성중공업 주식회사 The automatic calibration method in robot based multi-laser vision system
CN101493318A (en) * 2008-09-16 2009-07-29 北京航空航天大学 Rudder deflection angle synchronization dynamic measurement system and implementing method thereof
CN101698303A (en) * 2009-09-11 2010-04-28 大连理工大学 Automatic calibration method between three-dimensional laser and monocular vision
CN101799271A (en) * 2010-04-01 2010-08-11 哈尔滨工业大学 Method for obtaining camera calibration point under large viewing field condition

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Automatic extrinsic self-calibration for fusing data from monocular vision and 3-D laser scanner;Zhuang Yan, 等;《IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT》;20140731;第63卷(第7期);1874-1876 *

Also Published As

Publication number Publication date
CN104574406A (en) 2015-04-29

Similar Documents

Publication Publication Date Title
CN104574406B (en) A kind of combined calibrating method between 360 degree of panorama laser and multiple vision systems
Kang et al. Automatic targetless camera–lidar calibration by aligning edge with gaussian mixture model
Pusztai et al. Accurate calibration of LiDAR-camera systems using ordinary boxes
AU2018212700B2 (en) Apparatus, method, and system for alignment of 3D datasets
US10060739B2 (en) Method for determining a position and orientation offset of a geodetic surveying device and such a surveying device
CN109598765B (en) Monocular camera and millimeter wave radar external parameter combined calibration method based on spherical calibration object
CN105358937B (en) Geodetic surveying instrument, method for determining position data of geodetic surveying instrument, and storage medium
Scaramuzza et al. Extrinsic self calibration of a camera and a 3d laser range finder from natural scenes
Geiger et al. Automatic camera and range sensor calibration using a single shot
Paya et al. A state‐of‐the‐art review on mapping and localization of mobile robots using omnidirectional vision sensors
CN110349221A (en) A kind of three-dimensional laser radar merges scaling method with binocular visible light sensor
US9443308B2 (en) Position and orientation determination in 6-DOF
Yan et al. Joint camera intrinsic and lidar-camera extrinsic calibration
JP2012533222A (en) Image-based surface tracking
García-Moreno et al. LIDAR and panoramic camera extrinsic calibration approach using a pattern plane
CN107025663A (en) It is used for clutter points-scoring system and method that 3D point cloud is matched in vision system
CN113096183B (en) Barrier detection and measurement method based on laser radar and monocular camera
Zhao et al. Reconstruction of textured urban 3D model by fusing ground-based laser range and CCD images
CN113050074B (en) Camera and laser radar calibration system and calibration method in unmanned environment perception
CN114137564A (en) Automatic indoor object identification and positioning method and device
CN115359130B (en) Radar and camera combined calibration method and device, electronic equipment and storage medium
CN114140539A (en) Method and device for acquiring position of indoor object
Özdemir et al. A multi-purpose benchmark for photogrammetric urban 3D reconstruction in a controlled environment
Hasheminasab et al. Linear Feature-based image/LiDAR integration for a stockpile monitoring and reporting technology
Deng et al. Joint calibration of dual lidars and camera using a circular chessboard

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant