CN108020826A - Multi-line laser radar and multichannel camera mixed calibration method - Google Patents

Multi-line laser radar and multichannel camera mixed calibration method Download PDF

Info

Publication number
CN108020826A
CN108020826A CN201711012232.9A CN201711012232A CN108020826A CN 108020826 A CN108020826 A CN 108020826A CN 201711012232 A CN201711012232 A CN 201711012232A CN 108020826 A CN108020826 A CN 108020826A
Authority
CN
China
Prior art keywords
msub
mrow
camera
point cloud
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201711012232.9A
Other languages
Chinese (zh)
Other versions
CN108020826B (en
Inventor
温程璐
王程
夏彦
张正
宫正
李军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiamen University
Original Assignee
Xiamen University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiamen University filed Critical Xiamen University
Priority to CN201711012232.9A priority Critical patent/CN108020826B/en
Publication of CN108020826A publication Critical patent/CN108020826A/en
Application granted granted Critical
Publication of CN108020826B publication Critical patent/CN108020826B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Multimedia (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

本发明公开了一种多线激光雷达与多路相机混合标定方法,包括以下步骤:S1、多路相机的原始图像数据、多线激光雷达点云数据以及静态激光雷达点云数据的采集;S2、各相机内参模型的求解;S3、对各相机采集的图像进行去畸变,得到矫正后的图像;S4、将静态激光雷达点云数据配准到多线激光雷达点云坐标系中;S5、在S4配准好的点云数据中获取各相机在多线激光雷达点云坐标系中的位置(Xs,Ys,Zs);S6、在各相机矫正后的图像中选取至少4个靶标的像素坐标(u,v)和相对应的以多线激光雷达为坐标原点的点云中靶标的三维坐标(Xp,Yp,Zp);S7、根据各相机的内参模型、相机位置(Xs,Ys,Zs)及相机所对应的靶标的像素坐标(u,v)和三维坐标(Xp,Yp,Zp),建立共线方程,求出各相机的姿态角元素和9个方向余弦,完成标定。

The invention discloses a hybrid calibration method of multi-line laser radar and multi-channel camera, comprising the following steps: S1, collection of original image data of multi-channel camera, multi-line laser radar point cloud data and static laser radar point cloud data; S2 , solving the internal reference model of each camera; S3, de-distorting the images collected by each camera, and obtaining the corrected image; S4, registering the static lidar point cloud data into the multi-line lidar point cloud coordinate system; S5, Obtain the position (X s , Y s , Z s ) of each camera in the multi-line lidar point cloud coordinate system from the registered point cloud data in S4; S6, select at least 4 in the corrected image of each camera The pixel coordinates (u, v) of the target and the corresponding three-dimensional coordinates (X p , Y p , Z p ) of the target in the point cloud with the multi-line lidar as the coordinate origin; S7, according to the internal reference model of each camera, the camera Position (X s , Y s , Z s ) and the pixel coordinates (u, v) and three-dimensional coordinates (X p , Y p , Z p ) of the target corresponding to the camera, establish a collinear equation, and find the attitude of each camera Angle elements and 9 direction cosines complete the calibration.

Description

多线激光雷达与多路相机混合标定方法Hybrid calibration method of multi-line laser radar and multi-channel camera

技术领域technical field

本发明涉及标定技术领域,具体涉及一种多线激光雷达与多路相机混合标定方法。The invention relates to the technical field of calibration, in particular to a hybrid calibration method of a multi-line laser radar and a multi-channel camera.

背景技术Background technique

激光雷达是通过发射激光并接受反射的激光来探测物体的位置,为了提高激光雷达的探测范围和精度,人们在单线激光雷达的基础上进一步研究获得了多线激光雷达。多线激光雷达是指其可同时发射和接受多束激光,在进行扫描时,可以得到多个同心的扫描线。Lidar detects the position of objects by emitting laser light and receiving reflected laser light. In order to improve the detection range and accuracy of lidar, people have further studied and obtained multi-line lidar on the basis of single-line lidar. Multi-line laser radar means that it can emit and receive multiple laser beams at the same time, and when scanning, multiple concentric scanning lines can be obtained.

在基于多线激光雷达扫描的三维重建应用中,基于多线激光雷达与多路相机的数据融合,能够抓取更多的环境三维细节信息,为进一步处理提高更完善的空间数据,但在相关系统中,多线激光雷达与多路相机均存在自身的局部坐标系,需要通过相关算法进行标定,以找到多线激光雷达与多路相机之间的三维坐标变换关系,目前,针对多线激光雷达与多路相机混合标定的算法大多需要每一路相机与多线激光雷达单独标定。由于激光雷达采集的点云数据稀疏,而且多路相机内参数未知,导致标定难度大,目前尚无相关的标定技术能够一次性解决此种类型的多线激光雷达与多路相机混合标定的技术。In the application of 3D reconstruction based on multi-line lidar scanning, data fusion based on multi-line lidar and multi-channel cameras can capture more 3D details of the environment and improve spatial data for further processing. In the system, both the multi-line laser radar and the multi-channel camera have their own local coordinate systems, which need to be calibrated through related algorithms to find the three-dimensional coordinate transformation relationship between the multi-line laser radar and the multi-channel camera. Currently, for the multi-line laser Most of the mixed calibration algorithms for radar and multi-camera require separate calibration for each camera and multi-line lidar. Due to the sparse point cloud data collected by lidar and the unknown internal parameters of multi-channel cameras, calibration is difficult. At present, there is no related calibration technology that can solve this type of mixed calibration technology of multi-line lidar and multi-channel cameras at one time. .

发明内容Contents of the invention

本发明的目的在于提供一种多线激光雷达与多路相机混合标定算法,用于实现多线激光雷达与多路相机间的标定。The purpose of the present invention is to provide a hybrid calibration algorithm of multi-line laser radar and multi-channel camera, which is used to realize the calibration between multi-line laser radar and multi-channel camera.

为实现上述目的,本发明采用以下技术方案:To achieve the above object, the present invention adopts the following technical solutions:

多线激光雷达与多路相机混合标定方法,包括以下步骤:The mixed calibration method of multi-line laser radar and multi-channel camera includes the following steps:

S1、多路相机的原始图像数据、多线激光雷达点云数据以及静态激光雷达点云数据的采集;S1. Collection of raw image data from multi-channel cameras, multi-line lidar point cloud data and static lidar point cloud data;

S2、各相机内参模型的求解;S2. Solving the internal reference model of each camera;

S3、对各相机采集的图像进行去畸变,得到矫正后的图像;S3. De-distorting the images collected by each camera to obtain a corrected image;

S4、将静态激光雷达点云数据配准到多线激光雷达点云坐标系中;S4. Register the static lidar point cloud data into the multi-line lidar point cloud coordinate system;

S5、在S4配准好的点云数据中获取各相机在多线激光雷达点云坐标系中的位置(Xs,Ys,Zs);S5. Obtain the position (X s , Y s , Z s ) of each camera in the multi-line laser radar point cloud coordinate system from the point cloud data registered in S4;

S6、在各相机矫正后的图像中选取至少4个靶标的像素坐标(u,v)和相对应的以多线激光雷达为坐标原点的场景点云中靶标的三维坐标(Xp,Yp,Zp);S6. Select the pixel coordinates (u, v) of at least 4 targets in the corrected images of each camera and the corresponding three-dimensional coordinates (X p , Y p ) of the targets in the scene point cloud with the multi-line lidar as the coordinate origin , Z p );

S7、根据各相机的内参模型、相机位置(Xs,Ys,Zs)及相机所对应的靶标的像素坐标(u,v)和三维坐标(Xp,Yp,Zp),建立共线方程,求出各相机的姿态角元素和9个方向余弦,完成标定。S7. According to the internal reference model of each camera, the camera position (X s , Y s , Z s ), and the pixel coordinates (u, v) and three-dimensional coordinates (X p , Y p , Z p ) of the target corresponding to the camera, establish Collinear equation, calculate the attitude angle elements and 9 direction cosines of each camera, and complete the calibration.

进一步地,步骤S1具体包括:Further, step S1 specifically includes:

S11、多路相机图像数据的采集:S11. Collection of multi-channel camera image data:

将车辆静止停放,在每个相机的视场中依次均匀摆放若干个靶标,获得多路相机的原始图像数据;Park the vehicle statically, place several targets evenly in the field of view of each camera, and obtain the original image data of multiple cameras;

S12、多线激光雷达点云数据的采集:S12. Acquisition of multi-line lidar point cloud data:

将车顶上的多线激光雷达开机扫描,获得以多线激光雷达所在的位置为三维空间坐标系原点的多线激光雷达点云数据;Turn on the multi-line laser radar on the roof to scan, and obtain the multi-line laser radar point cloud data with the position of the multi-line laser radar as the origin of the three-dimensional space coordinate system;

S13、静态激光雷达点云数据的采集:S13. Collection of static lidar point cloud data:

采用地面静态激光雷达对整个场景进行扫描,获得静态激光雷达点云数据及各相机在静态激光雷达点云数据中的位置。Use the ground static lidar to scan the entire scene to obtain the static lidar point cloud data and the position of each camera in the static lidar point cloud data.

进一步地,步骤S2中,相机内参模型表示为 Further, in step S2, the internal reference model of the camera is expressed as

其中fx,fy为相机的焦距,cx,cy为相机的主光轴点,使用张正友的棋盘标定法求解相机内参和畸变因子,获得相机内参模型。Where f x , f y are the focal length of the camera, c x , cy are the main optical axis points of the camera, use Zhang Zhengyou's checkerboard calibration method to solve the camera internal parameters and distortion factors, and obtain the camera internal reference model.

进一步地,步骤S7中,所述共线方程为:Further, in step S7, the collinear equation is:

式中,f为镜头中心到影像平面的垂距,a1、a2、a3、b1′、b2′、b3′、c1、c2′及c3为各相机的9个方向余弦,各方向余弦与图像姿态角ω及γ间的关系如下:In the formula, f is the vertical distance from the lens center to the image plane, a 1 , a 2 , a 3 , b 1′ , b 2′ , b 3′ , c 1 , c 2′ and c 3 are the nine direction cosines of each camera, each direction cosine and the image attitude angle The relationship between ω and γ is as follows:

b1=cosωsinγ;b 1 = cos ω sin γ;

b2=cosωsinγ;b 2 = cos ω sin γ;

b3=-sinω;b 3 =-sinω;

其中ω和γ分别是以多线激光雷达为坐标原点的Y轴为主轴、X轴为主轴、Z轴为主轴的旋转角;估计出各相机的姿态角元素和9个方向余弦,完成标定。in ω and γ are the rotation angles of the Y-axis, the X-axis, and the Z-axis with the multi-line lidar as the coordinate origin, respectively; the attitude angle elements and 9 direction cosines of each camera are estimated to complete the calibration.

采用上述技术方案后,本发明与背景技术相比,具有如下优点:After adopting the technical solution, the present invention has the following advantages compared with the background technology:

本发明提出了一种新的标定方法,用于实现多线激光雷达与多路相机的同时混合标定,可进行一个多线激光雷达与多路相机同时标定,填补了相关技术空白。本发明安装简单,标定算法易于实施,在多路相机内参数未知、激光雷达采集的点云数据稀疏的情况下,解决了多线激光雷达与多路相机同时标定的困难,填补了相关技术空白,推进了无人驾驶技术朝低成本性、普遍性、平民性发展。The invention proposes a new calibration method, which is used to realize the simultaneous mixed calibration of multi-line laser radar and multi-channel cameras, and can perform simultaneous calibration of one multi-line laser radar and multiple cameras, which fills the gap in related technologies. The invention is easy to install, and the calibration algorithm is easy to implement. Under the condition that the internal parameters of the multi-channel camera are unknown and the point cloud data collected by the laser radar is sparse, it solves the difficulty of simultaneous calibration of the multi-line laser radar and the multi-channel camera, and fills the gap in related technologies , Promoting the development of unmanned driving technology towards low cost, universality and common people.

附图说明Description of drawings

图1为本发明中多线激光雷达与多路相机在车辆上的某一安装实例示意图。Fig. 1 is a schematic diagram of an installation example of a multi-line laser radar and a multi-channel camera on a vehicle in the present invention.

图2为本发明流程图。Fig. 2 is a flowchart of the present invention.

具体实施方式Detailed ways

为了使本发明的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本发明进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本发明,并不用于限定本发明。In order to make the object, technical solution and advantages of the present invention clearer, the present invention will be further described in detail below in conjunction with the accompanying drawings and embodiments. It should be understood that the specific embodiments described here are only used to explain the present invention, not to limit the present invention.

在本发明中需要说明的是,术语“上”“下”“左”“右”“竖直”“水平”“内”“外”等均为基于附图所示的方位或位置关系,仅仅是为了便于描述本发明和简化描述,而不是指示或暗示本发明的装置或元件必须具有特定的方位,因此不能理解为对本发明的限制。In the present invention, it should be noted that the terms "upper", "lower", "left", "right", "vertical", "horizontal", "inner" and "outer" are all based on the orientation or positional relationship shown in the accompanying drawings, only It is for the convenience of describing the present invention and simplifying the description, but does not indicate or imply that the device or element of the present invention must have a specific orientation, so it should not be construed as a limitation of the present invention.

实施例Example

本发明公开了一种多线激光雷达与多路相机混合标定方法,用于实现多线激光雷达与多路相机的混合标定。如图1所示的是多线激光雷达与多路相机在车辆上的某一安装实例示意图,其中,多线激光雷达5的个数为1,设置于车顶位置,4路相机1、2、3及4分别安装于车前、车后及车的左右两侧,在实际应用中,多线激光雷达的个数、分布位置以及相机的个数、分布位置可根据实际需求进行布置,不影响本发明对多线激光雷达与相机之间的位置标定。The invention discloses a hybrid calibration method of a multi-line laser radar and a multi-channel camera, which is used for realizing the hybrid calibration of a multi-line laser radar and a multi-channel camera. As shown in Figure 1 is a schematic diagram of an installation example of a multi-line laser radar and a multi-channel camera on a vehicle, wherein the number of the multi-line laser radar 5 is 1, which is set on the roof position, and the 4-channel cameras 1, 2 . It affects the position calibration between the multi-line laser radar and the camera in the present invention.

如图2所示的是本发明的流程图,本发明包括以下步骤:As shown in Figure 2 is a flow chart of the present invention, the present invention comprises the following steps:

S1、多路相机的原始图像数据、多线激光雷达点云数据以及静态激光雷达点云数据的采集,具体地:S1. Collection of raw image data from multiple cameras, multi-line lidar point cloud data and static lidar point cloud data, specifically:

具体地,步骤S1包括:Specifically, step S1 includes:

S11、多路相机图像数据的采集:S11. Collection of multi-channel camera image data:

首先,找到一个宽敞的、四周可以悬挂靶标场地,在每个相机的视场中依次均匀摆放5块或更多的靶标,获得多路相机的原始图像数据。First, find a spacious site where targets can be hung around, place 5 or more targets evenly in the field of view of each camera, and obtain the original image data of multiple cameras.

S12、多线激光雷达点云数据的采集:S12. Acquisition of multi-line lidar point cloud data:

将车顶上的多线激光雷达开机扫描,获得以多线激光雷达所在的位置为三维空间坐标系原点的多线激光雷达点云数据,即激光雷达所在位置的三维坐标(x,y,z)=(0,0,0)。Turn on the multi-line laser radar on the roof and scan it to obtain the multi-line laser radar point cloud data with the position of the multi-line laser radar as the origin of the three-dimensional space coordinate system, that is, the three-dimensional coordinates of the laser radar (x, y, z )=(0,0,0).

S13、静态激光雷达点云数据的采集:S13. Collection of static lidar point cloud data:

采用高精度地面静态激光雷达(精度在5毫米以内)对标定的整个场景进行扫描,获得静态激光雷达点云数据及各相机在静态激光雷达点云数据中的位置。此时获得的静态激光雷达点云数据是以静态激光雷达所在的位置为三维坐标原点的点云数据,即每个点的三维空间坐标值(X,Y,Z)是相对于静态激光雷达所在的位置。Use high-precision ground static lidar (accuracy within 5 mm) to scan the entire calibrated scene to obtain static lidar point cloud data and the position of each camera in the static lidar point cloud data. The static lidar point cloud data obtained at this time is the point cloud data with the location of the static lidar as the origin of the three-dimensional coordinates, that is, the three-dimensional space coordinate value (X, Y, Z) of each point is relative to the location of the static lidar s position.

S2、各相机内参模型的求解:S2. Solving the internal reference model of each camera:

相机内参模型表示为其中fx,fy为相机的焦距,cx,cy为相机的主光轴点,畸变因子由(k1,k2,k3,k4)表示,使用张正友的棋盘标定法求解相机内参和畸变因子,获得相机内参模型。The internal reference model of the camera is expressed as Where f x , f y are the focal length of the camera, c x , cy are the main optical axis points of the camera, and the distortion factor is represented by (k 1 ,k 2 ,k 3 ,k 4 ), using Zhang Zhengyou's checkerboard calibration method to solve the camera Internal reference and distortion factor to obtain the internal reference model of the camera.

S3、根据S2得到的相机内参和畸变因子等,对各相机采集的图像进行去畸变,得到矫正后的图像。S3. De-distorting the images collected by each camera according to the camera internal reference and distortion factors obtained in S2, to obtain a corrected image.

S4、将静态激光雷达点云数据配准到多线激光雷达点云坐标系中:将S1中得到的静态激光雷达点云数据和多线激光雷达点云数据进行手工配准,将以静态激光雷达位置为坐标原点的点云坐标系变换到以多线激光雷达位置为坐标原点的点云坐标系,得到整个标定场景的点云数据(其中,包括车、相机和靶标等目标),此步骤借助于专业软件(如RiPROCESS)完成。S4. Register the static lidar point cloud data into the multi-line lidar point cloud coordinate system: Manually register the static lidar point cloud data and the multi-line lidar point cloud data obtained in S1, and use the static laser Transform the point cloud coordinate system with the radar position as the coordinate origin to the point cloud coordinate system with the multi-line lidar position as the coordinate origin, and obtain the point cloud data of the entire calibration scene (including targets such as vehicles, cameras, and targets). This step With the help of professional software (such as RiPROCESS) to complete.

S5、在S4配准好的点云数据中获取各相机在多线激光雷达点云坐标系中的位置(Xs,Ys,Zs)。S5. Obtain the position (X s , Y s , Z s ) of each camera in the multi-line lidar point cloud coordinate system from the point cloud data registered in S4.

S6、在各相机矫正后的图像中选取至少4个靶标的像素坐标(u,v)和相对应的以多线激光雷达为坐标原点的场景点云中靶标的三维坐标(Xp,Yp,Zp)。S6. Select the pixel coordinates (u, v) of at least 4 targets in the corrected images of each camera and the corresponding three-dimensional coordinates (X p , Y p ) of the targets in the scene point cloud with the multi-line lidar as the coordinate origin , Z p ).

S7、根据各相机的内参模型、相机位置(Xs,Ys,Zs)及相机所对应的靶标的像素坐标(u,v)和三维坐标(Xp,Yp,Zp),建立共线方程,求出各相机的姿态角元素和9个方向余弦,完成标定。S7. According to the internal reference model of each camera, the camera position (X s , Y s , Z s ), and the pixel coordinates (u, v) and three-dimensional coordinates (X p , Y p , Z p ) of the target corresponding to the camera, establish Collinear equation, calculate the attitude angle elements and 9 direction cosines of each camera, and complete the calibration.

其中,所述共线方程为:Wherein, the collinear equation is:

式中,f为镜头中心到影像平面的垂距(即焦距),并为了简化问题,令a1、a2、a3、b1′、b2′、b3′、c1、c2′及c3为各相机的9个方向余弦;In the formula, f is the vertical distance from the center of the lens to the image plane (that is, the focal length), and in order to simplify the problem, let a 1 , a 2 , a 3 , b 1′ , b 2′ , b 3′ , c 1 , c 2′ and c 3 are the 9 direction cosines of each camera;

各方向余弦与图像姿态角ω及γ间的关系如下:Cosines of each direction and image attitude angle The relationship between ω and γ is as follows:

b1=cosωsinγ;b 1 = cos ω sin γ;

b2=cosωsinγ;b 2 = cos ω sin γ;

b3=-sinω;b 3 =-sinω;

其中,ω和γ分别是以多线激光雷达为坐标原点的Y轴为主轴、X轴为主轴以及Z轴为主轴的旋转角(即以多线激光雷达为坐标原点的Y轴为主轴旋转角,然后绕X轴旋转ω角,最后绕Z轴旋转γ角),主轴是指旋转过程中空间方向不变的一个固定轴;采用最小二乘法求解共线条件方程,估计出各相机的姿态角元素和9个方向余弦,完成标定。in, ω and γ are the rotation angles with the Y-axis as the main axis, the X-axis as the main axis, and the Z-axis as the main axis of the multi-line lidar as the coordinate origin respectively (that is, the Y-axis with the multi-line lidar as the coordinate origin as the main axis rotation angle, then rotate around the X-axis by an angle of ω, and finally rotate around the Z-axis by an angle of γ), the main axis refers to a fixed axis whose spatial direction does not change during the rotation process; the least square method is used to solve the collinear conditional equation, and the attitude of each camera is estimated Angle elements and 9 direction cosines complete the calibration.

以上所述,仅为本发明较佳的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到的变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应该以权利要求的保护范围为准。The above is only a preferred embodiment of the present invention, but the scope of protection of the present invention is not limited thereto. Any person skilled in the art within the technical scope disclosed in the present invention can easily think of changes or Replacement should be covered within the protection scope of the present invention. Therefore, the protection scope of the present invention should be determined by the protection scope of the claims.

Claims (4)

1.多线激光雷达与多路相机混合标定方法,其特征在于,包括以下步骤:1. The multi-line lidar and multi-channel camera hybrid calibration method is characterized in that, comprising the following steps: S1、多路相机的原始图像数据、多线激光雷达点云数据以及静态激光雷达点云数据的采集;S1. Collection of raw image data from multi-channel cameras, multi-line lidar point cloud data and static lidar point cloud data; S2、各相机内参模型的求解;S2. Solving the internal reference model of each camera; S3、对各相机采集的图像进行去畸变,得到矫正后的图像;S3. De-distorting the images collected by each camera to obtain a corrected image; S4、将静态激光雷达点云数据配准到多线激光雷达点云坐标系中;S4. Register the static lidar point cloud data into the multi-line lidar point cloud coordinate system; S5、在S4配准好的点云数据中获取各相机在多线激光雷达点云坐标系中的位置(Xs,Ys,Zs);S5. Obtain the position (X s , Y s , Z s ) of each camera in the multi-line laser radar point cloud coordinate system from the point cloud data registered in S4; S6、在各相机矫正后的图像中选取至少4个靶标的像素坐标(u,v)和相对应的以多线激光雷达为坐标原点的场景点云中靶标的三维坐标(Xp,Yp,Zp);S6. Select the pixel coordinates (u, v) of at least 4 targets in the corrected images of each camera and the corresponding three-dimensional coordinates (X p , Y p ) of the targets in the scene point cloud with the multi-line lidar as the coordinate origin , Z p ); S7、根据各相机的内参模型、相机位置(Xs,Ys,Zs)及相机所对应的靶标的像素坐标(u,v)和三维坐标(Xp,Yp,Zp),建立共线方程,求出各相机的姿态角元素和9个方向余弦,完成标定。S7. According to the internal reference model of each camera, the camera position (X s , Y s , Z s ), and the pixel coordinates (u, v) and three-dimensional coordinates (X p , Y p , Z p ) of the target corresponding to the camera, establish Collinear equation, calculate the attitude angle elements and 9 direction cosines of each camera, and complete the calibration. 2.如权利要求1所述的多线激光雷达与多路相机混合标定方法,其特征在于,步骤S1具体包括:2. The multi-line laser radar and multi-channel camera hybrid calibration method according to claim 1, wherein step S1 specifically comprises: S11、多路相机图像数据的采集:S11. Collection of multi-channel camera image data: 将车辆静止停放,在每个相机的视场中依次均匀摆放若干个靶标,获得多路相机的原始图像数据;Park the vehicle statically, place several targets evenly in the field of view of each camera, and obtain the original image data of multiple cameras; S12、多线激光雷达点云数据的采集:S12. Acquisition of multi-line lidar point cloud data: 将车顶上的多线激光雷达开机扫描,获得以多线激光雷达所在的位置为三维空间坐标系原点的多线激光雷达点云数据;Turn on the multi-line laser radar on the roof to scan, and obtain the multi-line laser radar point cloud data with the position of the multi-line laser radar as the origin of the three-dimensional space coordinate system; S13、静态激光雷达点云数据的采集:S13. Collection of static lidar point cloud data: 采用地面静态激光雷达对整个场景进行扫描,获得静态激光雷达点云数据及各相机在静态激光雷达点云数据中的位置。Use the ground static lidar to scan the entire scene to obtain the static lidar point cloud data and the position of each camera in the static lidar point cloud data. 3.如权利要求1所述的多线激光雷达与多路相机混合标定方法,其特征在于:步骤S2中,相机内参模型表示为 3. The multi-line laser radar and multi-channel camera hybrid calibration method according to claim 1, characterized in that: in step S2, the camera internal reference model is expressed as 其中fx,fy为相机的焦距,cx,cy为相机的主光轴点,使用张正友的棋盘标定法求解相机内参和畸变因子,获得相机内参模型。Where f x , f y are the focal length of the camera, c x , cy are the main optical axis points of the camera, use Zhang Zhengyou's checkerboard calibration method to solve the camera internal parameters and distortion factors, and obtain the camera internal reference model. 4.如权利要求3所述的多线激光雷达与多路相机混合标定方法,其特征在于:步骤S7中,所述共线方程为:4. The hybrid calibration method of multi-line laser radar and multi-channel camera as claimed in claim 3, characterized in that: in step S7, the collinear equation is: <mrow> <mi>u</mi> <mo>-</mo> <msub> <mi>c</mi> <mi>x</mi> </msub> <mo>=</mo> <mo>-</mo> <mi>f</mi> <mfrac> <mrow> <msub> <mi>a</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>X</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>X</mi> <mi>x</mi> </msub> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>b</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>Y</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>Y</mi> <mi>s</mi> </msub> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>c</mi> <mn>1</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>Z</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>Z</mi> <mi>x</mi> </msub> <mo>)</mo> </mrow> </mrow> <mrow> <msub> <mi>a</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>X</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>X</mi> <mi>x</mi> </msub> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>b</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>Y</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>Y</mi> <mi>s</mi> </msub> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>c</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>Z</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>Z</mi> <mi>x</mi> </msub> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>;</mo> </mrow> <mrow><mi>u</mi><mo>-</mo><msub><mi>c</mi><mi>x</mi></msub><mo>=</mo><mo>-</mo><mi>f</mi><mfrac><mrow><msub><mi>a</mi><mn>1</mn></msub><mrow><mo>(</mo><msub><mi>X</mi><mi>p</mi></msub><mo>-</mo><msub><mi>X</mi><mi>x</mi></msub><mo>)</mo></mrow><mo>+</mo><msub><mi>b</mi><mn>1</mn></msub><mrow><mo>(</mo><msub><mi>Y</mi><mi>p</mi></msub><mo>-</mo><msub><mi>Y</mi><mi>s</mi></msub><mo>)</mo></mrow><mo>+</mo><msub><mi>c</mi><mn>1</mn></msub><mrow><mo>(</mo><msub><mi>Z</mi><mi>p</mi></msub><mo>-</mo><msub><mi>Z</mi><mi>x</mi></msub><mo>)</mo></mrow></mrow><mrow><msub><mi>a</mi><mn>3</mn></msub><mrow><mo>(</mo><msub><mi>X</mi><mi>p</mi></msub><mo>-</mo><msub><mi>X</mi><mi>x</mi></msub><mo>)</mo></mrow><mo>+</mo><msub><mi>b</mi><mn>3</mn></msub><mrow><mo>(</mo><msub><mi>Y</mi><mi>p</mi></msub><mo>-</mo><msub><mi>Y</mi><mi>s</mi></msub><mo>)</mo></mrow><mo>+</mo><msub><mi>c</mi><mn>3</mn></msub><mrow><mo>(</mo><msub><mi>Z</mi><mi>p</mi></msub><mo>-</mo><msub><mi>Z</mi><mi>x</mi></msub><mo>)</mo></mrow></mrow></mfrac><mo>;</mo></mrow> <mrow> <mi>v</mi> <mo>-</mo> <msub> <mi>c</mi> <mi>y</mi> </msub> <mo>=</mo> <mo>-</mo> <mi>f</mi> <mfrac> <mrow> <msub> <mi>a</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>X</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>X</mi> <mi>x</mi> </msub> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>b</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>Y</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>Y</mi> <mi>s</mi> </msub> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>c</mi> <mn>2</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>Z</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>Z</mi> <mi>x</mi> </msub> <mo>)</mo> </mrow> </mrow> <mrow> <msub> <mi>a</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>X</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>X</mi> <mi>x</mi> </msub> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>b</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>Y</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>Y</mi> <mi>s</mi> </msub> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>c</mi> <mn>3</mn> </msub> <mrow> <mo>(</mo> <msub> <mi>Z</mi> <mi>p</mi> </msub> <mo>-</mo> <msub> <mi>Z</mi> <mi>x</mi> </msub> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>;</mo> </mrow> <mrow><mi>v</mi><mo>-</mo><msub><mi>c</mi><mi>y</mi></msub><mo>=</mo><mo>-</mo><mi>f</mi><mfrac><mrow><msub><mi>a</mi><mn>2</mn></msub><mrow><mo>(</mo><msub><mi>X</mi><mi>p</mi></msub><mo>-</mo><msub><mi>X</mi><mi>x</mi></msub><mo>)</mo></mrow><mo>+</mo><msub><mi>b</mi><mn>2</mn></msub><mrow><mo>(</mo><msub><mi>Y</mi><mi>p</mi></msub><mo>-</mo><msub><mi>Y</mi><mi>s</mi></msub><mo>)</mo></mrow><mo>+</mo><msub><mi>c</mi><mn>2</mn></msub><mrow><mo>(</mo><msub><mi>Z</mi><mi>p</mi></msub><mo>-</mo><msub><mi>Z</mi><mi>x</mi></msub><mo>)</mo></mrow></mrow><mrow><msub><mi>a</mi><mn>3</mn></msub><mrow><mo>(</mo><msub><mi>X</mi><mi>p</mi></msub><mo>-</mo><msub><mi>X</mi><mi>x</mi></msub><mo>)</mo></mrow><mo>+</mo><msub><mi>b</mi><mn>3</mn></msub><mrow><mo>(</mo><msub><mi>Y</mi><mi>p</mi></msub><mo>-</mo><msub><mi>Y</mi><mi>s</mi></msub><mo>)</mo></mrow><mo>+</mo><msub><mi>c</mi><mn>3</mn></msub><mrow><mo>(</mo><msub><mi>Z</mi><mi>p</mi></msub><mo>-</mo><msub><mi>Z</mi><mi>x</mi></msub><mo>)</mo></mrow></mrow></mfrac><mo>;</mo></mrow> 式中,f为镜头中心到影像平面的垂距,a1、a2、a3、b1、b2、b3、c1、c2及c3为各相机的9个方向余弦,各方向余弦与图像姿态角ω及γ间的关系如下:In the formula, f is the vertical distance from the lens center to the image plane, a 1 , a 2 , a 3 , b 1 , b 2 , b 3 , c 1 , c 2 and c 3 are the nine direction cosines of each camera, each direction cosine and the image attitude angle The relationship between ω and γ is as follows: b1=cosωsinγ;b 1 = cos ω sin γ; b2=cosωsinγ;b 2 = cos ω sin γ; b3=-sinω;b 3 =-sinω; 其中ω和γ分别是以多线激光雷达为坐标原点的Y轴为主轴、X轴为主轴、Z轴为主轴的旋转角;估计出各相机的姿态角元素和9个方向余弦,完成标定。in ω and γ are the rotation angles of the Y-axis, the X-axis, and the Z-axis with the multi-line lidar as the coordinate origin, respectively; the attitude angle elements and 9 direction cosines of each camera are estimated to complete the calibration.
CN201711012232.9A 2017-10-26 2017-10-26 Hybrid calibration method of multi-line laser radar and multi-channel camera Active CN108020826B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711012232.9A CN108020826B (en) 2017-10-26 2017-10-26 Hybrid calibration method of multi-line laser radar and multi-channel camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711012232.9A CN108020826B (en) 2017-10-26 2017-10-26 Hybrid calibration method of multi-line laser radar and multi-channel camera

Publications (2)

Publication Number Publication Date
CN108020826A true CN108020826A (en) 2018-05-11
CN108020826B CN108020826B (en) 2019-11-19

Family

ID=62080304

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711012232.9A Active CN108020826B (en) 2017-10-26 2017-10-26 Hybrid calibration method of multi-line laser radar and multi-channel camera

Country Status (1)

Country Link
CN (1) CN108020826B (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109300162A (en) * 2018-08-17 2019-02-01 浙江工业大学 A joint calibration method of multi-line lidar and camera based on refined radar scanning edge points
CN109345596A (en) * 2018-09-19 2019-02-15 百度在线网络技术(北京)有限公司 Multisensor scaling method, device, computer equipment, medium and vehicle
CN109343061A (en) * 2018-09-19 2019-02-15 百度在线网络技术(北京)有限公司 Transducer calibration method, device, computer equipment, medium and vehicle
CN109584183A (en) * 2018-12-05 2019-04-05 吉林大学 A kind of laser radar point cloud goes distortion method and system
CN109633612A (en) * 2018-10-18 2019-04-16 浙江大学 A kind of single line laser radar that nothing is observed jointly and Camera extrinsic scaling method
CN109900205A (en) * 2019-02-21 2019-06-18 武汉大学 A kind of quick calibrating method of high-precision single line laser device and optical camera
CN109949371A (en) * 2019-03-18 2019-06-28 北京智行者科技有限公司 A kind of scaling method for laser radar and camera data
CN110149463A (en) * 2019-04-22 2019-08-20 上海大学 It is a kind of to carry the hand-held line-structured light camera for turning station measurement target
CN110200552A (en) * 2019-06-20 2019-09-06 小狗电器互联网科技(北京)股份有限公司 The measurement terminals of laser radar are gone with the method and sweeper of distortion
CN110244282A (en) * 2019-06-10 2019-09-17 于兴虎 A kind of multicamera system and laser radar association system and its combined calibrating method
CN110353577A (en) * 2019-08-09 2019-10-22 小狗电器互联网科技(北京)股份有限公司 A kind of laser radar point cloud data goes the method and Floor-sweeping device of distortion
CN110609268A (en) * 2018-11-01 2019-12-24 驭势科技(北京)有限公司 Laser radar calibration method, device and system and storage medium
CN110765894A (en) * 2019-09-30 2020-02-07 杭州飞步科技有限公司 Target detection method, device, equipment and computer readable storage medium
CN111311742A (en) * 2020-03-27 2020-06-19 北京百度网讯科技有限公司 Three-dimensional reconstruction method, three-dimensional reconstruction device and electronic equipment
CN111325801A (en) * 2020-01-23 2020-06-23 天津大学 Combined calibration method for laser radar and camera
CN111413689A (en) * 2020-05-07 2020-07-14 沃行科技(南京)有限公司 Efficient static calibration method for realizing multi-laser radar point cloud alignment based on rviz
CN111538008A (en) * 2019-01-18 2020-08-14 杭州海康威视数字技术股份有限公司 Transformation matrix determining method, system and device
CN112230241A (en) * 2020-10-23 2021-01-15 湖北亿咖通科技有限公司 Calibration method based on random scanning type radar
CN112241989A (en) * 2019-08-22 2021-01-19 北京新能源汽车技术创新中心有限公司 External parameter calibration method and device, computer equipment and storage medium
CN112396663A (en) * 2020-11-17 2021-02-23 广东电科院能源技术有限责任公司 Visual calibration method, device, equipment and medium for multi-depth camera
CN112823294A (en) * 2019-09-18 2021-05-18 北京嘀嘀无限科技发展有限公司 System and method for calibrating camera and multiline lidar
CN113129590A (en) * 2021-04-12 2021-07-16 武汉理工大学 Traffic facility information intelligent analysis method based on vehicle-mounted radar and graphic measurement
CN113534110A (en) * 2021-06-24 2021-10-22 香港理工大学深圳研究院 Static calibration method for multi-laser radar system
CN113625288A (en) * 2021-06-15 2021-11-09 中国科学院自动化研究所 Camera and laser radar pose calibration method and device based on point cloud registration
CN114047487A (en) * 2021-11-05 2022-02-15 深圳市镭神智能系统有限公司 Radar and vehicle body external parameter calibration method and device, electronic equipment and storage medium
WO2022256976A1 (en) * 2021-06-07 2022-12-15 深圳市大疆创新科技有限公司 Method and system for constructing dense point cloud truth value data and electronic device
CN116499364A (en) * 2023-06-30 2023-07-28 济南作为科技有限公司 Method and system for cloud adjustment distortion of three-dimensional laser point of coal-coiling instrument

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103500338A (en) * 2013-10-16 2014-01-08 厦门大学 Road zebra crossing automatic extraction method based on vehicle-mounted laser scanning point cloud
CN104142157A (en) * 2013-05-06 2014-11-12 北京四维图新科技股份有限公司 Calibration method, device and equipment
CN105678783A (en) * 2016-01-25 2016-06-15 西安科技大学 Data fusion calibration method of catadioptric panorama camera and laser radar

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104142157A (en) * 2013-05-06 2014-11-12 北京四维图新科技股份有限公司 Calibration method, device and equipment
CN103500338A (en) * 2013-10-16 2014-01-08 厦门大学 Road zebra crossing automatic extraction method based on vehicle-mounted laser scanning point cloud
CN105678783A (en) * 2016-01-25 2016-06-15 西安科技大学 Data fusion calibration method of catadioptric panorama camera and laser radar

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109300162A (en) * 2018-08-17 2019-02-01 浙江工业大学 A joint calibration method of multi-line lidar and camera based on refined radar scanning edge points
CN109300162B (en) * 2018-08-17 2021-08-03 浙江工业大学 A joint calibration method of multi-line lidar and camera based on refined radar scanning edge points
CN109345596B (en) * 2018-09-19 2024-07-12 阿波罗智能技术(北京)有限公司 Multi-sensor calibration method, device, computer equipment, medium and vehicle
CN109345596A (en) * 2018-09-19 2019-02-15 百度在线网络技术(北京)有限公司 Multisensor scaling method, device, computer equipment, medium and vehicle
CN109343061A (en) * 2018-09-19 2019-02-15 百度在线网络技术(北京)有限公司 Transducer calibration method, device, computer equipment, medium and vehicle
US11002840B2 (en) 2018-09-19 2021-05-11 Baidu Online Network Technology (Beijing) Co., Ltd. Multi-sensor calibration method, multi-sensor calibration device, computer device, medium and vehicle
US11042762B2 (en) 2018-09-19 2021-06-22 Baidu Online Network Technology (Beijing) Co., Ltd. Sensor calibration method and device, computer device, medium, and vehicle
CN109633612A (en) * 2018-10-18 2019-04-16 浙江大学 A kind of single line laser radar that nothing is observed jointly and Camera extrinsic scaling method
CN110609268B (en) * 2018-11-01 2022-04-29 驭势科技(北京)有限公司 Laser radar calibration method, device and system and storage medium
CN110609268A (en) * 2018-11-01 2019-12-24 驭势科技(北京)有限公司 Laser radar calibration method, device and system and storage medium
CN109584183A (en) * 2018-12-05 2019-04-05 吉林大学 A kind of laser radar point cloud goes distortion method and system
CN111538008A (en) * 2019-01-18 2020-08-14 杭州海康威视数字技术股份有限公司 Transformation matrix determining method, system and device
CN109900205A (en) * 2019-02-21 2019-06-18 武汉大学 A kind of quick calibrating method of high-precision single line laser device and optical camera
CN109900205B (en) * 2019-02-21 2020-04-24 武汉大学 High-precision single-line laser and optical camera rapid calibration method
CN109949371A (en) * 2019-03-18 2019-06-28 北京智行者科技有限公司 A kind of scaling method for laser radar and camera data
CN110149463A (en) * 2019-04-22 2019-08-20 上海大学 It is a kind of to carry the hand-held line-structured light camera for turning station measurement target
CN110244282A (en) * 2019-06-10 2019-09-17 于兴虎 A kind of multicamera system and laser radar association system and its combined calibrating method
CN110200552A (en) * 2019-06-20 2019-09-06 小狗电器互联网科技(北京)股份有限公司 The measurement terminals of laser radar are gone with the method and sweeper of distortion
CN110353577A (en) * 2019-08-09 2019-10-22 小狗电器互联网科技(北京)股份有限公司 A kind of laser radar point cloud data goes the method and Floor-sweeping device of distortion
CN110353577B (en) * 2019-08-09 2020-12-08 小狗电器互联网科技(北京)股份有限公司 Laser radar point cloud data distortion removal method and sweeping device
CN112241989A (en) * 2019-08-22 2021-01-19 北京新能源汽车技术创新中心有限公司 External parameter calibration method and device, computer equipment and storage medium
CN112823294A (en) * 2019-09-18 2021-05-18 北京嘀嘀无限科技发展有限公司 System and method for calibrating camera and multiline lidar
CN112823294B (en) * 2019-09-18 2024-02-02 北京航迹科技有限公司 System and method for calibrating cameras and multi-line lidar
CN110765894A (en) * 2019-09-30 2020-02-07 杭州飞步科技有限公司 Target detection method, device, equipment and computer readable storage medium
CN110765894B (en) * 2019-09-30 2022-07-08 杭州飞步科技有限公司 Target detection method, device, equipment and computer readable storage medium
CN111325801B (en) * 2020-01-23 2022-03-15 天津大学 A joint calibration method of lidar and camera
CN111325801A (en) * 2020-01-23 2020-06-23 天津大学 Combined calibration method for laser radar and camera
CN111311742A (en) * 2020-03-27 2020-06-19 北京百度网讯科技有限公司 Three-dimensional reconstruction method, three-dimensional reconstruction device and electronic equipment
CN111413689B (en) * 2020-05-07 2023-04-07 沃行科技(南京)有限公司 Efficient static calibration method for realizing multi-laser radar point cloud alignment based on rviz
CN111413689A (en) * 2020-05-07 2020-07-14 沃行科技(南京)有限公司 Efficient static calibration method for realizing multi-laser radar point cloud alignment based on rviz
CN112230241A (en) * 2020-10-23 2021-01-15 湖北亿咖通科技有限公司 Calibration method based on random scanning type radar
CN112396663A (en) * 2020-11-17 2021-02-23 广东电科院能源技术有限责任公司 Visual calibration method, device, equipment and medium for multi-depth camera
CN113129590A (en) * 2021-04-12 2021-07-16 武汉理工大学 Traffic facility information intelligent analysis method based on vehicle-mounted radar and graphic measurement
WO2022256976A1 (en) * 2021-06-07 2022-12-15 深圳市大疆创新科技有限公司 Method and system for constructing dense point cloud truth value data and electronic device
CN113625288A (en) * 2021-06-15 2021-11-09 中国科学院自动化研究所 Camera and laser radar pose calibration method and device based on point cloud registration
CN113534110B (en) * 2021-06-24 2023-11-24 香港理工大学深圳研究院 A static calibration method for multi-lidar systems
CN113534110A (en) * 2021-06-24 2021-10-22 香港理工大学深圳研究院 Static calibration method for multi-laser radar system
CN114047487B (en) * 2021-11-05 2022-07-26 深圳市镭神智能系统有限公司 Radar and vehicle body external parameter calibration method and device, electronic equipment and storage medium
CN114047487A (en) * 2021-11-05 2022-02-15 深圳市镭神智能系统有限公司 Radar and vehicle body external parameter calibration method and device, electronic equipment and storage medium
CN116499364A (en) * 2023-06-30 2023-07-28 济南作为科技有限公司 Method and system for cloud adjustment distortion of three-dimensional laser point of coal-coiling instrument
CN116499364B (en) * 2023-06-30 2023-09-12 济南作为科技有限公司 Method and system for cloud adjustment distortion of three-dimensional laser point of coal-coiling instrument

Also Published As

Publication number Publication date
CN108020826B (en) 2019-11-19

Similar Documents

Publication Publication Date Title
CN108020826B (en) Hybrid calibration method of multi-line laser radar and multi-channel camera
CN110378965B (en) Method, device and equipment for determining coordinate system conversion parameters of road side imaging equipment
CN106895851B (en) A kind of sensor calibration method that the more CCD polyphasers of Optical remote satellite are uniformly processed
CN110842940A (en) Building surveying robot multi-sensor fusion three-dimensional modeling method and system
CN109855603B (en) Focus measurement method and terminal
CN105225241A (en) The acquisition methods of unmanned plane depth image and unmanned plane
CN108613628A (en) A kind of overhead transmission line arc sag measurement method based on binocular vision
CN108629756B (en) A Kinectv2 Depth Image Invalid Point Repair Method
CN103852060A (en) Visible light image distance measuring method based on monocular vision
CN105931200A (en) Quick geometric precise correction method for small area array spaceborne TDI CCD camera
CN113793270A (en) Aerial image geometric correction method based on unmanned aerial vehicle attitude information
WO2019144269A1 (en) Multi-camera photographing system, terminal device, and robot
CN111307046B (en) Tree height measuring method based on hemispherical image
CN202111802U (en) Calibration device for monitoring apparatus with multiple image sensors
CN105043252A (en) Image processing based size measuring method without reference object
CN104268884B (en) A kind of calibration system and method for the lane departure warning based on car networking
KR101035538B1 (en) Real-time lane location information acquisition device and method
JPH1019562A (en) Surveying device and surveying method
CN105136123B (en) A ground image pair photogrammetry method based on a point in the image of an ordinary fixed-focus digital camera as the camera station
CN102095368B (en) Method for quickly acquiring camera parameters in wide-range vision coordinate measurement
RU2697822C2 (en) Method of determining coordinates of objects based on their digital images
CN103791919B (en) A kind of vertical accuracy evaluation method based on digital base-height ratio model
CN115641380A (en) Camera and radar multi-angle combined external reference calibration method and system under rotation condition
CN115100290A (en) Monocular vision positioning method, monocular vision positioning device, monocular vision positioning equipment and monocular vision positioning storage medium in traffic scene
CN104567812A (en) Method and device for measuring spatial position

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant