CN111815717B - Multi-sensor fusion external parameter combination semi-autonomous calibration method - Google Patents

Multi-sensor fusion external parameter combination semi-autonomous calibration method Download PDF

Info

Publication number
CN111815717B
CN111815717B CN202010677982.3A CN202010677982A CN111815717B CN 111815717 B CN111815717 B CN 111815717B CN 202010677982 A CN202010677982 A CN 202010677982A CN 111815717 B CN111815717 B CN 111815717B
Authority
CN
China
Prior art keywords
coordinate system
camera
wave radar
calibration object
millimeter wave
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010677982.3A
Other languages
Chinese (zh)
Other versions
CN111815717A (en
Inventor
黄攀峰
余航
张帆
张夷斋
孟中杰
董刚奇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN202010677982.3A priority Critical patent/CN111815717B/en
Publication of CN111815717A publication Critical patent/CN111815717A/en
Application granted granted Critical
Publication of CN111815717B publication Critical patent/CN111815717B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20068Projection on vertical or horizontal image axis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Abstract

The invention relates to a multi-sensor fusion external parameter combination semi-autonomous calibration method, which comprises the following steps: establishing a millimeter wave radar and camera coordinate system; selecting a calibration object and an environment; acquiring data of a millimeter wave radar and a camera; projecting the point cloud data to a camera coordinate system; dragging the sliding bar to adjust the external parameters. The invention directly uses the camera coordinate system and the millimeter wave radar coordinate system to carry out coordinate conversion, thereby avoiding the dependence on the world coordinate system, and avoiding the need of dynamically capturing multi-frame information for calculation and conversion of the world coordinate system; the parameters are optimized by manually observing the projection result to drag the slide bar, so that the error caused by using an image processing algorithm is avoided, the calculated amount is reduced, the selection of a calibration object is looser, and the specific color and shape are not needed.

Description

Multi-sensor fusion external parameter combination semi-autonomous calibration method
Technical Field
The invention belongs to an external parameter calibration technology in the field of multi-sensor fusion, and particularly relates to a monocular camera and millimeter wave radar external parameter combined semi-autonomous calibration method based on projection transformation.
Background
With the development of the intelligent robot technology, the perception technology gradually becomes the core problem in the field of the intelligent robot at present, and the positioning of the robot to the robot and the mapping of the environment are determined. The environment information acquired by a single sensor is limited, and the multi-sensor fusion technology can make up for deficiencies by combining the characteristics of each sensor, so that the method has great significance for acquiring the environment perception capability and the map information of the intelligent robot. In the multi-sensor fusion algorithm, the problem of coordinate conversion among sensors is always involved, and the problem of external reference calibration among sensors is firstly solved.
Computer vision is a perception mode that the most environmental information can be obtained by the current intelligent robot, and the monocular camera has the characteristics of light weight, low cost and the like; millimeter wave radar is a sensor that works in millimeter wave band, and is small, the quality is light, measurement accuracy is high and measuring range is big, has all gained extensive attention and application in fields such as intelligent robot, unmanned driving, unmanned aerial vehicle in recent years. The information acquired by the millimeter wave radar is mainly in a three-dimensional point cloud form, and the point cloud fed back by the millimeter wave radar is sparse, cannot reflect the outline of an object like a laser radar, and cannot be accurately registered, so that external reference calibration of the millimeter wave radar and the monocular camera has higher difficulty, and the existing external reference calibration method of the laser radar and the monocular camera is not suitable for the millimeter wave radar and the monocular camera.
The external parameter calibration results of the millimeter wave radar and the monocular camera directly influence the subsequent sensor fusion effect. The patent (201210563695.5) proposes that a transformation matrix with a world coordinate system is obtained through the change between two frames of data of each sensor, and then the external reference between a millimeter wave radar and a monocular camera is calibrated through the world coordinate system, the method has higher requirements on the synchronization of time stamps because information needs to be collected in the moving process, and the problem of complex operation, low accuracy and the like exists because the coordinate transformation itself obtained through a motion method has larger errors; the patent (201811577547.2) proposes a method for calibrating external parameters of a monocular camera and a millimeter-wave radar by using metal balls as calibration objects, the method utilizes the sensitivity of the millimeter-wave radar to metal, selects a plurality of metal balls as the calibration objects, respectively obtains the center positions of the metal balls under the millimeter-wave radar and a camera coordinate system through clustering and image processing algorithms, and then obtains accurate external parameters by optimizing the matching error of the calibration balls, but the method has the defects that the calibration result is greatly influenced by the image processing effect and the calibration process is complex.
Disclosure of Invention
Technical problem to be solved
In order to avoid the defects of the prior art, the invention provides a multi-sensor fusion external parameter combination semi-autonomous calibration method.
Technical scheme
A multi-sensor fusion external parameter combination semi-autonomous calibration method is disclosed, wherein the multi-sensor comprises a millimeter wave radar and a monocular camera, and is characterized by comprising the following steps:
step 1: establishing millimeter wave radar and camera coordinate system
1.1) establishing a millimeter wave radar coordinate system: using millimeter wave radar as geometric centerIs the origin O of the millimeter wave radar coordinate systemR,YRThe axis is directed perpendicular to the plane of the wave source and in front of the sensor, XRAxis parallel to long side of radar and oriented to right, ZCAxis perpendicular XRORYRThe plane is upward;
1.2) establishing a camera coordinate system: using the geometric center of monocular camera as the origin of the coordinate system of camera, XCThe long side of the axis parallel camera is right in the direction, YCThe axis is parallel to the short side of the camera and the direction is downward, ZCAxis perpendicular XCOCYCThe plane is forward;
1.3) establishing a pixel coordinate system: the pixel coordinate system is used for representing the coordinates of pixel points of the image, and the origin OSAt the upper left corner of the image, XSAxial right and XCAxis parallel, YSAxially down and YCThe axes are parallel;
step 2: selecting a target and an environment
2.1) selection of calibrators: selecting a metal cuboid calibration object, wherein the width of the calibration object is L, and the height of the calibration object is H; the calibration object is positioned in front of the sensor, the transverse distance between the calibration object and the sensor is not more than 5m, and the linear distance is S;
2.2) selection of environment: selecting a 20m multiplied by 20m open environment to collect data, and ensuring that no other objects exist between the sensor and the calibration object;
and step 3: obtaining data of millimeter wave radar and camera
3.1) acquiring millimeter wave radar data: setting a maximum detection distance S of a radarmS or more, only the existence of the calibration object is ensured in the detection distance, and the central coordinate value of the calibration object obtained by the radar is set as DR0(xr,yr) Determining coordinates of points at the upper left corner and the lower right corner of the calibration object as D according to the size of the calibration objectR1(xr1,yr1,zr1)、DR2(xr2,yr2,zr2) And D isR1、DR2And DR0The following relationships exist:
Figure BDA0002584684180000031
3.2) acquiring camera data: calibrating the internal parameter K of the monocular camera to obtain a real-time image, wherein
Figure BDA0002584684180000032
fx、fyDenotes the focal length of the camera, cx、cyRepresenting the deviation of the central point of the camera relative to the origin of a pixel coordinate system, wherein the units are pixels;
3.3) if the unmanned aerial vehicle does not recognize the visual identification, the unmanned aerial vehicle patrols in a circular track in the landing area until the visual identification is found;
and 4, step 4: projecting point cloud data to camera coordinate system
4.1) projection of the millimeter-wave radar coordinate system into the camera coordinate system: setting the external parameter matrix of the millimeter wave radar and the monocular camera as R, wherein R is the angle deviation pitch angle theta and the roll angle formed by the camera and the millimeter wave radar
Figure BDA0002584684180000033
Yaw angle psi and position offset (x)t,yt,zt) Determining the projection coordinates D of the upper left corner and the lower right corner of the calibration object in the camera coordinate systemC1(xc1,yc1,zc1),DC2(xc2,yc2,zc2) Can be obtained by calculation of a projection matrix R, and the calculation method is as follows:
Figure BDA0002584684180000034
Figure BDA0002584684180000035
4.2) projection of the camera coordinate system onto the pixel coordinate system: calculating to obtain projection coordinates D of the upper left corner and the upper right corner of the calibration object under a pixel coordinate system according to the calibrated internal reference K and the coordinates of the camera coordinate system obtained by 4.1) in advanceS1(xs1,ys1),DS2(xs2,ys2) The calculation method is as follows,
Figure BDA0002584684180000041
4.3) according to the two vertex coordinates Ds1、Ds2Drawing a rectangular box in an image
And 5: drag the sliding strip to adjust the external parameter
5.1) setting the angular offset pitch angle theta and the roll angle of the camera and the millimeter wave radar
Figure BDA0002584684180000042
Yaw angle psi and position offset xt、yt、ztA slider of these six parameters, which can be dragged over the image window to change the parameter values;
5.2) observing the position relation between the rectangle drawn in the step 4.3) and the real calibration object in the image, adjusting the 6 sliding strips in the step 5.1) until the rectangle frame just selects the calibration object frame, and then selecting the corresponding pitch angle theta and the corresponding roll angle theta
Figure BDA0002584684180000043
Yaw angle psi and position offset xt、yt、ztNamely the accurate external parameters of the millimeter wave radar and the monocular camera.
The 0.5m < L <1m, 1m < H <2m, and 20m < S <30 m.
Advantageous effects
The invention provides a multi-sensor fusion external parameter combination semi-autonomous calibration method, which comprises the following steps: establishing a millimeter wave radar and camera coordinate system; selecting a calibration object and an environment; acquiring data of a millimeter wave radar and a camera; projecting the point cloud data to a camera coordinate system; dragging the sliding bar to adjust the external parameters. The beneficial effects are as follows:
(1) the coordinate conversion is directly carried out by using the camera coordinate system and the millimeter wave radar coordinate system, so that the dependence on a world coordinate system is avoided, and the conversion between the dynamic capture multi-frame information calculation and the world coordinate system is not needed;
(2) the parameters are optimized by manually observing the projection result to drag the slide bar, so that the error caused by using an image processing algorithm is avoided, the calculated amount is reduced, the selection of a calibration object is looser, and the specific color and shape are not needed.
Drawings
FIG. 1 is a schematic diagram of a multi-sensor coordinate system
FIG. 2 is a schematic diagram of coordinate system conversion between a monocular camera and a millimeter wave radar
FIG. 3 is a schematic diagram of external reference of a hand-calibrated monocular camera and a millimeter wave radar
Detailed Description
The invention will now be further described with reference to the following examples and drawings:
as shown in fig. 1:
1) establishing a millimeter wave radar and camera coordinate system:
1.1) establishing a millimeter wave radar coordinate system: as shown in FIG. 1, the geometric center of the millimeter wave radar is used as the origin O of the coordinate system of the millimeter wave radarR,YRThe axis is directed perpendicular to the plane of the wave source and in front of the sensor, XRAxis parallel to long side of radar and oriented to right, ZCAxis perpendicular XRORYRThe plane is upward;
1.2) establishing a camera coordinate system: as shown in FIG. 1, the origin, X, of the coordinate system of the monocular camera is taken as the origin of the camera's geometric centerCThe long side of the axis parallel camera is right in the direction, YCThe axis is parallel to the short side of the camera and the direction is downward, ZCAxis perpendicular XCOCYCThe plane is forward;
1.3) establishing a pixel coordinate system: as shown in FIG. 1, the pixel coordinate system is used to represent the coordinates of the pixels of the image, the origin OSAt the upper left corner of the image, XSAxial right and XCAxis parallel, YSAxially down and YCThe axes are parallel.
2) Selecting a calibration object and an environment:
2.1) selection of calibrators: selecting a calibration object according to a conveniently-obtained principle, wherein the calibration object is preferably a metal cuboid, so that a data point returned by the millimeter wave radar can be ensured to be the center of the calibration object, the width of the calibration object is L (0.5m < L <1m), and the height of the calibration object is H (1m < H <2 m); the calibration object is positioned in front of the sensor, the transverse distance between the calibration object and the sensor is not more than 5m, and the linear distance is S (20m < S <30 m);
2.2) selection of environment: collecting data by selecting a 20m × 20m open environment, and ensuring that no other objects exist between the sensor and a calibration object;
3) acquiring data of a millimeter wave radar and a camera:
3.1) acquiring millimeter wave radar data: at present, millimeter wave radars on the market can directly obtain clustered data, represent the centers of calibration objects, and set the maximum detection distance S of the radars according to the distance of the calibration objects of 2.1)mS or more, only the existence of the calibration object is ensured in the detection distance, and the central coordinate value of the calibration object obtained by the radar is set as DR0(xr,yr) Determining coordinates of points at the upper left corner and the lower right corner of the calibration object as D according to the size of the calibration objectR1(xr1,yr1,zr1)、DR2(xr2,yr2,zr2) And D isR1、DR2And DR0The following relationships exist:
Figure BDA0002584684180000061
3.1) acquiring camera data: calibrating the internal parameter K of the monocular camera to obtain a real-time image, wherein
Figure BDA0002584684180000062
fx、fyDenotes the focal length of the camera, cx、cyRepresenting the deviation of the central point of the camera relative to the origin of a pixel coordinate system, wherein the units are pixels;
3.2) if the unmanned aerial vehicle does not recognize the visual identification, the unmanned aerial vehicle patrols in a circular track in the landing area until the visual identification is found;
4) the projection of the point cloud data onto the camera coordinate system comprises the following substeps:
4.1) projection of the millimeter-wave radar coordinate system into the camera coordinate system: millimeter wave radar and sheetThe external parameter matrix of the camera is R, and the R is the angle deviation pitch angle theta and the roll angle between the camera and the millimeter wave radar
Figure BDA0002584684180000063
Yaw angle psi and position offset (x)t,yt,zt) Determining the projection coordinates D of the upper left corner and the lower right corner of the calibration object in the camera coordinate systemC1(xc1,yc1,zc1),DC2(xc2,yc2,zc2) Can be calculated from the projection matrix R, by the following method,
Figure BDA0002584684180000064
Figure BDA0002584684180000065
4.2) projection of the camera coordinate system onto the pixel coordinate system: calculating to obtain projection coordinates D of the upper left corner and the upper right corner of the calibration object under a pixel coordinate system according to the calibrated internal reference K and the coordinates of the camera coordinate system obtained by 4.1) in advanceS1(xs1,ys1),DS2(xs2,ys2) The calculation method is as follows,
Figure BDA0002584684180000071
4.3) according to the two vertex coordinates D, as shown in FIG. 3s1、Ds2Drawing a rectangular frame in the image;
5) the step of dragging the sliding bar to adjust the external parameter comprises the following substeps:
5.1) setting the angular offset pitch angle theta and the roll angle of the camera and the millimeter wave radar
Figure BDA0002584684180000072
Yaw angle psi and position offset xt、yt、ztSliding strip with six parametersThe slider bar may be dragged over the image window to change the parameter value;
5.2) observing the position relation between the rectangle drawn in the step 4.3) and the real calibration object in the image as shown in figure 3, adjusting the 6 sliding bars in the step 5.1) until the rectangle frame just selects the calibration object frame, and then correspondingly selecting the pitch angle theta and the roll angle theta
Figure BDA0002584684180000073
Yaw angle psi and position offset xt、yt、ztNamely the accurate external parameters of the millimeter wave radar and the monocular camera.

Claims (2)

1. A multi-sensor fusion external parameter combination semi-autonomous calibration method is disclosed, wherein the multi-sensor comprises a millimeter wave radar and a monocular camera, and is characterized by comprising the following steps:
step 1: establishing millimeter wave radar and camera coordinate system
1.1) establishing a millimeter wave radar coordinate system: taking the geometric center of the millimeter-wave radar as the origin O of the coordinate system of the millimeter-wave radarR,YRThe axis is directed perpendicular to the plane of the wave source and in front of the sensor, XRAxis parallel to long side of radar and oriented to right, ZCAxis perpendicular XRORYRThe plane is upward;
1.2) establishing a camera coordinate system: using the geometric center of the monocular camera as the origin O of the coordinate system of the cameraC,XCThe long side of the axis parallel camera is right in the direction, YCThe axis is parallel to the short side of the camera and the direction is downward, ZCAxis perpendicular XCOCYCThe plane is forward;
1.3) establishing a pixel coordinate system: the pixel coordinate system is used for representing the coordinates of pixel points of the image, and the origin OSAt the upper left corner of the image, XSAxial right and XCAxis parallel, YSAxially down and YCThe axes are parallel;
step 2: selecting a target and an environment
2.1) selection of calibrators: selecting a metal cuboid calibration object, wherein the width of the calibration object is L, and the height of the calibration object is H; the calibration object is positioned in front of the sensor, the transverse distance between the calibration object and the sensor is not more than 5m, and the linear distance is S;
2.2) selection of environment: collecting data by selecting a 20m × 20m open environment, and ensuring that no other objects exist between the sensor and a calibration object;
and step 3: obtaining data of millimeter wave radar and camera
3.1) acquiring millimeter wave radar data: setting a maximum detection distance S of a radarmS or more, only the existence of the calibration object is ensured in the detection distance, and the central coordinate value of the calibration object obtained by the radar is set as DR0(xr0,yr0) Determining coordinates of points at the upper left corner and the lower right corner of the calibration object as D according to the size of the calibration objectR1(xr1,yr1,zr1)、DR2(xr2,yr2,zr2) And D isR1、DR2And DR0The following relationships exist:
Figure FDA0003401450740000011
3.2) acquiring camera data: calibrating the internal parameter K of the monocular camera to obtain a real-time image, wherein
Figure FDA0003401450740000021
fx、fyDenotes the focal length of the camera, cx、cyRepresenting the deviation of the central point of the camera relative to the origin of a pixel coordinate system, wherein the units are pixels;
3.3) if the unmanned aerial vehicle does not recognize the visual identification, the unmanned aerial vehicle patrols the landing area to be landed in a circular track until the visual identification is found;
and 4, step 4: projecting point cloud data to camera coordinate system
4.1) projection of the millimeter wave radar coordinate system to the camera coordinate system: setting the external parameter matrix of the millimeter wave radar and the monocular camera as R, wherein R is the angle deviation pitch angle theta and the roll angle formed by the camera and the millimeter wave radar
Figure FDA0003401450740000026
Yaw angle psi and position offset (x)t,yt,zt) Determining the projection coordinates D of the upper left corner and the lower right corner of the calibration object in the camera coordinate systemC1(xc1,yc1,zc1),DC2(xc2,yc2,zc2) Can be obtained by calculation of a projection matrix R, and the calculation method is as follows:
Figure FDA0003401450740000022
Figure FDA0003401450740000023
4.2) projection of the camera coordinate system onto the pixel coordinate system: calculating to obtain projection coordinates D of the upper left corner and the upper right corner of the calibration object under the pixel coordinate system according to the internal reference K calibrated in advance and the coordinates of the camera coordinate system obtained by 4.1)S1(xs1,ys1),DS2(xs2,ys2) The calculation method is as follows,
Figure FDA0003401450740000024
4.3) according to the two vertex coordinates Ds1、Ds2Drawing a rectangular frame in the image;
and 5: drag the sliding strip to adjust the external parameter
5.1) setting the angular offset pitch angle theta and the roll angle of the camera and the millimeter wave radar
Figure FDA0003401450740000025
Yaw angle psi and position offset xt、yt、ztA slider of these six parameters, which can be dragged over the image window to change the parameter values;
5.2) observing the position relation between the rectangle drawn in the step 4.3) and the real calibration object in the imageAdjusting the sliding bars with the six parameters in the step 5.1) until the rectangular frame just selects the frame of the calibration object, and then correspondingly selecting the pitch angle theta and the roll angle theta
Figure FDA0003401450740000031
Yaw angle psi and position offset xt、yt、ztNamely the accurate external parameters of the millimeter wave radar and the monocular camera.
2. The method for multi-sensor fusion external parameter combination semi-autonomous calibration according to claim 1, wherein 0.5m < L <1m, 1m < H <2m, and 20m < S <30 m.
CN202010677982.3A 2020-07-15 2020-07-15 Multi-sensor fusion external parameter combination semi-autonomous calibration method Active CN111815717B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010677982.3A CN111815717B (en) 2020-07-15 2020-07-15 Multi-sensor fusion external parameter combination semi-autonomous calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010677982.3A CN111815717B (en) 2020-07-15 2020-07-15 Multi-sensor fusion external parameter combination semi-autonomous calibration method

Publications (2)

Publication Number Publication Date
CN111815717A CN111815717A (en) 2020-10-23
CN111815717B true CN111815717B (en) 2022-05-17

Family

ID=72864810

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010677982.3A Active CN111815717B (en) 2020-07-15 2020-07-15 Multi-sensor fusion external parameter combination semi-autonomous calibration method

Country Status (1)

Country Link
CN (1) CN111815717B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112489118B (en) * 2020-12-15 2022-06-14 中国人民解放军国防科技大学 Method for quickly calibrating external parameters of airborne sensor group of unmanned aerial vehicle
CN113514803A (en) * 2021-03-25 2021-10-19 武汉光庭信息技术股份有限公司 Combined calibration method for monocular camera and millimeter wave radar
CN113446933B (en) * 2021-05-19 2023-03-28 浙江大华技术股份有限公司 External parameter calibration method, device and system for multiple three-dimensional sensors
CN113702927A (en) * 2021-08-02 2021-11-26 中汽创智科技有限公司 Vehicle sensor calibration method and device and storage medium
CN114279468B (en) * 2021-12-31 2022-06-14 北京理工大学 Dynamic calibration method for millimeter wave radar and visual camera based on statistical analysis
CN114758005B (en) * 2022-03-23 2023-03-28 中国科学院自动化研究所 Laser radar and camera external parameter calibration method and device
CN116105662B (en) * 2023-04-17 2023-08-01 天津宜科自动化股份有限公司 Calibration method of multi-contour sensor
CN117784121A (en) * 2024-02-23 2024-03-29 四川天府新区北理工创新装备研究院 Combined calibration method and system for road side sensor and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110390695A (en) * 2019-06-28 2019-10-29 东南大学 The fusion calibration system and scaling method of a kind of laser radar based on ROS, camera
CN110766758A (en) * 2019-09-12 2020-02-07 浙江大华技术股份有限公司 Calibration method, device, system and storage device
CN110827358A (en) * 2019-10-15 2020-02-21 深圳数翔科技有限公司 Camera calibration method applied to automatic driving automobile
CN111383279A (en) * 2018-12-29 2020-07-07 阿里巴巴集团控股有限公司 External parameter calibration method and device and electronic equipment

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109211298B (en) * 2017-07-04 2021-08-17 百度在线网络技术(北京)有限公司 Sensor calibration method and device
US11061132B2 (en) * 2018-05-21 2021-07-13 Johnson Controls Technology Company Building radar-camera surveillance system
CN109490890B (en) * 2018-11-29 2023-06-02 重庆邮电大学 Intelligent vehicle-oriented millimeter wave radar and monocular camera information fusion method
CN109636837B (en) * 2018-12-21 2023-04-28 浙江大学 Method for evaluating calibration accuracy of external parameters of monocular camera and millimeter wave radar
CN109712189B (en) * 2019-03-26 2019-06-18 深兰人工智能芯片研究院(江苏)有限公司 A kind of method and apparatus of sensor combined calibrating
CN110135485A (en) * 2019-05-05 2019-08-16 浙江大学 The object identification and localization method and system that monocular camera is merged with millimetre-wave radar
CN110390697B (en) * 2019-07-11 2021-11-05 浙江大学 Millimeter wave radar and camera combined calibration method based on LM algorithm
CN110579764B (en) * 2019-08-08 2021-03-09 北京三快在线科技有限公司 Registration method and device for depth camera and millimeter wave radar, and electronic equipment
CN110794405B (en) * 2019-10-18 2022-06-10 北京全路通信信号研究设计院集团有限公司 Target detection method and system based on camera and radar fusion
CN111145264B (en) * 2019-11-12 2023-09-08 达闼机器人股份有限公司 Multi-sensor calibration method and device and computing equipment
CN111383285B (en) * 2019-11-25 2023-11-24 的卢技术有限公司 Sensor fusion calibration method and system based on millimeter wave radar and camera

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111383279A (en) * 2018-12-29 2020-07-07 阿里巴巴集团控股有限公司 External parameter calibration method and device and electronic equipment
CN110390695A (en) * 2019-06-28 2019-10-29 东南大学 The fusion calibration system and scaling method of a kind of laser radar based on ROS, camera
CN110766758A (en) * 2019-09-12 2020-02-07 浙江大华技术股份有限公司 Calibration method, device, system and storage device
CN110827358A (en) * 2019-10-15 2020-02-21 深圳数翔科技有限公司 Camera calibration method applied to automatic driving automobile

Also Published As

Publication number Publication date
CN111815717A (en) 2020-10-23

Similar Documents

Publication Publication Date Title
CN111815717B (en) Multi-sensor fusion external parameter combination semi-autonomous calibration method
CN108932736B (en) Two-dimensional laser radar point cloud data processing method and dynamic robot pose calibration method
CN111583337B (en) Omnibearing obstacle detection method based on multi-sensor fusion
CN110879401B (en) Unmanned platform real-time target 3D detection method based on camera and laser radar
CN112669393B (en) Laser radar and camera combined calibration method
CN110363820B (en) Target detection method based on laser radar and pre-image fusion
Zhu et al. Online camera-lidar calibration with sensor semantic information
CN111369630A (en) Method for calibrating multi-line laser radar and camera
CN111812649A (en) Obstacle identification and positioning method based on fusion of monocular camera and millimeter wave radar
CN103065323B (en) Subsection space aligning method based on homography transformational matrix
CN106774296A (en) A kind of disorder detection method based on laser radar and ccd video camera information fusion
CN107796373B (en) Distance measurement method based on monocular vision of front vehicle driven by lane plane geometric model
CN115731268A (en) Unmanned aerial vehicle multi-target tracking method based on visual/millimeter wave radar information fusion
CN113205604A (en) Feasible region detection method based on camera and laser radar
CN114638909A (en) Substation semantic map construction method based on laser SLAM and visual fusion
CN115546741A (en) Binocular vision and laser radar unmanned ship marine environment obstacle identification method
CN114413958A (en) Monocular vision distance and speed measurement method of unmanned logistics vehicle
CN114200442B (en) Road target detection and association method based on millimeter wave radar and vision
CN113327296B (en) Laser radar and camera online combined calibration method based on depth weighting
CN111340884B (en) Dual-target positioning and identity identification method for binocular heterogeneous camera and RFID
CN117310627A (en) Combined calibration method applied to vehicle-road collaborative road side sensing system
CN116403186A (en) Automatic driving three-dimensional target detection method based on FPN Swin Transformer and Pointernet++
CN113947141B (en) Roadside beacon sensing system of urban intersection scene
CN116243329A (en) High-precision multi-target non-contact ranging method based on laser radar and camera fusion
CN114923477A (en) Multi-dimensional space-ground collaborative map building system and method based on vision and laser SLAM technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant