CN109636837A - A kind of evaluation method of monocular camera and ginseng calibration accuracy outside millimetre-wave radar - Google Patents

A kind of evaluation method of monocular camera and ginseng calibration accuracy outside millimetre-wave radar Download PDF

Info

Publication number
CN109636837A
CN109636837A CN201811577934.6A CN201811577934A CN109636837A CN 109636837 A CN109636837 A CN 109636837A CN 201811577934 A CN201811577934 A CN 201811577934A CN 109636837 A CN109636837 A CN 109636837A
Authority
CN
China
Prior art keywords
coordinate
radar
sphere
camera
metal ball
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811577934.6A
Other languages
Chinese (zh)
Other versions
CN109636837B (en
Inventor
王滔
祝义朋
朱世强
张雲策
胡纪远
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201811577934.6A priority Critical patent/CN109636837B/en
Publication of CN109636837A publication Critical patent/CN109636837A/en
Application granted granted Critical
Publication of CN109636837B publication Critical patent/CN109636837B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Abstract

The invention discloses the evaluation methods for joining calibration result accuracy outside a kind of monocular camera and millimetre-wave radar, by arranging several metal balls in the scene, to metal ball after the data information under each coordinate system is converted, statistics is completed to calculate compared with, the index of the outer ginseng result accuracy of four kinds of descriptions is obtained, two of them is in camera image plane, and two kinds in radar points cloud space.

Description

A kind of evaluation method of monocular camera and ginseng calibration accuracy outside millimetre-wave radar
Technical field
The present invention relates to the outer ginseng calibration process in Multi-sensor Fusion field more particularly to a kind of monocular cameras and millimeter wave Join the evaluation method of calibration result accuracy outside radar.
Background technique
Multi-sensor information fusion technology is one of the critical issue in current robot perception field, decides robot pair The perception efficiency and precision of surrounding enviroment.Since the environmental information that single-sensor obtains is limited, the ring of multiple sensors is merged Border information is for improving robot perception ability and abundant robot cartographic information important in inhibiting.
Millimetre-wave radar has the advantages such as high reliablity, at low cost, obtains in intelligent robot and unmanned field Extensive concern and application, can be achieved the environment sensing of high performance-price ratio after especially merging with camera.Camera and millimetre-wave radar In fusion application, under needing to convert camera and millimetre-wave radar perception information obtained to the same coordinate system and pass through data Environmental information modeling is completed in processing and filtering.The color and pattern-information that camera provides can be used for judging the object kind in environment Class, the point cloud information that radar provides can be used for analyzing environmental objects present position and size, and the fusion of the two result can be protected effectively Hinder accuracy, rapidity and the reliability of environment sensing process.
Transition matrix between camera and millimetre-wave radar respective coordinates system be known as it is outer join matrix, the two fusion firstly the need of Solve the Solve problems of outer ginseng matrix, the outer ginseng combined calibrating of the process namely camera and millimetre-wave radar.Pass through outer ginseng matrix Data conversion can be further realized, is laid the foundation for data fusion.However, not yet existing outside to camera and millimetre-wave radar at present The effective evaluation method of ginseng calibration accuracy, does not have other referential reports, it is therefore desirable to which one general and descriptive good yet Evaluation method be used to evaluate the outer ginseng calibration order of accuarcy of camera and millimetre-wave radar.
Summary of the invention
The problem to be solved in the present invention is to provide one kind to join calibration result for describing outside monocular camera and millimetre-wave radar The evaluation method of accuracy, and provide the application example and description feature of this evaluation method, this method can also be used for camera with The similar occasions such as laser radar fusion.
To solve the above-mentioned problems, the invention discloses a kind of methods for joining result outside evaluation, by arranging in the scene Several metal balls are completed statistics and are calculated compared with, obtain to metal ball after the data information under each coordinate system is converted The indexs of the outer ginseng result accuracys of four kinds of descriptions, two of them in camera image plane, in addition two kinds it is empty in radar points cloud In.
For the pixel coordinate information of metal ball, the evaluation index that can be completed in camera image plane is calculated, including ball Body matching accuracy rate matches average error rate with sphere, and evaluation procedure comprises the steps of:
Step 1: metal ball and scene are arranged;
A. scene is arranged;
B. data acquire;
Step 2: the preparation of camera and radar data;
A. point cloud cluster seeks sphere centre coordinate;
B. image procossing seeks central coordinate of circle;
Step 3: three-dimensional point cloud passes through outer ginseng matrix conversion to image coordinate system;
A. different coordinates coordinate is converted;
B. result treatment and visualization;
Step 4: the outer ginseng accuracy index of correlation in camera image plane is calculated;
A. sphere matching accuracy rate is calculated;
B. it calculates sphere and matches average error rate;
Wherein, the index used in the present invention has the advantages that:
The finding process of sphere matching accuracy rate is easy, and computing resource requirement is low, exports visual result, can be used for judging Outer ginseng calibration result is if appropriate for being directly substituted into practical application.Sphere matching average error rate considers sphere size, accurate In property it is more fair with it is balanced, the error rate after quantization can linearly describe the accuracy of calibration result, combine to difference The discrimination of calibration result still can effectively embody degrees of offset when calibration result offset is larger.
For the three-dimensional point cloud coordinate information of metal ball, the evaluation index that can be completed in radar points cloud space is calculated, packet Sphere matching mean error distance and sphere matching mean error ratio are included, evaluation procedure comprises the steps of:
Step 1: metal ball and scene are arranged;
A. scene is arranged;
B. data acquire;
Step 2: the preparation of camera and radar data;
A. point cloud cluster seeks sphere centre coordinate;
B. image procossing seeks central coordinate of circle;
Step 3: image pixel coordinates pass through outer ginseng matrix conversion to radar fix system;
A. different coordinates coordinate is converted;
B. result treatment and visualization;
Step 4: the outer ginseng accuracy index of correlation in radar points cloud space is calculated;
A. it calculates sphere and matches mean error distance;
B. it calculates sphere and matches mean error ratio.
Wherein, the index used in the present invention has the advantages that:
Sphere matches the point cloud transition deviation distance in mean error distance description three-dimensional space, exports visual result;Ball Body matching mean error ratio considers sphere size, and when metal ball body size is identical and mean error is apart from descriptive one Cause, under metal ball body scene of different sizes it is descriptive it is more fair with it is balanced, applicability is more extensive.
In addition, scene requirement needed for calculating four indices is simple, wide usage is good, and solution procedure is easy, to computing resource It is required that small, quadrinomial parameter cover it is wide, it is descriptive strong, adapt to different demands, can be fair and quantitatively evaluate camera and radar The outer accuracy joined of calibration.
Detailed description of the invention
Specific evaluation side of the invention is described in further detail with reference to the accompanying drawing.
Fig. 1 is metal ball and monocular camera and millimetre-wave radar arrangement schematic diagram in the present invention;
Fig. 2 is that radar point cloud information is interior to camera image plane by outer ginseng matrixing in the present invention and assesses Flow diagram;
Fig. 3 is that radar point cloud information is interior to camera image plane by outer ginseng matrixing in the present invention and assesses Effect diagram;
Fig. 4 be in the present invention image pixel information by outer ginseng matrixing to radar points cloud space and assessing Flow diagram;
Fig. 5 be in the present invention image pixel information by outer ginseng matrixing to radar points cloud space and assessing Effect diagram.
Specific embodiment
The present invention will be further explained below with reference to the attached drawings.
For pixel coordinate information, the evaluation index that can be completed in camera image plane is calculated, including sphere matching is quasi- True rate matches average error rate with sphere, and evaluation procedure comprises the steps of:
Step 1: metal ball and scene are arranged;
A. scene is arranged: several metal balls being distributed across in the visual field of camera and radar.Metal ball is kept with each other Certain distance, meeting each spherical displacer pixel boundary can divide under image coordinate system, and point cloud boundary can divide under three-dimensional system of coordinate;
Fig. 1 is scene layout drawing, wherein 101 be metal ball, 102 be monocular camera and millimetre-wave radar system;
B. data acquire: by radar and camera it is relatively fixed after, static to be placed in plane, opening program and synchronous acquisition connect Radar point cloud data and camera image data in the continuous time;
Step 2: camera and radar data prepare;
A. point cloud cluster seeks sphere centre coordinate: with reference to the actual size of calibration sphere, by setting the threshold value of spatial dimension, The point cloud data for describing same spherical displacer is sorted out, and seeks three-dimensional coordinate mean value, as the centre of sphere under radar fix system Present position;
B. image procossing seeks central coordinate of circle: the circular contour of each spherical displacer in camera image is marked, to general image Color threshold processing is carried out, the outer boundary location of pixels in the center of circle corresponding to each spherical displacer is obtained, passes through search one by one radius The mode matched obtains the two-dimensional coordinate and radius size in each center of circle;
Step 3: three-dimensional point cloud passes through outer ginseng matrix conversion to image coordinate system;
A. different coordinates coordinate is converted:, will by formula according to known internal reference matrix K and the outer ginseng matrix A acquired The point cloud coordinate that metal ball is described in radar is converted into corresponding image pixel coordinates;
Wherein, camera internal reference matrix K and outer ginseng matrix A are known parameters,Indicate metal ball point cloud coordinate,It indicates Image pixel coordinates;
B. result treatment and visualization: according to image size, setting filters out the result points outside image range, retains conversion The still three-dimensional point in field of view afterwards, and its corresponding direction z distance is marked out, to subsequent difference;
Step 4: the outer ginseng accuracy index of correlation in camera image plane is calculated;
A. calculate sphere matching accuracy rate: it is round that the pixel after statistics three-dimensional point cloud conversion successfully falls in corresponding metal ball Quantity in boundary obtains the successful match rate of metal ball divided by former three-dimensional point cloud total quantity.
B. it calculates sphere and matches average error rate: calculating the pixel after three-dimensional point cloud is converted and the corresponding metal ball center of circle Distance obtains the matching error rate of the three-dimensional point divided by the radius squared of corresponding metal ball in the picture, then by all three-dimensional points The matching error rate of cloud is averaged, and the matched ensemble average error rate of sphere is obtained;
Outer ginseng evaluation of the accuracy flow chart and visualization result in camera image plane is set forth in Fig. 2~Fig. 3.
As shown in figure 3,301 indicate the pixel coordinate position after the conversion of radar points cloud coordinate, 302 indicate that the point is sat in radar Mark is the distance in the lower direction z, for judging the corresponding sphere of pixel.
For three-dimensional point cloud coordinate information, the evaluation index that can be completed in radar points cloud space is calculated, including sphere Balanced error distance and sphere match mean error ratio, and evaluation procedure comprises the steps of:
Step 1: metal ball and scene are arranged;
A. scene is arranged: several metal balls being distributed across in the visual field of camera and radar.Metal ball is kept with each other Certain distance, meeting each spherical displacer pixel boundary can divide under image coordinate system, and point cloud boundary can divide under three-dimensional system of coordinate;
B. data acquire: by radar and camera it is relatively fixed after, static to be placed in plane, opening program and synchronous acquisition connect Radar point cloud data and camera image data in the continuous time;
Step 2: camera and radar data prepare;
A. point cloud cluster seeks sphere centre coordinate: with reference to the actual size of calibration sphere, by setting the threshold value of spatial dimension, The point cloud data for describing same spherical displacer is sorted out, and seeks three-dimensional coordinate mean value, as the centre of sphere under radar fix system Present position;
B. image procossing seeks central coordinate of circle: the circular contour of each spherical displacer in camera image is marked, to general image Color threshold processing is carried out, the outer boundary location of pixels in the center of circle corresponding to each spherical displacer is obtained, passes through search one by one radius The mode matched obtains the two-dimensional coordinate and radius size in each center of circle;
Step 3: image pixel coordinates pass through outer ginseng matrix conversion to radar fix system;
A. different coordinates coordinate is converted:, will by formula according to known internal reference matrix K and the outer ginseng matrix A acquired Metal ball is described in image coordinate system, and the pixel coordinate on boundary is converted into corresponding to the three-dimensional coordinate in radar fix system up and down, Since z coordinate range information therein can not acquire, with the z coordinate substitution of corresponding metal ball;
Wherein, camera internal reference matrix K and outer ginseng matrix A are known parameters,Indicate metal ball point cloud coordinate,It indicates Image pixel coordinates;
B. result treatment and visualization: by three-dimensional coordinate mark after the conversion of image slices vegetarian refreshments in radar fix system, meanwhile, The centre of sphere three-dimensional coordinate of corresponding metal ball is also identified under the same coordinate system, to compare and subsequent calculating;
Step 4: the outer ginseng accuracy index of correlation in radar points cloud space is calculated;
A. it calculates sphere and matches mean error distance: after counting the four boundary pixel point conversions up and down of each metal ball Three-dimensional point cloud and the distance between metal ball three-dimensional coordinate, all metal balls are averaged.
B. it calculates sphere and matches mean error ratio: the sphere of each metal ball being matched into mean error distance, divided by right Metal ball radius squared is answered, mean error ratio is obtained, all metal balls are averaged;
Outer ginseng evaluation of the accuracy flow chart and visualization result in radar points cloud space is set forth in Fig. 4~Fig. 5.
As shown in figure 5,501 indicate sphere centre coordinate under radar fix system, what 502 expression metal ball boundary pixels were converted to Three-dimensional coordinate.
The evaluation criterion of accuracy is demarcated, it can be achieved that right using four indices parameter total in above two space-like as outer ginseng Join outside calibrated camera and radar and carry out reasonable effectively evaluating, sphere matching accuracy rate, sphere match mean error distance Computational efficiency is high but shortcoming separating capacity, sphere matching average error rate, sphere match the calculating of mean error ratio slightly Complicated but appraisement system is more fair perfect.It can be seen that quadrinomial parameter covers wide, descriptive strong, adaptation different demands, It being capable of accuracy fair and that quantitatively evaluation camera and Radar Calibration are joined outside.
The above is only optimizing evaluation standard of the invention, and protection scope of the present invention is not limited merely to above-mentioned implementation Example, all evaluation criterion schemes belonged under thinking of the present invention all belong to the scope of protection of the present invention.It should be pointed out that for this technology For the those of ordinary skill in field, several improvements and modifications without departing from the principles of the present invention also should be regarded as the present invention Protection scope.

Claims (10)

1. joining the evaluation method of calibration result accuracy outside a kind of monocular camera and millimetre-wave radar, it is characterised in that: by Several metal balls are arranged in scene, to metal ball after the data information under each coordinate system is converted, complete statistics with than Compared with calculating, the index of the outer ginseng result accuracy of four kinds of descriptions is obtained, two of them is in camera image plane, and two kinds in radar In point cloud space.
2. join the evaluation method of calibration result accuracy outside monocular camera as described in claim 1 and millimetre-wave radar, it is special Sign is: for the pixel coordinate information of metal ball, the evaluation index completed in camera image plane is calculated, including sphere matching Accuracy rate matches average error rate with sphere, and evaluation procedure comprises the steps of:
Step 1: metal ball and scene are arranged;
A. scene is arranged;
B. data acquire;
Step 2: the preparation of camera and radar data;
A. point cloud cluster seeks sphere centre coordinate;
B. image procossing seeks central coordinate of circle;
Step 3: three-dimensional point cloud passes through outer ginseng matrix conversion to image coordinate system;
A. different coordinates coordinate is converted;
B. result treatment and visualization;
Step 4: the outer ginseng accuracy index of correlation in camera image plane is calculated;
A. sphere matching accuracy rate is calculated;
B. it calculates sphere and matches average error rate.
3. join the evaluation method of calibration result accuracy outside monocular camera as claimed in claim 2 and millimetre-wave radar, it is special Sign is: the step 1 specifically:
Step 1: metal ball and scene are arranged;
A. scene is arranged: several metal balls being distributed across in the visual field of camera and radar.
B. data acquire: by radar and camera it is relatively fixed after, static to be placed in plane, opening program and synchronous acquisition consecutive hours Interior radar point cloud data and camera image data.
4. join the evaluation method of calibration result accuracy outside monocular camera as claimed in claim 2 and millimetre-wave radar, it is special Sign is: the step 2 specifically:
Step 2: camera and radar data prepare;
A. point cloud cluster seeks sphere centre coordinate: will be retouched with reference to the actual size of calibration sphere by setting the threshold value of spatial dimension The point cloud data for stating same spherical displacer is sorted out, and seeks three-dimensional coordinate mean value, as locating for the centre of sphere under radar fix system Position;
B. image procossing seeks central coordinate of circle: marking the circular contour of each spherical displacer in camera image, carries out to general image Color threshold processing, obtains the outer boundary location of pixels in the center of circle corresponding to each spherical displacer, matched by search one by one radius Mode obtains the two-dimensional coordinate and radius size in each center of circle.
5. join the evaluation method of calibration result accuracy outside monocular camera as claimed in claim 2 and millimetre-wave radar, it is special Sign is: the step 3 specifically:
Step 3: three-dimensional point cloud passes through outer ginseng matrix conversion to image coordinate system;
A. different coordinates coordinate is converted: according to known internal reference matrix K and the outer ginseng matrix A that acquires, by formula by radar The point cloud coordinate of middle description metal ball is converted into corresponding image pixel coordinates;
Wherein, camera internal reference matrix K and outer ginseng matrix A are known parameters,Indicate metal ball point cloud coordinate,Indicate image slices Plain coordinate;
B. result treatment and visualization: according to image size, setting filters out the result points outside image range, retain after conversion according to So three-dimensional point in field of view, and its corresponding direction z distance is marked out, to subsequent difference.
6. join the evaluation method of calibration result accuracy outside monocular camera as claimed in claim 2 and millimetre-wave radar, it is special Sign is: the step 4 is specific as follows:
Step 4: the outer ginseng accuracy index of correlation in camera image plane is calculated;
A. calculate sphere matching accuracy rate: the pixel after statistics three-dimensional point cloud conversion successfully falls in corresponding metal ball circular boundary Interior quantity obtains the successful match rate of metal ball divided by former three-dimensional point cloud total quantity:
B. calculate sphere and match average error rate: pixel and the corresponding metal ball center of circle after calculating three-dimensional point cloud conversion away from From divided by the radius squared of corresponding metal ball in the picture, obtaining the matching error rate of the three-dimensional point, then by all three-dimensional point clouds Matching error rate be averaged, obtain the matched ensemble average error rate of sphere;
7. join the evaluation method of calibration result accuracy outside monocular camera as described in claim 1 and millimetre-wave radar, it is special Sign is: for the three-dimensional point cloud coordinate information of metal ball, the evaluation index completed in radar points cloud space is calculated, including sphere It matches mean error distance and sphere matches mean error ratio, evaluation procedure comprises the steps of:
Step 1: metal ball and scene are arranged;
A. scene is arranged;
B. data acquire;
Step 2: the preparation of camera and radar data;
A. point cloud cluster seeks sphere centre coordinate;
B. image procossing seeks central coordinate of circle;
Step 3: image pixel coordinates pass through outer ginseng matrix conversion to radar fix system;
A. different coordinates coordinate is converted;
B. result treatment and visualization;
Step 4: the outer ginseng accuracy index of correlation in radar points cloud space is calculated;
A. it calculates sphere and matches mean error distance;
B. it calculates sphere and matches mean error ratio.
8. join the evaluation method of calibration result accuracy outside monocular camera as claimed in claim 7 and millimetre-wave radar, it is special Sign is: the step 1 is specific as follows:
Step 1: metal ball and scene are arranged;
A. scene is arranged: several metal balls being distributed across in the visual field of camera and radar;
B. data acquire: by radar and camera it is relatively fixed after, static to be placed in plane, opening program and synchronous acquisition consecutive hours Interior radar point cloud data and camera image data;
The step 2 is specific as follows:
Step 2: camera and radar data prepare;
A. point cloud cluster seeks sphere centre coordinate: will be retouched with reference to the actual size of calibration sphere by setting the threshold value of spatial dimension The point cloud data for stating same spherical displacer is sorted out, and seeks three-dimensional coordinate mean value, as locating for the centre of sphere under radar fix system Position;
B. image procossing seeks central coordinate of circle: marking the circular contour of each spherical displacer in camera image, carries out to general image Color threshold processing, obtains the outer boundary location of pixels in the center of circle corresponding to each spherical displacer, matched by search one by one radius Mode obtains the two-dimensional coordinate and radius size in each center of circle.
9. join the evaluation method of calibration result accuracy outside monocular camera as claimed in claim 7 and millimetre-wave radar, it is special Sign is: the step 3 is specific as follows:
Step 3: image pixel coordinates pass through outer ginseng matrix conversion to radar fix system;
A. different coordinates coordinate is converted: according to known internal reference matrix K and the outer ginseng matrix A that acquires, by formula by image Metal ball is described in coordinate system, and the pixel coordinate on boundary is converted into corresponding to the three-dimensional coordinate in radar fix system up and down, due to Z coordinate range information therein can not acquire, therefore be substituted with the z coordinate of corresponding metal ball;
Wherein, camera internal reference matrix K and outer ginseng matrix A are known parameters,Indicate metal ball point cloud coordinate,Indicate image slices Plain coordinate;
B. result treatment and visualization: by three-dimensional coordinate mark after the conversion of image slices vegetarian refreshments in radar fix system, meanwhile, it will be right The centre of sphere three-dimensional coordinate of metal ball is answered also to identify under the same coordinate system, to compare and subsequent calculating.
10. join the evaluation method of calibration result accuracy outside monocular camera as claimed in claim 7 and millimetre-wave radar, it is special Sign is: the step 4 is specific as follows:
Step 4: the outer ginseng accuracy index of correlation in radar points cloud space is calculated;
A. it calculates sphere and matches mean error distance: three after counting the four boundary pixel point conversions up and down of each metal ball Dimension point the distance between cloud and metal ball three-dimensional coordinate, are averaged all metal balls:
B. it calculates sphere and matches mean error ratio: the sphere of each metal ball is matched into mean error distance, divided by corresponding gold Belong to the radius of a ball square, obtains mean error ratio, all metal balls are averaged;
CN201811577934.6A 2018-12-21 2018-12-21 Method for evaluating calibration accuracy of external parameters of monocular camera and millimeter wave radar Active CN109636837B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811577934.6A CN109636837B (en) 2018-12-21 2018-12-21 Method for evaluating calibration accuracy of external parameters of monocular camera and millimeter wave radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811577934.6A CN109636837B (en) 2018-12-21 2018-12-21 Method for evaluating calibration accuracy of external parameters of monocular camera and millimeter wave radar

Publications (2)

Publication Number Publication Date
CN109636837A true CN109636837A (en) 2019-04-16
CN109636837B CN109636837B (en) 2023-04-28

Family

ID=66076637

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811577934.6A Active CN109636837B (en) 2018-12-21 2018-12-21 Method for evaluating calibration accuracy of external parameters of monocular camera and millimeter wave radar

Country Status (1)

Country Link
CN (1) CN109636837B (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110135485A (en) * 2019-05-05 2019-08-16 浙江大学 The object identification and localization method and system that monocular camera is merged with millimetre-wave radar
CN110260786A (en) * 2019-06-26 2019-09-20 华中科技大学 A kind of robot vision measuring system and its scaling method based on external trace
CN110298891A (en) * 2019-06-25 2019-10-01 北京智行者科技有限公司 The method and device that Camera extrinsic precision is assessed automatically
CN111179358A (en) * 2019-12-30 2020-05-19 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium
CN111311689A (en) * 2020-02-10 2020-06-19 清华大学 Method and system for calibrating relative external parameters of laser radar and camera
CN111815717A (en) * 2020-07-15 2020-10-23 西北工业大学 Multi-sensor fusion external parameter combination semi-autonomous calibration method
CN112991454A (en) * 2019-12-18 2021-06-18 动态Ad有限责任公司 Calibration and verification of camera to LiDAR
CN113376617A (en) * 2020-02-25 2021-09-10 北京京东乾石科技有限公司 Method, device, storage medium and system for evaluating accuracy of radar calibration result
CN114779188A (en) * 2022-01-24 2022-07-22 南京慧尔视智能科技有限公司 Method, device, equipment and medium for evaluating calibration effect
CN117830438A (en) * 2024-03-04 2024-04-05 数据堂(北京)科技股份有限公司 Laser radar and camera combined calibration method based on specific marker

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008304344A (en) * 2007-06-08 2008-12-18 Kumamoto Univ Target detector
CN107564069A (en) * 2017-09-04 2018-01-09 北京京东尚科信息技术有限公司 The determination method, apparatus and computer-readable recording medium of calibrating parameters
CN108037505A (en) * 2017-12-08 2018-05-15 吉林大学 A kind of night front vehicles detection method and system
CN108693532A (en) * 2018-03-29 2018-10-23 浙江大学 Wearable barrier-avoiding method and device based on enhanced binocular camera Yu 3D millimetre-wave radars

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008304344A (en) * 2007-06-08 2008-12-18 Kumamoto Univ Target detector
CN107564069A (en) * 2017-09-04 2018-01-09 北京京东尚科信息技术有限公司 The determination method, apparatus and computer-readable recording medium of calibrating parameters
CN108037505A (en) * 2017-12-08 2018-05-15 吉林大学 A kind of night front vehicles detection method and system
CN108693532A (en) * 2018-03-29 2018-10-23 浙江大学 Wearable barrier-avoiding method and device based on enhanced binocular camera Yu 3D millimetre-wave radars

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
DEZHI G,ET AL.: "A method of spatial calibration for camera and radar", 《2010 8TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION》 *
张灵飞 等: "基于一维标定物和改进进化策略的相机标定", 《光学学报》 *
韩正勇 等: "一种针孔相机与三维激光雷达外参标定方法", 《传感器与微系统》 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110135485A (en) * 2019-05-05 2019-08-16 浙江大学 The object identification and localization method and system that monocular camera is merged with millimetre-wave radar
CN110298891A (en) * 2019-06-25 2019-10-01 北京智行者科技有限公司 The method and device that Camera extrinsic precision is assessed automatically
CN110260786A (en) * 2019-06-26 2019-09-20 华中科技大学 A kind of robot vision measuring system and its scaling method based on external trace
CN110260786B (en) * 2019-06-26 2020-07-10 华中科技大学 Robot vision measurement system based on external tracking and calibration method thereof
CN112991454A (en) * 2019-12-18 2021-06-18 动态Ad有限责任公司 Calibration and verification of camera to LiDAR
US11940539B2 (en) 2019-12-18 2024-03-26 Motional Ad Llc Camera-to-LiDAR calibration and validation
CN111179358B (en) * 2019-12-30 2024-01-05 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium
CN111179358A (en) * 2019-12-30 2020-05-19 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium
CN111311689B (en) * 2020-02-10 2020-10-30 清华大学 Method and system for calibrating relative external parameters of laser radar and camera
CN111311689A (en) * 2020-02-10 2020-06-19 清华大学 Method and system for calibrating relative external parameters of laser radar and camera
CN113376617A (en) * 2020-02-25 2021-09-10 北京京东乾石科技有限公司 Method, device, storage medium and system for evaluating accuracy of radar calibration result
CN113376617B (en) * 2020-02-25 2024-04-05 北京京东乾石科技有限公司 Method, device, storage medium and system for evaluating accuracy of radar calibration result
CN111815717A (en) * 2020-07-15 2020-10-23 西北工业大学 Multi-sensor fusion external parameter combination semi-autonomous calibration method
CN114779188A (en) * 2022-01-24 2022-07-22 南京慧尔视智能科技有限公司 Method, device, equipment and medium for evaluating calibration effect
CN114779188B (en) * 2022-01-24 2023-11-03 南京慧尔视智能科技有限公司 Method, device, equipment and medium for evaluating calibration effect
CN117830438A (en) * 2024-03-04 2024-04-05 数据堂(北京)科技股份有限公司 Laser radar and camera combined calibration method based on specific marker

Also Published As

Publication number Publication date
CN109636837B (en) 2023-04-28

Similar Documents

Publication Publication Date Title
CN109636837A (en) A kind of evaluation method of monocular camera and ginseng calibration accuracy outside millimetre-wave radar
CN109598765B (en) Monocular camera and millimeter wave radar external parameter combined calibration method based on spherical calibration object
US9646212B2 (en) Methods, devices and systems for detecting objects in a video
US20170337701A1 (en) Method and system for 3d capture based on structure from motion with simplified pose detection
CN110073362A (en) System and method for lane markings detection
US11416719B2 (en) Localization method and helmet and computer readable storage medium using the same
CN102141398A (en) Monocular vision-based method for measuring positions and postures of multiple robots
CN106019264A (en) Binocular vision based UAV (Unmanned Aerial Vehicle) danger vehicle distance identifying system and method
CN107560592B (en) Precise distance measurement method for photoelectric tracker linkage target
CN115376109B (en) Obstacle detection method, obstacle detection device, and storage medium
CN113192182A (en) Multi-sensor-based live-action reconstruction method and system
CN112562005A (en) Space calibration method and system
Du et al. Recognition of mobile robot navigation path based on K-means algorithm
CN112488022A (en) Panoramic monitoring method, device and system
Abdullah et al. Camera calibration performance on different non-metric cameras.
US20220276046A1 (en) System and method for providing improved geocoded reference data to a 3d map representation
CN110322518A (en) Evaluation method, evaluation system and the test equipment of Stereo Matching Algorithm
CN102542563A (en) Modeling method of forward direction monocular vision of mobile robot
CN115407338A (en) Vehicle environment information sensing method and system
Short 3-D Point Cloud Generation from Rigid and Flexible Stereo Vision Systems
CN114092564A (en) External parameter calibration method, system, terminal and medium of non-overlapping view field multi-camera system
CN113792645A (en) AI eyeball fusing image and laser radar
CN113095324A (en) Classification and distance measurement method and system for cone barrel
Su Vanishing points in road recognition: A review
Geng et al. Detection algorithm of video image distance based on rectangular pattern

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant