CN100586200C - Camera calibration method based on laser radar - Google Patents

Camera calibration method based on laser radar Download PDF

Info

Publication number
CN100586200C
CN100586200C CN200810042152A CN200810042152A CN100586200C CN 100586200 C CN100586200 C CN 100586200C CN 200810042152 A CN200810042152 A CN 200810042152A CN 200810042152 A CN200810042152 A CN 200810042152A CN 100586200 C CN100586200 C CN 100586200C
Authority
CN
China
Prior art keywords
laser radar
cylindrical bar
subscript
centerdot
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN200810042152A
Other languages
Chinese (zh)
Other versions
CN101345890A (en
Inventor
李颢
杨明
夏庭凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Jiaotong University
Original Assignee
Shanghai Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Jiaotong University filed Critical Shanghai Jiaotong University
Priority to CN200810042152A priority Critical patent/CN100586200C/en
Publication of CN101345890A publication Critical patent/CN101345890A/en
Application granted granted Critical
Publication of CN100586200C publication Critical patent/CN100586200C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention relates to a camera calibration method based on laser radar in the intelligent vehicle field. The method adopts laser radar as a distance measuring instrument to install in the front of the vehicle, and adopts a columnar rod to place on a plurality of positions in the calibrated field. On each position, the bottom surface center and the top surface of the columnar rod are taken as calibration sampling points. The image coordinate of the calibration sampling points is acquired by a camera, and the world coordinate of the calibration sampling points is acquired by the laser radar. After the image coordinate and the image coordinate of the calibration sampling points at each position are all acquired, a perspective transformation parameter of the camera is acquired through solving a linear system of equations. The invention is convenient and effective, needs no time and energy to prepare calibration fields, and provides a helpful method for the technical people to perform camera calibration for the visual system of an intelligent vehicle.

Description

Camera calibration method based on laser radar
Technical field
The present invention relates to the camera calibration method in a kind of vehicle technology field, be specifically related to a kind of camera calibration method based on laser radar.
Background technology
The purpose of intelligent vehicle camera calibration is in order determine to describe in the image coordinate system and the parameter of the camera imaging model of geometrical relationship between world coordinate system.The camera imaging model can adopt easy perspective transform relation to be described, and camera calibration just index is decided camera perspective transform parameter.In the camera calibration process, especially for the application scenario of distant view vision (effective field of view as camera can far reach over three, 40 meters), basis but rather stubborn problem be how to determine to demarcate the image coordinate and the world coordinates of sampling point.The difficulty of problem is to lack demarcation thing easily and effectively, also is to demarcate the inconvenience that sampling point is measured.
Find that through literature search at this problem, forefathers have taked some ways to prior art.Tiberiu etc. were at ' " Camera calibration method for far range stereovisionsensors used in vehicles " literary composition of delivering on the IEEE Intelligent Vehicles Symposium ' (IEEE intelligent vehicle investigation meeting) in 2006, proposition in prior off-the-shelf particular field on the ground, the method for demarcating by the object that is decorated with ' X ' shape pattern.And for example ' " Self calibration of a stereo vision system forautomotive application " literary composition of delivering on the IEEEInternational Conference on Robotics and Automation ' (IEEE robot and automation meeting) proposes a kind ofly to brush a big lattice on the ground as demarcating the method that scene has been demarcated in calendar year 2001 for Broggi etc.The total deficiency of these methods is: the preparation that need waste time and energy, can't the implementing easily and effectively in camera calibration work of mistake.At these deficiencies, easily and effectively, especially demand urgently proposing at the camera calibration method of distant view vision.
Summary of the invention
The objective of the invention is at the deficiencies in the prior art, propose a kind of camera calibration method, can demarcate camera perspective transform parameter easily and effectively based on laser radar.
The present invention is achieved by the following technical solutions, the present invention includes following steps:
The 1st step, make a cylindrical bar, measure the height of cylindrical bar.
In the 2nd step, selected several different positions (at least six) are called calibration position in the camera visual field.Cylindrical bar is placed in one on the calibration position, constitutes one and demarcate scene.
In the 3rd step, get on the cylindrical bar two particular points as demarcating sampling point: a bit be that cylindrical bar bottom center is a subscript random sample point, on the other hand cylindrical bar end face center is a subscript random sample point.Obtain a two field picture of current demarcation scene with camera.Measure the image coordinate of subscript random sample point and subscript random sample point by hand.
The 4th step, adopt laser radar as distance mearuring equipment, its level is installed in front part of vehicle, obtain a frame laser radar data of current demarcation scene with laser radar.Choose by hand the sampling point of cylindrical bar correspondence in the laser radar data, convert the coordinate of these cylindrical bar sampling points under the laser radar rectangular coordinate system value by the value under the laser radar polar coordinate system.
In the 5th step,, ask for the abscissa average and the ordinate average of these sampling points for the cylindrical bar sampling point in the 4th step.The world coordinates of then trying to achieve subscript random sample point is (abscissa average, ordinate average, 0), and the world coordinates of subscript random sample point is (abscissa average, ordinate average, a cylindrical bar height).
The 6th step placed cylindrical bar on the another one calibration position, constituted a new demarcation scene, re-executed for the 3rd step to go on foot to the 6th, until cylindrical bar till all placing on all calibration position.
In the 7th step, find the solution following system of linear equations:
X 1 Y 1 Z 1 1 0 0 0 0 - X 1 u 1 - Y 1 u 1 - Z 1 u 1 - u 1 0 0 0 0 X 1 Y 1 Z 1 1 - X 1 v 1 - Y 1 v 1 - Z 1 v 1 - v 1 · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · X N Y N Z N 1 0 0 0 0 - X N u N - Y N u N - Z N u N - u N 0 0 0 0 X N Y N Z N 1 - X N v N - Y N v N - Z N v N - v N m 11 · · · m 14 m 21 · · · m 24 m 31 · · · m 34 = 0
Wherein, N represents to demarcate the sum of sampling point.(X i, Y i, Z i) i demarcation of expression sampling point corresponding world coordinate, (u i, v i) i image coordinate of demarcating the sampling point correspondence of expression.m 11..., m 14, m 21..., m 24, m 31..., m 34Be the perspective transform parameter.
Try to achieve the perspective transform parameter, can obtain perspective transformation matrix:
M = m 11 . . . m 14 m 21 . . . m 24 m 31 . . . m 34
Thereby realize the demarcation of camera.
The present invention easily and effectively, the preparation of the demarcation scene that need not to waste time and energy as long as the place of ground grading is arranged, as road, square, can realize the present invention.Though need among the present invention to adopt a laser radar, consider the normally indispensable equipment of intelligent vehicle of laser radar, therefore realize that the present invention need not to acquire the ancillary cost of laser radar.The present invention provides a kind of efficient ways of the vision system of intelligent vehicle being carried out camera calibration for relevant scientific and technical personnel.
Description of drawings
Fig. 1 laser radar installation site schematic diagram;
Fig. 2 cylindrical bar and upper and lower demarcation sampling point schematic diagram;
Fig. 3 camera calibration effect schematic diagram.
Embodiment
Below in conjunction with accompanying drawing embodiments of the invention are elaborated: present embodiment is being to implement under the prerequisite with the technical solution of the present invention, provided detailed execution mode and concrete operating process, but protection scope of the present invention is not limited to following embodiment.
As shown in Figure 1, present embodiment adopts a laser radar as distance mearuring equipment, and its level is installed in front part of vehicle.Then select the smooth ground of a slice, as road, square etc., with this as demarcating the place.Adopt a cylindrical bar, it is positioned on some positions of demarcating in the place.In each position, get on the cylindrical bar two particular points as demarcating sampling point: a bit be cylindrical bar bottom center, cylindrical bar end face center on the other hand.Utilize camera to ask for the image coordinate of demarcating sampling point, utilize laser radar to ask for the world coordinates of demarcating sampling point.After the image coordinate of the demarcation sampling point of each position and world coordinates are all tried to achieve, ask for camera perspective transform parameter by finding the solution a system of linear equations, realize camera calibration.
Concrete implementation step is as follows:
In the 1st step, make a cylindrical bar.Measure the height of cylindrical bar, represent with H.
In the 2nd step, selected several different positions (at least six) are called calibration position in the camera visual field.Cylindrical bar is placed in one on the calibration position, constitutes one and demarcate scene.
The distribution of calibration position does not have strict restriction, but in order to reach demarcation effect as well as possible, calibration position preferably is evenly distributed on the whole effective field of view of camera.The number of calibration position obtains many more, and calibration result is also accurate more, but its number is how after to a certain degree, and many more severally do not had much improvement to demarcating effect, causes that on the contrary the superfluous of staking-out work amount increases.
Be trade-off, in specific implementation process, the number of calibration position be taken as 9 ~ 13 comparatively suitable, but what deserves to be explained is that this concrete value mode is as restriction of the present invention, at this just as example.
In the 3rd step, get on the cylindrical bar two particular points as demarcating sampling point: a bit be cylindrical bar bottom center (subscript random sample point), on the other hand cylindrical bar end face center (subscript random sample point) is as Fig. 2.Obtain a two field picture of current demarcation scene with camera.Measure the image coordinate of subscript random sample point and subscript random sample point by hand.
Be worth specifying and how determine the position of upper and lower demarcation sampling point in image.The edge of cylindrical bar image is made up of four parts, promptly left and right each straight line section, and upper and lower each one section oblate arc is as Fig. 2.Go up oblate arc two-end-point line mid point and be subscript random sample point, following oblate arc two-end-point line mid point is subscript random sample point.
The 4th goes on foot, and obtains a frame laser radar data of current demarcation scene with laser radar.Choose by hand the sampling point of cylindrical bar correspondence in the laser radar data, convert the coordinate of these cylindrical bar sampling points under the laser radar rectangular coordinate system value by the value under the laser radar polar coordinate system.Promptly
x i=ρ icosθ i
y i=ρ isinθ i
i=1,2,...,m
Wherein, m represents the number of cylindrical bar sampling point under the current demarcation scene.(ρ 1, θ 1) ..., (ρ m, θ m) the coordinate value of expression cylindrical bar sampling point under the laser radar polar coordinate system, the just data value of directly exporting by laser radar.(x 1, y 1) ..., (x m, y m) the expression coordinate value of cylindrical bar sampling point under the laser radar rectangular coordinate, the data value of using for the subsequent treatment analysis.
In the 5th step,, ask for the abscissa average of these sampling points and (use c for the cylindrical bar sampling point in the 4th step xThe expression) and the ordinate average (use c yExpression).Promptly
c x = mean ( x ) = Σ i = 1 m x i N
c y = mean ( y ) = Σ i = 1 m y i N
The world coordinates of then trying to achieve subscript random sample point is (c x, c y, 0), the world coordinates of subscript random sample point is (c x, c y, H).
The 6th step placed cylindrical bar on the another one calibration position, constituted a new demarcation scene, re-executed for the 3rd step to go on foot to the 6th, until cylindrical bar till all placing on all calibration position.
In the 7th step, find the solution following system of linear equations:
X 1 Y 1 Z 1 1 0 0 0 0 - X 1 u 1 - Y 1 u 1 - Z 1 u 1 - u 1 0 0 0 0 X 1 Y 1 Z 1 1 - X 1 v 1 - Y 1 v 1 - Z 1 v 1 - v 1 · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · · X N Y N Z N 1 0 0 0 0 - X N u N - Y N u N - Z N u N - u N 0 0 0 0 X N Y N Z N 1 - X N v N - Y N v N - Z N v N - v N m 11 · · · m 14 m 21 · · · m 24 m 31 · · · m 34 = 0
Wherein, N represents to demarcate the sum of sampling point.Because have two to demarcate sampling point, so the sum of demarcation sampling point is 2 times of the calibration position number in each calibration position.(X i, Y i, Z i) i demarcation of expression sampling point corresponding world coordinate, (u i, v i) i image coordinate of demarcating the sampling point correspondence of expression.m 11..., m 14, m 21..., m 24, m 31..., m 34Be the perspective transform parameter.
Try to achieve the perspective transform parameter, can obtain perspective transformation matrix:
M = m 11 . . . m 14 m 21 . . . m 24 m 31 . . . m 34
Undertaken by above-mentioned concrete implementation step camera calibration embodiment effect as shown in Figure 3.This width of cloth figure is made up of the little figure of 4 width of cloth, and the little figure of two width of cloth of the left side one row is the original image of captured two width of cloth roads, a width of cloth straight way figure, a width of cloth bend figure.The little figure of two width of cloth of the right one row, the perspective transform parameter that expression utilizes aforementioned concrete implementation step to obtain carries out obtaining orthographic drawing after the contrary perspective transform to original graph.Shown by vehicle nearby to the road scope at 50 meters of vehicle front the application scenario of satisfying the distant view vision in the orthographic drawing.From orthographic drawing as can be seen, the geometry original state of road has obtained recovery, and perspective distortion is removed fully, and this has reflected validity of the present invention.Therefore, the invention provides a kind of camera calibration method easily and effectively, that be applied to the intelligent vehicle field.

Claims (6)

1, a kind of camera calibration method based on laser radar is characterized in that may further comprise the steps:
The 1st step, make a cylindrical bar, measure the height of cylindrical bar;
In the 2nd step, selected at least six positions are called calibration position in the camera visual field, and cylindrical bar is placed in one on the calibration position, constitute a demarcation scene;
The 3rd step, get on the cylindrical bar two particular points as demarcating sampling point: a bit be that cylindrical bar bottom center is a subscript random sample point, on the other hand cylindrical bar end face center is a subscript random sample point, obtain a two field picture of current demarcation scene with camera, measure the image coordinate of subscript random sample point and subscript random sample point by hand;
The 4th step, adopt laser radar as distance mearuring equipment, its level is installed in front part of vehicle, obtain a frame laser radar data of current demarcation scene with laser radar, choose by hand the sampling point of cylindrical bar correspondence in the laser radar data, convert the coordinate of these cylindrical bar sampling points under the laser radar rectangular coordinate system value by the value under the laser radar polar coordinate system;
The 5th step, for the cylindrical bar sampling point in the 4th step, ask for the abscissa average and the ordinate average of these sampling points, the world coordinates that then gets subscript random sample point is: (abscissa average, the ordinate average, 0), the world coordinates of subscript random sample point is: (abscissa average, the ordinate average, the cylindrical bar height);
The 6th step placed cylindrical bar on the another one calibration position, constituted a new demarcation scene, re-executed for the 3rd step to go on foot to the 6th, until cylindrical bar till all placing on all calibration position;
In the 7th step, find the solution following system of linear equations:
X 1 Y 1 Z 1 1 0 0 0 0 - X 1 u 1 - Y 1 u 1 - Z 1 u 1 - u 1 0 0 0 0 X 1 Y 1 Z 1 1 - X 1 v 1 - Y 1 v 1 - Z 1 v 1 - v 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . X N Y N Z N 1 0 0 0 0 - X N u N - Y N u N - Z N u N - u N 0 0 0 0 X 1 Y 1 Z 1 1 - X N v N - Y N v N - Z N v N - v N m 11 . . . m 14 m 21 . . . m 24 m 31 . . . m 34 = 0
Wherein, N represents to demarcate the sum of sampling point, (X i, Y i, Z i) i demarcation of expression sampling point corresponding world coordinate, (u i, v i) i image coordinate of demarcating the sampling point correspondence of expression, m 11..., m 14, m 21..., m 24, m 31..., m 34Be the perspective transform parameter;
Try to achieve the perspective transform parameter, promptly obtain perspective transformation matrix:
M = m 11 . . . m 14 m 21 . . . m 24 m 31 . . . m 34 ,
Thereby finish the demarcation of camera.
2, the camera calibration method based on laser radar according to claim 1 is characterized in that, in the 2nd step, calibration position is evenly distributed on the whole effective field of view of camera.
3, the camera calibration method based on laser radar according to claim 1 and 2 is characterized in that, the number of described calibration position is 9 ~ 13.
4, the camera calibration method based on laser radar according to claim 1, it is characterized in that, in the 3rd step, the image coordinate of described subscript random sample point and subscript random sample point, it determines that method is: the edge of cylindrical bar image is made up of four parts, promptly left and right each straight line section, upper and lower each one section oblate arc, go up oblate arc two-end-point line mid point and be subscript random sample point, following oblate arc two-end-point line mid point is subscript random sample point.
5, the camera calibration method based on laser radar according to claim 1 is characterized in that, in the 4th step, converts the coordinate of cylindrical bar sampling point under the laser radar rectangular coordinate system value by the value under the laser radar polar coordinate system, is specially:
x i=ρ icosθ i
y i=ρ isinθ i
i=1,2,...,m
Wherein, m represents the number of cylindrical bar sampling point under the current demarcation scene, (ρ 1, θ 1) .., (ρ m, θ m) the expression coordinate value of cylindrical bar sampling point under the laser radar polar coordinate system, the data value of directly exporting by laser radar just, (x 1, y 1) ..., (x m, y m) the expression coordinate value of cylindrical bar sampling point under the laser radar rectangular coordinate, the data value of using for the subsequent treatment analysis.
6, the camera calibration method based on laser radar according to claim 5 is characterized in that, in the 5th step, for the cylindrical bar sampling point in the 4th step, asks for the abscissa average c of these sampling points xWith ordinate average c y, be specially:
c x = mean ( x ) = Σ i = 1 m x i N
c y = mean ( y ) = Σ i = 1 m y i N
The world coordinates of then trying to achieve subscript random sample point is (c x, c y, 0), the world coordinates of subscript random sample point is (c x, c y, H).
CN200810042152A 2008-08-28 2008-08-28 Camera calibration method based on laser radar Expired - Fee Related CN100586200C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN200810042152A CN100586200C (en) 2008-08-28 2008-08-28 Camera calibration method based on laser radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN200810042152A CN100586200C (en) 2008-08-28 2008-08-28 Camera calibration method based on laser radar

Publications (2)

Publication Number Publication Date
CN101345890A CN101345890A (en) 2009-01-14
CN100586200C true CN100586200C (en) 2010-01-27

Family

ID=40247758

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200810042152A Expired - Fee Related CN100586200C (en) 2008-08-28 2008-08-28 Camera calibration method based on laser radar

Country Status (1)

Country Link
CN (1) CN100586200C (en)

Families Citing this family (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102589426A (en) * 2012-01-13 2012-07-18 同济大学 Geology photographing system and method capable of automatically calibrating size
CN102608613B (en) * 2012-03-20 2013-11-06 西安理工大学 Device and method for accurately calibrating point object detectivity of laser radar
CN102867414B (en) * 2012-08-18 2014-12-10 湖南大学 Vehicle queue length measurement method based on PTZ (Pan/Tilt/Zoom) camera fast calibration
US20190004178A1 (en) * 2016-03-16 2019-01-03 Sony Corporation Signal processing apparatus and signal processing method
CN107464263A (en) * 2016-06-02 2017-12-12 维森软件技术(上海)有限公司 Automobile calibration system and its scaling method
CN106646407B (en) * 2016-12-15 2019-04-02 广州汽车集团股份有限公司 Radar Calibration equipment verification methods, devices and systems
CN107870324B (en) * 2017-05-09 2021-06-25 吉林大学 Calibration device and method for multi-line laser radar
CN109697736B (en) * 2017-10-20 2021-10-15 杭州海康机器人技术有限公司 Calibration method and device of measurement system, electronic equipment and readable storage medium
CN109741402B (en) * 2018-12-26 2023-04-07 上海交通大学 Small-coincidence-field multi-camera combined calibration method based on laser radar
CN112146848B (en) * 2019-06-27 2022-02-25 华为技术有限公司 Method and device for determining distortion parameter of camera
CN112558023B (en) * 2019-09-25 2024-03-26 华为技术有限公司 Calibration method and device of sensor
CN110687508A (en) * 2019-10-12 2020-01-14 内蒙古工业大学 Method for correcting monitoring data of micro-varying radar
CN111709995B (en) * 2020-05-09 2022-09-23 西安电子科技大学 Position calibration method between laser radar and camera
CN114115021A (en) * 2021-11-19 2022-03-01 安徽省爱夫卡电子科技有限公司 Camera calibration system of automobile ADAS system

Also Published As

Publication number Publication date
CN101345890A (en) 2009-01-14

Similar Documents

Publication Publication Date Title
CN100586200C (en) Camera calibration method based on laser radar
CN101093543B (en) Method for correcting image in 2D code of quick response matrix
CN101814185A (en) Line structured light vision sensor calibration method for micro-size measurement
CN108012143B (en) Binocular camera calibration method and device
CN109443307B (en) System and method for measuring settlement and inclination angle of transmission tower based on optical measurement
CN103033132B (en) Plane survey method and device based on monocular vision
CN108507949B (en) River water quality monitoring method based on high-resolution remote sensing satellite
CN103353388B (en) A kind of binocular body formula micro imaging system scaling method of tool camera function and device
CN104458895A (en) Three-dimensional pipeline leakage flux imaging detection method and system
CN108592895A (en) Construction detecting system based on 3 D laser scanning, method and apparatus
CN107886547A (en) A kind of fisheye camera scaling method and system
CN106597416A (en) Ground-GPS-assisted method for correcting error of difference of elevation of LiDAR data
CN113177744A (en) Urban green land system carbon sink amount estimation method and system
CN101635050A (en) Image restoration method
CN104143192A (en) Calibration method and device of lane departure early warning system
CN107830839A (en) Three Dimensional Ground laser scanning data processing method and processing device
WO2009024129A1 (en) Signal marks and method for the photogrammetric measurement of geometrically irregular objects
EP3433574A1 (en) Device for three-dimensionally measuring an object, method and computer program having image-based triggering
CN105180811A (en) Laser scanner calibration method, based on ground objects with characteristics of the same name, for mobile measuring system
CN112308926B (en) Camera external reference calibration method without public view field
CN107506724A (en) Inclination angle detection method and detection terminal
CN104048645A (en) Integral orientation method of ground scanning point cloud through linear fitting
Moreno et al. Significance of soil erosion on soil surface roughness decay after tillage operations
CN102147249B (en) Method for precisely correcting satellite-borne optical linear array image based on linear characteristic
CN103413319A (en) Industrial camera parameter on-site calibration method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20100127

Termination date: 20120828