CN107976668A - A kind of method of outer parameter between definite camera and laser radar - Google Patents

A kind of method of outer parameter between definite camera and laser radar Download PDF

Info

Publication number
CN107976668A
CN107976668A CN201610922058.0A CN201610922058A CN107976668A CN 107976668 A CN107976668 A CN 107976668A CN 201610922058 A CN201610922058 A CN 201610922058A CN 107976668 A CN107976668 A CN 107976668A
Authority
CN
China
Prior art keywords
camera
laser radar
corner location
scaling board
location coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201610922058.0A
Other languages
Chinese (zh)
Other versions
CN107976668B (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fafa Automobile China Co ltd
Original Assignee
Faraday Beijing Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Faraday Beijing Network Technology Co Ltd filed Critical Faraday Beijing Network Technology Co Ltd
Priority to CN201610922058.0A priority Critical patent/CN107976668B/en
Publication of CN107976668A publication Critical patent/CN107976668A/en
Application granted granted Critical
Publication of CN107976668B publication Critical patent/CN107976668B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Abstract

The method that the embodiment of the present invention provides the outer parameter between a kind of definite camera and laser radar, the described method includes:The AprilTag of multiple differences ID is respectively arranged to each angle of polygon scaling board;Each first corner location coordinate that camera calibration arrives is obtained, wherein, the first corner location coordinate is obtained by the camera by detecting each AprilTag;Each second corner location coordinate is obtained, wherein, the second corner location coordinate is the angular coordinate at each angle of the polygon scaling board;According to the first corner location coordinate and the second corner location coordinate got, the outer parameter between camera and laser radar is determined.By the method for the outer parameter between definite camera provided in an embodiment of the present invention and laser radar, the accuracy of identified calibration result can be lifted.

Description

A kind of method of outer parameter between definite camera and laser radar
Technical field
The present invention relates to unmanned vehicle field, the side of the outer parameter between a kind of definite camera and laser radar is specifically designed Method.
Background technology
Continuous with unmanned vehicle technology promotes, and the kind of sensor carried on car also gradually increases.It is most common to be exactly The camera and laser radar for perceiving and positioning for unmanned vehicle, the three-dimensional position of object can be obtained by laser radar scanning Put, and camera capture picture to obtain the two-dimensional position of object and color, these information unmanned vehicle travel during it is non- Locality relation that is often important while requiring our Calibration of Laser radars and camera, for the data for exactly obtaining the two Under warm to one coordinate system of information.
Being currently used in the method for joining calibration outside camera and laser radar mainly has two kinds, is respectively:
The first is demarcated using chequered with black and white gridiron pattern, and this method is that gridiron pattern is placed on camera and laser thunder Up in all visible scope, the opposite of them is calculated as constraint by the use of the gridiron pattern plane that camera and laser radar detect and closed System.The problem of this method, is:Reflectivity of the light beam of laser radar on black-white colors is different, will cause the plane that measures There is deviation, the final result for influencing calibration.
Second method is demarcated using the feature of natural environment, though any special calibration is not required in this method Plate, finds corresponding constraint in the point cloud of same regional reconstruction using camera and laser radar, calculates their relativeness. But the problem of this method, is:The point cloud of reconstruction has error in itself and no method can evade the error, plus mark Car body is kept in motion during fixed, the problem of producing synchronous sensing data and motion compensation, ultimately results in calibration and ties Fruit is not allowed.
As it can be seen that joining the scheme of calibration outside existing camera and laser radar, restrict in existing e measurement technology and others Objective factor, final calibration result accuracy be not high.
The content of the invention
The present invention provides a kind of method of the outer parameter between definite camera and laser radar, to solve in the prior art Join the scheme of calibration outside existing camera and laser radar, due to restricting in existing e measurement technology and other objective factors, The problem of final calibration result accuracy is not high.
To solve the above-mentioned problems, the invention discloses a kind of side of the outer parameter between definite camera and laser radar Method, the described method includes:The AprilTag of multiple differences ID is respectively arranged to each angle of polygon scaling board;Obtain phase machine examination Each first corner location coordinate measured, wherein, the first corner location coordinate is each by detecting by the camera AprilTag is obtained;Each second corner location coordinate is obtained, wherein, the second corner location coordinate is demarcated for the polygon The angular coordinate at each angle of plate;The the first corner location coordinate and the second corner location coordinate that foundation is got, really Determine the outer parameter between camera and laser radar.
Preferably, the step of each second corner location coordinate of the acquisition includes:Obtain on the polygon scaling board Cloud data information, wherein, the cloud data information is obtained by the laser that laser radar detection polygon scaling board is reflected back; According to the cloud data information, each edge point position coordinate on the polygon scaling board edge is calculated;According to the side Edge point position coordinates, fits the linear equation on each bar side of the polygon scaling board;Straight line side according to each bar side Journey, determines the second corner location coordinate of each angle point of polygon scaling board.
Preferably, it is described according to the first corner location coordinate and the second corner location coordinate, determine camera with The step of outer parameter between laser radar, includes:According to the first corner location coordinate, the first angle point matrix is generated;Foundation The second corner location coordinate, generates the second angle point matrix;Obtain scalar and calibrated intrinsic parameter;By described first jiao Dot matrix, the second angle point matrix, scalar and intrinsic parameter input constraint equation;By the output valve being calculated be determined as camera with Outer parameter between laser radar.
Preferably, the outer parameter between the camera and laser radar is calculated according to equation below:S* (u, v, 1) ^T=K* [R,t]*(X,Y,Z,1)^T;Wherein, the single angle point of scaling board is (u, v, 1) in the positional representation of camera plane coordinate system, is being swashed Positional representation under optical radar coordinate system is (X, Y, Z, 1), and coordinate is homogeneity expression way;S is a scalar, is represented Scale parameters, ^T represent the transposition in matrix, and K is intrinsic parameter, and K is 3x3 matrixes;[R, t] is between camera and laser radar Outer parameter, [R, t] are 3x4 matrixes.
Preferably, scanning direction of each side of polygon scaling board with laser radar has angle.
Preferably, for each AprilTag, the angle point direction angle corresponding with the AprilTag of the AprilTag Direction it is consistent.
Compared with prior art, the present invention has the following advantages:
A kind of method of outer parameter between definite camera and laser radar provided in an embodiment of the present invention is different by multiple The AprilTag of ID is respectively arranged at each angle of polygon scaling board, and the angle point of each AprilTag arrived by camera calibration is sat Mark, and the angular coordinate at each angle of polygon scaling board that laser radar detects, using identical angle point in different coordinates The lower position relationship of system determines the outer parameter between camera and laser radar as constraint, improve camera and laser radar it Between outer parameter calibration result accuracy.
Described above is only the general introduction of technical solution of the present invention, in order to better understand the technological means of the present invention, And can be practiced according to the content of specification, and in order to allow above and other objects of the present invention, feature and advantage can Become apparent, below especially exemplified by the embodiment of the present invention.
Brief description of the drawings
In order to illustrate more clearly about the embodiment of the present invention or technical scheme of the prior art, below will be to embodiment or existing There is attached drawing needed in technology description to be briefly described, it should be apparent that, drawings in the following description are this hairs Some bright embodiments, for those of ordinary skill in the art, without creative efforts, can be with root Other attached drawings are obtained according to these attached drawings.
The step of Fig. 1 is the method for the outer parameter between a kind of definite camera and laser radar of the embodiment of the present invention one is flowed Cheng Tu;
The step of Fig. 2 is the method for the outer parameter between a kind of definite camera and laser radar of the embodiment of the present invention two is flowed Cheng Tu;
Fig. 3 is that the structure of the device of the outer parameter between a kind of definite camera and laser radar of the embodiment of the present invention three is shown It is intended to;
Fig. 4 is that the structure of the device of the outer parameter between a kind of definite camera and laser radar of the embodiment of the present invention four is shown It is intended to;
Fig. 5 is the schematic diagram of a kind of scaling board and AprilTag of the embodiment of the present invention;
Fig. 6 is a kind of cloud data schematic diagram of the embodiment of the present invention.
Embodiment
The exemplary embodiment of the disclosure is more fully described below with reference to accompanying drawings.Although the disclosure is shown in attached drawing Exemplary embodiment, it being understood, however, that may be realized in various forms the disclosure without should be by embodiments set forth here Limited.On the contrary, these embodiments are provided to facilitate a more thoroughly understanding of the present invention, and can be by the scope of the present disclosure Completely it is communicated to those skilled in the art.
Embodiment one
Reference Fig. 1, shows the side of the outer parameter between a kind of definite camera and laser radar of the embodiment of the present invention one The step flow chart of method.
The method of outer parameter between the definite camera and laser radar of the embodiment of the present invention comprises the following steps:
Step 101:The AprilTag of multiple differences ID is respectively arranged to each angle of polygon scaling board.
Polygon scaling board can be any suitable shape such as diamond shape, rectangle, triangle, for mark in the embodiment of the present invention The concrete shape of fixed board is not restricted.
AprilTag is a kind of label similar to Quick Response Code of the U.S.'s one distinguished professor invention.Here ID can be understood as The code of different model.During specific implementation, the quantity at the angle that quantity and the polygon of AprilTag include is related;Setting When can be each angle AprilTag is set, or segment angle set AprilTag.
Preferably, each side of polygon scaling board for horizontal direction with having angle;For each AprilTag, The direction at the angle point direction angle corresponding with AprilTag of AprilTag is consistent;The color of polygon scaling board is pure white, with Beneficial to the laser of reflection laser radar emission.
Step 102:Obtain each first corner location coordinate that camera calibration arrives.
Wherein, the first corner location coordinate is two-dimensional coordinate.First corner location coordinate is each by detecting by camera AprilTag is obtained, and the AprilTags that camera is attached to each angle point of scaling board by detection can obtain each angle point of scaling board and exist Two-dimensional position information in image, and the first corner location coordinate is obtained by two-dimensional position information.
Step 103:Obtain each second corner location coordinate.
Wherein, the second corner location coordinate is three-dimensional coordinate.Second corner location coordinate is each angle of polygon scaling board Angular coordinate.Laser radar launches entire row laser to scaling board, and the cloud data on scaling board is obtained by return light.Laser Radar scanning mode is not that from top to bottom, scanning direction is related with installation site, if being horizontally mounted, that scanning is from Beginning, position started to rotate a circle in the horizontal direction.There are certain angle with laser radar scanning direction for the side needs of scaling board. Point on scaling board edge is calculated by the cloud data on scaling board of beating of return, and then is fitted according to these points each The linear equation on side, the intersection point on last each side is exactly the corner location of scaling board.
Step 104:According to the first corner location coordinate and the second corner location coordinate got, camera and laser are determined Outer parameter between radar.
Outer parameter in the embodiment of the present invention between camera and laser radar, refers to the opposite position of camera and laser radar Confidence ceases.In general, these parameters can establish the mapping relations of the three-dimensional system of coordinate that calibration plate determines and camera image coordinate system, Point in one three dimensions can be mapped to image space by these parameters, and vice versa.
The outer ginseng between camera and laser radar is being determined according to the first corner location coordinate and the second corner location coordinate During number, the first angle point matrix and the second angle point square can be generated by the first corner location coordinate and the second corner location coordinate Battle array;Scalar and calibrated intrinsic parameter are obtained, the first angle point matrix, the second angle point matrix, scalar and intrinsic parameter are inputted Constraint equation;The output valve being calculated is determined as the outer parameter between camera and laser radar.
Generally speaking, camera calibration is the conversion mathematical relationship by finding object in image and real world, finds out it Quantitative contact, so as to fulfill the purpose of real data is measured from image.
A kind of method of outer parameter between definite camera and laser radar provided in an embodiment of the present invention is different by multiple The AprilTag of ID is respectively arranged at each angle of polygon scaling board, and the angle point of each AprilTag arrived by camera calibration is sat Mark, and the angular coordinate at each angle of polygon scaling board that laser radar detects, using identical angle point in different coordinates The lower position relationship of system determines the outer parameter between camera and laser radar as constraint, improve camera and laser radar it Between outer parameter calibration result accuracy.
Embodiment two
Reference Fig. 2, shows the side of the outer parameter between a kind of definite camera and laser radar of the embodiment of the present invention two The step flow chart of method.
The method of outer parameter between the definite camera and laser radar of the embodiment of the present invention comprises the following steps:
Step 201:The AprilTag of multiple differences ID is respectively arranged to each angle of polygon scaling board.
Each side of polygon scaling board has angle with the scanning direction of laser radar;For each AprilTag, The direction at the angle point direction angle corresponding with AprilTag of AprilTag is consistent;The color of polygon scaling board is pure white, with Beneficial to the laser of reflection laser radar emission.
In the embodiment of the present invention, the number using polygon scaling board as diamond shape, AprilTag is four, scaling board it is each It is respectively provided with angle exemplified by an AprilTag, follow-up process is illustrated, wherein, it is provided with the diamond shape scaling board of AprilTag Schematic diagram is as shown in Figure 5.
The size of polygon scaling board can be configured according to the actual requirements by those skilled in the art.Preferably, will It is sized to 1m × 1m.
Step 202:Obtain each first corner location coordinate that camera calibration arrives.
Wherein, the first corner location coordinate is obtained by camera by detecting each AprilTag, and camera is attached to mark by detection The AprilTags of each angle point of fixed board can obtain the two-dimensional position information of each angle point of scaling board in the picture, and pass through two dimension Positional information obtains the first corner location coordinate.
Step 203:Obtain the cloud data information on polygon scaling board.
Wherein, cloud data information is obtained by the laser that laser radar detection polygon scaling board is reflected back.
With reference to shown in Fig. 6, scaling board is the pure white rhombic plate of 1m × 1m, and the AprilTag of 4 different id is neatly pasted In 4 corners.Scaling board is vertically placed on camera by diagram (1 and 3 angle points respectively towards upper and lower, 2 and 4 angle points towards left and right) and is swashed Optical radar it is common within sweep of the eye, camera can be detected by existing method AprilTag obtain 4 corner locations (u_i, V_i) (i=1,2,3,4).Laser radar launches entire row laser to scaling board, and the point cloud number on scaling board is obtained by return light According to.Laser radar scanning mode is not that from top to bottom, scanning direction is related with installation site, if being horizontally mounted, that is swept Retouch is rotated a circle in the horizontal direction since initial position.There are one with laser radar scanning direction for the side needs of scaling board Clamp angle.
Step 204:According to cloud data information, each edge point position coordinate on polygon scaling board edge is calculated.
Step 205:According to edge point position coordinate, the linear equation on each bar side of polygon scaling board is fitted.
Step 206:According to the linear equation on each bar side, determine that the second corner location of each angle point of polygon scaling board is sat Mark.
Laser radar calculates point on scaling board edge, Jin Ergen by the cloud data on scaling board of beating of return The linear equation on 4 sides is fitted according to these points, the intersection point on last 4 sides is exactly the corner location (X_ of scaling boardi,Y_i,Z_i) (i=1,2,3,4).
Step 207:According to the first corner location coordinate and the second corner location coordinate got, camera and laser are determined Outer parameter between radar.
It is a kind of preferably according to the first corner location coordinate and the second corner location coordinate, determine camera and laser radar it Between outer parameter mode it is as follows:
S1:According to the first corner location coordinate, the first angle point matrix is generated;
S2:According to the second corner location coordinate, the second angle point matrix is generated;
S3:Obtain scalar and calibrated intrinsic parameter;
S4:By the first angle point matrix, the second angle point matrix, scalar and intrinsic parameter input constraint equation;
S5:The outer parameter output valve being calculated being determined as between camera and laser radar.
The parameter that camera needs to demarcate is generally divided into intrinsic parameter and outer parameter two parts.Outer parameter determines camera at some Position and orientation in three dimensions, internal reference are the parameters of camera internal.During camera calibration, it will usually be related to Matrix includes:
Outer parameter matrix:Indicate real world point (world coordinates) is how to pass through rotation and translation, then fall into another On one real world point (camera coordinates).
Intrinsic Matrix.Indicate above-mentioned real world point is how to continue through the camera lens of video camera and pass through pin hole Imaging and electronics are transformed as pixel.
By camera lens, it is (certainly micro- that the object in a three dimensions is often mapped to the picture reduced that stands upside down Mirror is amplification, but common camera is all to reduce), arrived by sensor senses.
Ideally, the optical axis (being exactly the straight line by optical center perpendicular to sensor plane) of camera lens should be worn Cross the middle of image, still, it is actual due to installation accuracy the problem of, be constantly present error, this error needs to use internal reference To describe.As the x directions of camera with the diminution ratio of the size in y directions are, but in fact, camera lens if not perfect Round, the pixel on sensor all may result in the diminution in the two directions if not perfect compact arranged square Ratio is inconsistent.The scaling in the two directions can be described in internal reference comprising two parameters, can will not only use pixel number Measure the length transition weighed into three dimensions with other units (such as rice) come the length weighed, can also represent in x With the inconsistency of the change of scale in y directions.
Distort matrix:Ideally, the straight line in a three dimensions can also be mapped (the i.e. projection change that is in line by camera lens Change), but in fact, camera lens can not be so perfect, after being mapped by camera lens, straight line can bent, so needing the distortion of camera Parameter describes this deformation effect.
It is a kind of it is preferable determine between camera and laser radar to the computational methods of outer parameter be:Demarcated for diamond shape Plate, it is assumed that camera calibrated intrinsic parameter K (3x3 matrixes).Position table of the single angle point of scaling board in camera plane coordinate system (u, v, 1) is shown as, the positional representation under laser radar coordinate system is (X, Y, Z, 1), and coordinate is homogeneity expression way.
Wherein, matrix, (X, Y, Z, 1) for the first angular coordinate generation that (u, v, 1) is determined by camera are by laser radar The matrix of definite the second angular coordinate generation;Outer parameter [R, t] (3x4 matrixes) accounting equation of camera and laser radar is:
S* (u, v, 1) ^T=K* [R, t] * (X, Y, Z, 1) ^T
Wherein, s is a scalar, represents scale parameters, and ^T represents the transposition in matrix.
The PnP solver that this equation can utilize any third party to increase income solve to obtain R and t, i.e. camera and laser radar Outer parameter.More accurate outer ginseng is as a result, the error caused by measurement can be reduced by multiple repairing weld in order to obtain.Can be in phase Machine and laser radar it can be seen that in scope, scaling board is placed on different positions with ensure constraint angle point on the image and Being uniformly distributed in three dimensions.
The method of outer parameter between a kind of definite camera and laser radar provided in an embodiment of the present invention, except with reality Apply possessed by the method for the outer parameter between the camera and laser radar in example one outside beneficial effect, also obtained by laser radar The cloud data information on polygon scaling board is taken, and according to cloud data information, is calculated each on polygon scaling board edge Edge point position coordinate, so as to fit the linear equation on each side of polygon scaling board, and determines each angle point of polygon scaling board The second corner location coordinate.
Embodiment three
Reference Fig. 3, shows the dress of the outer parameter between a kind of definite camera and laser radar of the embodiment of the present invention three The structure diagram put.
The device of outer parameter between the definite camera and laser radar of the embodiment of the present invention includes:
Plate module 301 is demarcated, for the AprilTag of multiple differences ID to be respectively arranged to each angle of polygon scaling board.
First angle point acquisition module 302, each first corner location coordinate arrived for obtaining camera calibration, wherein, first Corner location coordinate is obtained by camera by detecting each AprilTag.
Second angle point acquisition module 303, for obtaining each second corner location coordinate, wherein, the second corner location coordinate For the angular coordinate at each angle of polygon scaling board.
Outer parameter determination module 304, the first corner location coordinate and the second corner location coordinate got for foundation, Determine the outer parameter between camera and laser radar.
A kind of device of outer parameter between definite camera and laser radar provided in an embodiment of the present invention, multiple are different The AprilTag of ID is respectively arranged at each angle of polygon scaling board, and the angle point of each AprilTag arrived by camera calibration is sat Mark, and the angular coordinate at each angle of polygon scaling board that laser radar detects, using identical angle point in different coordinates The lower position relationship of system determines the outer parameter between camera and laser radar as constraint, improve camera and laser radar it Between outer parameter calibration result accuracy.
Example IV
Reference Fig. 4, shows the dress of the outer parameter between a kind of definite camera and laser radar of the embodiment of the present invention four The structure diagram put.
The device of outer parameter between the definite camera and laser radar of the embodiment of the present invention includes:
Plate module 401 is demarcated, for the AprilTag of multiple differences ID to be respectively arranged to each angle of polygon scaling board; First angle point acquisition module 402, each first corner location coordinate arrived for obtaining camera calibration, wherein, the first corner location Coordinate is obtained by camera by detecting each AprilTag;Second angle point acquisition module 403, sits for obtaining each second corner location Mark, wherein, the second corner location coordinate is the angular coordinate at each angle of polygon scaling board;Outer parameter determination module 404, is used for According to the first corner location coordinate and the second corner location coordinate got, the outer ginseng between camera and laser radar is determined Number.
Preferably, the second angle point acquisition module 403 includes:Cloud data submodule 4031, for obtaining polygon calibration Cloud data information on plate, wherein, cloud data information is obtained by the laser that laser radar detection polygon scaling board is reflected back ;Marginal point submodule 4032, for according to cloud data information, calculating each edge point position on polygon scaling board edge Coordinate;Fitting a straight line submodule 4033, for according to edge point position coordinate, fit polygon scaling board each bar side it is straight Line equation;Coordinate determination sub-module 4034, for the linear equation according to each side, determines the second of each angle point of polygon scaling board Corner location coordinate.
Preferably, outer parameter determination module 404 includes:First matrix submodule 4041, for according to the first corner location Coordinate, generates the first angle point matrix;Second matrix submodule 4042, for according to the second corner location coordinate, generating second jiao Dot matrix;Parameter acquiring submodule 4043, for obtaining scalar and calibrated intrinsic parameter;Constraint equation submodule 4044, For by the first angle point matrix, the second angle point matrix, scalar and intrinsic parameter input constraint equation;Parameter determination submodule 4045, for the outer parameter being determined as the output valve being calculated between camera and laser radar.
Preferably, the parameter determination submodule be specifically used for according to equation below calculate the camera and laser radar it Between outer parameter:
S* (u, v, 1) ^T=K* [R, t] * (X, Y, Z, 1) ^T
Wherein, the single angle point of scaling board is (u, v, 1) in the positional representation of camera plane coordinate system, in laser radar coordinate Positional representation under system is (X, Y, Z, 1), and coordinate is homogeneity expression way;S is a scalar, represents scale parameters, ^T generations Transposition in table matrix, K are intrinsic parameter, and K is 3x3 matrixes;The outer parameter of [R, t] between camera and laser radar, [R, t] is 3x4 matrixes.
Preferably, scanning direction of each side of polygon scaling board with laser radar has angle.
Preferably, for each AprilTag, the direction one at the angle point direction angle corresponding with AprilTag of AprilTag Cause;The color of polygon scaling board is pure white.
The device of outer parameter between the definite camera and laser radar of the embodiment of the present invention is used for realization previous embodiment One and embodiment two in it is corresponding determine the method for the outer parameter between camera and laser radar, and there is corresponding method The beneficial effect of embodiment, details are not described herein.
Each embodiment in this specification is described by the way of progressive, what each embodiment stressed be with The difference of other embodiment, between each embodiment identical similar part mutually referring to.For system embodiment For, since it is substantially similar to embodiment of the method, so description is fairly simple, referring to the portion of embodiment of the method in place of correlation Defend oneself bright.
Above to the method for the outer parameter between a kind of definite camera provided by the present invention and laser radar, carry out in detail Thin to introduce, specific case used herein is set forth the implementation steps and realization device of the present invention, above example Explanation be only intended to help understand the present invention method and its core concept;Meanwhile for those of ordinary skill in the art, According to the thought of the present invention, there will be changes in specific embodiments and applications, in conclusion in this specification Appearance should not be construed as limiting the invention.It should be noted that the present invention will be described rather than to this hair for above-described embodiment It is bright to be limited, and those skilled in the art can design replacement in fact without departing from the scope of the appended claims Apply example.In the claims, any reference symbol between bracket should not be configured to limitations on claims.Word "comprising" does not exclude the presence of element or step not listed in the claims.Word "a" or "an" before element is not There are multiple such elements for exclusion.The present invention can be by means of including the hardware of some different elements and by means of appropriate The computer of programming is realized.In if the unit claim of equipment for drying is listed, several in these devices can be Embodied by same hardware branch.The use of the grade of word first, second does not indicate that any order.Can be by these word solutions It is interpreted as title.
Finally it should be noted that:The above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;Although The present invention is described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that:It still may be used To modify to the technical solution described in foregoing embodiments, or equivalent substitution is carried out to which part technical characteristic; And these modification or replace, do not make appropriate technical solution essence depart from various embodiments of the present invention technical solution spirit and Scope.

Claims (6)

  1. A kind of 1. method of the outer parameter between definite camera and laser radar, it is characterised in that the described method includes:
    The AprilTag of multiple differences ID is respectively arranged to each angle of polygon scaling board;
    Each first corner location coordinate that camera calibration arrives is obtained, wherein, the first corner location coordinate is led to by the camera Cross and detect each AprilTag acquisitions;
    Each second corner location coordinate is obtained, wherein, the second corner location coordinate is each angle of the polygon scaling board Angular coordinate;
    According to the first corner location coordinate and the second corner location coordinate got, camera and laser radar are determined Between outer parameter.
  2. 2. according to the method described in claim 1, it is characterized in that, it is described acquisition each second corner location coordinate the step of wrap Include:
    The cloud data information on the polygon scaling board is obtained, wherein, the cloud data information is detected by laser radar The laser that polygon scaling board is reflected back obtains;
    According to the cloud data information, each edge point position coordinate on the polygon scaling board edge is calculated;
    According to the edge point position coordinate, the linear equation on each bar side of the polygon scaling board is fitted;
    According to the linear equation on each bar side, the second corner location coordinate of each angle point of polygon scaling board is determined.
  3. It is 3. according to the method described in claim 1, it is characterized in that, described according to the first corner location coordinate and described Two corner location coordinates, the step of determining the outer parameter between camera and laser radar, include:
    According to the first corner location coordinate, the first angle point matrix is generated;
    According to the second corner location coordinate, the second angle point matrix is generated;
    Obtain scalar and calibrated intrinsic parameter;
    By the first angle point matrix, the second angle point matrix, scalar and intrinsic parameter input constraint equation;
    The outer parameter output valve being calculated being determined as between camera and laser radar.
  4. 4. according to the method described in claim 3, it is characterized in that, according to equation below calculate the camera and laser radar it Between outer parameter:
    S* (u, v, 1) ^T=K* [R, t] * (X, Y, Z, 1) ^T;
    Wherein, the single angle point of scaling board is (u, v, 1) in the positional representation of camera plane coordinate system, under laser radar coordinate system Positional representation be (X, Y, Z, 1), coordinate is homogeneity expression way;S is a scalar, represents scale parameters, and ^T represents square Transposition in battle array, K is intrinsic parameter, and K is 3x3 matrixes;The outer parameter of [R, t] between camera and laser radar, [R, t] are 3x4 Matrix.
  5. 5. according to the method described in claim 1, it is characterized in that, each side of polygon scaling board is swept with laser radar Retouching direction has angle.
  6. 6. according to the method described in claim 1, it is characterized in that, it is directed to each AprilTag, the angle point of the AprilTag The direction at direction angle corresponding with the AprilTag is consistent.
CN201610922058.0A 2016-10-21 2016-10-21 Method for determining external parameters between camera and laser radar Active CN107976668B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610922058.0A CN107976668B (en) 2016-10-21 2016-10-21 Method for determining external parameters between camera and laser radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610922058.0A CN107976668B (en) 2016-10-21 2016-10-21 Method for determining external parameters between camera and laser radar

Publications (2)

Publication Number Publication Date
CN107976668A true CN107976668A (en) 2018-05-01
CN107976668B CN107976668B (en) 2020-03-31

Family

ID=62004679

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610922058.0A Active CN107976668B (en) 2016-10-21 2016-10-21 Method for determining external parameters between camera and laser radar

Country Status (1)

Country Link
CN (1) CN107976668B (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108627849A (en) * 2018-07-25 2018-10-09 南京富锐光电科技有限公司 A kind of range laser radar system applied to high speed camera calibration
CN109239727A (en) * 2018-09-11 2019-01-18 北京理工大学 A kind of distance measuring method of combination solid-state face battle array laser radar and double CCD cameras
CN109270534A (en) * 2018-05-07 2019-01-25 西安交通大学 A kind of intelligent vehicle laser sensor and camera online calibration method
CN109633612A (en) * 2018-10-18 2019-04-16 浙江大学 A kind of single line laser radar that nothing is observed jointly and Camera extrinsic scaling method
CN109658461A (en) * 2018-12-24 2019-04-19 中国电子科技集团公司第二十研究所 A kind of unmanned plane localization method of the cooperation two dimensional code based on virtual simulation environment
CN109712190A (en) * 2018-11-10 2019-05-03 浙江大学 The outer ginseng scaling method of three-dimensional laser scanner and three-dimensional laser radar
CN109946703A (en) * 2019-04-10 2019-06-28 北京小马智行科技有限公司 A kind of sensor attitude method of adjustment and device
CN110021046A (en) * 2019-03-05 2019-07-16 中国科学院计算技术研究所 The external parameters calibration method and system of camera and laser radar combination sensor
CN110148180A (en) * 2019-04-22 2019-08-20 河海大学 A kind of laser radar and camera fusing device and scaling method
CN110361717A (en) * 2019-07-31 2019-10-22 苏州玖物互通智能科技有限公司 Laser radar-camera combined calibration target and combined calibration method
CN110687521A (en) * 2019-10-15 2020-01-14 深圳数翔科技有限公司 Vehicle-mounted laser radar calibration method
CN111123242A (en) * 2018-10-31 2020-05-08 北京亚兴智数科技有限公司 Combined calibration method based on laser radar and camera and computer readable storage medium
CN111638499A (en) * 2020-05-08 2020-09-08 上海交通大学 Camera-laser radar relative external reference calibration method based on laser radar reflection intensity point characteristics
CN112162263A (en) * 2020-10-26 2021-01-01 苏州挚途科技有限公司 Combined calibration method and device for sensor and electronic equipment
CN112270713A (en) * 2020-10-14 2021-01-26 北京航空航天大学杭州创新研究院 Calibration method and device, storage medium and electronic device
CN112308928A (en) * 2020-10-27 2021-02-02 北京航空航天大学 Camera without calibration device and laser radar automatic calibration method
CN112734857A (en) * 2021-01-08 2021-04-30 香港理工大学深圳研究院 Calibration method for camera internal reference and camera relative laser radar external reference and electronic equipment
CN112816949A (en) * 2019-11-18 2021-05-18 商汤集团有限公司 Calibration method and device of sensor, storage medium and calibration system
CN113034567A (en) * 2021-03-31 2021-06-25 奥比中光科技集团股份有限公司 Depth truth value acquisition method, device and system and depth camera
CN113281723A (en) * 2021-05-07 2021-08-20 北京航空航天大学 Calibration method for structural parameters between 3D laser radar and camera based on AR tag
CN113484830A (en) * 2021-06-22 2021-10-08 上海智能网联汽车技术中心有限公司 Composite calibration plate and calibration method
CN116148809A (en) * 2023-04-04 2023-05-23 中储粮成都储藏研究院有限公司 Automatic generation method and system for grain vehicle sampling point based on laser radar scanning and positioning

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882313A (en) * 2010-07-14 2010-11-10 中国人民解放军国防科学技术大学 Calibration method of correlation between single line laser radar and CCD (Charge Coupled Device) camera
CN103473771A (en) * 2013-09-05 2013-12-25 上海理工大学 Method for calibrating camera
CN103837869A (en) * 2014-02-26 2014-06-04 北京工业大学 Vector-relation-based method for calibrating single-line laser radar and CCD camera
US9282326B2 (en) * 2013-10-28 2016-03-08 The Regents Of The University Of Michigan Interactive camera calibration tool
CN105931229A (en) * 2016-04-18 2016-09-07 东北大学 Wireless camera sensor network position and posture calibration method for wireless camera sensor network

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882313A (en) * 2010-07-14 2010-11-10 中国人民解放军国防科学技术大学 Calibration method of correlation between single line laser radar and CCD (Charge Coupled Device) camera
CN103473771A (en) * 2013-09-05 2013-12-25 上海理工大学 Method for calibrating camera
US9282326B2 (en) * 2013-10-28 2016-03-08 The Regents Of The University Of Michigan Interactive camera calibration tool
CN103837869A (en) * 2014-02-26 2014-06-04 北京工业大学 Vector-relation-based method for calibrating single-line laser radar and CCD camera
CN105931229A (en) * 2016-04-18 2016-09-07 东北大学 Wireless camera sensor network position and posture calibration method for wireless camera sensor network

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
DENGQING TANG等: "AprilTag array‑aided extrinsic calibration of camera–laser multi‑sensor system", 《TANG ET AL. ROBOT. BIOMIM》 *
YAOWEN LV等: "A new robust 2D camera calibration method using RANSAC", 《OPTIK》 *
程效军等: "融合航空影像和LIDAR点云的建筑物探测及轮廓提取", 《中国激光》 *
陈慧岩等: "《无人驾驶汽车概论》", 31 July 2014 *

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109270534A (en) * 2018-05-07 2019-01-25 西安交通大学 A kind of intelligent vehicle laser sensor and camera online calibration method
CN109270534B (en) * 2018-05-07 2020-10-27 西安交通大学 Intelligent vehicle laser sensor and camera online calibration method
CN108627849A (en) * 2018-07-25 2018-10-09 南京富锐光电科技有限公司 A kind of range laser radar system applied to high speed camera calibration
CN109239727B (en) * 2018-09-11 2022-08-05 北京理工大学 Distance measurement method combining solid-state area array laser radar and double CCD cameras
CN109239727A (en) * 2018-09-11 2019-01-18 北京理工大学 A kind of distance measuring method of combination solid-state face battle array laser radar and double CCD cameras
CN109633612A (en) * 2018-10-18 2019-04-16 浙江大学 A kind of single line laser radar that nothing is observed jointly and Camera extrinsic scaling method
CN111123242B (en) * 2018-10-31 2022-03-15 北京亚兴智数科技有限公司 Combined calibration method based on laser radar and camera and computer readable storage medium
CN111123242A (en) * 2018-10-31 2020-05-08 北京亚兴智数科技有限公司 Combined calibration method based on laser radar and camera and computer readable storage medium
CN109712190A (en) * 2018-11-10 2019-05-03 浙江大学 The outer ginseng scaling method of three-dimensional laser scanner and three-dimensional laser radar
CN109658461B (en) * 2018-12-24 2023-05-26 中国电子科技集团公司第二十研究所 Unmanned aerial vehicle positioning method based on cooperation two-dimensional code of virtual simulation environment
CN109658461A (en) * 2018-12-24 2019-04-19 中国电子科技集团公司第二十研究所 A kind of unmanned plane localization method of the cooperation two dimensional code based on virtual simulation environment
CN110021046A (en) * 2019-03-05 2019-07-16 中国科学院计算技术研究所 The external parameters calibration method and system of camera and laser radar combination sensor
CN110021046B (en) * 2019-03-05 2021-11-19 中国科学院计算技术研究所 External parameter calibration method and system for camera and laser radar combined sensor
CN109946703B (en) * 2019-04-10 2021-09-28 北京小马智行科技有限公司 Sensor attitude adjusting method and device
CN109946703A (en) * 2019-04-10 2019-06-28 北京小马智行科技有限公司 A kind of sensor attitude method of adjustment and device
CN110148180A (en) * 2019-04-22 2019-08-20 河海大学 A kind of laser radar and camera fusing device and scaling method
CN110361717A (en) * 2019-07-31 2019-10-22 苏州玖物互通智能科技有限公司 Laser radar-camera combined calibration target and combined calibration method
CN110687521A (en) * 2019-10-15 2020-01-14 深圳数翔科技有限公司 Vehicle-mounted laser radar calibration method
CN112816949B (en) * 2019-11-18 2024-04-16 商汤集团有限公司 Sensor calibration method and device, storage medium and calibration system
CN112816949A (en) * 2019-11-18 2021-05-18 商汤集团有限公司 Calibration method and device of sensor, storage medium and calibration system
WO2021098439A1 (en) * 2019-11-18 2021-05-27 商汤集团有限公司 Sensor calibration method and apparatus, and storage medium, calibration system and program product
CN111638499A (en) * 2020-05-08 2020-09-08 上海交通大学 Camera-laser radar relative external reference calibration method based on laser radar reflection intensity point characteristics
CN111638499B (en) * 2020-05-08 2024-04-09 上海交通大学 Camera-laser radar relative external parameter calibration method based on laser radar reflection intensity point characteristics
CN112270713A (en) * 2020-10-14 2021-01-26 北京航空航天大学杭州创新研究院 Calibration method and device, storage medium and electronic device
CN112162263A (en) * 2020-10-26 2021-01-01 苏州挚途科技有限公司 Combined calibration method and device for sensor and electronic equipment
CN112308928A (en) * 2020-10-27 2021-02-02 北京航空航天大学 Camera without calibration device and laser radar automatic calibration method
CN112734857A (en) * 2021-01-08 2021-04-30 香港理工大学深圳研究院 Calibration method for camera internal reference and camera relative laser radar external reference and electronic equipment
CN113034567A (en) * 2021-03-31 2021-06-25 奥比中光科技集团股份有限公司 Depth truth value acquisition method, device and system and depth camera
CN113281723A (en) * 2021-05-07 2021-08-20 北京航空航天大学 Calibration method for structural parameters between 3D laser radar and camera based on AR tag
CN113484830A (en) * 2021-06-22 2021-10-08 上海智能网联汽车技术中心有限公司 Composite calibration plate and calibration method
CN116148809A (en) * 2023-04-04 2023-05-23 中储粮成都储藏研究院有限公司 Automatic generation method and system for grain vehicle sampling point based on laser radar scanning and positioning
CN116148809B (en) * 2023-04-04 2023-06-20 中储粮成都储藏研究院有限公司 Automatic generation method and system for grain vehicle sampling point based on laser radar scanning and positioning

Also Published As

Publication number Publication date
CN107976668B (en) 2020-03-31

Similar Documents

Publication Publication Date Title
CN107976668A (en) A kind of method of outer parameter between definite camera and laser radar
CN107976669A (en) A kind of device of outer parameter between definite camera and laser radar
CN106127745B (en) The combined calibrating method and device of structure light 3 D vision system and line-scan digital camera
CN107564069B (en) Method and device for determining calibration parameters and computer readable storage medium
CN102132125B (en) Calibration of a profile measuring system
CN111750806A (en) Multi-view three-dimensional measurement system and method
US20160025591A1 (en) Automated deflectometry system for assessing reflector quality
CN110191326A (en) A kind of optical projection system resolution extension method, apparatus and optical projection system
CN109443209A (en) A kind of line-structured light system calibrating method based on homography matrix
KR20060031685A (en) Image projector, inclination angle detection method, and projection image correction method
JP2004127239A (en) Method and system for calibrating multiple cameras using calibration object
CN110246191B (en) Camera nonparametric model calibration method and calibration precision evaluation method
Yu et al. A calibration method based on virtual large planar target for cameras with large FOV
CN111091599A (en) Multi-camera-projector system calibration method based on sphere calibration object
CN113108721B (en) High-reflectivity object three-dimensional measurement method based on multi-beam self-adaptive complementary matching
CN108429908A (en) A kind of test method of camera module, device, equipment and medium
CN112135120A (en) Virtual image information measuring method and system based on head-up display system
CN107170010A (en) System calibration method, device and three-dimensional reconstruction system
CN110360930A (en) A kind of laser displacement normal sensor and its measurement method
CN115830103A (en) Monocular color-based transparent object positioning method and device and storage medium
CN110146032B (en) Synthetic aperture camera calibration method based on light field distribution
JP2011155412A (en) Projection system and distortion correction method in the same
CN206258111U (en) A kind of image measuring device of use telecentric lens
Yu et al. High-accuracy camera calibration method based on coded concentric ring center extraction
US10643341B2 (en) Replicated dot maps for simplified depth computation using machine learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 100026 8 floor 909, 105 building 3, Yao Yuan Road, Chaoyang District, Beijing.

Applicant after: Lexus Automobile (Beijing) Co.,Ltd.

Address before: 100026 8 floor 909, 105 building 3, Yao Yuan Road, Chaoyang District, Beijing.

Applicant before: FARADAY (BEIJING) NETWORK TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
TA01 Transfer of patent application right

Effective date of registration: 20180830

Address after: 511458 9, Nansha District Beach Road, Guangzhou, Guangdong, 9

Applicant after: Evergrande Faraday Future Smart Car (Guangdong) Co.,Ltd.

Address before: 100026 8 floor 909, 105 building 3, Yao Yuan Road, Chaoyang District, Beijing.

Applicant before: Lexus Automobile (Beijing) Co.,Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20190314

Address after: 100015 Building No. 7, 74, Jiuxianqiao North Road, Chaoyang District, Beijing, 001

Applicant after: FAFA Automobile (China) Co.,Ltd.

Address before: 511458 9, Nansha District Beach Road, Guangzhou, Guangdong, 9

Applicant before: Evergrande Faraday Future Smart Car (Guangdong) Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant