CN107976668B - Method for determining external parameters between camera and laser radar - Google Patents

Method for determining external parameters between camera and laser radar Download PDF

Info

Publication number
CN107976668B
CN107976668B CN201610922058.0A CN201610922058A CN107976668B CN 107976668 B CN107976668 B CN 107976668B CN 201610922058 A CN201610922058 A CN 201610922058A CN 107976668 B CN107976668 B CN 107976668B
Authority
CN
China
Prior art keywords
corner
camera
calibration plate
coordinates
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610922058.0A
Other languages
Chinese (zh)
Other versions
CN107976668A (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fafa Automobile China Co ltd
Original Assignee
Leauto Intelligent Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Leauto Intelligent Technology Beijing Co Ltd filed Critical Leauto Intelligent Technology Beijing Co Ltd
Priority to CN201610922058.0A priority Critical patent/CN107976668B/en
Publication of CN107976668A publication Critical patent/CN107976668A/en
Application granted granted Critical
Publication of CN107976668B publication Critical patent/CN107976668B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The embodiment of the invention provides a method for determining an external parameter between a camera and a laser radar, which comprises the following steps: respectively arranging a plurality of AprilTags with different IDs at each corner of the polygonal calibration plate; acquiring position coordinates of each first corner point detected by a camera, wherein the position coordinates of the first corner point are obtained by the camera through detecting each AprilTag; acquiring position coordinates of each second corner point, wherein the position coordinates of the second corner points are the corner point coordinates of each corner of the polygon calibration plate; and determining external parameters between the camera and the laser radar according to the acquired coordinates of the first corner point position and the acquired coordinates of the second corner point position. By the method for determining the external parameters between the camera and the laser radar, the accuracy of the determined calibration result can be improved.

Description

Method for determining external parameters between camera and laser radar
Technical Field
The invention relates to the field of unmanned vehicles, in particular to a method for determining external parameters between a camera and a laser radar.
Background
With the progress of unmanned vehicle technology, the number of types of sensors mounted on vehicles has increased. The most common is a camera and a laser radar for sensing and positioning an unmanned vehicle, a three-dimensional position of an object can be obtained through scanning of the laser radar, a picture captured by the camera obtains a two-dimensional position and a color of the object, the information is very important in the driving process of the unmanned vehicle, and meanwhile, the position and direction relationship between the laser radar and the camera is required to be calibrated, and in order to accurately blend data information obtained by the laser radar and the camera into a coordinate system.
The current methods for calibrating the camera and the laser radar external parameter mainly comprise two methods, respectively:
the first method is to use black and white checkerboards for calibration, the checkerboards are placed in a range visible to both a camera and a laser radar, and the relative relationship between the checkerboards is calculated by using the checkerboard planes detected by the camera and the laser radar as constraints. The problem with this approach is that: the reflectivity of the laser radar beam in black and white colors is different, which causes deviation of the measured plane and finally influences the calibration result.
The second method is to use the characteristics of the natural environment for calibration, although no special calibration plate is needed, the corresponding constraints are found by using the point cloud reconstructed by the camera and the laser radar in the same area, and the relative relationship between the point cloud and the point cloud is calculated. However, this method has problems in that: the reconstructed point cloud has errors, no method can avoid the errors, and the problems of sensor data synchronization and motion compensation can be caused when the vehicle body is in a motion state in the calibration process, so that the calibration result is inaccurate.
Therefore, the existing camera and laser radar external reference calibration scheme is limited by the existing measurement technology and other objective factors, and the final calibration result is not high in accuracy.
Disclosure of Invention
The invention provides a method for determining external parameters between a camera and a laser radar, which aims to solve the problem that the accuracy of a final calibration result is low due to the restriction on the existing measurement technology and other objective factors in the existing camera and laser radar external parameter calibration scheme in the prior art.
In order to solve the above problem, the present invention discloses a method for determining an external parameter between a camera and a laser radar, the method comprising: respectively arranging a plurality of AprilTags with different IDs at each corner of the polygonal calibration plate; acquiring position coordinates of each first corner point detected by a camera, wherein the position coordinates of the first corner point are obtained by the camera through detecting each AprilTag; acquiring position coordinates of each second corner point, wherein the position coordinates of the second corner points are the corner point coordinates of each corner of the polygon calibration plate; and determining external parameters between the camera and the laser radar according to the acquired coordinates of the first corner point position and the acquired coordinates of the second corner point position.
Preferably, the step of acquiring the position coordinates of each second corner point includes: acquiring point cloud data information on the polygon calibration plate, wherein the point cloud data information is obtained by detecting laser reflected by the polygon calibration plate through a laser radar; calculating the position coordinates of each edge point on the edge of the polygonal calibration plate according to the point cloud data information; fitting a linear equation of each edge of the polygonal calibration plate according to the position coordinates of the edge points; and determining the position coordinates of a second corner point of each corner point of the polygon calibration plate according to the linear equation of each edge.
Preferably, the step of determining an external parameter between the camera and the lidar according to the first corner position coordinate and the second corner position coordinate includes: generating a first corner matrix according to the first corner position coordinate; generating a second corner matrix according to the second corner position coordinates; acquiring a scalar and calibrated internal parameters; inputting the first corner matrix, the second corner matrix, the scalar and the internal parameter into a constraint equation; and determining the calculated output value as an external parameter between the camera and the laser radar.
Preferably, the extrinsic parameter between the camera and the lidar is calculated according to the following formula: s (u, v,1) T ═ K ^ R, T (X, Y, Z,1) T; the position of a single angular point of the calibration plate in a camera plane coordinate system is represented as (u, v,1), the position of the single angular point of the calibration plate in a laser radar coordinate system is represented as (X, Y, Z,1), and the coordinates are all in a homogeneous expression mode; s is a scalar quantity, representing scale parameters, and ^ T represents transposition in the matrix, K is an internal parameter, and K is a 3x3 matrix; [ R, t ] is an external parameter between the camera and the laser radar, [ R, t ] is a 3x4 matrix.
Preferably, each side of the polygonal calibration plate has an included angle with the scanning direction of the laser radar.
Preferably, for each AprilTag, the direction of the corner point of the AprilTag coincides with the direction of the corner to which the AprilTag corresponds.
Compared with the prior art, the invention has the following advantages:
the method for determining the external parameters between the camera and the laser radar, provided by the embodiment of the invention, is characterized in that a plurality of AprilTags with different IDs are respectively arranged at each corner of a polygonal calibration plate, and the external parameters between the camera and the laser radar are determined by using the position relation of the same corner point under different coordinate systems as constraints through the corner point coordinates of each AprilTag detected by the camera and the corner point coordinates of each corner of the polygonal calibration plate detected by the laser radar, so that the accuracy of the calibration result of the external parameters between the camera and the laser radar is improved.
The foregoing description is only an overview of the technical solutions of the present invention, and the embodiments of the present invention are described below in order to make the technical means of the present invention more clearly understood and to make the above and other objects, features, and advantages of the present invention more clearly understandable.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
FIG. 1 is a flowchart illustrating steps of a method for determining an extrinsic parameter between a camera and a lidar according to a first embodiment of the present invention;
FIG. 2 is a flowchart illustrating steps of a method for determining an extrinsic parameter between a camera and a lidar according to a second embodiment of the present invention;
FIG. 3 is a schematic structural diagram of an apparatus for determining an external parameter between a camera and a lidar according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of an apparatus for determining an external parameter between a camera and a lidar according to a fourth embodiment of the present invention;
FIG. 5 is a schematic illustration of a calibration plate and an AprilTag in accordance with an embodiment of the present invention;
fig. 6 is a schematic diagram of point cloud data according to an embodiment of the present invention.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Example one
Referring to fig. 1, a flowchart illustrating steps of a method for determining an extrinsic parameter between a camera and a lidar according to a first embodiment of the present invention is shown.
The method for determining the external parameters between the camera and the laser radar comprises the following steps:
step 101: a plurality of aprilats of different IDs are respectively disposed at each corner of the polygonal calibration plate.
The polygonal calibration plate can be in any appropriate shape such as a diamond shape, a rectangular shape, a triangular shape, and the like, and the specific shape of the calibration plate in the embodiment of the invention is not limited.
Aprilat is a two-dimensional code-like label invented by the teachings of the united states. The ID here can be understood as a code of different model. In a specific implementation, the number of apriltags is related to the number of corners that a polygon contains; AprilTag may be set for each corner or for some corners.
Preferably, each side of the polygonal calibration plate forms an included angle with the horizontal direction; for each AprilTag, the direction of the corner point of the AprilTag is consistent with the direction of the corner corresponding to the AprilTag; the color of the polygonal calibration plate is pure white, so that laser emitted by the laser radar can be reflected conveniently.
Step 102: and acquiring the position coordinates of each first corner point detected by the camera.
And the position coordinate of the first corner point is a two-dimensional coordinate. The first corner position coordinates are obtained by detecting AprilTag by the camera, the camera can obtain two-dimensional position information of each corner of the calibration board in the image by detecting AprilTags attached to each corner of the calibration board, and the first corner position coordinates are obtained through the two-dimensional position information.
Step 103: and acquiring the position coordinates of each second corner point.
And the position coordinate of the second corner point is a three-dimensional coordinate. The second corner position coordinates are the corner coordinates of each corner of the polygon calibration plate. And the laser radar transmits a whole row of laser to the calibration plate, and point cloud data on the calibration plate are obtained through the return light. The scanning mode of the laser radar is not from top to bottom, the scanning direction is related to the installation position, and if the laser radar is installed horizontally, the scanning is started from the initial position and rotates in the horizontal direction for one circle. The edge of the calibration plate needs to have a certain included angle with the scanning direction of the laser radar. And calculating points on the edge of the calibration plate through the returned point cloud data printed on the calibration plate, and further fitting a linear equation of each edge according to the points, wherein the intersection point of each edge is the angular point position of the calibration plate.
Step 104: and determining external parameters between the camera and the laser radar according to the acquired coordinates of the first corner point position and the acquired coordinates of the second corner point position.
The external parameters between the camera and the laser radar in the embodiment of the invention refer to the relative position information of the camera and the laser radar. In general, these parameters may establish a mapping relationship between the three-dimensional coordinate system determined by the calibration plate and the camera image coordinate system, and these parameters may map points in a three-dimensional space to image space, and vice versa.
When the external parameters between the camera and the laser radar are determined according to the first corner position coordinate and the second corner position coordinate, a first corner matrix and a second corner matrix can be generated through the first corner position coordinate and the second corner position coordinate; acquiring a scalar quantity and calibrated internal parameters, and inputting the first corner matrix, the second corner matrix, the scalar quantity and the internal parameters into a constraint equation; the calculated output value is determined as an external parameter between the camera and the laser radar.
In general, camera calibration is to find out quantitative relation between an image and the real world by finding out the conversion mathematical relation between the image and the real world of an object, so as to achieve the purpose of measuring actual data from the image.
The method for determining the external parameters between the camera and the laser radar, provided by the embodiment of the invention, is characterized in that a plurality of AprilTags with different IDs are respectively arranged at each corner of a polygonal calibration plate, and the external parameters between the camera and the laser radar are determined by using the position relation of the same corner point under different coordinate systems as constraints through the corner point coordinates of each AprilTag detected by the camera and the corner point coordinates of each corner of the polygonal calibration plate detected by the laser radar, so that the accuracy of the calibration result of the external parameters between the camera and the laser radar is improved.
Example two
Referring to fig. 2, a flowchart illustrating steps of a method for determining an extrinsic parameter between a camera and a lidar according to a second embodiment of the present invention is shown.
The method for determining the external parameters between the camera and the laser radar comprises the following steps:
step 201: a plurality of aprilats of different IDs are respectively disposed at each corner of the polygonal calibration plate.
Each side of the polygonal calibration plate forms an included angle with the scanning direction of the laser radar; for each AprilTag, the direction of the corner point of the AprilTag is consistent with the direction of the corner corresponding to the AprilTag; the color of the polygonal calibration plate is pure white, so that laser emitted by the laser radar can be reflected conveniently.
In the embodiment of the present invention, a subsequent flow is described by taking an example in which a polygonal calibration plate is a diamond, the number of apriltags is four, and each corner of the calibration plate is provided with an AprilTag, where a schematic diagram of the diamond calibration plate provided with AprilTag is shown in fig. 5.
The size of the polygonal calibration plate can be set by a person skilled in the art according to actual requirements. Preferably, its size is set to 1m × 1 m.
Step 202: and acquiring the position coordinates of each first corner point detected by the camera.
The first corner position coordinates are obtained by detecting AprilTag by the camera, the camera can obtain two-dimensional position information of each corner of the calibration board in the image by detecting AprilTags attached to each corner of the calibration board, and the first corner position coordinates are obtained through the two-dimensional position information.
Step 203: and acquiring point cloud data information on the polygon calibration plate.
And the point cloud data information is obtained by detecting the laser reflected by the polygon calibration plate by the laser radar.
Referring to FIG. 6, the calibration plate is a 1m × 1m pure white diamond plate, and 4 AprilTag sheets with different id are adhered to 4 corners in order. The calibration board is vertically placed in a common visual field range of the camera and the laser radar according to an illustration (the 1 corner point and the 3 corner point face up and down respectively, and the 2 corner point and the 4 corner point face left and right), and the camera can detect AprilTag by an existing method to obtain 4 corner point positions (u _ i, v _ i) (i is 1,2,3 and 4). And the laser radar transmits a whole row of laser to the calibration plate, and point cloud data on the calibration plate are obtained through the return light. The scanning mode of the laser radar is not from top to bottom, the scanning direction is related to the installation position, and if the laser radar is installed horizontally, the scanning is started from the initial position and rotates in the horizontal direction for one circle. The edge of the calibration plate needs to have a certain included angle with the scanning direction of the laser radar.
Step 204: and calculating the position coordinates of each edge point on the edge of the polygonal calibration plate according to the point cloud data information.
Step 205: and fitting a linear equation of each edge of the polygonal calibration plate according to the position coordinates of the edge points.
Step 206: and determining the position coordinates of a second angular point of each angular point of the polygon calibration plate according to the linear equation of each edge.
The laser radar calculates points on the edge of the calibration plate through the returned point cloud data printed on the calibration plate, then a linear equation of 4 sides is fitted according to the points, and finally the intersection point of the 4 sides is the angular point position (X \u) of the calibration platei,Y_i,Z_i)(i=1,2,3,4)。
Step 207: and determining external parameters between the camera and the laser radar according to the acquired coordinates of the first corner point position and the acquired coordinates of the second corner point position.
A preferred way of determining the extrinsic parameters between the camera and the lidar, based on the coordinates of the first corner position and the coordinates of the second corner position, is as follows:
s1: generating a first corner matrix according to the first corner position coordinate;
s2: generating a second corner matrix according to the position coordinates of the second corner;
s3: acquiring a scalar and calibrated internal parameters;
s4: inputting the first corner matrix, the second corner matrix, the scalar and the internal parameter into a constraint equation;
s5: and determining the calculated output value as an external parameter between the camera and the laser radar.
The parameters to be calibrated of the camera are generally divided into an internal parameter and an external parameter. The extrinsic parameters determine the position and orientation of the camera in some three-dimensional space, and the intrinsic parameters are parameters internal to the camera. In the camera calibration process, the matrices that are typically involved include:
an external parameter matrix: it is shown how a real world point (world coordinate) is rotated and translated and then falls onto another real world point (camera coordinate).
And (4) an internal parameter matrix. It is shown how the real world points continue through the camera lens and become pixel points through pinhole imaging and electronic transformation.
Through the lens, objects in a three-dimensional space are often mapped into an inverted, reduced image (although microscopes are typically enlarged, cameras are often reduced) that is perceived by the sensor.
Ideally, the optical axis of the lens (i.e. the line perpendicular to the sensor plane through the center of the lens) should be through the middle of the image, but in practice there is always an error due to mounting accuracy issues, which needs to be described by internal reference. The x-direction and y-direction dimensions of the camera are scaled down equally, but in practice, pixels on the sensor may not be perfectly square, if not perfectly round, and may not be scaled down equally in both directions. The inclusion of two parameters in the internal reference can describe the scaling of the two directions, and not only can convert the length measured by the number of pixels into the length measured by other units (such as meters) in a three-dimensional space, but also can represent the inconsistency of the scale transformation in the x and y directions.
Distortion matrix: ideally, the lens would map a straight line in a three-dimensional space into a straight line (i.e. projective transformation), but in reality, the lens cannot be so perfect, and after mapping by the lens, the straight line would be bent, so distortion parameters of the camera are needed to describe the distortion effect.
A preferred calculation method for determining the extrinsic parameters of the arrival between the camera and the lidar is: for the diamond calibration plate, assume that the camera has calibrated the intrinsic parameter K (3x3 matrix). The position of a single angular point of the calibration plate in a camera plane coordinate system is represented as (u, v,1), the position of the single angular point in a laser radar coordinate system is represented as (X, Y, Z,1), and the coordinates are expressed in a homogeneous mode.
Wherein (u, v,1) a matrix generated by a first angular point coordinate determined by the camera, and (X, Y, Z,1) a matrix generated by a second angular point coordinate determined by the laser radar; the camera and lidar extrinsic parameters [ R, t ] (3x4 matrix) are calculated as:
s*(u,v,1)^T=K*[R,t]*(X,Y,Z,1)^T
where s is a scalar representing the scale parameter and T represents the transpose in the matrix.
This equation can be solved using any third party open source PnP resolver to obtain R and t, the extrinsic parameters of the camera and the lidar. In order to obtain more accurate external reference results, errors caused by measurement can be reduced by sampling for multiple times. The calibration plate can be placed at different positions within the range of being seen by both the camera and the laser radar to ensure the uniform distribution of the constraint angular points on the image and in the three-dimensional space.
The method for determining the external parameters between the camera and the laser radar provided by the embodiment of the invention has the beneficial effects of the method for determining the external parameters between the camera and the laser radar in the first embodiment, and also obtains point cloud data information on the polygon calibration plate through the laser radar, and calculates the position coordinates of each edge point on the edge of the polygon calibration plate according to the point cloud data information, so as to fit a linear equation of each edge of the polygon calibration plate and determine the position coordinates of a second corner point of each corner point of the polygon calibration plate.
EXAMPLE III
Referring to fig. 3, a schematic structural diagram of an apparatus for determining an external parameter between a camera and a lidar according to a third embodiment of the present invention is shown.
The device for determining the external parameters between the camera and the laser radar comprises the following steps:
the calibration board module 301 is configured to set a plurality of aprilats with different IDs at each corner of the polygonal calibration board.
A first corner point obtaining module 302, configured to obtain coordinates of first corner points detected by the camera, where the coordinates of the first corner points are obtained by the camera by detecting aprilats.
A second corner point obtaining module 303, configured to obtain position coordinates of each second corner point, where the position coordinates of the second corner point are corner point coordinates of each corner of the polygon calibration board.
And an external parameter determining module 304, configured to determine an external parameter between the camera and the laser radar according to the acquired first corner position coordinate and the acquired second corner position coordinate.
According to the device for determining the external parameters between the camera and the laser radar, provided by the embodiment of the invention, a plurality of AprilTags with different IDs are respectively arranged at each corner of the polygonal calibration plate, and the external parameters between the camera and the laser radar are determined by using the position relation of the same corner point under different coordinate systems as constraints through the corner point coordinates of each AprilTag detected by the camera and the corner point coordinates of each corner of the polygonal calibration plate detected by the laser radar, so that the accuracy of the calibration result of the external parameters between the camera and the laser radar is improved.
Example four
Referring to fig. 4, a schematic structural diagram of an apparatus for determining an external parameter between a camera and a lidar according to a fourth embodiment of the present invention is shown.
The device for determining the external parameters between the camera and the laser radar comprises the following steps:
a calibration board module 401, configured to set a plurality of aprilats with different IDs at each corner of the polygonal calibration board respectively; a first corner point obtaining module 402, configured to obtain coordinates of first corner points detected by a camera, where the coordinates of the first corner points are obtained by the camera by detecting aprilats; a second corner point obtaining module 403, configured to obtain position coordinates of each second corner point, where the position coordinates of the second corner point are corner point coordinates of each corner of the polygon calibration board; and an external parameter determining module 404, configured to determine an external parameter between the camera and the laser radar according to the acquired first corner position coordinate and the acquired second corner position coordinate.
Preferably, the second corner point obtaining module 403 includes: the point cloud data submodule 4031 is used for acquiring point cloud data information on the polygon calibration plate, wherein the point cloud data information is obtained by detecting laser reflected by the polygon calibration plate by a laser radar; the edge point sub-module 4032 is used for calculating the position coordinates of each edge point on the edge of the polygon calibration plate according to the point cloud data information; the straight line fitting submodule 4033 is used for fitting a straight line equation of each edge of the polygonal calibration plate according to the position coordinates of the edge points; and the coordinate determination submodule 4034 is configured to determine second corner position coordinates of each corner of the polygon calibration plate according to the linear equation of each side.
Preferably, the extrinsic parameter determination module 404 includes: the first matrix submodule 4041 is configured to generate a first corner matrix according to the first corner position coordinate; a second matrix submodule 4042, configured to generate a second corner matrix according to the second corner position coordinate; a parameter obtaining submodule 4043, configured to obtain a scalar and calibrated internal parameters; the constraint equation submodule 4044 is configured to input the first corner matrix, the second corner matrix, the scalar quantity, and the internal parameter into the constraint equation; and the parameter determining submodule 4045 is used for determining the calculated output value as an external parameter between the camera and the laser radar.
Preferably, the parameter determination sub-module is specifically configured to calculate an external parameter between the camera and the lidar according to the following formula:
s*(u,v,1)^T=K*[R,t]*(X,Y,Z,1)^T
the position of a single angular point of the calibration plate in a camera plane coordinate system is represented as (u, v,1), the position of the single angular point of the calibration plate in a laser radar coordinate system is represented as (X, Y, Z,1), and the coordinates are all in a homogeneous expression mode; s is a scalar quantity, representing scale parameters, and ^ T represents transposition in the matrix, K is an internal parameter, and K is a 3x3 matrix; [ R, t ] is an external parameter between the camera and the laser radar, [ R, t ] is a 3x4 matrix.
Preferably, each side of the polygonal calibration plate has an angle with the scanning direction of the laser radar.
Preferably, for each AprilTag, the direction of the corner point of AprilTag is consistent with the direction of the corner corresponding to AprilTag; the color of the polygonal calibration plate is pure white.
The device for determining the external parameter between the camera and the lidar in the embodiment of the invention is used for realizing the method for determining the external parameter between the camera and the lidar in the first embodiment and the second embodiment, and has the beneficial effects of the corresponding method embodiments, which are not described herein again.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. For the system embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, refer to the partial description of the method embodiment.
The method for determining the external parameter between the camera and the laser radar provided by the invention is described in detail, specific examples are applied in the text to explain the implementation steps and the implementation device of the invention, and the description of the above embodiments is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention. It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, etc. do not denote any order. These words may be interpreted as names.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (5)

1. A method of determining an extrinsic parameter between a camera and a lidar, the method comprising:
respectively arranging a plurality of AprilTags with different IDs at each corner of the polygonal calibration plate;
acquiring coordinates of first corner positions detected by a camera, wherein the coordinates of the first corner positions are obtained by the camera through detecting AprilTag, and the coordinates of the first corner positions are two-dimensional coordinates;
acquiring position coordinates of each second corner point, wherein the position coordinates of the second corner points are the corner point coordinates of each corner of the polygon calibration plate, and the position coordinates of the second corner points are three-dimensional coordinates;
and generating a first corner matrix according to the position coordinates of the first corner point, generating a second corner matrix according to the position coordinates of the second corner point, acquiring a scalar and calibrated internal parameters, inputting the first corner matrix, the second corner matrix, the scalar and the internal parameters into a constraint equation, and determining the calculated output value as an external parameter between the camera and the laser radar.
2. The method according to claim 1, wherein the step of obtaining the position coordinates of each second corner point comprises:
acquiring point cloud data information on the polygon calibration plate, wherein the point cloud data information is obtained by detecting laser reflected by the polygon calibration plate through a laser radar;
calculating the position coordinates of each edge point on the edge of the polygonal calibration plate according to the point cloud data information;
fitting a linear equation of each edge of the polygonal calibration plate according to the position coordinates of the edge points;
and determining the position coordinates of a second corner point of each corner point of the polygon calibration plate according to the linear equation of each edge.
3. The method of claim 1, wherein the extrinsic parameters between the camera and lidar are calculated according to the following equation:
s*(u,v,1)^T=K*[R,t]*(X,Y,Z,1)^T;
the position of a single angular point of the calibration plate in a camera plane coordinate system is represented as (u, v,1), the position of the single angular point of the calibration plate in a laser radar coordinate system is represented as (X, Y, Z,1), and the coordinates are all in a homogeneous expression mode; s is a scalar quantity, representing scale parameters, and ^ T represents transposition in the matrix, K is an internal parameter, and K is a 3x3 matrix; [ R, t ] is an external parameter between the camera and the laser radar, [ R, t ] is a 3x4 matrix.
4. The method of claim 1, wherein each side of the polygonal calibration plate is angled with respect to a scanning direction of the lidar.
5. The method of claim 1, wherein for each aprilat, the direction of a corner point of the aprilat coincides with the direction of a corner to which the aprilat corresponds.
CN201610922058.0A 2016-10-21 2016-10-21 Method for determining external parameters between camera and laser radar Active CN107976668B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610922058.0A CN107976668B (en) 2016-10-21 2016-10-21 Method for determining external parameters between camera and laser radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610922058.0A CN107976668B (en) 2016-10-21 2016-10-21 Method for determining external parameters between camera and laser radar

Publications (2)

Publication Number Publication Date
CN107976668A CN107976668A (en) 2018-05-01
CN107976668B true CN107976668B (en) 2020-03-31

Family

ID=62004679

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610922058.0A Active CN107976668B (en) 2016-10-21 2016-10-21 Method for determining external parameters between camera and laser radar

Country Status (1)

Country Link
CN (1) CN107976668B (en)

Families Citing this family (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109270534B (en) * 2018-05-07 2020-10-27 西安交通大学 Intelligent vehicle laser sensor and camera online calibration method
CN108627849B (en) * 2018-07-25 2022-02-15 南京富锐光电科技有限公司 Range finding laser radar system for high-speed camera calibration
CN109239727B (en) * 2018-09-11 2022-08-05 北京理工大学 Distance measurement method combining solid-state area array laser radar and double CCD cameras
CN109633612B (en) * 2018-10-18 2020-06-16 浙江大学 Single-line laser radar and camera external reference calibration method without common observation
CN111123242B (en) * 2018-10-31 2022-03-15 北京亚兴智数科技有限公司 Combined calibration method based on laser radar and camera and computer readable storage medium
CN109712190A (en) * 2018-11-10 2019-05-03 浙江大学 The outer ginseng scaling method of three-dimensional laser scanner and three-dimensional laser radar
CN109658461B (en) * 2018-12-24 2023-05-26 中国电子科技集团公司第二十研究所 Unmanned aerial vehicle positioning method based on cooperation two-dimensional code of virtual simulation environment
CN110021046B (en) * 2019-03-05 2021-11-19 中国科学院计算技术研究所 External parameter calibration method and system for camera and laser radar combined sensor
CN109946703B (en) * 2019-04-10 2021-09-28 北京小马智行科技有限公司 Sensor attitude adjusting method and device
CN110148180B (en) * 2019-04-22 2021-06-08 河海大学 Laser radar and camera fusion device and calibration method
CN110361717B (en) * 2019-07-31 2021-03-12 苏州玖物互通智能科技有限公司 Laser radar-camera combined calibration target and combined calibration method
CN110687521B (en) * 2019-10-15 2023-05-16 深圳数翔科技有限公司 Method for calibrating vehicle-mounted laser radar
CN112816949B (en) * 2019-11-18 2024-04-16 商汤集团有限公司 Sensor calibration method and device, storage medium and calibration system
CN111638499B (en) * 2020-05-08 2024-04-09 上海交通大学 Camera-laser radar relative external parameter calibration method based on laser radar reflection intensity point characteristics
CN112270713A (en) * 2020-10-14 2021-01-26 北京航空航天大学杭州创新研究院 Calibration method and device, storage medium and electronic device
CN112162263A (en) * 2020-10-26 2021-01-01 苏州挚途科技有限公司 Combined calibration method and device for sensor and electronic equipment
CN112308928B (en) * 2020-10-27 2022-11-15 北京航空航天大学 Camera without calibration device and laser radar automatic calibration method
CN112734857B (en) * 2021-01-08 2021-11-02 香港理工大学深圳研究院 Calibration method for camera internal reference and camera relative laser radar external reference and electronic equipment
CN113034567A (en) * 2021-03-31 2021-06-25 奥比中光科技集团股份有限公司 Depth truth value acquisition method, device and system and depth camera
CN113281723B (en) * 2021-05-07 2022-07-22 北京航空航天大学 AR tag-based calibration method for structural parameters between 3D laser radar and camera
CN113484830A (en) * 2021-06-22 2021-10-08 上海智能网联汽车技术中心有限公司 Composite calibration plate and calibration method
CN116148809B (en) * 2023-04-04 2023-06-20 中储粮成都储藏研究院有限公司 Automatic generation method and system for grain vehicle sampling point based on laser radar scanning and positioning

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882313B (en) * 2010-07-14 2011-12-21 中国人民解放军国防科学技术大学 Calibration method of correlation between single line laser radar and CCD (Charge Coupled Device) camera
CN103473771B (en) * 2013-09-05 2016-05-25 上海理工大学 A kind of camera scaling method
US9282326B2 (en) * 2013-10-28 2016-03-08 The Regents Of The University Of Michigan Interactive camera calibration tool
CN103837869B (en) * 2014-02-26 2016-06-01 北京工业大学 Based on single line laser radar and the CCD camera scaling method of vector relations
CN105931229B (en) * 2016-04-18 2019-02-05 东北大学 Wireless camera sensor pose scaling method towards wireless camera sensor network

Also Published As

Publication number Publication date
CN107976668A (en) 2018-05-01

Similar Documents

Publication Publication Date Title
CN107976668B (en) Method for determining external parameters between camera and laser radar
CN107976669B (en) Device for determining external parameters between camera and laser radar
EP3848901A2 (en) Method and apparatus for calibrating external parameters of image acquisition device, device and storage medium
CN112654886B (en) External parameter calibration method, device, equipment and storage medium
CN106599897B (en) Readings of pointer type meters recognition methods and device based on machine vision
Schmalz et al. Camera calibration: active versus passive targets
JP4484863B2 (en) Method and system for determining inaccurate information in an augmented reality system
CN104483664B (en) Single-linear-array laser radar equipment centering method
KR20220080011A (en) High-accuracy calibration system and method
JP5773436B2 (en) Information terminal equipment
US9661304B2 (en) Pre-calculation of sine waves for pixel values
CN111123242B (en) Combined calibration method based on laser radar and camera and computer readable storage medium
CN110260857A (en) Calibration method, device and the storage medium of vision map
CN113281723B (en) AR tag-based calibration method for structural parameters between 3D laser radar and camera
CN111913169B (en) Laser radar internal reference and point cloud data correction method, device and storage medium
WO2024011764A1 (en) Calibration parameter determination method and apparatus, hybrid calibration board, device, and medium
JP2011155412A (en) Projection system and distortion correction method in the same
Yu et al. High-accuracy camera calibration method based on coded concentric ring center extraction
KR102490521B1 (en) Automatic calibration through vector matching of the LiDAR coordinate system and the camera coordinate system
CN108776338B (en) Signal source space sensing method and device and active sensing system
CN114463436A (en) Calibration method, system, equipment and storage medium of galvanometer scanning device
CN112985258B (en) Calibration method and measurement method of three-dimensional measurement system
Song et al. Flexible line-scan camera calibration method using a coded eight trigrams pattern
CN116592766A (en) Precise three-dimensional measurement method and device based on fusion of laser and monocular vision
CN115953478A (en) Camera parameter calibration method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 100026 8 floor 909, 105 building 3, Yao Yuan Road, Chaoyang District, Beijing.

Applicant after: Lexus Automobile (Beijing) Co.,Ltd.

Address before: 100026 8 floor 909, 105 building 3, Yao Yuan Road, Chaoyang District, Beijing.

Applicant before: FARADAY (BEIJING) NETWORK TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
TA01 Transfer of patent application right

Effective date of registration: 20180830

Address after: 511458 9, Nansha District Beach Road, Guangzhou, Guangdong, 9

Applicant after: Evergrande Faraday Future Smart Car (Guangdong) Co.,Ltd.

Address before: 100026 8 floor 909, 105 building 3, Yao Yuan Road, Chaoyang District, Beijing.

Applicant before: Lexus Automobile (Beijing) Co.,Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20190314

Address after: 100015 Building No. 7, 74, Jiuxianqiao North Road, Chaoyang District, Beijing, 001

Applicant after: FAFA Automobile (China) Co.,Ltd.

Address before: 511458 9, Nansha District Beach Road, Guangzhou, Guangdong, 9

Applicant before: Evergrande Faraday Future Smart Car (Guangdong) Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant