CN111123242B - Combined calibration method based on laser radar and camera and computer readable storage medium - Google Patents

Combined calibration method based on laser radar and camera and computer readable storage medium Download PDF

Info

Publication number
CN111123242B
CN111123242B CN201811290090.7A CN201811290090A CN111123242B CN 111123242 B CN111123242 B CN 111123242B CN 201811290090 A CN201811290090 A CN 201811290090A CN 111123242 B CN111123242 B CN 111123242B
Authority
CN
China
Prior art keywords
point
edge
foreground
edge point
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811290090.7A
Other languages
Chinese (zh)
Other versions
CN111123242A (en
Inventor
张剑华
巴瑛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hainan Zhibo Rui Technology Co.,Ltd.
Original Assignee
Beijing Yaxing Zhishu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yaxing Zhishu Technology Co ltd filed Critical Beijing Yaxing Zhishu Technology Co ltd
Priority to CN201811290090.7A priority Critical patent/CN111123242B/en
Publication of CN111123242A publication Critical patent/CN111123242A/en
Application granted granted Critical
Publication of CN111123242B publication Critical patent/CN111123242B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

The application relates to a combined calibration method based on a laser radar and a camera and a computer readable storage medium, wherein the combined calibration method comprises the following steps: the method comprises the steps of carrying out edge detection on laser radar point cloud extracted by a laser radar, extracting partial edge points of a background, searching a foreground edge point corresponding to each background edge point through the depth difference of the laser radar points, obtaining a foreground edge point closer to the edge through searching and comparing for many times, then extracting the laser radar edge point cloud and a camera at the same moment to obtain the circle center and the radius of a circle in an edge image, calculating a translation vector between the camera and the laser radar through a pinhole camera model, searching calibration parameters in a neighborhood space of the obtained translation vector to find a calibration result which minimizes a projection error, extracting points scanned on the edge of an object by the laser radar with high precision, and avoiding the problem of insufficient precision caused by low resolution of the laser radar. And more accurate calibration results can be obtained through the obtained accurate edge points.

Description

Combined calibration method based on laser radar and camera and computer readable storage medium
Technical Field
The present application relates to the field of calibration, and in particular, to a joint calibration method based on a laser radar and a camera and a computer-readable storage medium.
Background
The fusion of the laser radar and the camera is widely applied to the fields of three-dimensional reconstruction, autonomous navigation and positioning in robot vision, unmanned aerial vehicles and the like. The single sensor has limitations, for example, the camera is easily affected by illumination and fuzzy external environment, the data points of the laser radar are sparse, and the fusion of the two can make up the respective defects.
In order to fuse the information acquired by the laser radar and the camera, it is essential to perform joint calibration between the two. The mutual conversion relation between the space coordinate systems of the two sensors is determined through calibration, so that information obtained by different sensors is fused into a unified coordinate system. At present, most of methods for jointly calibrating among multi-line laser radars and cameras use calibration objects with space geometric characteristics, points with sudden distance changes are extracted as edge points by using the depth discontinuity of the laser radars at the edge points to be used as calibrated characteristic points, and the edge points are registered with edge information extracted from images.
Therefore, how to design a combined calibration method based on the laser radar and the camera to enable the laser radar to obtain accurate edge points for calibration is a problem to be solved for realizing the precise calibration of the laser radar and the camera.
Disclosure of Invention
To solve the above technical problem or at least partially solve the above technical problem, the present application provides a joint calibration method based on a laser radar and a camera and a computer-readable storage medium
In view of the above, in a first aspect, the present application provides a joint calibration method based on a laser radar and a camera, where the joint calibration method includes the following steps: step 1, performing internal reference calibration on a camera to obtain an internal reference matrix of the camera; step 2, extracting a first foreground edge point and a background edge point in the laser radar point cloud obtained from the nth frame, wherein n is a positive integer; step 3, regarding the laser radar point cloud obtained from the (n + 1) th frame, taking the point which has the sudden change of the distance closest to the edge and the minimum horizontal angle difference with the background edge point as a second foreground edge point; step 4, repeating the step 3 for the laser radar point cloud of the (n + 2) th frame to obtain a third foreground edge point, comparing the third foreground edge point with the second foreground edge point, and taking a distance mutation point closer to the edge of the third foreground edge point and the second foreground edge point as a fourth foreground edge point; step 5, sequentially acquiring a new frame according to the frame sequence, repeatedly executing the step 4 until the n + k frame to obtain a final fifth foreground edge point, and recording the time of the n + k frame, wherein k is a positive integer; step 6, extracting a first circle radius and a first circle center coordinate according to a fifth foreground edge point and the time of the (n + k) th frame, and calculating a translation vector between a camera and a laser radar coordinate system by using a camera model; and 7, searching calibration parameters in the neighborhood space of the translation vector, and selecting the calibration parameter which enables the projection error to be minimum as a calibration result.
As an embodiment of the present invention, extracting a first foreground edge point and a background edge point in a laser radar point cloud acquired from an nth frame includes: step 21, calculating the maximum value of the first distance difference from the ith point and two adjacent points i-1 and i +1 on the same scanning line to the laser radar point cloud obtained from the nth frame, wherein i is a positive integer; and step 22, extracting points of which the maximum value of the first distance difference is larger than a threshold value to obtain edge points, wherein the edge points comprise first foreground edge points and background edge points.
As an embodiment of the present invention, regarding the laser radar point cloud obtained in the (n + 1) th frame, taking a point, where a distance closest to an edge is suddenly changed and a horizontal angle difference with a background edge point is minimum, as a second foreground edge point includes: step 31, traversing all the first background edge points of the laser radar point cloud obtained from the (n + 1) th frame, and calculating a second distance difference between a point on the same scanning line as the first background edge point in the (n + 1) th frame and the background edge point; step 32, storing the corresponding point into the foreground point cloud under the condition that the second distance difference is larger than the threshold value; step 33, calculating a first horizontal angle difference between each point in the foreground point cloud and a first foreground edge point; and step 34, taking the point with the minimum first horizontal angle difference as a second foreground edge point.
As an embodiment of the present invention, regarding the laser radar point cloud obtained from the (n + 1) th frame, taking a point where the distance closest to the edge is suddenly changed and the horizontal angle difference with the background edge point is minimum as the second foreground edge point, further includes: and regarding the background edge point, if the second distance difference is not larger than the threshold value in the laser radar point cloud obtained from the (n + 1) th frame, taking the corresponding first foreground edge point in the nth frame as the second foreground edge point.
As an embodiment of the present invention, comparing the first foreground edge point with the second foreground edge point, and taking the distance mutation point closer to the edge of the first foreground edge point and the second foreground edge point as the fourth foreground edge point includes: step 41, obtaining a second horizontal angle difference between a second foreground edge point and a background edge point and a third horizontal angle difference between a third foreground edge point and the background edge point; and 42, comparing the second horizontal angle difference with the third horizontal angle difference, and selecting a smaller point as a fourth foreground edge point.
As an embodiment of the present invention, extracting a first circle radius and a first circle center coordinate according to a fifth foreground edge point and a time of an n + k-th frame, and calculating a translation vector between a camera and a laser radar coordinate system using a camera model includes: step 61, acquiring a second circle radius and a second circle center coordinate according to a fifth foreground edge point and a RANSAC algorithm; step 62, acquiring a third circle radius and a third circle center coordinate according to the time of the (n + k) th frame and Hough transformation; and step 63, determining a translation vector according to the internal reference matrix, the second circle radius, the second circle center coordinate, the third circle radius and the third circle center coordinate.
As an embodiment of the present invention, step 7 further includes: and projecting the laser radar point cloud at the moment of the (n + k) th frame to an image plane according to the calibration parameters and the internal reference matrix.
In a second aspect, the present application provides a computer-readable storage medium having stored thereon an executable program, which when executed by a processor, performs the steps of the combined lidar and camera based calibration method according to any of the first aspects.
Compared with the prior art, the technical scheme provided by the embodiment of the application has the following advantages:
according to the method provided by the embodiment of the application, edge detection is carried out on laser radar point cloud extracted by a laser radar, the edge points of the background part are extracted, the foreground edge points corresponding to each background edge point are searched through the depth difference of the laser radar points, the foreground edge points closer to the edge are obtained through searching and comparing for many times, then the laser radar edge point cloud and a camera at the same moment are respectively extracted to obtain the circle center and the radius of a circle in an edge image, the translation vector between the camera and the laser radar is calculated through a pinhole camera model, calibration parameters are searched in the neighborhood space of the obtained translation vector, so that a calibration result which enables projection errors to be minimized is found, the points where the laser radar scans the edge of an object can be extracted with high precision, and the problem that the precision is insufficient due to the low resolution of the laser radar is avoided. And more accurate calibration results can be obtained through the obtained accurate edge points.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a schematic flowchart of a joint calibration method based on a laser radar and a camera according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a calibration object with spatial geometry provided by an embodiment of the present application;
FIG. 3 is a schematic diagram of joint calibration of a laser radar and a camera provided in an embodiment of the present application;
fig. 4 is a schematic diagram of the lidar according to the embodiment of the present disclosure when scanning to an edge.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The embodiment of the application provides a combined calibration method based on a laser radar and a camera, as shown in fig. 1, the combined calibration method may include the following steps:
and step S1, performing internal reference calibration on the camera to obtain an internal reference matrix of the camera.
In this step, an internal reference calibration is performed on the camera to obtain an internal reference matrix of the camera, where the internal reference matrix is as follows:
Figure BDA0001849872610000051
wherein f is the focal length of the camera, [ O ]x,Oy]Is the principal optical axis point. As shown in fig. 2, the calibration object with the space geometric characteristics is designed, four hollow circles with the same radius r and the same circle center distance L between any two adjacent circles are arranged on the calibration object, and the calibration object is placed at the position where the camera and the laser radar can acquire simultaneously, the positions of the laser radar and the camera are relatively fixed, the distance from the calibration plate to the laser radar is L, and the distance between the background wall and the calibration plate is d, with reference to fig. 3.
Step S2, extracting the laser radar point cloud P obtained from the nth framenWherein n is a positive integer, and a first foreground edge point and a background edge point in (1).
In the step, the laser radar point cloud P obtained from the nth frame is extractednThe first foreground edge point and the background edge point in (1) comprise: step S21, the laser radar point cloud P obtained from the nth framenCalculating the ith point Pi nI-1 point of two adjacent points on the same scanning line
Figure BDA0001849872610000052
And point i +1
Figure BDA0001849872610000053
Maximum value X of distance difference to laser radariCan be expressed as:
Figure BDA0001849872610000054
wherein the content of the first and second substances,
Figure BDA0001849872610000055
representing the nth frame of three-dimensional lidar point cloud PnPoint i in
Figure BDA0001849872610000056
The distance to the lidar, and correspondingly,
Figure BDA0001849872610000061
representing the nth frame lidar Point cloud PnPoint i-1 of
Figure BDA0001849872610000062
The distance to the laser radar is such that,
Figure BDA0001849872610000063
representing the nth frame lidar Point cloud PnPoint (i + 1)
Figure BDA0001849872610000064
Distance to the lidar.
Step S22, maximum value X of distance difference is extractediAnd obtaining the edge point by the point with the value of the threshold value R being larger than the threshold value R, wherein the value of the threshold value R is slightly smaller than the distance difference d between the calibration board and the background wall. Furthermore, the resulting edge points include two parts: the method comprises the following steps of scanning a background edge point closest to the edge of a calibration board on a background wall, and a first foreground edge point falling on the edge of the calibration board. And preserving the angular constraints of the background portion, resulting in foreground portion edge points of BnThe edge point on the foreground calibration plate is Fn
As shown in FIG. 4, |2Indicating the plane of the calibration plate in the top view,/1Representing the background wall, because of the limited resolution of the laser radar, two adjacent points P on the same scanning line1And P2There is a certain extracted edge point P between2The edge of the calibration plate is not necessarily scanned exactly, thereby introducing errors into the edge extraction. In order to extract more accurate edge points of the calibration plate, the invention uses BnAs a reference frame to find foreground edge points closer to the edge of the calibration plate.
Step S3, the laser radar point cloud P obtained from the n +1 th framen+1And taking the point which is the closest to the edge and has the sudden change of the distance and the minimum horizontal angle difference with the background edge point as the second foreground edge point.
In this step, a new frame of point cloud is obtained, and the point with the abrupt change of the distance closest to the edge is searched as the foreground edge point. ByWhen the scanning accuracy of the laser radar is unstable, the horizontal angle difference between two adjacent points on the same scanning line is not identical, so that there may be a situation shown in fig. 4, P1And P2Background edge points and foreground edge points extracted separately for the nth scan, in n +1 scans, at P1And P2There may be a plurality of scanning points Q in between1,Q2,Q3Therefore, for each point with an abrupt change in distance, the horizontal angle difference between the point and the background point is compared, and the point closest to the edge is taken.
Further, the laser radar point cloud P obtained for the (n + 1) th framen+1Go through the reference frame BnEach point in
Figure BDA0001849872610000065
Computing the sum in the n +1 th frame
Figure BDA0001849872610000066
Points on the same scan line
Figure BDA0001849872610000067
And
Figure BDA0001849872610000068
the range difference of (a) is:
Figure BDA0001849872610000071
if the Range Diff is greater than the threshold R, the point is pointed
Figure BDA0001849872610000072
Store point foreground point cloud Kn+1Medium, i.e. foreground points located on the calibration plate. To foreground point cloud Kn+1Each point in
Figure BDA0001849872610000073
Computing
Figure BDA0001849872610000074
And point
Figure BDA0001849872610000075
The horizontal angle difference angleDiff.
Figure BDA0001849872610000076
Wherein the content of the first and second substances,
Figure BDA0001849872610000077
indicating points
Figure BDA0001849872610000078
The horizontal angle of (a) is, correspondingly,
Figure BDA0001849872610000079
indicating points
Figure BDA00018498726100000710
The horizontal angle of (c). The point at which the horizontal angle difference angleDiff is smallest is selected as the relative point
Figure BDA00018498726100000711
Selected second foreground edge point Si
In addition, if for the background edge point
Figure BDA00018498726100000712
If there is no corresponding foreground edge point with horizontal angle difference angleDiff greater than threshold R in the n +1 th frame, the corresponding foreground edge point in the n th frame is retained
Figure BDA00018498726100000713
To obtain a second foreground edge point Si(or accurate edge point S)i)。
Step S4, for the laser radar point cloud P of the n +2 th framen+2And repeating the step S3 to obtain a third foreground edge point, comparing the third foreground edge point with the second foreground edge point, and taking the distance mutation point closer to the edge of the third foreground edge point and the second foreground edge point as a fourth foreground edge point.
In the step, the next frame of laser radar point cloud is continuously obtained, the point with the distance mutation closest to the edge is searched, and the point with the distance mutation closer to the edge is compared with the foreground edge point obtained in the previous frame to be taken as the foreground edge point.
Namely, the point cloud P of the laser radar obtained by the n +2 th frame is obtainedn+2Repeating the above step S3 to obtain the neutralization result in the n +2 th frame
Figure BDA00018498726100000721
Points on the same scan line
Figure BDA00018498726100000714
And
Figure BDA00018498726100000715
and a range difference of
Figure BDA00018498726100000716
The point with the smallest horizontal angle difference is used as the newly found edge point
Figure BDA00018498726100000719
Comparing the accurate edge points SiAnd
Figure BDA00018498726100000717
the horizontal angle difference of (2), and the newly found edge point
Figure BDA00018498726100000720
And
Figure BDA00018498726100000718
selecting a smaller point as a fourth foreground edge point (or a new foreground edge point S)i)。
And step 5, sequentially acquiring a new frame according to the frame sequence, repeatedly executing step 4 until the n + k frame to obtain the final fifth foreground edge point, and recording the time of the n + k frame, wherein k is a positive integer.
In this step, theContinuously acquiring the (n + 3) th frame, the (n + 4) th frame, the (n + 5) th frame and the like, and repeating the step S4 until the (n + k) th frame to obtain a fifth foreground edge point (or a final accurate edge point S)i) And the moment of acquiring the frame is recorded as tiAnd k is the accurate point finding cycle number.
And step S6, extracting a first circle radius and a first circle center coordinate according to the fifth foreground edge point and the time of the (n + k) th frame, and calculating a translation vector between the camera and the laser radar coordinate system by using the camera model.
In this embodiment, for tiRemoving straight lines, namely straight line edges at two sides of the calibration plate, from fifth foreground edge points obtained at any moment by using an RANSAC algorithm, obtaining intersection points, namely circles, of the calibration plane and the calibration sphere in the laser radar point cloud by using the RANSAC algorithm, and obtaining the radius r of the circles3dAnd center coordinates [ X, Y, Z ]]. At the same time for tiThe circle is detected by Hough transformation of the image acquired at the moment, and the radius r of the circle is calculated2dAnd the coordinates of the center of a circle [ u, v ]]. And calculating a translation vector between the camera and the laser radar coordinate system by using the camera model.
Figure BDA0001849872610000081
Figure BDA0001849872610000082
Wherein, [ x, y, w]TCoordinates of points in a camera coordinate system, [ X, Y, Z,1 ]]TAnd (3) representing the coordinates of points under the three-dimensional laser radar coordinate system, wherein P is a camera internal reference matrix, and C is a required calibration matrix. Here, only translation transformation is considered, such that
Figure BDA0001849872610000083
Wherein, [ t ]x,ty,tz]Representing a translation vector. The z-direction translation is obtained by a pinhole camera model, assuming that the calibration object plane is vertical.
Figure BDA0001849872610000084
Where Z is the depth coordinate of the center of the circle. T can be obtained from the equations (6.1) and (6.4)xAnd ty
Figure BDA0001849872610000091
Then, the translation results calculated by the four circles are averaged to obtain a final translation vector [ t ]x,ty,tz]。
Step S7, searching the calibration parameters in the neighborhood space of the translation vector, and selecting the calibration parameter which minimizes the projection error as the calibration result.
In this step, the translation amount [ t ] obtained in the above step S6 is usedx,ty,tz]To initialize the calibration parameter C as tx,ty,tz,0,0,0]Here, the rotation parameter r is initialized with 0x,ry,rz. According to the distance information of the laser radar point cloud, the radar point cloud is divided into a foreground and a background by an Otsu threshold value, and a camera image is also divided into the foreground and the background by self-adaptive threshold value division. Furthermore, t is scaled using the scaling parameters C and the internal reference matrix PiAnd projecting the laser radar point cloud at the moment to an image plane. In addition, the projection error PEThe value of (c) can be calculated by dividing the number of erroneous projected points E (i.e. foreground points projected on the background segment, or vice versa) by the total number of points P in the point cloud:
Figure BDA0001849872610000092
densely searching the neighborhood space of the calibration parameter C, and selecting the projection error PEThe minimum calibration parameter C is taken as the final result.
The method provided by the embodiment of the application can be used for extracting the point where the laser radar scans the edge of the object with high precision, and the problem of insufficient precision caused by low resolution of the laser radar is solved. And more accurate calibration results can be obtained through the obtained accurate edge points.
An embodiment of the present application further provides a computer-readable storage medium, on which an executable program is stored, and the executable program, when executed by a processor, implements the steps of the joint calibration method based on lidar and the camera as shown in any one of fig. 1.
The foregoing are merely exemplary embodiments of the present invention, which enable those skilled in the art to understand or practice the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (7)

1. A computer-readable storage medium having stored thereon an executable program which, when executed by a processor, implements a lidar and camera based joint calibration method, the lidar and camera based joint calibration method comprising:
step 1, performing internal reference calibration on the camera to obtain an internal reference matrix of the camera;
step 2, extracting a first foreground edge point and a background edge point in the laser radar point cloud obtained from the nth frame, wherein n is a positive integer;
step 3, regarding the laser radar point cloud obtained from the (n + 1) th frame, taking the point which has the sudden change of the distance closest to the edge and the minimum horizontal angle difference with the background edge point as a second foreground edge point;
step 4, the step 3 is repeatedly executed for the laser radar point cloud of the (n + 2) th frame to obtain a third foreground edge point, the third foreground edge point is compared with the second foreground edge point, and a distance mutation point closer to the edge is taken as a fourth foreground edge point;
step 5, sequentially acquiring a new frame according to the frame sequence, repeatedly executing the step 4 until the n + k frame to obtain a final fifth foreground edge point, and recording the time of the n + k frame, wherein k is a positive integer;
step 6, extracting a first circle radius and a first circle center coordinate according to the fifth foreground edge point and the time of the (n + k) th frame, and calculating a translation vector between a camera and a laser radar coordinate system by using a camera model;
and 7, searching calibration parameters in the neighborhood space of the translation vector, and selecting the calibration parameter which enables the projection error to be minimum as a calibration result.
2. The computer-readable storage medium of claim 1, wherein extracting the first foreground edge point and the background edge point in the lidar point cloud acquired for the nth frame comprises:
step 21, calculating the maximum value of the first distance difference from the ith point and two adjacent points i-1 and i +1 on the same scanning line to the laser radar point cloud obtained from the nth frame, wherein i is a positive integer;
and step 22, extracting points of which the maximum value of the first distance difference is larger than a threshold value to obtain edge points, wherein the edge points include the first foreground edge points and the background edge points.
3. The computer-readable storage medium according to claim 1, wherein regarding the lidar point cloud acquired for the (n + 1) th frame, regarding a point with a sudden change in distance closest to an edge and a minimum horizontal angle difference from the background edge point as a second foreground edge point comprises:
step 31, traversing all first background edge points of the laser radar point cloud obtained from the (n + 1) th frame, and calculating a second distance difference between a point of the (n + 1) th frame, which is located on the same scanning line as the first background edge point, and the background edge point;
step 32, storing the corresponding point into the foreground point cloud under the condition that the second distance difference is larger than a threshold value;
step 33, calculating a first horizontal angle difference between each point in the foreground point cloud and the first foreground edge point;
and step 34, taking the point with the minimum first horizontal angle difference as the second foreground edge point.
4. The computer-readable storage medium according to claim 3, wherein regarding the lidar point cloud acquired for the (n + 1) th frame, regarding a point with a sudden change in distance closest to an edge and a minimum horizontal angle difference from the background edge point as a second foreground edge point, further comprises:
and regarding the background edge point, if the second distance difference is not larger than the threshold value in the laser radar point cloud obtained by the n +1 th frame, taking the corresponding first foreground edge point in the n th frame as the second foreground edge point.
5. The computer-readable storage medium of claim 1, wherein comparing the second foreground edge point with the second foreground edge point, and wherein taking the distance discontinuities closer to the edge of the second foreground edge point as fourth foreground edge points comprises:
step 41, obtaining a second horizontal angle difference between the second foreground edge point and the background edge point, and a third horizontal angle difference between the third foreground edge point and the background edge point;
and 42, comparing the second horizontal angle difference with the third horizontal angle difference, and selecting a smaller point as the fourth foreground edge point.
6. The computer-readable storage medium of claim 1, wherein extracting a first circle radius and first circle center coordinates from the fifth foreground edge point and the time of the n + k frame, and calculating a translation vector between a camera and a lidar coordinate system using a camera model comprises:
step 61, acquiring a second circle radius and a second circle center coordinate according to the fifth foreground edge point and a RANSAC algorithm;
step 62, acquiring a third circle radius and a third circle center coordinate according to the time of the n + k frame and Hough transformation;
and step 63, determining the translation vector according to the internal reference matrix, the second circle radius, the second circle center coordinate, the third circle radius and the third circle center coordinate.
7. The computer-readable storage medium of claim 1, wherein the step 7 further comprises:
and projecting the laser radar point cloud at the moment of the n + k frame to an image plane according to the calibration parameters and the internal reference matrix.
CN201811290090.7A 2018-10-31 2018-10-31 Combined calibration method based on laser radar and camera and computer readable storage medium Active CN111123242B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811290090.7A CN111123242B (en) 2018-10-31 2018-10-31 Combined calibration method based on laser radar and camera and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811290090.7A CN111123242B (en) 2018-10-31 2018-10-31 Combined calibration method based on laser radar and camera and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN111123242A CN111123242A (en) 2020-05-08
CN111123242B true CN111123242B (en) 2022-03-15

Family

ID=70494303

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811290090.7A Active CN111123242B (en) 2018-10-31 2018-10-31 Combined calibration method based on laser radar and camera and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111123242B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111638680B (en) * 2020-06-15 2021-03-12 大连誉洋工业智能有限公司 Robot-based casting circular structure polishing path planning method
CN113759346A (en) * 2020-10-10 2021-12-07 北京京东乾石科技有限公司 Laser radar calibration method and device, electronic equipment and storage medium
CN112446927A (en) * 2020-12-18 2021-03-05 广东电网有限责任公司 Combined calibration method, device and equipment for laser radar and camera and storage medium
CN113050074B (en) * 2021-03-16 2023-08-25 成都信息工程大学 Camera and laser radar calibration system and calibration method in unmanned environment perception
CN115601451B (en) * 2022-12-14 2023-03-21 深圳思谋信息科技有限公司 External parameter data calibration method, device, computer equipment and storage medium
CN116447977B (en) * 2023-06-16 2023-08-29 北京航天计量测试技术研究所 Round hole feature measurement and parameter extraction method based on laser radar

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2728376A1 (en) * 2012-11-05 2014-05-07 The Chancellor, Masters and Scholars of the University of Oxford Extrinsic calibration of imaging sensing devices and 2D LIDARs mounted on transportable apparatus
CN107976668A (en) * 2016-10-21 2018-05-01 法乐第(北京)网络科技有限公司 A kind of method of outer parameter between definite camera and laser radar
CN107977997A (en) * 2017-11-29 2018-05-01 北京航空航天大学 A kind of Camera Self-Calibration method of combination laser radar three dimensional point cloud
CN109300162A (en) * 2018-08-17 2019-02-01 浙江工业大学 A kind of multi-line laser radar and camera combined calibrating method based on fining radar scanning marginal point

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2728376A1 (en) * 2012-11-05 2014-05-07 The Chancellor, Masters and Scholars of the University of Oxford Extrinsic calibration of imaging sensing devices and 2D LIDARs mounted on transportable apparatus
CN107976668A (en) * 2016-10-21 2018-05-01 法乐第(北京)网络科技有限公司 A kind of method of outer parameter between definite camera and laser radar
CN107977997A (en) * 2017-11-29 2018-05-01 北京航空航天大学 A kind of Camera Self-Calibration method of combination laser radar three dimensional point cloud
CN109300162A (en) * 2018-08-17 2019-02-01 浙江工业大学 A kind of multi-line laser radar and camera combined calibrating method based on fining radar scanning marginal point

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Calibration of RGB Camera with Velodyne Lidar;Martin Velas etc.;《WSCG2014 Conference on Computer Graphics,Visualization and Computer Vision》;20141231;正文第135-144页 *

Also Published As

Publication number Publication date
CN111123242A (en) 2020-05-08

Similar Documents

Publication Publication Date Title
CN109300162B (en) Multi-line laser radar and camera combined calibration method based on refined radar scanning edge points
CN111123242B (en) Combined calibration method based on laser radar and camera and computer readable storage medium
Pusztai et al. Accurate calibration of LiDAR-camera systems using ordinary boxes
CN109961468B (en) Volume measurement method and device based on binocular vision and storage medium
US8792726B2 (en) Geometric feature extracting device, geometric feature extracting method, storage medium, three-dimensional measurement apparatus, and object recognition apparatus
Kang et al. Automatic targetless camera–lidar calibration by aligning edge with gaussian mixture model
CN109801333B (en) Volume measurement method, device and system and computing equipment
JP6573419B1 (en) Positioning method, robot and computer storage medium
KR20220025028A (en) Method and device for building beacon map based on visual beacon
US20110235898A1 (en) Matching process in three-dimensional registration and computer-readable storage medium storing a program thereof
Munoz-Banon et al. Targetless camera-lidar calibration in unstructured environments
CN110458952B (en) Three-dimensional reconstruction method and device based on trinocular vision
CN113096183B (en) Barrier detection and measurement method based on laser radar and monocular camera
CN113256729A (en) External parameter calibration method, device, equipment and storage medium for laser radar and camera
CN112464812A (en) Vehicle-based sunken obstacle detection method
CN113049184A (en) Method, device and storage medium for measuring mass center
CN115685160A (en) Target-based laser radar and camera calibration method, system and electronic equipment
KR102490521B1 (en) Automatic calibration through vector matching of the LiDAR coordinate system and the camera coordinate system
CN112184793B (en) Depth data processing method and device and readable storage medium
JPH07103715A (en) Method and apparatus for recognizing three-dimensional position and attitude based on visual sense
GB2569609A (en) Method and device for digital 3D reconstruction
KR100933304B1 (en) An object information estimator using the single camera, a method thereof, a multimedia device and a computer device including the estimator, and a computer-readable recording medium storing a program for performing the method.
CN115661252A (en) Real-time pose estimation method and device, electronic equipment and storage medium
CN113405532B (en) Forward intersection measuring method and system based on structural parameters of vision system
Otero et al. Local iterative DLT soft-computing vs. interval-valued stereo calibration and triangulation with uncertainty bounding in 3D reconstruction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20230822

Address after: Room 3001, 3rd Floor, Incubation Building, Hainan Ecological Software Park, Laocheng High tech Industry Demonstration Zone, Chengmai County, Sansha City, Hainan Province, 571900

Patentee after: Hainan Zhibo Rui Technology Co.,Ltd.

Address before: Room zt325, science and technology building, No. 45, Zhaitang street, Mentougou District, Beijing 102300

Patentee before: Beijing Yaxing Zhishu Technology Co.,Ltd.

TR01 Transfer of patent right