CN113256729A - External parameter calibration method, device, equipment and storage medium for laser radar and camera - Google Patents

External parameter calibration method, device, equipment and storage medium for laser radar and camera Download PDF

Info

Publication number
CN113256729A
CN113256729A CN202110286400.3A CN202110286400A CN113256729A CN 113256729 A CN113256729 A CN 113256729A CN 202110286400 A CN202110286400 A CN 202110286400A CN 113256729 A CN113256729 A CN 113256729A
Authority
CN
China
Prior art keywords
point
camera
key point
determining
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110286400.3A
Other languages
Chinese (zh)
Other versions
CN113256729B (en
Inventor
李晓欢
覃兴胜
唐欣
陈倩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangxi Comprehensive Transportation Big Data Research Institute
Original Assignee
Guangxi Comprehensive Transportation Big Data Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangxi Comprehensive Transportation Big Data Research Institute filed Critical Guangxi Comprehensive Transportation Big Data Research Institute
Priority to CN202110286400.3A priority Critical patent/CN113256729B/en
Publication of CN113256729A publication Critical patent/CN113256729A/en
Application granted granted Critical
Publication of CN113256729B publication Critical patent/CN113256729B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the application provides a method, a device, equipment and a storage medium for calibrating external parameters of a laser radar and a camera, and relates to the field of image processing. The method comprises the following steps: acquiring a three-dimensional point cloud acquired by a laser radar and a pixel image acquired by a camera at the same time, determining a first key point in the pixel image, and determining a second key point in the three-dimensional point cloud by adopting Hough transformation; and determining external parameters of the laser radar and the camera based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud. The embodiment of the application has the advantages that the length of the adjacent edge of the calibration plate is prolonged, the number of point clouds on the adjacent edge is increased, the identification degree of the adjacent edge is increased, the extraction of the adjacent edge of the calibration plate is conveniently carried out through Hough transformation, the extraction precision of key points is improved, the optimization condition is set according to geometric constraint, the key points under the poses of a plurality of calibration plates are counted, different external parameter values are obtained, the external parameter value with the minimum error is selected according to the constraint condition, and the external parameter calibration precision is improved.

Description

External parameter calibration method, device, equipment and storage medium for laser radar and camera
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method, an apparatus, a device, and a storage medium for calibrating external parameters of a laser radar and a camera.
Background
In recent years, MEMS (Micro-Electro-Mechanical System) lidar has been widely used in the fields of intelligent driving, industrial robot, and the like, because of its low cost, dense point cloud, and good three-dimensional imaging effect, it is often fused with camera information to improve the robustness and precision of System application.
The external parameter of the laser radar and the camera is a space corresponding parameter of a point cloud coordinate system relative to an image coordinate system, namely the relationship between rotation and translation, and is the key of information fusion. However, the point cloud ranging precision of the MEMS laser radar is lower than that of the traditional mechanical laser radar, noise points are more, and the external parameter value obtained by the traditional external parameter calibration method has large deviation, so that the information fusion effect is influenced. And the external reference calibration needs to accurately extract corresponding points of the point cloud and the image pixels, and external reference values can be obtained according to a plurality of pairs of corresponding points. The corresponding points of the point cloud and the image pixels are used as key points, the extraction of the key points is influenced by the ranging and noise characteristics of the MEMS laser radar, and the error of external parameter calculation with the image pixels is large in the traditional method.
Therefore, the extraction of key points is influenced by the ranging and noise characteristics of the MEMS laser radar in the prior art, and the traditional technical problem that the error is large when external parameters are obtained with image pixels is urgently needed to be improved.
Disclosure of Invention
The purpose of this application aims at solving one of foretell technical defect at least, especially has MEMS laser radar's range finding and noise characteristics among the prior art and has influenced the extraction of key point, and the tradition leads to solving the technical problem that external reference time error is bigger than normal with image pixel.
According to one aspect of the application, an external reference calibration method for a laser radar and a camera is provided, and the method comprises the following steps:
acquiring a three-dimensional point cloud acquired by a laser radar and a pixel image acquired by a camera at the same time, wherein the three-dimensional point cloud and the pixel image are acquired aiming at the same calibration plate;
determining a first key point in the pixel image, and determining a second key point in the three-dimensional point cloud by adopting Hough transformation; the first key point and the second key point are multiple, and the point on the calibration plate corresponding to the first key point is the same as the point on the calibration plate corresponding to the second key point;
determining the laser radar and camera external parameters based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud.
As an optional embodiment of the present application, the acquiring three-dimensional point cloud collected by the laser radar at the same time includes:
and aiming at the same calibration plate, collecting multi-frame middle three-dimensional point clouds at the same pose, and fusing the multi-frame middle three-dimensional point clouds to obtain the three-dimensional point clouds collected by the laser radar.
As an optional embodiment of the present application, the determining a second keypoint in the three-dimensional point cloud by using Hough transform includes:
fitting the three-dimensional point cloud by adopting a preset RANSAC algorithm to obtain an optimal plane;
and determining a second key point in the optimal plane by using Hough transformation.
As an optional embodiment of the present application, the determining a second keypoint in the optimal plane by using Hough transform includes:
fitting the edge straight line of the calibration plate in the optimal plane by using Hough transformation;
and determining the second key point according to the intersection point of the edge straight line and the actual length of the calibration plate, wherein the length of the edge straight line is greater than the actual length of the calibration plate.
As an optional embodiment of the present application, the determining the laser radar and camera external parameters based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud includes:
converting the pixel coordinate of the first key point into a camera coordinate by adopting a preset pixel coordinate-camera coordinate formula;
determining the external parameters based on a relationship of the camera coordinates of the first keypoint and the coordinates of the second keypoint in the three-dimensional point cloud.
As an optional embodiment of the present application, the method further comprises:
respectively acquiring a plurality of groups of three-dimensional point clouds acquired by a laser radar and pixel images acquired by a camera aiming at a plurality of different poses of the same calibration plate, and solving a plurality of external parameters;
and determining the optimal external parameters in the plurality of external parameters by adopting a preset algorithm.
According to another aspect of the present application, there is provided an external reference calibration apparatus for a laser radar and a camera, the apparatus including:
the device comprises an image and point cloud acquisition module, a calibration board acquisition module and a calibration processing module, wherein the image and point cloud acquisition module is used for acquiring a three-dimensional point cloud acquired by a laser radar and a pixel image acquired by a camera at the same moment;
the key point determining module is used for determining a first key point in the pixel image and determining a second key point in the three-dimensional point cloud by adopting Hough transformation; the first key point and the second key point are multiple, and the point on the calibration plate corresponding to the first key point is the same as the point on the calibration plate corresponding to the second key point;
and the external parameter determining module is used for determining the external parameters of the laser radar and the camera based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud.
As an optional implementation manner of the present application, the key point determining module includes:
the straight line fitting unit is used for fitting the edge straight line of the calibration plate in the optimal plane by utilizing Hough transformation;
and the key point determining unit is used for determining the second key point according to the edge straight line intersection point and the actual length of the calibration plate, wherein the length of the edge straight line is greater than the actual length of the calibration plate.
According to another aspect of the present application, there is provided an electronic device including:
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to: and executing the external reference calibration method of the laser radar and the camera.
According to an aspect of the application, a computer program product or computer program is provided, comprising computer instructions, the computer instructions being stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided in the various alternative implementations described above. .
The method and the device extract the point cloud key points by utilizing Hough transformation. The length of the adjacent edge of the calibration plate is prolonged to increase the number of point clouds on the adjacent edge, so that the identification degree of the adjacent edge is increased, the adjacent edge of the calibration plate is conveniently extracted through Hough transformation, and then point cloud key points are obtained according to the intersection point of the adjacent edge and the actual length of the adjacent edge. And improving the extraction precision of the key points, setting an optimized condition according to geometric constraint after accurately extracting the key points, counting the key points under the poses of a plurality of calibration plates, calculating the external parameter values calculated by different key point pairs, selecting the external parameter value with the minimum error according to the constraint condition, and improving the precision of external parameter calibration.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings used in the description of the embodiments of the present application will be briefly described below.
Fig. 1 is a schematic flowchart of an external reference calibration method for a laser radar and a camera according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a key point of a pixel image according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of a three-dimensional point cloud key point provided in the embodiment of the present application;
fig. 4 is a schematic flowchart of a plane fitting method according to an embodiment of the present application;
fig. 5 is a schematic flowchart of a method for determining a key point according to an embodiment of the present application;
fig. 6 is a schematic flowchart of a method for obtaining an optimal external parameter according to an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of an external reference calibration apparatus for a laser radar and a camera according to an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a keypoint determination module according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
The above and other features, advantages and aspects of various embodiments of the present application will become more apparent from the following detailed description when taken in conjunction with the accompanying drawings. Throughout the drawings, the same or similar reference numbers refer to the same or similar elements. It should be understood that the drawings are schematic and that elements and features are not necessarily drawn to scale.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present application.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
To make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
The external parameter of the laser radar and the camera is a space corresponding parameter of a point cloud coordinate system relative to an image coordinate system, namely the relationship between rotation and translation, and is the key of information fusion. However, the point cloud ranging precision of the MEMS laser radar is lower than that of the traditional mechanical laser radar, noise points are more, and the external parameter value obtained by the traditional external parameter calibration method has large deviation, so that the information fusion effect is influenced. And the external reference calibration needs to accurately extract corresponding points of the point cloud and the image pixels, and external reference values can be obtained according to a plurality of pairs of corresponding points. The corresponding points of the point cloud and the image pixels are used as key points, the extraction of the key points is influenced by the ranging and noise characteristics of the MEMS laser radar, and the error of external parameter calculation with the image pixels is large in the traditional method.
The application provides a laser radar and camera external reference calibration method, device, equipment and storage medium, and aims to solve the technical problems in the prior art.
The following describes the technical solutions of the present application and how to solve the above technical problems with specific embodiments. The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present application will be described below with reference to the accompanying drawings.
The embodiment of the application provides an external reference calibration method for a laser radar and a camera, and as shown in fig. 1, the method comprises the following steps:
step S101, acquiring a three-dimensional point cloud acquired by a laser radar and a pixel image acquired by a camera at the same time, wherein the three-dimensional point cloud and the pixel image are acquired aiming at the same calibration plate;
step S102, determining a first key point in the pixel image, and determining a second key point in the three-dimensional point cloud by adopting Hough transformation; the first key point and the second key point are multiple, and the point on the calibration plate corresponding to the first key point is the same as the point on the calibration plate corresponding to the second key point;
step S103, determining external parameters of the laser radar and the camera based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud.
In this embodiment of the application, when an image is acquired, a calibration board is placed in the same background and is acquired by using a laser radar and a camera, a three-dimensional point cloud acquired by the laser radar and a pixel image acquired by the camera at the same time are images of the same calibration board at the same time and in the same pose, after the three-dimensional point cloud acquired by the laser radar and the pixel image acquired by the camera are acquired, a first key point in the pixel image and a second key point in the three-dimensional point cloud are determined, respectively, where the first key point and the second key point are three vertexes of the calibration board in the pixel image and the three-dimensional point cloud, as shown in fig. 2, the image includes a vertex board 201, where three vertexes o of the calibration board 201 are taken as an example, and the first key point and the second key point are three vertexes of the pixel image and the three-dimensional point cloud acquired by the camera, respectively1、o2And o3As shown in fig. 3, the method is a schematic diagram of key points in a three-dimensional point cloud, wherein the second key points are o'1、o′2And o'3Wherein, the point o1And o 'point'1Corresponding to the upper vertex of the calibration plate, point o2And o 'point'2Corresponding to the left vertex of the calibration plate, point o3And o 'point'3Corresponding is the right vertex of the calibration plate, wherein the pixels are determinedThe key points in the image can be identified by adopting the prior art, and the Hough transformation is adopted to determine when the key points of the three-dimensional point cloud are identified, wherein the specific solving mode is explained in the following. After the first key point and the second key point are determined, determining the external parameters of the laser radar and the camera based on the coordinate of the first key point in the pixel image and the coordinate of the second key point in the three-dimensional point cloud.
For the embodiment of the present application, for convenience of description, taking an embodiment as an example, the solution of the extrinsic parameter is to extract corresponding point cloud-pixel coordinates from the same time frame of the two types of sensors, and then solve the extrinsic parameter according to the corresponding coordinate values. The intrinsic parameters of the camera, namely the internal parameters of the camera, are obtained before the laser radar and the external parameters of the camera are calibrated, then the external parameters are solved and converted into a PnP (multipoint perspective-n-point) problem, the relation between the pixel point coordinates of the camera image and the coordinate system of the camera is shown as a formula (1),
Figure BDA0002980652480000071
wherein (u, v) represents the coordinates of the first keypoint in the pixel coordinate system, (X)c,Yc,Zc) Identifying coordinates of the first keypoint in a camera coordinate system, the relationship of the camera coordinate system with respect to the point cloud coordinate system being shown in equation (2),
[Xc,Yc,Zc]T=[Rt,tt][Xl,Yl,Zl]T (2)
wherein (X)l,Yl,Zl) Representing the coordinates of the second keypoint in the point cloud coordinate system, Rt3x3 rotation matrix, t, representing point cloud under lidar coordinate system to image pixels under camera coordinate systemtRepresenting a three-dimensional translation vector, (R)t,tt) The ginseng is the external ginseng.
The method and the device extract the point cloud key points by utilizing Hough transformation. The length of the adjacent edge of the calibration plate is prolonged to increase the number of point clouds on the adjacent edge, so that the identification degree of the adjacent edge is increased, the adjacent edge of the calibration plate is conveniently extracted through Hough transformation, and then point cloud key points are obtained according to the intersection point of the adjacent edge and the actual length of the adjacent edge. And improving the extraction precision of the key points, setting an optimized condition according to geometric constraint after accurately extracting the key points, counting the key points under the poses of a plurality of calibration plates, calculating the external parameter values calculated by different key point pairs, selecting the external parameter value with the minimum error according to the constraint condition, and improving the precision of external parameter calibration.
The embodiment of the present application provides a possible implementation manner, in which the obtaining of the three-dimensional point cloud collected by the laser radar at the same time includes:
and aiming at the same calibration plate, collecting multi-frame middle three-dimensional point clouds at the same pose, and fusing the multi-frame middle three-dimensional point clouds to obtain the three-dimensional point clouds collected by the laser radar.
In the embodiment of the application, although the single-frame point cloud of the MEMS laser radar is dense, the vertex angle is fuzzy and the number of discrete points is large, so that the point cloud is preprocessed, and the corner features of the accurate point cloud can be obtained to further extract the key points of the calibration plate by needing enough points on the calibration plate. Firstly, extracting point clouds on a calibration plate by a rough segmentation method, then increasing the density of the point clouds by utilizing a time domain multi-frame fusion algorithm, and overlapping multi-frame point clouds to obtain a point cloud set, wherein the multi-frame three-dimensional point clouds to be fused are collected when the same calibration plate is in the same pose.
According to the embodiment of the application, the multi-frame three-dimensional point clouds collected when the same calibration plate is positioned at the same pose are fused, so that the density of the midpoint of the three-dimensional point clouds is ensured to be large enough, and the extraction of key points is facilitated.
The embodiment of the present application provides a possible implementation manner, in which as shown in fig. 4, determining a second keypoint in the three-dimensional point cloud by using Hough transform includes:
step S401, fitting the three-dimensional point cloud by adopting a preset RANSAC algorithm to obtain an optimal plane;
and step S402, determining a second key point in the optimal plane by adopting Hough transformation.
In the embodiment of the application, after the point cloud set P is obtained, some abnormal points far away from the central plane of the calibration plate exist in the point cloud set P, and in order to eliminate the abnormal points, the RANSAC-based algorithm is firstly used for fitting the optimal plane P1Wherein the fitting formula is shown as formula (3),
Ax1+By1+Cz1+D=0 (3)
wherein A, B, C, D is a parameter for plane fitting, (x)1,y1,z1) Identifying an optimal plane P1Removing points far away from the fitting plane in the point cloud set P, and then projecting the rest points to the plane P1Thereby obtaining a point cloud set P'.
After point cloud pretreatment, the points in the point set P' are all on the plane P1And on the upper part, discrete points at the edge of the calibration plate are not completely eliminated, the discrete points with large errors are avoided during calibration, two adjacent edges of the top point on the point cloud of the calibration plate are fitted by Hough transformation, the intersection point of straight lines is the top point of the calibration plate, the left top point and the right top point of the calibration plate are solved according to the actual edge length of the calibration plate, and then the coordinate value of the key point of the calibration plate is determined.
According to the method and the device, the abnormal points in the point cloud set are removed through an algorithm, and the influence of the abnormal points on the extraction of the key points is eliminated.
In this embodiment, as shown in fig. 5, the determining the second keypoint in the optimal plane by using Hough transform includes:
step S501, fitting an edge straight line of the calibration plate in the optimal plane by utilizing Hough transformation;
step S502, determining the second key point according to the intersection point of the edge straight lines and the actual length of the calibration plate, wherein the length of the edge straight lines is greater than the actual length of the calibration plate.
In the embodiment of the application, the Hough transformation principle is that a straight line in an image is converted into points in a parameter domain by utilizing point-line duality, and the straight line in the image domain is determined by counting the number of the points detected in the parameter domain. Therefore, the number of point clouds of the edge line of the calibration plate is increased by prolonging the edge line of the calibration plate, the identification degree of the edge line is further increased, the Hough transformation is conveniently utilized for fitting the edge line, and then key points are obtained according to the intersection point of the fitted edge line and the actual length of the edge of the calibration plate.
In the embodiment of the present application, a straight line is represented in a polar coordinate manner, where kx + b of a straight line in an X-y space is converted into a polar coordinate representation ρ xcos θ + ysin θ, and the straight line corresponds to a parameter space point (ρ, θ), where ρ represents a distance from a point on the straight line to an origin, and θ represents an angle between the straight line where a line segment from the point to the origin is located and an X axis. The step of determining key points by Hough transformation comprises the steps of firstly fitting a point cloud plane, then resampling the point cloud to enable the point cloud to be uniformly distributed, and aiming at converting three-dimensional point cloud into two-dimensional point cloud, determining the value range of a parameter space, discretizing the parameter space and obtaining thetai(i=1,2,3…m),ρj(j ═ 1,2,3 … m), the array a (ρ, θ) for accumulating the number of parameter space points, and the threshold value N for the number of linear parameters are initializedρMerging the parameter thresholds sigma1,σ2Length threshold N of line segmentLAnd null array Line(ii) a Run over thetai(i ═ 1,2,3 … m), solving for the value of ρ in the linear space, then with ρ in the parameter spacejComparing, and setting the fault tolerance threshold of the collinear point as sigmaρIf | ρ - ρj|<σρArray A (ρ)tt) Accumulation of 1, array A (ρ)ji) Is greater than a threshold value NρThen, determine (ρ)tt) For the parameters of a straight line, points in x-y space are substituted into pk=xi cosθi+yi sinθiRecording satisfies | ρkt|<σρAnd is stored in array LineIn the method, the straight lines in a short distance are merged, and if two straight line parameters (rho)11)、(ρ22) Satisfy | ρ12|<σ1、 |θ12|<σ2Then merging the parameters into a straight line, and setting the error threshold of the detected point cloud line segment as sigmaLSorting according to the coordinate size of the collinear points of the straight lines, and respectively arranging the maximum value and the minimum valueTo fit the line segment end points on a straight line, if Lt-Llength|<σLWhen k is>When 0, the left adjacent edge is judged as the ith left adjacent edge
Figure BDA0002980652480000091
When k is<When 0, the right adjacent side is judged as the jth right adjacent side
Figure BDA0002980652480000092
Setting the number of the collinear points as n, fitting the collinear points on the straight line according to a least square method, solving the slope k and the intercept b of the straight line, wherein the calculation mode is shown as a formula (4) and a formula (5),
Figure BDA0002980652480000093
Figure BDA0002980652480000094
find out
Figure BDA0002980652480000095
And
Figure BDA0002980652480000096
the intersection point of (A) is the upper vertex of the calibration plate, and the actual side lengths of the left adjacent side and the right adjacent side of the calibration plate are respectively set as L1And L2And then, the left vertex and the right vertex of the calibration plate can be calculated, and then three key points of the three-dimensional point cloud are determined.
The method and the device extract the point cloud key points by utilizing Hough transformation. The length of the adjacent edge of the calibration plate is prolonged to increase the number of point clouds on the adjacent edge, so that the identification degree of the adjacent edge is increased, the adjacent edge of the calibration plate is conveniently extracted through Hough transformation, and then point cloud key points are obtained according to the intersection point of the adjacent edge and the actual length of the adjacent edge. And improve the accuracy of key point extraction.
The embodiment of the present application provides a possible implementation manner, in which the determining the external parameters of the lidar and the camera based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud includes:
converting the pixel coordinate of the first key point into a camera coordinate by adopting a preset pixel coordinate-camera coordinate formula;
determining the external parameters based on a relationship of the camera coordinates of the first keypoint and the coordinates of the second keypoint in the three-dimensional point cloud.
For the embodiment of the present application, for convenience of description, taking an embodiment as an example, the solution of the extrinsic parameter is to extract corresponding point cloud-pixel coordinates from the same time frame of the two types of sensors, and then solve the extrinsic parameter according to the corresponding coordinate values. The intrinsic parameters of the camera, namely the internal parameters of the camera, are obtained before the laser radar and the external parameters of the camera are calibrated, then the external parameters are solved and converted into a PnP (multipoint perspective-n-point) problem, the relation between the pixel point coordinates of the camera image and the coordinate system of the camera is shown as a formula (1),
Figure BDA0002980652480000101
wherein (u, v) represents the coordinates of the first keypoint in the pixel coordinate system, (X)c,Yc,Zc) Identifying coordinates of the first keypoint in a camera coordinate system, the relationship of the camera coordinate system with respect to the point cloud coordinate system being shown in equation (2),
[Xc,Yc,Zc]T=[Rt,tt][Xl,Yl,Zl]T (2)
wherein (X)l,Yl,Zl) Representing the coordinates of the second keypoint in the point cloud coordinate system, Rt3x3 rotation matrix, t, representing point cloud under lidar coordinate system to image pixels under camera coordinate systemtRepresenting a three-dimensional translation vector, (R)t,tt) The ginseng is the external ginseng.
The method and the device extract the point cloud key points by utilizing Hough transformation. The length of the adjacent edge of the calibration plate is prolonged to increase the number of point clouds on the adjacent edge, so that the identification degree of the adjacent edge is increased, the adjacent edge of the calibration plate is conveniently extracted through Hough transformation, and then point cloud key points are obtained according to the intersection point of the adjacent edge and the actual length of the adjacent edge. And improving the extraction precision of the key points, setting an optimized condition according to geometric constraint after accurately extracting the key points, counting the key points under the poses of a plurality of calibration plates, calculating the external parameter values calculated by different key point pairs, selecting the external parameter value with the minimum error according to the constraint condition, and improving the precision of external parameter calibration.
The embodiment of the present application provides a possible implementation manner, in which, as shown in fig. 6, the method further includes:
step S601, aiming at a plurality of different poses of the same calibration plate, respectively acquiring a plurality of groups of three-dimensional point clouds acquired by a laser radar and pixel images acquired by a camera, and solving a plurality of external parameters;
step S602, determining the optimal external parameter in the plurality of external parameters by adopting a preset algorithm.
In the embodiment of the present application, the external reference (R) is found by using a calibration boardt,tt) And (3) obtaining key points of the calibration plate corresponding to a plurality of groups of pixels and point clouds according to the parameters in the (1) and (2) calculating external parameters. Therefore, the calibration plate is moved within the field of view of the sensor, and N (N) is obtained>3) Different positions and postures of the calibration plate can be obtained, the geometric characteristics of N groups of point clouds corresponding to the image can be obtained, and then a plurality of groups of external parameters (R) are calculated according to the corresponding relation between key points and normal vectors of the calibration platet,tt) Then, according to the formula (3), the plane p can be obtained1Normal vector n of(1),n(1)Through the rotation matrix RtAfter transformation, n is obtained(1,c)Ideally n(1,c)The dot product with the image plane vector is 0, but there is a systematic measurement error, let the average dot product error be ed,n(1,c)And the image plane normal vector n(c)Has a calibration error of erThe optimal rotation matrix can be obtained according to the formula (7)
Figure BDA0002980652480000111
Figure BDA0002980652480000112
Wherein N represents the total gesture number of the calibration plate, i belongs to [1, N]And the (i) th pose is shown,
Figure BDA0002980652480000113
representing key points on the image, obtained from the OpenCV library, edTo average dot product error, erIs n(1,c)And the image plane normal vector n(c)The calibration error of (2). Then, the optimal rotation matrix is obtained according to the formula (8)
Figure BDA0002980652480000114
Figure BDA0002980652480000115
Wherein R istExpressing a rotation matrix, and calculating from multiple pixel coordinates and point cloud coordinates, specifically referring to formulas (1) and (2), wherein the coordinate values are different, and the obtained R istAlso different, it is understood that RtIs a variable, in formula (7)
Figure BDA0002980652480000121
n(c)Is a constant value, and n(1,c)=Rtn(l)N in (1)(l)Also a constant value, a plurality of RtHas a value such that ed+erMinimum, when the minimum value is obtained
Figure BDA0002980652480000122
According to the corresponding relation of the first key point of the calibration plate in the pixel image and the second key point of the calibration plate in the three-dimensional point cloud, the optimal translation matrix is solved
Figure BDA0002980652480000123
After all point cloud key points are projected on the image, calculating the average Euclidean distance between the point cloud key points and the image key points as an average error, wherein the calculation mode is shown as the formula (9):
Figure BDA0002980652480000124
in the formula (9), N is the total number of poses,
Figure RE-GDA0003145142060000124
for the j image key point on the i pose,
Figure RE-GDA0003145142060000125
is the projection point of the j point cloud key point on the i pose, wherein
Figure RE-GDA0003145142060000126
Figure RE-GDA0003145142060000127
And j point cloud key points on the ith pose.
Figure RE-GDA0003145142060000128
And
Figure RE-GDA0003145142060000129
the variance of the euclidean distance of (a):
Figure BDA00029806524800001211
traversing the candidate external reference values, and taking out the variance vtThe minimum external parameter value is used, then the external parameter value is used for recalculating the average Euclidean distance between the projection point of the point cloud key point and the image key point, the key points with the Euclidean distance larger than the average Euclidean distance are removed, and the rest point cloud key point set is set as OlThe image key point set is OcFinding the optimal translation vector
Figure BDA00029806524800001214
As shown in equation (11), the mean (-) function represents the average by row.
Figure BDA00029806524800001212
Wherein the content of the first and second substances,
Figure BDA00029806524800001213
is the optimal external reference.
After the key points are accurately extracted, the optimization conditions are set according to the geometric constraints, the key points under the poses of a plurality of calibration plates are counted, the external parameter values calculated by different key point pairs are calculated, the external parameter value with the minimum error is selected according to the constraint conditions, and the external parameter calibration precision is improved.
The method and the device extract the point cloud key points by utilizing Hough transformation. The length of the adjacent edge of the calibration plate is prolonged to increase the number of point clouds on the adjacent edge, so that the identification degree of the adjacent edge is increased, the adjacent edge of the calibration plate is conveniently extracted through Hough transformation, and then point cloud key points are obtained according to the intersection point of the adjacent edge and the actual length of the adjacent edge. And improving the extraction precision of the key points, setting an optimized condition according to geometric constraint after accurately extracting the key points, counting the key points under the poses of a plurality of calibration plates, calculating the external parameter values calculated by different key point pairs, selecting the external parameter value with the minimum error according to the constraint condition, and improving the precision of external parameter calibration.
The embodiment of the present application provides an external reference calibration apparatus for a laser radar and a camera, as shown in fig. 7, the external reference calibration apparatus 70 for a laser radar and a camera may include: an image and point cloud acquisition module 701, a keypoint determination module 702, and an outlier determination module 703, wherein,
an image and point cloud obtaining module 701, configured to obtain a three-dimensional point cloud collected by a laser radar and a pixel image collected by a camera at the same time, where the three-dimensional point cloud and the pixel image are collected for the same calibration plate;
a key point determining module 702, configured to determine a first key point in the pixel image, and determine a second key point in the three-dimensional point cloud by using Hough transform; the first key point and the second key point are multiple, and the point on the calibration plate corresponding to the first key point is the same as the point on the calibration plate corresponding to the second key point;
an external parameter determining module 703, configured to determine external parameters of the lidar and the camera based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud.
Further, as shown in fig. 8, the keypoint determination module 702 includes:
a straight line fitting unit 801, configured to fit an edge straight line of the calibration board in the optimal plane by using Hough transform;
a key point determining unit 802, configured to determine the second key point according to the intersection point of the edge straight lines and the actual length of the calibration board, where the length of the edge straight line is greater than the actual length of the calibration board.
The external reference calibration device for the laser radar and the camera according to the embodiment of the present application can execute the external reference calibration method for the laser radar and the camera according to the embodiment of the present application, which is similar to the implementation principle and is not repeated herein.
The method and the device extract the point cloud key points by utilizing Hough transformation. The length of the adjacent edge of the calibration plate is prolonged to increase the number of point clouds on the adjacent edge, so that the identification degree of the adjacent edge is increased, the adjacent edge of the calibration plate is conveniently extracted through Hough transformation, and then point cloud key points are obtained according to the intersection point of the adjacent edge and the actual length of the adjacent edge. And improving the extraction precision of the key points, setting an optimized condition according to geometric constraint after accurately extracting the key points, counting the key points under the poses of a plurality of calibration plates, calculating the external parameter values calculated by different key point pairs, selecting the external parameter value with the minimum error according to the constraint condition, and improving the precision of external parameter calibration.
An embodiment of the present application provides an electronic device, including: a memory and a processor; at least one program stored in the memory for execution by the processor, in contrast to the prior art, embodiments of the present application extract the point cloud key points by using the Hough transform. The length of the adjacent edge of the calibration plate is prolonged to increase the number of point clouds on the adjacent edge, so that the identification degree of the adjacent edge is increased, the adjacent edge of the calibration plate is conveniently extracted through Hough transformation, and then point cloud key points are obtained according to the intersection point of the adjacent edge and the actual length of the adjacent edge. And improving the extraction precision of the key points, setting an optimized condition according to geometric constraint after accurately extracting the key points, counting the key points under the poses of a plurality of calibration plates, calculating the external parameter values calculated by different key point pairs, selecting the external parameter value with the minimum error according to the constraint condition, and improving the precision of external parameter calibration.
In an alternative embodiment, an electronic device is provided, as shown in fig. 9, the electronic device 4000 shown in fig. 9 comprising: a processor 4001 and a memory 4003. Processor 4001 is coupled to memory 4003, such as via bus 4002. Optionally, the electronic device 4000 may further include a transceiver 4004, and the transceiver 4004 may be used for data interaction between the electronic device and other electronic devices, such as transmission of data and/or reception of data. In addition, the transceiver 4004 is not limited to one in practical applications, and the structure of the electronic device 4000 is not limited to the embodiment of the present application.
The Processor 4001 may be a CPU (Central Processing Unit), a general-purpose Processor, a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), an FPGA (field programmable Gate Array) or other programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. The processor 4001 may also be a combination that performs a computational function, including, for example, a combination of one or more microprocessors, a combination of a DSP and a microprocessor, or the like.
Bus 4002 may include a path that carries information between the aforementioned components. The bus 4002 may be a PCI (Peripheral Component Interconnect) bus, an EISA (Extended Industry Standard Architecture) bus, or the like. The bus 4002 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 9, but this does not indicate only one bus or one type of bus.
The Memory 4003 may be a ROM (Read Only Memory) or other types of static storage devices that can store static information and instructions, a RAM (Random Access Memory) or other types of dynamic storage devices that can store information and instructions, an EEPROM (Electrically Erasable Programmable Read Only Memory), a CD-ROM (Compact Disc Read Only Memory) or other optical Disc storage, optical Disc storage (including Compact Disc, laser Disc, optical Disc, digital versatile Disc, blu-ray Disc, etc.), a magnetic Disc storage medium or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to these.
The memory 4003 is used for storing application program codes (computer programs) for executing the present scheme, and is controlled by the processor 4001 to execute. Processor 4001 is configured to execute application code stored in memory 4003 to implement what is shown in the foregoing method embodiments.
The present application provides a computer-readable storage medium, on which a computer program is stored, which, when running on a computer, enables the computer to execute the corresponding content in the foregoing method embodiments.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and may be performed in other orders unless explicitly stated herein. Moreover, at least a portion of the steps in the flow chart of the figure may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, which are not necessarily performed in sequence, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
The foregoing is only a partial embodiment of the present application, and it should be noted that, for those skilled in the art, several modifications and decorations can be made without departing from the principle of the present application, and these modifications and decorations should also be regarded as the protection scope of the present application.

Claims (10)

1. An external reference calibration method for a laser radar and a camera is characterized by comprising the following steps:
acquiring a three-dimensional point cloud acquired by a laser radar and a pixel image acquired by a camera at the same time, wherein the three-dimensional point cloud and the pixel image are acquired aiming at the same calibration plate;
determining a first key point in the pixel image, and determining a second key point in the three-dimensional point cloud by adopting Hough transformation; the first key point and the second key point are multiple, and the point on the calibration plate corresponding to the first key point is the same as the point on the calibration plate corresponding to the second key point;
determining the laser radar and camera external parameters based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud.
2. The method for calibrating the external parameters of the lidar and the camera according to claim 1, wherein the obtaining of the three-dimensional point cloud collected by the lidar at the same time comprises:
and aiming at the same calibration plate, collecting multi-frame middle three-dimensional point clouds at the same pose, and fusing the multi-frame middle three-dimensional point clouds to obtain the three-dimensional point clouds collected by the laser radar.
3. The method for calibrating external parameters of a lidar and a camera according to claim 2, wherein the determining the second key point in the three-dimensional point cloud by using Hough transform comprises:
fitting the three-dimensional point cloud by adopting a preset RANSAC algorithm to obtain an optimal plane;
and determining a second key point in the optimal plane by using Hough transformation.
4. The method of claim 3, wherein the determining the second keypoint in the optimal plane by using Hough transform comprises:
fitting the edge straight line of the calibration plate in the optimal plane by using Hough transformation;
and determining the second key point according to the intersection point of the edge straight line and the actual length of the calibration plate, wherein the length of the edge straight line is greater than the actual length of the calibration plate.
5. The lidar and camera extrinsic parameter calibration method according to claim 1, wherein the determining the lidar and camera extrinsic parameters based on the coordinates of the first keypoint in the pixel image and the coordinates of the second keypoint in the three-dimensional point cloud comprises:
converting the pixel coordinate of the first key point into a camera coordinate by adopting a preset pixel coordinate-camera coordinate formula;
determining the external parameters based on a relationship of the camera coordinates of the first keypoint and the coordinates of the second keypoint in the three-dimensional point cloud.
6. The lidar and camera external reference calibration method according to claim 4, further comprising:
respectively acquiring a plurality of groups of three-dimensional point clouds acquired by a laser radar and pixel images acquired by a camera aiming at a plurality of different poses of the same calibration plate, and solving a plurality of external parameters;
and determining the optimal external parameters in the plurality of external parameters by adopting a preset algorithm.
7. The external reference calibration device for the laser radar and the camera is characterized by comprising the following components:
the device comprises an image and point cloud acquisition module, a calibration board acquisition module and a calibration processing module, wherein the image and point cloud acquisition module is used for acquiring a three-dimensional point cloud acquired by a laser radar and a pixel image acquired by a camera at the same moment;
the key point determining module is used for determining a first key point in the pixel image and determining a second key point in the three-dimensional point cloud by adopting Hough transformation; the first key point and the second key point are multiple, and the point on the calibration plate corresponding to the first key point is the same as the point on the calibration plate corresponding to the second key point;
and the external parameter determining module is used for determining the external parameters of the laser radar and the camera based on the coordinates of the first key point in the pixel image and the coordinates of the second key point in the three-dimensional point cloud.
8. The lidar and camera extrinsic parameter calibration apparatus of claim 7, wherein the keypoint determination module comprises:
the straight line fitting unit is used for fitting the edge straight line of the calibration plate in the optimal plane by utilizing Hough transformation;
and the key point determining unit is used for determining the second key point according to the edge straight line intersection point and the actual length of the calibration plate, wherein the length of the edge straight line is greater than the actual length of the calibration plate.
9. An electronic device, characterized in that the electronic device comprises:
one or more processors;
a memory;
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to: the method for external reference calibration of a lidar and a camera according to any of claims 1-6.
10. A computer readable storage medium having stored thereon at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement the method of external reference calibration for lidar and cameras according to any of claims 1 to 6.
CN202110286400.3A 2021-03-17 2021-03-17 External parameter calibration method, device and equipment for laser radar and camera and storage medium Active CN113256729B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110286400.3A CN113256729B (en) 2021-03-17 2021-03-17 External parameter calibration method, device and equipment for laser radar and camera and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110286400.3A CN113256729B (en) 2021-03-17 2021-03-17 External parameter calibration method, device and equipment for laser radar and camera and storage medium

Publications (2)

Publication Number Publication Date
CN113256729A true CN113256729A (en) 2021-08-13
CN113256729B CN113256729B (en) 2024-06-18

Family

ID=77181467

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110286400.3A Active CN113256729B (en) 2021-03-17 2021-03-17 External parameter calibration method, device and equipment for laser radar and camera and storage medium

Country Status (1)

Country Link
CN (1) CN113256729B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113838141A (en) * 2021-09-02 2021-12-24 中南大学 External parameter calibration method and system for single line laser radar and visible light camera
CN114387347A (en) * 2021-10-26 2022-04-22 浙江智慧视频安防创新中心有限公司 Method and device for determining external parameter calibration, electronic equipment and medium
CN114758005A (en) * 2022-03-23 2022-07-15 中国科学院自动化研究所 Laser radar and camera external parameter calibration method and device
CN115267746A (en) * 2022-06-13 2022-11-01 广州文远知行科技有限公司 Positioning method for laser radar point cloud projection error and related equipment
CN115994955A (en) * 2023-03-23 2023-04-21 深圳佑驾创新科技有限公司 Camera external parameter calibration method and device and vehicle
WO2024001923A1 (en) * 2022-06-30 2024-01-04 先临三维科技股份有限公司 Mapping method and apparatus, device, and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108932736A (en) * 2018-05-30 2018-12-04 南昌大学 Two-dimensional laser radar Processing Method of Point-clouds and dynamic robot pose calibration method
CN111179358A (en) * 2019-12-30 2020-05-19 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium
CN111192331A (en) * 2020-04-09 2020-05-22 浙江欣奕华智能科技有限公司 External parameter calibration method and device for laser radar and camera
CN111627072A (en) * 2020-04-30 2020-09-04 贝壳技术有限公司 Method and device for calibrating multiple sensors and storage medium
CN111862224A (en) * 2019-04-17 2020-10-30 杭州海康威视数字技术股份有限公司 Method and device for determining external parameters between camera and laser radar
CN111965624A (en) * 2020-08-06 2020-11-20 北京百度网讯科技有限公司 Calibration method, device and equipment for laser radar and camera and readable storage medium
CN112270713A (en) * 2020-10-14 2021-01-26 北京航空航天大学杭州创新研究院 Calibration method and device, storage medium and electronic device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108932736A (en) * 2018-05-30 2018-12-04 南昌大学 Two-dimensional laser radar Processing Method of Point-clouds and dynamic robot pose calibration method
CN111862224A (en) * 2019-04-17 2020-10-30 杭州海康威视数字技术股份有限公司 Method and device for determining external parameters between camera and laser radar
CN111179358A (en) * 2019-12-30 2020-05-19 浙江商汤科技开发有限公司 Calibration method, device, equipment and storage medium
CN111192331A (en) * 2020-04-09 2020-05-22 浙江欣奕华智能科技有限公司 External parameter calibration method and device for laser radar and camera
CN111627072A (en) * 2020-04-30 2020-09-04 贝壳技术有限公司 Method and device for calibrating multiple sensors and storage medium
CN111965624A (en) * 2020-08-06 2020-11-20 北京百度网讯科技有限公司 Calibration method, device and equipment for laser radar and camera and readable storage medium
CN112270713A (en) * 2020-10-14 2021-01-26 北京航空航天大学杭州创新研究院 Calibration method and device, storage medium and electronic device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
BALÁZS NAGY 等: "On-the-Fly Camera and Lidar Calibration", 《REMOTE SENSING》 *
SURABHI VERMA 等: "Automatic extrinsic calibration between a camera and a 3D Lidar using 3D point and plane correspondences", 《2019 IEEE INTELLIGENT TRANSPORTATION SYSTEMS CONFERENCE (ITSC)》 *
张云鹏: "基于Hough变换的点云数据直线特征提取研究", 《矿山测量》, vol. 47, no. 5 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113838141A (en) * 2021-09-02 2021-12-24 中南大学 External parameter calibration method and system for single line laser radar and visible light camera
CN114387347A (en) * 2021-10-26 2022-04-22 浙江智慧视频安防创新中心有限公司 Method and device for determining external parameter calibration, electronic equipment and medium
CN114387347B (en) * 2021-10-26 2023-09-19 浙江视觉智能创新中心有限公司 Method, device, electronic equipment and medium for determining external parameter calibration
CN114758005A (en) * 2022-03-23 2022-07-15 中国科学院自动化研究所 Laser radar and camera external parameter calibration method and device
CN114758005B (en) * 2022-03-23 2023-03-28 中国科学院自动化研究所 Laser radar and camera external parameter calibration method and device
CN115267746A (en) * 2022-06-13 2022-11-01 广州文远知行科技有限公司 Positioning method for laser radar point cloud projection error and related equipment
WO2024001923A1 (en) * 2022-06-30 2024-01-04 先临三维科技股份有限公司 Mapping method and apparatus, device, and storage medium
CN115994955A (en) * 2023-03-23 2023-04-21 深圳佑驾创新科技有限公司 Camera external parameter calibration method and device and vehicle
CN115994955B (en) * 2023-03-23 2023-07-04 深圳佑驾创新科技有限公司 Camera external parameter calibration method and device and vehicle

Also Published As

Publication number Publication date
CN113256729B (en) 2024-06-18

Similar Documents

Publication Publication Date Title
CN113256729B (en) External parameter calibration method, device and equipment for laser radar and camera and storage medium
EP2843590B1 (en) System and method for package dimensioning
CN109961468B (en) Volume measurement method and device based on binocular vision and storage medium
US10339390B2 (en) Methods and apparatus for an imaging system
US5933523A (en) Machine vision method and apparatus for determining the position of generally rectangular devices using boundary extracting features
CN111627075B (en) Camera external parameter calibration method, system, terminal and medium based on aruco code
JP4943034B2 (en) Stereo image processing device
EP3460715B1 (en) Template creation apparatus, object recognition processing apparatus, template creation method, and program
CN111123242B (en) Combined calibration method based on laser radar and camera and computer readable storage medium
CN110766758B (en) Calibration method, device, system and storage device
CN108573471B (en) Image processing apparatus, image processing method, and recording medium
JPH03260782A (en) Pattern recognition device
CN111829439B (en) High-precision translation measuring method and device
CN116433737A (en) Method and device for registering laser radar point cloud and image and intelligent terminal
KR20180098945A (en) Method and apparatus for measuring speed of vehicle by using fixed single camera
CN114972531B (en) Corner detection method, equipment and readable storage medium
CN112802114B (en) Multi-vision sensor fusion device, method thereof and electronic equipment
Miksch et al. Automatic extrinsic camera self-calibration based on homography and epipolar geometry
CN115760860B (en) Multi-type workpiece dimension visual measurement method based on DXF file import
CN110458951B (en) Modeling data acquisition method and related device for power grid pole tower
CN115932877A (en) Target tracking method and system with fusion of laser radar and monocular camera
CN113689397A (en) Workpiece circular hole feature detection method and workpiece circular hole feature detection device
US20220366593A1 (en) Electronic apparatus and object detection method
CN111583317B (en) Image alignment method and device and terminal equipment
CN117115242B (en) Identification method of mark point, computer storage medium and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant