CN112509062B - Calibration plate, calibration system and calibration method - Google Patents

Calibration plate, calibration system and calibration method Download PDF

Info

Publication number
CN112509062B
CN112509062B CN202011494870.0A CN202011494870A CN112509062B CN 112509062 B CN112509062 B CN 112509062B CN 202011494870 A CN202011494870 A CN 202011494870A CN 112509062 B CN112509062 B CN 112509062B
Authority
CN
China
Prior art keywords
calibration
data
calibration plate
camera
characteristic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011494870.0A
Other languages
Chinese (zh)
Other versions
CN112509062A (en
Inventor
战荫伟
蔡宇健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN202011494870.0A priority Critical patent/CN112509062B/en
Publication of CN112509062A publication Critical patent/CN112509062A/en
Application granted granted Critical
Publication of CN112509062B publication Critical patent/CN112509062B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The application discloses a calibration plate, a calibration system and a calibration method, wherein the calibration plate comprises the following components: a planar plate body; the method comprises the steps that a first type of characteristic pattern and a plurality of second type of characteristic patterns are arranged on a plane plate body, wherein the first type of characteristic pattern is a first hollow area; the second feature patterns are distributed around the circumference of the first feature pattern and each consist of a second hollow area and a reflection area with different colors from the plane plate body. The calibration plate has two characteristic patterns, is arranged according to a certain arrangement rule, provides enough point cloud data depth types, enables the plane estimation of a calibration object in the calibration process to be more accurate and reliable, reduces the data quantity required to be acquired for completing the calibration work, and improves the calibration work efficiency.

Description

Calibration plate, calibration system and calibration method
Technical Field
The application relates to the technical field of calibration, in particular to a calibration plate, a calibration system and a calibration method.
Background
With the development of unmanned technology, sensors are often used to acquire position information around a vehicle, so as to plan and make decisions on the behavior of the vehicle in automatic driving according to the information detected by the sensors. Cameras have become one of the most common sensors in unmanned technology for sensing the environment, but in the last decades, the number of unmanned cars equipped with lidar has grown rapidly, and accurate distance measurements can provide information that is not available with conventional cameras. However, lidar does not provide the high spatial resolution and valuable color information provided by cameras. Many recent studies have thus utilized the complementary advantages of the two sensors together to form a new topic for unmanned sensor data fusion.
The existing laser radar and camera calibration technology generally uses a simple shape or a single plane object as a calibration object, which can lead to insufficient depth types of the cloud data of the calibration object point, so that the calibration object plane in the calibration algorithm is estimated incorrectly, thereby leading to inaccurate calibration results. The calibration plate needs to collect a large amount of data to finish the calibration work of the laser radar and the camera, and has low calibration efficiency. Meanwhile, in the existing laser radar and camera calibration technology, the calibration result is not optimized, and a more accurate calibration result cannot be obtained.
Disclosure of Invention
In view of the above, a first object of the present application is to provide a calibration board, which provides enough depth types of point cloud data, so that the plane estimation of a calibration object in the calibration process is more accurate and reliable, the data amount required to be acquired for completing the calibration is reduced, and the calibration efficiency is improved.
The second object of the application is to provide a calibration system, which can estimate the plane of the calibration object more accurately and reliably, reduce the data amount required to be acquired for completing the calibration work, and improve the calibration work efficiency.
The third object of the application is to provide a calibration method, which solves the problem of low accuracy of the calibration result obtained by the existing calibration method and improves the reliability of the calibration result.
In order to achieve the technical purpose, the application provides a calibration plate, which comprises: a planar plate body;
the plane plate body is provided with a first characteristic pattern and a plurality of second characteristic patterns;
the first characteristic pattern is a first hollow area;
a plurality of second feature patterns are circumferentially distributed around the first feature pattern and each consist of a second hollow region and a reflective region;
the color of the reflection area is different from the color of the calibration surface of the plane plate body.
Further, the plane plate body is square;
the first characteristic pattern is a circular pattern, and the center of the circular pattern coincides with the center of the plane plate body;
the second characteristic pattern is a square pattern and equally divided into four triangular areas with equal area;
two triangular areas which are distributed around the center point of the second characteristic pattern at intervals in the circumferential direction in the four triangular areas respectively form the second hollow area, and the other two triangular areas respectively form the reflecting area.
Further, the second characteristic patterns are specifically four;
the four second characteristic patterns are uniformly distributed at four corner positions of the plane plate body around the first characteristic pattern.
Further, the color of the reflection area is blue;
the color of the calibration surface of the plane plate body is white or black.
Further, the plane plate body is provided with a black frame.
The application also discloses a calibration system, which comprises a calibration device and the calibration plate;
the calibration device comprises a laser radar and a camera.
The application also discloses a calibration method, which is applied to the calibration system and comprises the following steps:
respectively acquiring point cloud data acquired by a laser radar and image data acquired by a camera in a preset calibration scene;
determining the position of the characteristic point of the calibration plate in the image according to the image data to obtain first position data;
determining the position of the characteristic point of the calibration plate in the point cloud data according to the point cloud data to obtain second position data;
calculating and determining the position of the characteristic point of the calibration plate in the image data in a world coordinate system according to the first position data, the estimated position data and the camera internal reference data to obtain third position data, wherein the estimated position data is the position of the estimated characteristic point in the calibration plate in the world coordinate system;
establishing a pose coordinate system conversion equation according to the second position data and the third position data;
solving the pose coordinate system conversion equation by using a Levenberg-Marquardt algorithm until a preset convergence condition is reached, so as to obtain laser radar and camera external parameter data;
and optimizing the laser radar and camera external parameter data according to the edge information of the calibration plate to obtain optimized laser radar and camera external parameter data.
Further, calculating and determining the position of the calibration plate feature point in the image data in the world coordinate system according to the first position data, the estimated position data and the camera internal reference data to obtain third position data specifically includes:
calculating camera external parameter data through a perspective pose algorithm according to the first position data, the estimated position data and the camera internal parameter data;
and determining the position of the characteristic point of the calibration plate in the image data in a world coordinate system according to the calculated camera external parameter data to obtain third position data.
Further, the optimizing the laser radar and the camera external parameter data according to the calibration plate edge information, and the obtaining the optimized laser radar and camera external parameter data specifically includes:
acquiring calibration plate edge information in the image data and calibration plate edge information in the point cloud data;
establishing a loss function equation according to the calibration plate edge information in the image data and the calibration plate edge information in the point cloud data;
substituting the laser radar and camera external parameter data into the loss function equation, and solving the loss function equation to obtain optimized laser radar and camera external parameter data.
Further, projecting the point cloud data into the image data, optimizing the laser radar and camera external parameter data according to the calibration plate edge information, and obtaining the optimized laser radar and camera external parameter data further includes:
and re-projecting the point cloud data into the image data, and displaying the calibration plate re-projection data in the image data.
According to the technical scheme, the first type of characteristic patterns and the second type of characteristic patterns are arranged on the plane plate body, wherein the first type of characteristic patterns are first hollow areas; the second feature patterns are distributed around the circumference of the first feature pattern and each consist of a second hollow area and a reflection area with different colors from the plane plate body. The calibration plate has two characteristic patterns, is arranged according to a certain arrangement rule, provides enough point cloud data depth types, enables the plane estimation of a calibration object in the calibration process to be more accurate and reliable, reduces the data quantity required to be acquired for completing the calibration work, and improves the calibration work efficiency.
According to the technical scheme, the calibration system provided by the application comprises the calibration device with the laser radar and the camera and the improved calibration plate, wherein the calibration device is more accurate and reliable in estimating the calibration object plane, the data volume required to be acquired for completing the calibration work is reduced, and the calibration efficiency is improved.
According to the technical scheme, the calibration method is applied to the calibration system, data acquisition is carried out on the improved calibration plate, enough image data depth and point cloud data depth can be acquired to achieve more accurate calibration object plane estimation, and the established pose coordinate system conversion equation is solved by using the Levenberg-Marquardt algorithm until a preset convergence condition is reached, so that laser radar and camera external reference data are obtained; and optimizing the laser radar and the camera external parameter data according to the edge information of the calibration plate to obtain optimized laser radar and camera external parameter data. The method solves the problem of low accuracy of the calibration result obtained by the existing calibration method, and improves the reliability of the calibration result by introducing a parameter optimization algorithm.
Drawings
In order to more clearly illustrate the embodiments of the application or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, it being obvious that the drawings in the description below are only some embodiments of the application, and that other drawings can be obtained from these drawings without inventive faculty for a person skilled in the art.
FIG. 1 is a schematic structural view of a calibration plate according to the present application;
FIG. 2 is a schematic diagram of a calibration system according to the present application;
FIG. 3 is an overall flow chart of a calibration method provided in the present application;
FIG. 4 is a first partial flow chart of a calibration method provided in the present application;
FIG. 5 is a second partial flow chart of a calibration method provided in the present application;
in the figure: 1. a planar plate body; 2. a first feature pattern; 21. a first hollow region; 3. a second feature pattern; 31. a second hollow region; 32. a reflective region; 10. a laser radar; 20. a camera; 30. a calibration plate; 40. and (3) a bracket.
Detailed Description
The following description of the embodiments of the present application will be made in detail, but not necessarily all embodiments, with reference to the accompanying drawings. All other embodiments, which can be made by one of ordinary skill in the art without undue burden from the present disclosure, are within the scope of the embodiments of the present application.
In the description of the embodiments of the present application, it should be noted that the terms "center," "upper," "lower," "left," "right," "vertical," "horizontal," "inner," "outer," and the like indicate or are based on the orientation or positional relationship shown in the drawings, merely to facilitate description of the embodiments of the present application and to simplify the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the embodiments of the present application. Furthermore, the terms "first," "second," and "third" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In describing embodiments of the present application, it should be noted that, unless explicitly stated and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be, for example, fixedly connected, interchangeably connected, integrally connected, mechanically connected, electrically connected, directly connected, indirectly connected through an intermediary, or in communication between two elements. The specific meaning of the above terms in embodiments of the present application will be understood in detail by those of ordinary skill in the art.
Computer vision systems typically consist of one or more optical sensors, a processor, and vision software. The target object is typically collected by one or more optical sensors, and the resulting data is analyzed and processed by visual software in a processor to obtain results and trigger subsequent actions. In some applications, the vision system may also need to be coordinated with other exercise devices (e.g., an unmanned vehicle). In vision system applications, it is often necessary to calibrate one or more optical sensors, as well as the vision system and motion devices. The calibration plate can be used for calibrating the vision system. The calibration plate is a target of a vision system, the optical sensor can calibrate the internal parameters and the external parameters of the optical sensor through the calibration plate, and all the cameras can perform measurement on the real physical space through calibration.
The calibration plate is a planar object having defined geometric features and precise dimensions. The calibration plate is usually a planar object with a special pattern on its surface. The shape and size of the calibration plate pattern is carefully designed. So as to ensure that the optical sensor can clearly detect the pattern and facilitate the convenient extraction of pattern features by a specific algorithm. Because the calibration plate is the reference target of the vision system measuring physical dimension most source, the dimension of the pattern and the flatness of the pattern all meet certain precision requirements, and it is very important to ensure the absolute precision of the pattern in the processing and manufacturing process.
The most fundamental function of the calibration plate is to provide a large number of known features that can be visually detected. The feature pattern is designed in advance and the location in the physical world is determined and known. The design of the characteristic pattern is to make the vision system easy to detect as much as possible, and the characteristic position is detected with high precision. The visual calibration process is to observe the position of the feature point in the image, combine the known information of the feature, and calculate some unknown parameters such as internal parameters, distortion and position in the world coordinate system of the camera through a certain algorithm process.
Common calibration plate design designs include, but are not limited to, two broad categories, checkerboard calibration plates, HALCON calibration plates. The basic pattern of the checkerboard calibration plate is composed of black and white alternate small lattices like a chess checkerboard, and the characteristic is corner points or straight line intersection points. The HALCON calibration plate base pattern is a stack of repeated circles arranged in a matrix or honeycomb, providing features that are the centers of the circles. Because the two common calibration plates are only designed according to the working characteristics of the camera, the characteristics provided by the calibration plates cannot be detected by the laser radar, so that the common calibration plates cannot be used as calibration objects for laser radar and camera calibration.
Therefore, the embodiment of the application discloses the calibration plate, which has wider application range and can obtain better identification precision of the calibration plate.
Referring to fig. 1, an embodiment of a calibration plate provided in an embodiment of the present application includes:
a planar plate body 1; the plane plate body 1 is provided with a first type characteristic pattern 2 and a plurality of second type characteristic patterns 3; wherein the first feature pattern 2 is a first hollow region 21; the plurality of second feature patterns 3 are circumferentially distributed around the first feature pattern 2, respectively, and each of the second hollow regions 31 and the reflection regions 32 having a color different from that of the planar plate body 1. The calibration plate has two characteristic patterns, is arranged according to a certain arrangement rule, provides enough point cloud data depth types, enables the plane estimation of a calibration object in the calibration process to be more accurate and reliable, reduces the data quantity required to be acquired for completing the calibration work, and improves the calibration work efficiency. Of course, as for the feature patterns, more feature patterns such as a third feature pattern and a fourth feature pattern may be further provided to enrich the feature patterns, further increase the depth variety of the detected point cloud data, and a person skilled in the art may make appropriate selection according to actual needs, without limitation. The hollow area is hollow on the plane plate 1, and is hollow according to the predetermined pattern shape, while the reflection area 32 is highly reflective on the plane plate 1 in the area planned according to the predetermined pattern shape. In addition, the color of the reflection area 32 is different from the color of the plane plate body 1 in the present application, so as to ensure that the feature points in the feature pattern can be correctly identified.
It should be noted that, the calibration board of the present application is not only applied to an unmanned scene, but also applied to a robot navigation scene, and is not particularly limited.
The foregoing is an embodiment one of a calibration board provided in the present embodiment, and the following is an embodiment two of a calibration board provided in the present embodiment, refer to fig. 1 specifically.
Based on the first embodiment, the following:
the plane plate body 1 of the application is a plane object with definite geometric shape and accurate dimension as the conventional calibration plate, and the pattern features are also definite geometric shape, so that the specific geometric shape, dimension and geometric shape of the pattern features of the plane plate body 1 can be appropriately adjusted according to actual needs by those skilled in the art without limitation.
Specifically, the planar plate body 1 may be square, and a square with four sides equal in length is taken as an example.
The first characteristic pattern 2 may be a circular pattern, the center of which coincides with the center of the planar plate body 1, and the corresponding circle may be used as a characteristic point of the pattern. The camera finds the feature points by detecting the edge information of the second feature pattern 3, and the laser radar finds the circular first hollow area 21 by the laser ranging information, thereby fitting out a circular plane and finding the feature points. The center of the circular pattern is arranged to coincide with the center of the plane plate body 1, so that the accuracy of the characteristic points of the calibration plate can be further optimized.
The second feature pattern 3 may be a square pattern to be distinguished from the first feature pattern 2, and equally divided into four triangular regions of equal area; two of the four triangular regions, which are circumferentially spaced around the center point of the second pattern 3, form second hollow regions 31, respectively, and the other two triangular regions form reflective regions 32, respectively. Each square pattern is provided with a characteristic point, and the characteristic points can be intersection points positioned on diagonal lines; the camera can find feature points through a corner detection algorithm; the laser radar can find the characteristic points through patterns with different reflectivities.
Further, taking the second characteristic pattern 3 as four examples; specifically, the four second feature patterns 3 may be uniformly distributed at four angular positions of the plane plate body 1 around the first feature pattern 2, and when the plane plate body 1 is square with four equal sides, the margins of the four second feature patterns 3 and the plane plate body 1 are all equal. Of course, as shown in fig. 1, the second hollow regions 31 and the reflective regions 32 in each second feature pattern 3 may be formed by diagonally distributing the second hollow regions 31 and the reflective regions 32 in two feature patterns, and the adjacent two feature patterns may be distributed differently, which is not particularly limited. Of course, the number of the second characteristic patterns 3 in the present application is not limited to four, but other numbers are also included, and is not particularly limited.
Further, the color of the reflective area 32 is preferably blue, but may be other colors, without limitation. The color of the calibration surface of the flat plate body 1 is preferably white or black, and of course, other colors can be used for aesthetic purposes, without limitation.
Further, the flat panel body 1 is provided with a black frame (not shown) to form a closed area enclosing all patterns inside. In the visual recognition positioning calibration plate area, the recognition speed can be increased, and the interference of the background pattern on recognition is reduced. When a plurality of calibration plates are arranged in the visual field, the frame can divide different calibration plates, so that the characteristic patterns of the different calibration plates are ensured not to interfere with each other.
The application also provides a calibration system, which comprises a calibration device and the calibration plate 30 of the first or second embodiment; wherein the calibration means comprise a lidar 10 and a camera 20. As shown in fig. 2, the calibration board 30 may be installed at a preset position of an application environment through the bracket 40, and the lidar 10 and the camera 20 may be installed at any position of a vehicle, and the calibration result of the lidar 10 and the camera 20 may be determined by acquiring relative position information between the lidar 10 and the camera 20 through a calibration algorithm.
Referring to fig. 3, the application further provides a calibration method applied to the calibration system, comprising the following steps:
s1, respectively acquiring point cloud data acquired by a laser radar and image data acquired by a camera in a preset calibration scene.
S2, determining the position of the characteristic point of the calibration plate in the image according to the image data to obtain first position data. The position of the calibration plate in the image data collected by the camera can be determined according to the image data, then the characteristic points of the calibration plate in the image data are determined according to the image characteristic point detection algorithm, and then the position of the characteristic points of the calibration plate in the image is determined.
And S3, determining the position of the characteristic point of the calibration plate in the point cloud data according to the point cloud data, and obtaining second position data. It should be noted that, according to the point cloud data, the position of the calibration plate in the point cloud data acquired by the laser radar may be determined first, and then, according to the point cloud feature point detection algorithm, the position of the calibration plate feature point in the point cloud data may be determined.
S4, calculating and determining the position of the characteristic point of the calibration plate in the image data in the world coordinate system according to the first position data, the estimated position data and the camera internal reference data to obtain third position data, wherein the estimated position data is the position of the estimated characteristic point in the calibration plate in the world coordinate system.
And S5, establishing a pose coordinate system conversion equation according to the second position data and the third position data. It should be noted that, the pose coordinate system conversion equation, that is, the conversion relation equation of the laser radar and the camera coordinate system, can be represented as follows:
wherein P is C Is the coordinates of characteristic points of camera image data, P L Is the characteristic point coordinates, K of the laser radar point cloud data 3×3 Is an internal parameter of the camera, R 3×3 And t 3×1 The rotation matrix and the translation matrix represent the conversion relation between the camera and the laser radar.
And S6, solving a pose coordinate system conversion equation (1) by using a Levenberg-Marquardt algorithm until a preset convergence condition is reached, and obtaining laser radar and camera external parameter data. It should be noted that the Levenberg-Marquardt algorithm, also known as the le Wen Beige-Marquardt method, can provide a numerical solution for non-linear minimization (local minimization). The algorithm can be implemented by modifying parameters to incorporate Gaussian-Newton algorithm and gradient descentThe advantages of the method and the shortcomings of the method are improved. In addition, the external parameters of the laser radar and the camera are the relative pose (including the orientation R) between the laser radar and the camera 3×3 And relative position t 3×1 ) The following formula can be obtained:
and S7, optimizing the laser radar and the camera external parameter data according to the edge information of the calibration plate to obtain optimized laser radar and camera external parameter data.
According to the technical scheme, the established pose coordinate system conversion equation is solved by utilizing the Levenberg-Marquardt algorithm until the preset convergence condition is reached, so that laser radar and camera external parameter data are obtained; and optimizing the laser radar and camera external parameter data according to the edge information of the calibration plate to obtain optimized laser radar and camera external parameter data. The method solves the problem of low accuracy of the calibration result obtained by the existing calibration method, and improves the reliability of the calibration result by introducing a parameter optimization algorithm.
Further, as shown in fig. 4, the step S4 may specifically include the steps of:
s41, calculating camera external reference data through a perspective pose algorithm according to the first position data, the estimated position data and the camera internal reference data. It should be noted that the camera external parameter data is a rotation matrix and a translation vector of the camera in the world coordinate system.
S42, determining the position of the characteristic point of the calibration plate in the image data in the world coordinate system according to the calculated camera external parameter data, and obtaining third position data.
Further, as shown in fig. 5, the step S7 may specifically include the steps of:
s71, acquiring the edge information of the calibration plate in the image data and the edge information of the calibration plate in the point cloud data.
S72, establishing a loss function equation according to the edge information of the calibration plate in the image data and the edge information of the calibration plate in the point cloud data.
S73, substituting the laser radar and camera external parameter data into a loss function equation, and solving the loss function equation to obtain optimized laser radar and camera external parameter data.
Wherein S is C Is the value of the loss and,and->The method is characterized in that the method is respectively the laser radar point cloud data and the edge information of a calibration plate in camera image data, and the formula shows whether the laser radar point cloud data can be accurately projected to a confidence region of the edge of the calibration plate, and the confidence region is used as a loss value to converge and optimize laser radar and camera external parameter data.
Further, in the above step S6, the method specifically further includes the steps of:
s74, re-projecting the point cloud data into the image data, and displaying calibration plate re-projection data in the image data. It should be noted that, by displaying the projection data in the image data to display the calibration result, the converged and optimized laser radar and camera external parameter data can be compared to determine whether the relative pose between the laser radar and the camera is accurate.
The calibration plate, the calibration system and the calibration method provided by the application are described in detail, and for those skilled in the art, according to the idea of the embodiment of the application, the specific implementation and the application range of the embodiment of the application are changed, so that the disclosure should not be interpreted as limiting the application.

Claims (9)

1. A calibration plate, comprising: a planar plate body;
the plane plate body is provided with a first characteristic pattern and a plurality of second characteristic patterns;
the first characteristic pattern is a first hollow area;
a plurality of second feature patterns are circumferentially distributed around the first feature pattern and each consist of a second hollow region and a reflective region;
the color of the reflection area is different from the color of the calibration surface of the plane plate body;
the plane plate body is square;
the first characteristic pattern is a circular pattern, and the center of the circular pattern coincides with the center of the plane plate body;
the second characteristic pattern is a square pattern and equally divided into four triangular areas with equal area;
two triangular areas which are distributed around the center point of the second characteristic pattern at intervals in the circumferential direction in the four triangular areas respectively form the second hollow area, and the other two triangular areas respectively form the reflecting area.
2. A calibration plate according to claim 1, wherein the second characteristic pattern is in particular four;
the four second characteristic patterns are uniformly distributed at four corner positions of the plane plate body around the first characteristic pattern.
3. A calibration plate according to claim 1, wherein the reflective area is blue in colour;
the color of the calibration surface of the plane plate body is white or black.
4. A calibration plate according to claim 1, wherein the planar plate body is provided with a black border.
5. A calibration system comprising calibration means and a calibration plate according to any one of claims 1 to 4;
the calibration device comprises a laser radar and a camera.
6. A calibration method applied to the calibration system of claim 5, comprising:
respectively acquiring point cloud data acquired by a laser radar and image data acquired by a camera in a preset calibration scene;
determining the position of the characteristic point of the calibration plate in the image according to the image data to obtain first position data;
determining the position of the characteristic point of the calibration plate in the point cloud data according to the point cloud data to obtain second position data;
calculating and determining the position of the characteristic point of the calibration plate in the image data in a world coordinate system according to the first position data, the estimated position data and the camera internal reference data to obtain third position data, wherein the estimated position data is the position of the estimated characteristic point in the calibration plate in the world coordinate system;
establishing a pose coordinate system conversion equation according to the second position data and the third position data;
solving the pose coordinate system conversion equation by using a Levenberg-Marquardt algorithm until a preset convergence condition is reached, so as to obtain laser radar and camera external parameter data;
and optimizing the laser radar and camera external parameter data according to the edge information of the calibration plate to obtain optimized laser radar and camera external parameter data.
7. The method of claim 6, wherein calculating and determining the position of the calibration plate feature point in the image data in the world coordinate system according to the first position data, the estimated position data and the camera reference data to obtain the third position data specifically includes:
calculating camera external parameter data through a perspective pose algorithm according to the first position data, the estimated position data and the camera internal parameter data;
and determining the position of the characteristic point of the calibration plate in the image data in a world coordinate system according to the calculated camera external parameter data to obtain third position data.
8. The method according to claim 6, wherein optimizing the laser radar and the camera external parameter data according to the calibration plate edge information, the obtaining the optimized laser radar and camera external parameter data specifically includes:
acquiring calibration plate edge information in the image data and calibration plate edge information in the point cloud data;
establishing a loss function equation according to the calibration plate edge information in the image data and the calibration plate edge information in the point cloud data;
substituting the laser radar and camera external parameter data into the loss function equation, and solving the loss function equation to obtain optimized laser radar and camera external parameter data.
9. The method according to claim 8, wherein optimizing the lidar and camera extrinsic data according to calibration plate edge information, the obtaining optimized lidar and camera extrinsic data further comprises:
and re-projecting the point cloud data into the image data, and displaying the calibration plate re-projection data in the image data.
CN202011494870.0A 2020-12-17 2020-12-17 Calibration plate, calibration system and calibration method Active CN112509062B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011494870.0A CN112509062B (en) 2020-12-17 2020-12-17 Calibration plate, calibration system and calibration method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011494870.0A CN112509062B (en) 2020-12-17 2020-12-17 Calibration plate, calibration system and calibration method

Publications (2)

Publication Number Publication Date
CN112509062A CN112509062A (en) 2021-03-16
CN112509062B true CN112509062B (en) 2023-09-12

Family

ID=74922162

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011494870.0A Active CN112509062B (en) 2020-12-17 2020-12-17 Calibration plate, calibration system and calibration method

Country Status (1)

Country Link
CN (1) CN112509062B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109839624A (en) * 2017-11-27 2019-06-04 北京万集科技股份有限公司 A kind of multilasered optical radar position calibration method and device
CN111123912A (en) * 2019-11-29 2020-05-08 苏州智加科技有限公司 Calibration method and device for travelling crane positioning coordinates
CN111369630A (en) * 2020-02-27 2020-07-03 河海大学常州校区 Method for calibrating multi-line laser radar and camera
CN111965624A (en) * 2020-08-06 2020-11-20 北京百度网讯科技有限公司 Calibration method, device and equipment for laser radar and camera and readable storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109839624A (en) * 2017-11-27 2019-06-04 北京万集科技股份有限公司 A kind of multilasered optical radar position calibration method and device
CN111123912A (en) * 2019-11-29 2020-05-08 苏州智加科技有限公司 Calibration method and device for travelling crane positioning coordinates
CN111369630A (en) * 2020-02-27 2020-07-03 河海大学常州校区 Method for calibrating multi-line laser radar and camera
CN111965624A (en) * 2020-08-06 2020-11-20 北京百度网讯科技有限公司 Calibration method, device and equipment for laser radar and camera and readable storage medium

Also Published As

Publication number Publication date
CN112509062A (en) 2021-03-16

Similar Documents

Publication Publication Date Title
CN110689579B (en) Rapid monocular vision pose measurement method and measurement system based on cooperative target
CN110009682B (en) Target identification and positioning method based on monocular vision
CN113034612B (en) Calibration device, method and depth camera
CN110415286B (en) External parameter calibration method of multi-flight time depth camera system
CN109272555B (en) External parameter obtaining and calibrating method for RGB-D camera
CN110827361B (en) Camera group calibration method and device based on global calibration frame
CN116168072A (en) Multi-camera large-size vision measurement method and system
CN114299156A (en) Method for calibrating and unifying coordinates of multiple cameras in non-overlapping area
Ding et al. A robust detection method of control points for calibration and measurement with defocused images
CN113409396A (en) Calibration method of ADAS monocular camera
WO2023201578A1 (en) Extrinsic parameter calibration method and device for monocular laser speckle projection system
CN110044266B (en) Photogrammetry system based on speckle projection
Jung et al. A novel 2.5 D pattern for extrinsic calibration of tof and camera fusion system
CN109712198B (en) Calibration method of advanced driving assistance system
KR102490521B1 (en) Automatic calibration through vector matching of the LiDAR coordinate system and the camera coordinate system
CN112116665B (en) Structural light sensor calibration method
CN112509062B (en) Calibration plate, calibration system and calibration method
CN110543612B (en) Card collection positioning method based on monocular vision measurement
CN111380503A (en) Monocular camera ranging method adopting laser-assisted calibration
CN105809685A (en) Single-concentric circle image-based camera calibration method
US20240085448A1 (en) Speed measurement method and apparatus based on multiple cameras
CN109815966A (en) A kind of mobile robot visual odometer implementation method based on improvement SIFT algorithm
CN114814865A (en) Method and system for vehicle ranging, vehicle and program product
US11418771B1 (en) Method for calibrating 3D camera by employing calibrated 2D camera
JP2022111072A (en) Target-free rgbd camera alignment to robots

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant