CN111862224B - Method and device for determining external parameters between camera and laser radar - Google Patents

Method and device for determining external parameters between camera and laser radar Download PDF

Info

Publication number
CN111862224B
CN111862224B CN201910308370.4A CN201910308370A CN111862224B CN 111862224 B CN111862224 B CN 111862224B CN 201910308370 A CN201910308370 A CN 201910308370A CN 111862224 B CN111862224 B CN 111862224B
Authority
CN
China
Prior art keywords
coordinate system
calibration
camera
calibration plate
group
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910308370.4A
Other languages
Chinese (zh)
Other versions
CN111862224A (en
Inventor
步青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN201910308370.4A priority Critical patent/CN111862224B/en
Publication of CN111862224A publication Critical patent/CN111862224A/en
Application granted granted Critical
Publication of CN111862224B publication Critical patent/CN111862224B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application discloses a method and a device for determining external parameters between a camera and a laser radar, and belongs to the technical field of computer vision. The method comprises the following steps: obtaining a plurality of groups of calibration images; determining the linear characteristics of each edge of the calibration plate corresponding to each group of calibration images under the imaging coordinate system of the camera based on the two-dimensional images in each group of calibration images; determining three-dimensional coordinates of edge points of the calibration plate corresponding to each group of calibration images under a laser radar coordinate system based on the three-dimensional point cloud images in each group of calibration images; establishing a nonlinear relation of external parameters between the camera and the laser radar based on linear characteristics of each edge of the calibration plate under the imaging coordinate system and three-dimensional coordinates of an edge point of the calibration plate under the laser radar coordinate system, which correspond to each group of calibration images; based on the nonlinear relationship, an outlier between the camera and the lidar is determined. By adopting the method and the device, the external parameters before the camera and the laser radar can be accurately determined.

Description

Method and device for determining external parameters between camera and laser radar
Technical Field
The application relates to the technical field of computer vision, in particular to a method and a device for determining external parameters between a camera and a laser radar.
Background
The camera may acquire a two-dimensional RGB (Red Green Blue) image of the target, where the two-dimensional RGB image includes color information of the target, and the lidar may acquire a three-dimensional point cloud image of the target, where the three-dimensional point cloud image includes depth information of the target. In practical applications, the two are often combined, for example, in three-dimensional reconstruction, a camera is used to obtain an RGB image of scene color information, and a laser radar is used to obtain a three-dimensional point cloud image containing scene depth information. The depth information of the scene and the color information of the scene can be combined through the external parameters between the camera and the laser radar to obtain a three-dimensional image of the scene with the color information, wherein the external parameters are a rotation matrix and a translation vector, and the conversion of a certain point from a laser radar coordinate system to a camera coordinate system can be completed based on the external parameters, and vice versa. However, before combining the depth information of the scene and the color information of the scene, the external parameters between the camera and the lidar need to be calibrated in advance.
If the existing method for calibrating the external parameters between the camera and the laser radar is adopted, the characteristic points of two frames of images acquired by the laser radar are required to be matched so as to determine the relative external parameters of the laser radar, and the relative external parameters of the laser radar obtained by the method are inaccurate because the points on the laser radar imaging are sparse, the external parameters between the camera and the laser radar, which are further obtained based on the relative external parameters of the laser radar, are inaccurate.
Disclosure of Invention
In order to solve the problems of the related art, the embodiment of the application provides a method and a device for determining external parameters between a camera and a laser radar. The technical scheme is as follows:
in a first aspect, there is provided a method of determining an external parameter between a camera and a lidar, the method comprising:
acquiring a plurality of groups of calibration images, wherein each group of calibration images comprises a two-dimensional image and a three-dimensional laser point cloud image;
determining the linear characteristics of each edge of the calibration plate corresponding to each group of calibration images under the imaging coordinate system of the camera based on the two-dimensional images in each group of calibration images;
determining three-dimensional coordinates of edge points of the calibration plate corresponding to each group of calibration images under a laser radar coordinate system based on the three-dimensional point cloud images in each group of calibration images;
establishing a nonlinear relation of external parameters between the camera and the laser radar based on linear characteristics of each edge of the calibration plate under the imaging coordinate system and three-dimensional coordinates of an edge point of the calibration plate under the laser radar coordinate system, which correspond to each group of calibration images;
based on the nonlinear relationship, an outlier between the camera and the lidar is determined.
Optionally, the establishing a nonlinear relation of external parameters between the camera and the laser radar based on the linear characteristics of each edge of the calibration board under the imaging coordinate system corresponding to each group of calibration images and the three-dimensional coordinates of the edge point of the calibration board under the laser radar coordinate system includes:
Determining plane characteristics of the calibration plate corresponding to each group of calibration images under the camera coordinate system based on the coordinates of each angular point of the calibration plate under the world coordinate system corresponding to each group of calibration images and external parameters between the camera coordinate system and the world coordinate system;
determining two-dimensional coordinates of edge points of the calibration plate corresponding to each group of calibration images under the camera coordinate system based on the two-dimensional images in each group of calibration images;
determining a direction vector corresponding to each edge of the calibration plate under the camera coordinate system corresponding to each group of calibration images based on the plane characteristics of the calibration plate under the camera coordinate system corresponding to each group of calibration images and the two-dimensional coordinates of the edge points of the calibration plate under the camera coordinate system;
determining a direction vector corresponding to each edge of the calibration plate under the laser radar coordinate system corresponding to each group of calibration images based on the three-dimensional coordinates of the edge point of the calibration plate under the laser radar coordinate system corresponding to each group of calibration images;
and establishing an external parameter nonlinear relation between the camera and the laser radar based on the direction vectors corresponding to the edges of the calibration plate under the camera coordinate system, the direction vectors corresponding to the edges of the calibration plate under the laser radar coordinate system, the linear characteristics of the edges of the calibration plate under the imaging coordinate system and the three-dimensional coordinates of the edge points of the calibration plate under the laser radar coordinate system, which are corresponding to each group of calibration images.
Optionally, the establishing the nonlinear relationship of the external parameters between the camera and the laser radar based on the direction vector corresponding to each edge of the calibration board under the camera coordinate system, the direction vector corresponding to each edge of the calibration board under the laser radar coordinate system, each edge straight line feature of the calibration board under the imaging coordinate system, and the three-dimensional coordinates of the edge point of the calibration board under the laser radar coordinate system, where the three-dimensional relationship includes:
determining a normal vector of a plane of the calibration plate under the camera coordinate system corresponding to each group of calibration images based on the plane characteristics of the calibration plate under the camera coordinate system corresponding to each group of calibration images;
determining the normal vector of the plane of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system based on the three-dimensional coordinates of the edge point of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system;
and establishing a nonlinear relation of external parameters between the camera and the laser radar based on the normal vector of the plane of the calibration plate under the camera coordinate system, the normal vector of the plane of the calibration plate under the laser radar coordinate system, the direction vector corresponding to each edge of the calibration plate under the camera coordinate system, the direction vector corresponding to each edge of the calibration plate under the laser radar coordinate system, each edge straight line characteristic of the calibration plate under the imaging coordinate system and the three-dimensional coordinate of the edge point of the calibration plate under the laser radar coordinate system, which are corresponding to each group of calibration images.
Optionally, the establishing the nonlinear relation of the external parameters between the camera and the laser radar based on the normal vector of the plane of the calibration board under the camera coordinate system, the normal vector of the plane of the calibration board under the laser radar coordinate system, the direction vector of each edge of the calibration board under the camera coordinate system, the direction vector of each edge of the calibration board under the laser radar coordinate system, each edge straight line feature of the calibration board under the imaging coordinate system, and the three-dimensional coordinates of the edge point of the calibration board under the laser radar coordinate system, includes:
establishing a cost function of external parameters between the camera and the laser radar based on a normal vector of a plane of the calibration plate under the camera coordinate system, a normal vector of a plane of the calibration plate under the laser radar coordinate system, a direction vector of each edge of the calibration plate under the camera coordinate system, a direction vector of each edge of the calibration plate under the laser radar coordinate system, each edge straight line characteristic of the calibration plate under the imaging coordinate system, and an edge point three-dimensional coordinate of the calibration plate under the laser radar coordinate system, which are corresponding to each group of calibration images;
The determining the external parameters between the camera and the laser radar based on the nonlinear relation comprises the following steps:
and optimizing the cost function, and determining the external parameters between the camera and the laser radar.
Optionally, the establishing a cost function of external parameters between the camera and the laser radar based on a normal vector of a plane of the calibration board under the camera coordinate system, a normal vector of a plane of the calibration board under the laser radar coordinate system, a direction vector of each edge of the calibration board under the camera coordinate system, a direction vector of each edge of the calibration board under the laser radar coordinate system, each edge straight line feature of the calibration board under the imaging coordinate system, and an edge point three-dimensional coordinate of the calibration board under the laser radar coordinate system, where the cost function includes:
determining a normal vector expression of a plane of the calibration plate under the camera coordinate system corresponding to each group of calibration images based on the normal vector of the plane of the calibration plate under the laser radar coordinate system corresponding to each group of calibration images, wherein unknown parameters in the normal vector expression comprise external parameters between the camera and the laser radar;
Calculating a difference value of a normal vector expression of the calibration plate under the camera coordinate system and a normal vector of the plane of the calibration plate under the camera coordinate system corresponding to each group of calibration images to obtain a normal vector difference value expression;
determining a direction vector expression corresponding to each edge of the calibration plate corresponding to each group of calibration images under the camera coordinate system based on a direction vector corresponding to each edge of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system, wherein unknown parameters in the direction vector expression comprise external parameters between the camera and the laser radar;
calculating a difference value of a direction vector expression corresponding to each edge of the calibration plate under the camera coordinate system corresponding to each group of calibration images and a direction vector corresponding to each edge of the calibration plate under the camera coordinate system to obtain a direction vector difference value expression;
determining an edge point expression of the calibration plate corresponding to each group of calibration images under the imaging coordinate system based on the three-dimensional coordinates of the edge point of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system and the internal parameters of the camera, wherein unknown parameters in the edge point expression comprise external parameters between the camera and the laser radar;
Calculating Euclidean distance for the edge point expression of the calibration plate under the imaging coordinate system corresponding to each group of calibration images and each edge linear characteristic of the calibration plate under the imaging coordinate system to obtain a distance expression;
and establishing a cost function of the external parameters between the camera and the laser radar based on the normal vector difference value expression, the direction vector difference value expression and the distance expression.
Optionally, the optimizing the cost function, determining the external parameters between the camera and the laser radar includes:
and optimizing the cost function by using a nonlinear optimization method, and determining the external parameters between the camera and the laser radar.
Optionally, the calibration plate is polygonal, the scanning line of the laser radar is not parallel to any edge of the calibration plate, and each edge of the calibration plate has the scanning line of the laser radar passing through.
In a second aspect, there is provided a system for determining an external parameter between a camera and a lidar, the system comprising: camera, lidar and computer device, wherein:
the camera is used for respectively shooting two-dimensional images of the calibration plates in a plurality of postures and sending the shot two-dimensional images to the computer equipment;
The laser radar is used for respectively shooting three-dimensional laser point cloud images of the calibration plates in the plurality of poses and sending the shot three-dimensional laser point cloud images to the computer equipment;
the computer equipment is used for taking a two-dimensional image and a three-dimensional laser point cloud image of the calibration plate under the same pose, which are respectively shot by the camera and the laser radar, as a group of calibration images, and determining a plurality of groups of calibration images; and determining external parameters between the camera and the laser radar based on the plurality of groups of calibration images.
Optionally, the computer device is configured to:
determining the linear characteristics of each edge of the calibration plate corresponding to each group of calibration images under the imaging coordinate system of the camera based on the two-dimensional images in each group of calibration images; determining three-dimensional coordinates of edge points of the calibration plate corresponding to each group of calibration images under a laser radar coordinate system based on the three-dimensional point cloud images in each group of calibration images; and determining external parameters between the camera and the laser radar based on the linear characteristics of each edge of the calibration plate corresponding to each group of calibration images under the imaging coordinate system of the camera and the three-dimensional coordinates of the edge point of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system.
Optionally, the computer device is configured to:
determining plane characteristics of the calibration plate corresponding to each group of calibration images under the camera coordinate system based on the coordinates of each angular point of the calibration plate under the world coordinate system corresponding to each group of calibration images and external parameters between the camera coordinate system and the world coordinate system; and determining the external parameters between the camera and the laser radar according to the linear characteristics of each edge of the calibration plate corresponding to each group of calibration images under the imaging coordinate system of the camera, the three-dimensional coordinates of the edge point of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system and the plane characteristics of the calibration plate corresponding to each group of calibration images under the camera coordinate system of the camera.
In a third aspect, there is provided an apparatus for determining an external parameter between a camera and a lidar, the apparatus comprising:
the acquisition module is used for acquiring a plurality of groups of calibration images, wherein each group of calibration images comprises a two-dimensional image and a three-dimensional laser point cloud image;
the determining module is used for determining the linear characteristics of each edge of the calibration plate corresponding to each group of calibration images under the imaging coordinate system of the camera based on the two-dimensional images in each group of calibration images;
The determining module is used for determining three-dimensional coordinates of edge points of the calibration plate corresponding to each group of calibration images under a laser radar coordinate system based on the three-dimensional point cloud images in each group of calibration images;
the establishing module is used for establishing a nonlinear relation of external parameters between the camera and the laser radar based on linear characteristics of each edge of the calibration plate under the imaging coordinate system and three-dimensional coordinates of an edge point of the calibration plate under the laser radar coordinate system, which correspond to each group of calibration images;
an optimization module for determining external parameters between the camera and the laser radar based on the nonlinear relation
Optionally, the establishing module is configured to:
determining plane characteristics of the calibration plate corresponding to each group of calibration images under the camera coordinate system based on the coordinates of each angular point of the calibration plate under the world coordinate system corresponding to each group of calibration images and external parameters between the camera coordinate system and the world coordinate system;
determining two-dimensional coordinates of edge points of the calibration plate corresponding to each group of calibration images under the camera coordinate system based on the two-dimensional images in each group of calibration images;
determining a direction vector corresponding to each edge of the calibration plate under the camera coordinate system corresponding to each group of calibration images based on the plane characteristics of the calibration plate under the camera coordinate system corresponding to each group of calibration images and the two-dimensional coordinates of the edge points of the calibration plate under the camera coordinate system;
Determining a direction vector corresponding to each edge of the calibration plate under the laser radar coordinate system corresponding to each group of calibration images based on the three-dimensional coordinates of the edge point of the calibration plate under the laser radar coordinate system corresponding to each group of calibration images;
and establishing an external parameter nonlinear relation between the camera and the laser radar based on the direction vectors corresponding to the edges of the calibration plate under the camera coordinate system, the direction vectors corresponding to the edges of the calibration plate under the laser radar coordinate system, the linear characteristics of the edges of the calibration plate under the imaging coordinate system and the three-dimensional coordinates of the edge points of the calibration plate under the laser radar coordinate system, which are corresponding to each group of calibration images.
Optionally, the establishing module is configured to:
determining a normal vector of a plane of the calibration plate under the camera coordinate system corresponding to each group of calibration images based on the plane characteristics of the calibration plate under the camera coordinate system corresponding to each group of calibration images;
determining the normal vector of the plane of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system based on the three-dimensional coordinates of the edge point of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system;
And establishing a nonlinear relation of external parameters between the camera and the laser radar based on the normal vector of the plane of the calibration plate under the camera coordinate system, the normal vector of the plane of the calibration plate under the laser radar coordinate system, the direction vector corresponding to each edge of the calibration plate under the camera coordinate system, the direction vector corresponding to each edge of the calibration plate under the laser radar coordinate system, each edge straight line characteristic of the calibration plate under the imaging coordinate system and the three-dimensional coordinate of the edge point of the calibration plate under the laser radar coordinate system, which are corresponding to each group of calibration images.
Optionally, the establishing module is configured to:
establishing a cost function of external parameters between the camera and the laser radar based on a normal vector of a plane of the calibration plate under the camera coordinate system, a normal vector of a plane of the calibration plate under the laser radar coordinate system, a direction vector of each edge of the calibration plate under the camera coordinate system, a direction vector of each edge of the calibration plate under the laser radar coordinate system, each edge straight line characteristic of the calibration plate under the imaging coordinate system, and an edge point three-dimensional coordinate of the calibration plate under the laser radar coordinate system, which are corresponding to each group of calibration images;
The optimizing module is used for:
and optimizing the cost function, and determining the external parameters between the camera and the laser radar.
Optionally, the establishing module is configured to:
determining a normal vector expression of a plane of the calibration plate under the camera coordinate system corresponding to each group of calibration images based on the normal vector of the plane of the calibration plate under the laser radar coordinate system corresponding to each group of calibration images, wherein unknown parameters in the normal vector expression comprise external parameters between the camera and the laser radar;
calculating a difference value of a normal vector expression of the calibration plate under the camera coordinate system and a normal vector of the plane of the calibration plate under the camera coordinate system corresponding to each group of calibration images to obtain a normal vector difference value expression;
determining a direction vector expression corresponding to each edge of the calibration plate corresponding to each group of calibration images under the camera coordinate system based on a direction vector corresponding to each edge of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system, wherein unknown parameters in the direction vector expression comprise external parameters between the camera and the laser radar;
Calculating a difference value of a direction vector expression corresponding to each edge of the calibration plate under the camera coordinate system corresponding to each group of calibration images and a direction vector corresponding to each edge of the calibration plate under the camera coordinate system to obtain a direction vector difference value expression;
determining an edge point expression of the calibration plate corresponding to each group of calibration images under the imaging coordinate system based on the three-dimensional coordinates of the edge point of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system and the internal parameters of the camera, wherein unknown parameters in the edge point expression comprise external parameters between the camera and the laser radar;
calculating Euclidean distance for the edge point expression of the calibration plate under the imaging coordinate system corresponding to each group of calibration images and each edge linear characteristic of the calibration plate under the imaging coordinate system to obtain a distance expression;
and establishing a cost function of the external parameters between the camera and the laser radar based on the normal vector difference value expression, the direction vector difference value expression and the distance expression.
Optionally, the optimizing module is configured to:
And optimizing the cost function by using a nonlinear optimization device, and determining the external parameters between the camera and the laser radar.
Optionally, the calibration plate is polygonal, the scanning line of the laser radar is not parallel to any edge of the calibration plate, and each edge of the calibration plate has the scanning line of the laser radar passing through.
In a fourth aspect, there is provided a computer device comprising a processor and a memory having stored therein at least one instruction loaded and executed by the processor to implement a method of determining a look-out between a camera and a lidar as described in the first aspect above.
In a fifth aspect, there is provided a computer readable storage medium having stored therein at least one instruction loaded and executed by the processor to implement a method of determining external parameters between a camera and a lidar as described in the first aspect above.
The technical scheme provided by the embodiment of the application has the beneficial effects that at least:
in the embodiment of the application, a plurality of groups of calibration images are acquired, then, based on two-dimensional images in each group of calibration images, each edge straight line characteristic of the calibration plate corresponding to each group of calibration images under an imaging coordinate system of a camera is determined, and then, based on three-dimensional point cloud images in each group of calibration images, the three-dimensional coordinates of the edge points of the calibration plate corresponding to each group of calibration images under a laser radar coordinate system are determined. Based on the linear characteristics of each edge of the calibration plate under the imaging coordinate system and the three-dimensional coordinates of the edge point of the calibration plate under the laser radar coordinate system, which correspond to each group of calibration images. The method can obtain the three-dimensional coordinates of the edge point of the calibration plate under the laser radar coordinate system and the edge linear characteristics of the calibration plate under the imaging coordinate system of the camera, so that the nonlinear relation of the external parameters between the camera and the laser radar can be established according to the relation between the two. And then according to the nonlinear relation, the external parameters between the camera and the laser radar can be determined. The method does not need to acquire the parameter of the relative external parameters of the laser radar, the influence of the sparseness of imaging points of the laser radar is small, and the obtained external parameters between the camera and the laser radar are accurate.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a diagram of an environment in which a method of determining an external parameter between a camera and a lidar is implemented, provided by an embodiment of the present application;
FIG. 2 is a schematic diagram of a laser radar scanning calibration plate according to an embodiment of the present application;
FIG. 3 is a flow chart of a method for determining an external parameter between a camera and a lidar according to an embodiment of the present application;
FIG. 4 is a schematic structural diagram of an apparatus for determining external parameters between a camera and a lidar according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a computer device according to an embodiment of the present application;
fig. 6 is a flowchart of a method for determining an external parameter between a camera and a lidar according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail with reference to the accompanying drawings.
The embodiment of the application provides a method for determining external parameters between a camera and a laser radar, which can be realized by computer equipment. The computer device may be a notebook, desktop, or the like, among others. The implementation environment of the method can be as shown in fig. 1, a camera is arranged below the laser radar, the camera and the laser radar are fixed in position, and a calibration plate is arranged in an overlapping area of a camera view field and a laser radar view field. The laser radar can be a multi-line laser radar, the camera can be an RGB camera, and the surface of the calibration plate can be a checkerboard. The laser radar view field is a view field in the vertical direction, namely, the region range which can be scanned by the laser radar in the vertical direction, the view field of the laser radar in the horizontal direction can reach 360-degree looking around region, and the camera view field is the region range which can be shot by the camera. And as shown in fig. 2, the calibration plate is placed so as to satisfy the following conditions: the scan lines of the lidar pass through the respective edges of the calibration plate. Before determining the external parameters between the camera and the laser radar, the laser radar and the camera with fixed positions are required to be used for shooting a plurality of groups of calibration images to be used as input data of the external parameters determining method. After the laser radar or the camera shoots to obtain a plurality of groups of calibration images, the calibration images are input into the computer equipment, the computer equipment processes the calibration images, and the external parameters are determined by using a method for determining the external parameters between the camera and the laser radar. The following is an exemplary embodiment of a method of determining an external reference between a camera and a lidar.
As shown in fig. 3, the process flow of the method may include the following steps:
in step 301, a plurality of sets of calibration images are acquired, each set of calibration images including a two-dimensional image and a three-dimensional laser point cloud image.
The camera and the laser radar are different in positions when shooting different groups of calibration images, the positions of the calibration plates are the same when shooting the same group of calibration images, and in order to achieve a better calibration effect, the camera and the laser radar can synchronously shoot the calibration plates when shooting the same group of calibration images. The two-dimensional image may be an RGB image.
In implementation, before shooting calibration images, the camera and the laser radar can firstly use a common monocular camera calibration method to calibrate the internal parameters of the camera so as to obtain the internal parameters of the camera, wherein the monocular camera calibration method can be a checkerboard calibration method of Zhang Zhengyou or other general monocular camera calibration methods. The terminal acquires a plurality of groups of calibration images, wherein each group of calibration images comprises a two-dimensional image shot by a camera and a three-dimensional point cloud image shot by a laser radar. The method for acquiring the image can be that after the camera and the laser radar shoot to obtain the calibration image, the calibration image is directly transmitted to the computer equipment through wired or wireless data transmission, or the calibration image can be firstly stored locally after the camera and the laser radar shoot to obtain the calibration image, and when the calibration image is needed, the computer equipment acquires the calibration image from the camera and the laser radar. The computer device may perform the following processing for each set of calibration images obtained, respectively.
In step 302, based on the two-dimensional images in each set of calibration images, edge straight line characteristics of the calibration plate corresponding to each set of calibration images under the imaging coordinate system of the camera are determined.
The linear characteristic of each edge may be a linear equation, and the linear characteristic is described as a linear equation.
In implementation, the two-dimensional image in the calibration image may be first subjected to distortion correction using the camera internal reference obtained by calibration in advance. Then, a Harris corner extraction algorithm, an OpenCV corner extraction algorithm based on growth and the like can be used for extracting corner points of the checkerboard on the calibration plate from the two-dimensional image after distortion correction, and further in order to improve the accuracy of subsequent processes, the algorithm or other corner extraction algorithms can be used for extracting sub-pixel corner points of the checkerboard on the calibration plate. Then, the extracted sub-pixel corner points are used as detection points, and an LSD (Line Segment Detector, line detection) algorithm is used for detecting the edges of the mark point plate on the two-dimensional image after distortion correction, so that the two-dimensional coordinates of the edge points of the mark plate under an imaging coordinate system are obtained. And then performing linear fitting on the two-dimensional coordinates of the edge points of the obtained calibration plate under the imaging coordinate system to obtain an edge linear equation of the calibration plate under the imaging coordinate system.
In step 303, based on the three-dimensional point cloud image in each set of calibration images, the three-dimensional coordinates of the edge point of the calibration plate corresponding to each set of calibration images in the laser radar coordinate system are determined.
In implementation, for the obtained three-dimensional point cloud image in each set of calibration images, a technician can manually screen out the three-dimensional laser points of the calibration plate and the three-dimensional laser points with a certain distance around the calibration plate. As the calibration boards are mostly wood boards, the chessboards can be chessboards printed on paper and then are attached to the calibration boards. And the reflectivity of the laser radar to objects with different materials and colors is different, so that the obtained three-dimensional laser points have fluctuation. Therefore, before the three-dimensional laser points are used for subsequent processing, the three-dimensional laser points can be smoothed, that is, a RANSAC (Random Sample Consensus, random sampling consistency) algorithm is used for performing plane fitting on the screened three-dimensional laser points, and then the three-dimensional laser points with the fitting plane being greater than a preset threshold value are pulled back to the plane. Thus, the three-dimensional laser points of the calibration plate and the three-dimensional laser points with a certain distance around the calibration plate can be obtained, and the three-dimensional laser points are smoothly filtered under a laser radar coordinate system. And then searching laser points with depth gradient change exceeding a preset threshold value from the three-dimensional laser points subjected to smooth filtering as edge points of the calibration plate under the laser radar coordinate system, and further obtaining the three-dimensional coordinates of the edge points of the calibration plate under the laser radar coordinate system.
In step 304, a nonlinear relationship of external parameters between the camera and the laser radar is established based on the linear characteristics of each edge of the calibration plate under the imaging coordinate system corresponding to each group of calibration images and the three-dimensional coordinates of the edge points of the calibration plate under the laser radar coordinate system.
The nonlinear relation of the external parameters between the camera and the laser radar can be taken as a cost function, and the nonlinear relation is taken as the cost function to be described below.
In practice, the three-dimensional coordinates of the edge points of the calibration plate in the laser radar coordinate system can beThe representation, where k is the number of three-dimensional coordinates of the edge points of the calibration plate in the laser radar coordinate system (number of edge laser points), the external parameters between the camera to be determined and the laser radar can be>And->Wherein->For rotating matrix->Obtaining the calibration plate in a laser radar coordinate system based on the parameters as translation vectorsThe expression of the three-dimensional coordinates of the lower edge point in the camera coordinate system may be:
then, the expression of the three-dimensional coordinates of the edge point of the calibration plate under the laser radar coordinate system under the imaging coordinate system of the camera can be:wherein K is c Is the camera intrinsic obtained in step 301.
The linear equation of each edge of the calibration plate under the imaging coordinate system can be as follows Wherein j is the number of edges of the calibration plate, such that the pair is +.>And->Can ask for->And Euclidean distance between them, a distance expression is obtained. Then, according to the distance expression, a cost function of the external parameters between the camera and the laser radar can be established, and specifically, the cost function can be as follows:
wherein N is the group number of the calibration images, and M is the total number of the laser points at the edge of the calibration plate.
Optionally, in order to make the determined external parameters of the camera and the lidar more accurate, when the cost function is established, the cost function may be established based on more conditions, and several methods for establishing the cost function are given below, and other methods for establishing the cost function may be provided, which are not described herein.
According to the method, plane characteristics of the calibration plate corresponding to each group of calibration images under a camera coordinate system are determined based on the corner coordinates of the calibration plate corresponding to each group of calibration images under the world coordinate system and external parameters between the camera coordinate system and the world coordinate system. And determining two-dimensional coordinates of the edge points of the calibration plate corresponding to each group of calibration images under a camera coordinate system based on the two-dimensional images in each group of calibration images. And determining a direction vector corresponding to each edge linear equation of the calibration plate corresponding to each group of calibration images under the camera coordinate system based on the plane characteristics of the calibration plate corresponding to each group of calibration images under the camera coordinate system and the two-dimensional coordinates of the edge point of the calibration plate under the camera coordinate system. Determining the direction vector corresponding to each edge of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system based on the three-dimensional coordinates of the edge point of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system, and establishing the nonlinear relation of external parameters between the camera and the laser radar based on the direction vector corresponding to each edge of the calibration plate corresponding to each group of calibration images under the camera coordinate system, the direction vector corresponding to each edge of the calibration plate under the laser radar coordinate system, the straight line characteristics of each edge of the calibration plate under the imaging coordinate system and the three-dimensional coordinates of the edge point of the calibration plate under the laser radar coordinate system.
The plane feature may be a plane equation, and the plane feature is described as a plane equation.
In the implementation, firstly, the external parameter calibration method of the camera coordinate system relative to the world coordinate system of the common monocular camera can be used for calibrating the external parameters of the camera coordinate system relative to the world coordinate system, and the external parameter calibration method can be a checkerboard calibration method of Zhang Zhengyou or other similar external parameter calibration methods, and the application is not limited to the above.
Then, converting the three-dimensional coordinates of each angular point of the checkerboard on the calibration board under the world coordinate system into the camera coordinate system by utilizing the external parameters of the camera coordinate system of the camera relative to the world coordinate system to obtain the three-dimensional coordinates of each angular point of the checkerboard on the calibration board under the camera coordinate system, and then obtaining the angular pointsAnd performing plane fitting on the three-dimensional coordinates to obtain a plane equation of the calibration plate under a camera coordinate system. Then, the internal reference of the camera is used, the two-dimensional coordinates of the edge points of the calibration plate under the imaging coordinate system are back projected under the camera coordinate system, the two-dimensional coordinates of the edge points of the calibration plate under the camera coordinate system are obtained, and then the obtained two-dimensional coordinates of the edge points of the calibration plate under the camera coordinate system are substituted into the plane equation of the calibration plate under the camera coordinate system, so that the three-dimensional coordinates of the edge points of the calibration plate under the camera coordinate system can be obtained. Then, the obtained three-dimensional coordinates of the edge points are subjected to linear fitting, so that a linear equation of each edge of the calibration plate under the camera coordinate system can be obtained, further, the direction vector corresponding to each edge of the calibration plate under the camera coordinate system can be calculated, and the direction vector can be And (3) representing.
Performing linear fitting on the edge points of the calibration plate under the laser radar coordinate system obtained in the step 303, so as to obtain the linear equation of each edge of the calibration plate under the laser radar coordinate system, thereby calculating the direction vector corresponding to each edge of the calibration plate under the laser radar coordinate system, wherein the direction vector can beAnd (3) representing. Then, the direction vector +.>The expression under the camera coordinate system may be: />
For a pair ofAnd->The difference value can be calculated, in particular, the difference value can be calculated by first checking->And->And obtaining a difference, and then calculating the two norms of the difference to obtain a direction vector difference expression. Based on the direction vector difference expression and the distance expression in the first method, a cost function of external parameters between the camera and the laser radar is established, and the specific cost function can be as follows:
wherein lambda is 1 And lambda (lambda) 2 The constant coefficient can be adjusted according to the needs in actual use, but the following conditions are satisfied: 0<λ 1 ,λ 2 <1, and lambda 12 =1。
Determining the normal vector of the plane of the calibration plate corresponding to each group of calibration images under the camera coordinate system based on the plane characteristics of the calibration plate corresponding to each group of calibration images under the camera coordinate system; determining the normal vector of the plane of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system based on the three-dimensional coordinates of the edge point of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system; and establishing a nonlinear relation of external parameters between a camera and a laser radar based on the normal vector of the plane of the calibration plate under the camera coordinate system, the normal vector of the plane of the calibration plate under the laser radar coordinate system, the linear characteristics of each edge of the calibration plate under the imaging coordinate system and the three-dimensional coordinates of the edge point of the calibration plate under the laser radar coordinate system, which correspond to each group of calibration images.
The plane feature may be a plane equation, and the plane feature is described as a plane equation.
In practice, after the above method obtains the plane equation of the calibration plate under the camera coordinate system, the normal vector of the plane expressed by the plane equation can be calculated, and the normal vector can beAnd (3) representing. Then, calculating the plane equation of the calibration plate under the laser radar coordinate system by the three-dimensional coordinates of each laser point of the calibration plate under the laser radar coordinate system screened in step 303, thereby calculating the normal vector of the plane expressed by the plane equation, wherein the normal vector can be->And (3) representing. Then, the normal vector ∈ ->The expression under camera coordinates may be: />
For the above obtainedAnd->The difference value can be calculated, in particular, the difference value can be calculated by first checking->And->And obtaining a difference, and then calculating the two norms of the difference to obtain a normal difference expression. And then according to the normal difference expression and the distance expression obtained in the step 304, establishing a cost function of the external parameters between the camera and the laser radar, wherein the cost function can be specifically as follows:
wherein lambda is 1 And lambda (lambda) 3 Is a constant coefficient, and can be adjusted according to the needs in actual use But the following conditions are satisfied: 0<λ 1 ,λ 3 <1, and lambda 13 =1。
The method comprises the steps of establishing a nonlinear relation of external parameters between a camera and a laser radar based on a normal vector of a plane of a calibration plate under a camera coordinate system, a normal vector of a plane of the calibration plate under a laser radar coordinate system, a direction vector of each edge of the calibration plate under the camera coordinate system, a direction vector of each edge of the calibration plate under the laser radar coordinate system, linear characteristics of each edge of the calibration plate under an imaging coordinate system and three-dimensional coordinates of edge points of the calibration plate under the laser radar coordinate system, wherein the normal vector of the plane of the calibration plate under the camera coordinate system, the normal vector of the plane of the calibration plate under the laser radar coordinate system, the direction vector of each edge of the calibration plate under the camera coordinate system, the direction vector of each edge of the calibration plate under the laser radar coordinate system, and the three-dimensional coordinates of edge points of the calibration plate under the imaging coordinate system.
The cost function of the external parameters between the camera and the lidar may be established based on the distance expression obtained in the above step 304, the direction vector expression obtained in the above method one, and the normal vector expression obtained in the above method two, and the specific cost function may be as follows:
wherein lambda is 1 、λ 2 And lambda (lambda) 3 The constant coefficient can be adjusted according to the needs in actual use, but the following conditions are satisfied: 0<λ 1 ,λ 2 ,λ 3 <1, and lambda 123 =1。
In step 304, external parameters between the camera and the lidar are determined based on the nonlinear relationship described above.
In practice, the cost function obtained in step 303 and the related steps may be optimized by using a common cost function optimization method. When the cost function value is minimum, the external parameters of the corresponding camera and the laser radar are the optimal external parameters And->The specific optimization method may be LM (Levenberg-Marquardt, le Wen Beige-Marquardt method), gradient descent method, newton method, etc. in the nonlinear optimization method, and the specific optimization method adopted is not limited in the present application. />
The embodiment of the application also provides a method for determining the external parameters between the camera and the laser radar, which can be completed by a system for determining the external parameters between the camera and the laser radar, wherein the system comprises the camera, the laser radar and computer equipment, and as shown in fig. 6, the processing flow of the method can comprise the following steps:
in step 601, the camera respectively shoots two-dimensional images of the calibration plates in a plurality of postures, and the shot two-dimensional images are sent to the computer equipment.
Wherein the plurality of poses are a plurality of different poses.
Step 602, the laser radar respectively shoots three-dimensional laser point cloud images of the calibration plates in a plurality of postures, and the shot three-dimensional laser point cloud images are sent to the computer equipment.
Step 603, the computer device uses a two-dimensional image and a three-dimensional laser point cloud image of the calibration plate under the same pose, which are respectively shot by the camera and the laser radar, as a set of calibration images, and determines a plurality of sets of calibration images; and determining external parameters between the camera and the laser radar based on the plurality of groups of calibration images.
Alternatively, the computer device may make the determination based on a variety of information in determining the external parameters between the camera and the lidar, and the corresponding processing in step 603 may be as follows: determining the linear characteristics of each edge of the calibration plate corresponding to each group of calibration images under the imaging coordinate system of the camera based on the two-dimensional images in each group of calibration images; determining three-dimensional coordinates of edge points of the calibration plates corresponding to each group of calibration images under a laser radar coordinate system based on the three-dimensional point cloud images in each group of calibration images; and determining external parameters between the camera and the laser radar based on the linear characteristics of each edge of the calibration plate corresponding to each group of calibration images under the imaging coordinate system of the camera and the three-dimensional coordinates of the edge point of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system.
Optionally, in order to more accurately determine the external parameters between the camera and the lidar, in step 603, there may be further processing as follows: determining plane characteristics of the calibration plates corresponding to each group of calibration images under the camera coordinate system based on the coordinates of each angular point of the calibration plates corresponding to each group of calibration images under the world coordinate system and external parameters between the camera coordinate system and the world coordinate system; and determining the external parameters between the camera and the laser radar according to the linear characteristics of each edge of the calibration plate corresponding to each group of calibration images under the imaging coordinate system of the camera, the three-dimensional coordinates of the edge point of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system and the plane characteristics of the calibration plate corresponding to each group of calibration images under the camera coordinate system of the camera.
It should be noted that, the implementation of the method shown in fig. 6 is the same as that of the method shown in fig. 3, and will not be described here again.
Based on the same technical concept, the embodiment of the present application further provides an apparatus for determining external parameters between a camera and a lidar, where the apparatus may be a computer device in the foregoing embodiment, as shown in fig. 4, and the apparatus includes: the system comprises an acquisition module 410, a determination module 420, a building module 430 and an optimization module 440.
An acquisition module 410, configured to acquire a plurality of sets of calibration images, each set of calibration images including a two-dimensional image and a three-dimensional laser point cloud image;
the determining module 420 is configured to determine, based on the two-dimensional images in each set of calibration images, each edge straight line feature of the calibration plate corresponding to each set of calibration images under the imaging coordinate system of the camera;
the determining module 420 is configured to determine three-dimensional coordinates of edge points of the calibration plate corresponding to each set of calibration images in a laser radar coordinate system based on the three-dimensional point cloud images in each set of calibration images;
the establishing module 430 is configured to establish a nonlinear relationship of external parameters between the camera and the lidar based on the linear characteristics of each edge of the calibration plate under the imaging coordinate system and the three-dimensional coordinates of the edge point of the calibration plate under the lidar coordinate system, where the linear characteristics correspond to each group of calibration images;
An optimization module 440 for determining external parameters between the camera and the lidar based on the nonlinear relationship
Optionally, the establishing module 430 is configured to:
determining plane characteristics of the calibration plate corresponding to each group of calibration images under the camera coordinate system based on the coordinates of each angular point of the calibration plate under the world coordinate system corresponding to each group of calibration images and external parameters between the camera coordinate system and the world coordinate system;
determining two-dimensional coordinates of edge points of the calibration plate corresponding to each group of calibration images under the camera coordinate system based on the two-dimensional images in each group of calibration images;
determining a direction vector corresponding to each edge of the calibration plate under the camera coordinate system corresponding to each group of calibration images based on the plane characteristics of the calibration plate under the camera coordinate system corresponding to each group of calibration images and the two-dimensional coordinates of the edge points of the calibration plate under the camera coordinate system;
determining a direction vector corresponding to each edge of the calibration plate under the laser radar coordinate system corresponding to each group of calibration images based on the three-dimensional coordinates of the edge point of the calibration plate under the laser radar coordinate system corresponding to each group of calibration images;
And establishing an external parameter nonlinear relation between the camera and the laser radar based on the direction vectors corresponding to the edges of the calibration plate under the camera coordinate system, the direction vectors corresponding to the edges of the calibration plate under the laser radar coordinate system, the linear characteristics of the edges of the calibration plate under the imaging coordinate system and the three-dimensional coordinates of the edge points of the calibration plate under the laser radar coordinate system, which are corresponding to each group of calibration images.
Optionally, the establishing module 430 is configured to:
determining a normal vector of a plane of the calibration plate under the camera coordinate system corresponding to each group of calibration images based on the plane characteristics of the calibration plate under the camera coordinate system corresponding to each group of calibration images;
determining the normal vector of the plane of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system based on the three-dimensional coordinates of the edge point of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system;
and establishing a nonlinear relation of external parameters between the camera and the laser radar based on the normal vector of the plane of the calibration plate under the camera coordinate system, the normal vector of the plane of the calibration plate under the laser radar coordinate system, the direction vector corresponding to each edge of the calibration plate under the camera coordinate system, the direction vector corresponding to each edge of the calibration plate under the laser radar coordinate system, each edge straight line characteristic of the calibration plate under the imaging coordinate system and the three-dimensional coordinate of the edge point of the calibration plate under the laser radar coordinate system, which are corresponding to each group of calibration images.
Optionally, the establishing module 430 is configured to:
establishing a cost function of external parameters between the camera and the laser radar based on a normal vector of a plane of the calibration plate under the camera coordinate system, a normal vector of a plane of the calibration plate under the laser radar coordinate system, a direction vector of each edge of the calibration plate under the camera coordinate system, a direction vector of each edge of the calibration plate under the laser radar coordinate system, each edge straight line characteristic of the calibration plate under the imaging coordinate system, and an edge point three-dimensional coordinate of the calibration plate under the laser radar coordinate system, which are corresponding to each group of calibration images;
the optimizing module 440 is configured to:
and optimizing the cost function, and determining the external parameters between the camera and the laser radar.
Optionally, the establishing module 430 is configured to:
determining a normal vector expression of a plane of the calibration plate under the camera coordinate system corresponding to each group of calibration images based on the normal vector of the plane of the calibration plate under the laser radar coordinate system corresponding to each group of calibration images, wherein unknown parameters in the normal vector expression comprise external parameters between the camera and the laser radar;
Calculating a difference value of a normal vector expression of the calibration plate under the camera coordinate system and a normal vector of the plane of the calibration plate under the camera coordinate system corresponding to each group of calibration images to obtain a normal vector difference value expression;
determining a direction vector expression corresponding to each edge of the calibration plate corresponding to each group of calibration images under the camera coordinate system based on a direction vector corresponding to each edge of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system, wherein unknown parameters in the direction vector expression comprise external parameters between the camera and the laser radar;
calculating a difference value of a direction vector expression corresponding to each edge of the calibration plate under the camera coordinate system corresponding to each group of calibration images and a direction vector corresponding to each edge of the calibration plate under the camera coordinate system to obtain a direction vector difference value expression;
determining an edge point expression of the calibration plate corresponding to each group of calibration images under the imaging coordinate system based on the three-dimensional coordinates of the edge point of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system and the internal parameters of the camera, wherein unknown parameters in the edge point expression comprise external parameters between the camera and the laser radar;
Calculating Euclidean distance for the edge point expression of the calibration plate under the imaging coordinate system corresponding to each group of calibration images and each edge linear characteristic of the calibration plate under the imaging coordinate system to obtain a distance expression;
and establishing a cost function of the external parameters between the camera and the laser radar based on the normal vector difference value expression, the direction vector difference value expression and the distance expression.
Optionally, the optimizing module 440 is configured to:
and optimizing the cost function by using a nonlinear optimization device, and determining the external parameters between the camera and the laser radar.
Optionally, the calibration plate is polygonal, the scanning line of the laser radar is not parallel to any edge of the calibration plate, and each edge of the calibration plate has the scanning line of the laser radar passing through.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
It should be noted that: the device for determining external parameters between the camera and the laser radar provided in the above embodiment is only exemplified by the division of the above functional modules when determining external parameters between the camera and the laser radar, and in practical application, the above functional allocation may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the device for determining the external parameters between the camera and the laser radar provided in the above embodiment and the method embodiment for determining the external parameters between the camera and the laser radar belong to the same concept, and detailed implementation processes of the device are referred to as method embodiments, which are not described herein.
In an exemplary embodiment, there is also provided a computer-readable storage medium having stored therein at least one instruction that is loaded and executed by a processor to implement the method of identifying an action category in the above-described embodiments. For example, the computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
Fig. 5 is a schematic structural diagram of a computer device according to an embodiment of the present application, where the computer device 500 may have a relatively large difference due to different configurations or performances, and may include one or more processors (central processing units, CPU) 501 and one or more memories 502, where at least one instruction is stored in the memories 502, and the at least one instruction is loaded and executed by the processors 501 to implement the above-mentioned method for determining external parameters between a camera and a lidar.
It will be understood by those skilled in the art that all or part of the steps for implementing the above embodiments may be implemented by hardware, or may be implemented by a program for instructing relevant hardware, where the program may be stored in a computer readable storage medium, and the storage medium may be a read-only memory, a magnetic disk or an optical disk, etc.
The foregoing description of the preferred embodiments of the application is not intended to limit the application to the precise form disclosed, and any such modifications, equivalents, and alternatives falling within the spirit and scope of the application are intended to be included within the scope of the application.

Claims (13)

1. A method of determining an external reference between a camera and a lidar, the method comprising:
acquiring a plurality of groups of calibration images, wherein each group of calibration images comprises a two-dimensional image and a three-dimensional laser point cloud image;
determining the linear characteristics of each edge of the calibration plate corresponding to each group of calibration images under the imaging coordinate system of the camera based on the two-dimensional images in each group of calibration images;
determining three-dimensional coordinates of edge points of the calibration plate corresponding to each group of calibration images under a laser radar coordinate system based on the three-dimensional point cloud images in each group of calibration images;
determining plane characteristics of the calibration plate corresponding to each group of calibration images under the camera coordinate system based on the coordinates of each angular point of the calibration plate under the world coordinate system corresponding to each group of calibration images and external parameters between the camera coordinate system and the world coordinate system;
determining two-dimensional coordinates of edge points of the calibration plate corresponding to each group of calibration images under the camera coordinate system based on the two-dimensional images in each group of calibration images;
Determining a direction vector corresponding to each edge of the calibration plate under the camera coordinate system corresponding to each group of calibration images based on the plane characteristics of the calibration plate under the camera coordinate system corresponding to each group of calibration images and the two-dimensional coordinates of the edge points of the calibration plate under the camera coordinate system;
determining a direction vector corresponding to each edge of the calibration plate under the laser radar coordinate system corresponding to each group of calibration images based on the three-dimensional coordinates of the edge point of the calibration plate under the laser radar coordinate system corresponding to each group of calibration images;
establishing a nonlinear relation of external parameters between the camera and the laser radar based on direction vectors corresponding to the edges of the calibration plate under the camera coordinate system, direction vectors corresponding to the edges of the calibration plate under the laser radar coordinate system, linear characteristics of the edges of the calibration plate under the imaging coordinate system and three-dimensional coordinates of edge points of the calibration plate under the laser radar coordinate system, which correspond to each group of calibration images;
based on the nonlinear relationship, an outlier between the camera and the lidar is determined.
2. The method of claim 1, wherein the establishing the nonlinear relationship of the external parameters between the camera and the lidar based on the direction vector corresponding to the edges of the calibration plate under the camera coordinate system, the direction vector corresponding to the edges of the calibration plate under the lidar coordinate system, the edge straight line characteristics of the calibration plate under the imaging coordinate system, and the edge point three-dimensional coordinates of the calibration plate under the lidar coordinate system for each set of calibration images comprises:
Determining a normal vector of a plane of the calibration plate under the camera coordinate system corresponding to each group of calibration images based on the plane characteristics of the calibration plate under the camera coordinate system corresponding to each group of calibration images;
determining the normal vector of the plane of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system based on the three-dimensional coordinates of the edge point of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system;
and establishing a nonlinear relation of external parameters between the camera and the laser radar based on the normal vector of the plane of the calibration plate under the camera coordinate system, the normal vector of the plane of the calibration plate under the laser radar coordinate system, the direction vector corresponding to each edge of the calibration plate under the camera coordinate system, the direction vector corresponding to each edge of the calibration plate under the laser radar coordinate system, each edge straight line characteristic of the calibration plate under the imaging coordinate system and the three-dimensional coordinate of the edge point of the calibration plate under the laser radar coordinate system, which are corresponding to each group of calibration images.
3. The method of claim 2, wherein the establishing a nonlinear relationship of the external parameters between the camera and the lidar based on the normal vector of the plane of the calibration plate under the camera coordinate system, the normal vector of the plane of the calibration plate under the lidar coordinate system, the direction vector of each edge of the calibration plate under the camera coordinate system, the direction vector of each edge of the calibration plate under the lidar coordinate system, each edge straight line feature of the calibration plate under the imaging coordinate system, and the edge point three-dimensional coordinates of the calibration plate under the lidar coordinate system for each set of calibration images comprises:
Establishing a cost function of external parameters between the camera and the laser radar based on a normal vector of a plane of the calibration plate under the camera coordinate system, a normal vector of a plane of the calibration plate under the laser radar coordinate system, a direction vector of each edge of the calibration plate under the camera coordinate system, a direction vector of each edge of the calibration plate under the laser radar coordinate system, each edge straight line characteristic of the calibration plate under the imaging coordinate system, and an edge point three-dimensional coordinate of the calibration plate under the laser radar coordinate system, which are corresponding to each group of calibration images;
the determining the external parameters between the camera and the laser radar based on the nonlinear relation comprises the following steps:
and optimizing the cost function, and determining the external parameters between the camera and the laser radar.
4. A method according to claim 3, wherein the establishing a cost function of the external parameters between the camera and the lidar based on the normal vector of the plane of the calibration plate under the camera coordinate system, the normal vector of the plane of the calibration plate under the lidar coordinate system, the direction vector of each edge of the calibration plate under the camera coordinate system, the direction vector of each edge of the calibration plate under the lidar coordinate system, the straight line feature of each edge of the calibration plate under the imaging coordinate system, and the three-dimensional coordinates of the edge points of the calibration plate under the lidar coordinate system for each set of calibration images comprises:
Determining a normal vector expression of a plane of the calibration plate under the camera coordinate system corresponding to each group of calibration images based on the normal vector of the plane of the calibration plate under the laser radar coordinate system corresponding to each group of calibration images, wherein unknown parameters in the normal vector expression comprise external parameters between the camera and the laser radar;
calculating a difference value of a normal vector expression of the calibration plate under the camera coordinate system and a normal vector of the plane of the calibration plate under the camera coordinate system corresponding to each group of calibration images to obtain a normal vector difference value expression;
determining a direction vector expression corresponding to each edge of the calibration plate corresponding to each group of calibration images under the camera coordinate system based on a direction vector corresponding to each edge of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system, wherein unknown parameters in the direction vector expression comprise external parameters between the camera and the laser radar;
calculating a difference value of a direction vector expression corresponding to each edge of the calibration plate under the camera coordinate system corresponding to each group of calibration images and a direction vector corresponding to each edge of the calibration plate under the camera coordinate system to obtain a direction vector difference value expression;
Determining an edge point expression of the calibration plate corresponding to each group of calibration images under the imaging coordinate system based on the three-dimensional coordinates of the edge point of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system and the internal parameters of the camera, wherein unknown parameters in the edge point expression comprise external parameters between the camera and the laser radar;
calculating Euclidean distance for the edge point expression of the calibration plate under the imaging coordinate system corresponding to each group of calibration images and each edge linear characteristic of the calibration plate under the imaging coordinate system to obtain a distance expression;
and establishing a cost function of the external parameters between the camera and the laser radar based on the normal vector difference value expression, the direction vector difference value expression and the distance expression.
5. A method according to claim 3, wherein optimizing the cost function, determining external parameters between a camera and a lidar, comprises:
and optimizing the cost function by using a nonlinear optimization method, and determining the external parameters between the camera and the laser radar.
6. The method of any one of claims 1-5, wherein the calibration plate is polygonal, and the scan lines of the lidar are not parallel to any edge of the calibration plate, and wherein each edge of the calibration plate has the scan lines of the lidar passing therethrough.
7. A system for determining an external parameter between a camera and a lidar, the system comprising: camera, lidar and computer device, wherein:
the camera is used for respectively shooting two-dimensional images of the calibration plates in a plurality of postures and sending the shot two-dimensional images to the computer equipment;
the laser radar is used for respectively shooting three-dimensional laser point cloud images of the calibration plates in the plurality of poses and sending the shot three-dimensional laser point cloud images to the computer equipment;
the computer equipment is used for taking a two-dimensional image and a three-dimensional laser point cloud image of the calibration plate under the same pose, which are respectively shot by the camera and the laser radar, as a group of calibration images, and determining a plurality of groups of calibration images; determining the linear characteristics of each edge of the calibration plate corresponding to each group of calibration images under the imaging coordinate system of the camera based on the two-dimensional images in each group of calibration images; determining three-dimensional coordinates of edge points of the calibration plate corresponding to each group of calibration images under a laser radar coordinate system based on the three-dimensional point cloud images in each group of calibration images; determining plane characteristics of the calibration plate corresponding to each group of calibration images under the camera coordinate system based on the coordinates of each angular point of the calibration plate under the world coordinate system corresponding to each group of calibration images and external parameters between the camera coordinate system and the world coordinate system; and determining external parameters between the camera and the laser radar based on the linear characteristics of each edge of the calibration plate corresponding to each group of calibration images under the imaging coordinate system of the camera, the three-dimensional coordinates of the edge point of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system and the plane characteristics of the calibration plate corresponding to each group of calibration images under the camera coordinate system of the camera.
8. An apparatus for determining an external parameter between a camera and a lidar, the apparatus comprising:
the acquisition module is used for acquiring a plurality of groups of calibration images, wherein each group of calibration images comprises a two-dimensional image and a three-dimensional laser point cloud image;
the determining module is used for determining the linear characteristics of each edge of the calibration plate corresponding to each group of calibration images under the imaging coordinate system of the camera based on the two-dimensional images in each group of calibration images;
the determining module is used for determining three-dimensional coordinates of edge points of the calibration plate corresponding to each group of calibration images under a laser radar coordinate system based on the three-dimensional point cloud images in each group of calibration images;
the establishing module is used for determining the plane characteristics of the calibration plate corresponding to each group of calibration images under the camera coordinate system based on the angular point coordinates of the calibration plate corresponding to each group of calibration images under the world coordinate system and the external parameters between the camera coordinate system and the world coordinate system; determining two-dimensional coordinates of edge points of the calibration plate corresponding to each group of calibration images under the camera coordinate system based on the two-dimensional images in each group of calibration images; determining a direction vector corresponding to each edge of the calibration plate under the camera coordinate system corresponding to each group of calibration images based on the plane characteristics of the calibration plate under the camera coordinate system corresponding to each group of calibration images and the two-dimensional coordinates of the edge points of the calibration plate under the camera coordinate system; determining a direction vector corresponding to each edge of the calibration plate under the laser radar coordinate system corresponding to each group of calibration images based on the three-dimensional coordinates of the edge point of the calibration plate under the laser radar coordinate system corresponding to each group of calibration images; establishing a nonlinear relation of external parameters between the camera and the laser radar based on direction vectors corresponding to the edges of the calibration plate under the camera coordinate system, direction vectors corresponding to the edges of the calibration plate under the laser radar coordinate system, linear characteristics of the edges of the calibration plate under the imaging coordinate system and three-dimensional coordinates of edge points of the calibration plate under the laser radar coordinate system, which correspond to each group of calibration images;
And the optimization module is used for determining external parameters between the camera and the laser radar based on the nonlinear relation.
9. The apparatus of claim 8, wherein the means for establishing is configured to:
determining a normal vector of a plane of the calibration plate under the camera coordinate system corresponding to each group of calibration images based on the plane characteristics of the calibration plate under the camera coordinate system corresponding to each group of calibration images;
determining the normal vector of the plane of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system based on the three-dimensional coordinates of the edge point of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system;
and establishing a nonlinear relation of external parameters between the camera and the laser radar based on the normal vector of the plane of the calibration plate under the camera coordinate system, the normal vector of the plane of the calibration plate under the laser radar coordinate system, the direction vector corresponding to each edge of the calibration plate under the camera coordinate system, the direction vector corresponding to each edge of the calibration plate under the laser radar coordinate system, each edge straight line characteristic of the calibration plate under the imaging coordinate system and the three-dimensional coordinate of the edge point of the calibration plate under the laser radar coordinate system, which are corresponding to each group of calibration images.
10. The apparatus of claim 9, wherein the means for establishing is configured to:
establishing a cost function of external parameters between the camera and the laser radar based on a normal vector of a plane of the calibration plate under the camera coordinate system, a normal vector of a plane of the calibration plate under the laser radar coordinate system, a direction vector of each edge of the calibration plate under the camera coordinate system, a direction vector of each edge of the calibration plate under the laser radar coordinate system, each edge straight line characteristic of the calibration plate under the imaging coordinate system, and an edge point three-dimensional coordinate of the calibration plate under the laser radar coordinate system, which are corresponding to each group of calibration images;
the optimizing module is used for:
and optimizing the cost function, and determining the external parameters between the camera and the laser radar.
11. The apparatus of claim 10, wherein the means for establishing is configured to:
determining a normal vector expression of a plane of the calibration plate under the camera coordinate system corresponding to each group of calibration images based on the normal vector of the plane of the calibration plate under the laser radar coordinate system corresponding to each group of calibration images, wherein unknown parameters in the normal vector expression comprise external parameters between the camera and the laser radar;
Calculating a difference value of a normal vector expression of the calibration plate under the camera coordinate system and a normal vector of the plane of the calibration plate under the camera coordinate system corresponding to each group of calibration images to obtain a normal vector difference value expression;
determining a direction vector expression corresponding to each edge of the calibration plate corresponding to each group of calibration images under the camera coordinate system based on a direction vector corresponding to each edge of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system, wherein unknown parameters in the direction vector expression comprise external parameters between the camera and the laser radar;
calculating a difference value of a direction vector expression corresponding to each edge of the calibration plate under the camera coordinate system corresponding to each group of calibration images and a direction vector corresponding to each edge of the calibration plate under the camera coordinate system to obtain a direction vector difference value expression;
determining an edge point expression of the calibration plate corresponding to each group of calibration images under the imaging coordinate system based on the three-dimensional coordinates of the edge point of the calibration plate corresponding to each group of calibration images under the laser radar coordinate system and the internal parameters of the camera, wherein unknown parameters in the edge point expression comprise external parameters between the camera and the laser radar;
Calculating Euclidean distance for the edge point expression of the calibration plate under the imaging coordinate system corresponding to each group of calibration images and each edge linear characteristic of the calibration plate under the imaging coordinate system to obtain a distance expression;
and establishing a cost function of the external parameters between the camera and the laser radar based on the normal vector difference value expression, the direction vector difference value expression and the distance expression.
12. The apparatus of claim 10, wherein the optimization module is configured to:
and optimizing the cost function by using a nonlinear optimization device, and determining the external parameters between the camera and the laser radar.
13. The apparatus of any one of claims 8-12, wherein the calibration plate is polygonal, and the scan lines of the lidar are not parallel to any edge of the calibration plate, and wherein each edge of the calibration plate has the scan lines of the lidar passing therethrough.
CN201910308370.4A 2019-04-17 2019-04-17 Method and device for determining external parameters between camera and laser radar Active CN111862224B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910308370.4A CN111862224B (en) 2019-04-17 2019-04-17 Method and device for determining external parameters between camera and laser radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910308370.4A CN111862224B (en) 2019-04-17 2019-04-17 Method and device for determining external parameters between camera and laser radar

Publications (2)

Publication Number Publication Date
CN111862224A CN111862224A (en) 2020-10-30
CN111862224B true CN111862224B (en) 2023-09-19

Family

ID=72951253

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910308370.4A Active CN111862224B (en) 2019-04-17 2019-04-17 Method and device for determining external parameters between camera and laser radar

Country Status (1)

Country Link
CN (1) CN111862224B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112446927A (en) * 2020-12-18 2021-03-05 广东电网有限责任公司 Combined calibration method, device and equipment for laser radar and camera and storage medium
CN112734857B (en) * 2021-01-08 2021-11-02 香港理工大学深圳研究院 Calibration method for camera internal reference and camera relative laser radar external reference and electronic equipment
CN113256729B (en) * 2021-03-17 2024-06-18 广西综合交通大数据研究院 External parameter calibration method, device and equipment for laser radar and camera and storage medium
CN113177988B (en) * 2021-04-30 2023-12-05 中德(珠海)人工智能研究院有限公司 Spherical screen camera and laser calibration method, device, equipment and storage medium
CN113436278A (en) * 2021-07-22 2021-09-24 深圳市道通智能汽车有限公司 Calibration method, calibration device, distance measurement system and computer readable storage medium
CN113610929B (en) * 2021-08-09 2023-08-18 西安外事学院 Combined calibration method of camera and multi-line laser
CN113838141B (en) * 2021-09-02 2023-07-25 中南大学 External parameter calibration method and system for single-line laser radar and visible light camera
CN114488097A (en) * 2022-01-26 2022-05-13 广州小鹏自动驾驶科技有限公司 External parameter calibration method of laser radar, computer equipment and computer storage medium
CN114758005B (en) * 2022-03-23 2023-03-28 中国科学院自动化研究所 Laser radar and camera external parameter calibration method and device
CN115994955B (en) * 2023-03-23 2023-07-04 深圳佑驾创新科技有限公司 Camera external parameter calibration method and device and vehicle
CN116137039B (en) * 2023-04-14 2023-09-12 深圳大学 Visual and laser sensor external parameter correction method and related equipment

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882313A (en) * 2010-07-14 2010-11-10 中国人民解放军国防科学技术大学 Calibration method of correlation between single line laser radar and CCD (Charge Coupled Device) camera
CN104156972A (en) * 2014-08-25 2014-11-19 西北工业大学 Perspective imaging method based on laser scanning distance measuring instrument and multiple cameras
CN104484887A (en) * 2015-01-19 2015-04-01 河北工业大学 External parameter calibration method used when camera and two-dimensional laser range finder are used in combined mode
CN104656097A (en) * 2015-01-28 2015-05-27 武汉理工大学 Calibration device based on rotary type two-dimensional laser three-dimensional reconstruction system
CN106097348A (en) * 2016-06-13 2016-11-09 大连理工大学 A kind of three-dimensional laser point cloud and the fusion method of two dimensional image
CN107255443A (en) * 2017-07-14 2017-10-17 北京航空航天大学 Binocular vision sensor field calibration method and device under a kind of complex environment
CN107862719A (en) * 2017-11-10 2018-03-30 未来机器人(深圳)有限公司 Scaling method, device, computer equipment and the storage medium of Camera extrinsic
CN107976669A (en) * 2016-10-21 2018-05-01 法乐第(北京)网络科技有限公司 A kind of device of outer parameter between definite camera and laser radar
CN108389233A (en) * 2018-02-23 2018-08-10 大连理工大学 The laser scanner and camera calibration method approached based on boundary constraint and mean value
CN108399643A (en) * 2018-03-15 2018-08-14 南京大学 A kind of outer ginseng calibration system between laser radar and camera and method
CN109270534A (en) * 2018-05-07 2019-01-25 西安交通大学 A kind of intelligent vehicle laser sensor and camera online calibration method
CN109615661A (en) * 2017-12-05 2019-04-12 西北工业大学 Light-field camera intrinsic parameter caliberating device and method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110573901A (en) * 2017-04-28 2019-12-13 深圳市大疆创新科技有限公司 calibration of laser sensor and vision sensor
US10474161B2 (en) * 2017-07-03 2019-11-12 Baidu Usa Llc High resolution 3D point clouds generation from upsampled low resolution lidar 3D point clouds and camera images

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101882313A (en) * 2010-07-14 2010-11-10 中国人民解放军国防科学技术大学 Calibration method of correlation between single line laser radar and CCD (Charge Coupled Device) camera
CN104156972A (en) * 2014-08-25 2014-11-19 西北工业大学 Perspective imaging method based on laser scanning distance measuring instrument and multiple cameras
CN104484887A (en) * 2015-01-19 2015-04-01 河北工业大学 External parameter calibration method used when camera and two-dimensional laser range finder are used in combined mode
CN104656097A (en) * 2015-01-28 2015-05-27 武汉理工大学 Calibration device based on rotary type two-dimensional laser three-dimensional reconstruction system
CN106097348A (en) * 2016-06-13 2016-11-09 大连理工大学 A kind of three-dimensional laser point cloud and the fusion method of two dimensional image
CN107976669A (en) * 2016-10-21 2018-05-01 法乐第(北京)网络科技有限公司 A kind of device of outer parameter between definite camera and laser radar
CN107255443A (en) * 2017-07-14 2017-10-17 北京航空航天大学 Binocular vision sensor field calibration method and device under a kind of complex environment
CN107862719A (en) * 2017-11-10 2018-03-30 未来机器人(深圳)有限公司 Scaling method, device, computer equipment and the storage medium of Camera extrinsic
CN109615661A (en) * 2017-12-05 2019-04-12 西北工业大学 Light-field camera intrinsic parameter caliberating device and method
CN108389233A (en) * 2018-02-23 2018-08-10 大连理工大学 The laser scanner and camera calibration method approached based on boundary constraint and mean value
CN108399643A (en) * 2018-03-15 2018-08-14 南京大学 A kind of outer ginseng calibration system between laser radar and camera and method
CN109270534A (en) * 2018-05-07 2019-01-25 西安交通大学 A kind of intelligent vehicle laser sensor and camera online calibration method

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
Extrinsic Calibration of a Camera and a Laser Range Finder using Point to Line Constraint;Zhen Chen,et al;Procedia Engineering;第29卷;4348-4352 *
Laser scanned point clouds to support autonomous vehicles;Vivien Potó,et al;Transportation Research Procedia;第27卷;531-537 *
一种自适应摄像机与激光雷达联合标定算法;姚文韬等;控制工程;第24卷(第S0期);75-79 *
一种针孔相机与三维激光雷达外参标定方法;韩正勇,等;传感器与微系统(第04期);14-17+21 *
二维和三维视觉传感集成系统联合标定方法;李琳,等;仪器仪表学报(第11期);75-81 *
基于相机标定的直线距离测量的研究;周超,等;科技资讯(第14期);86+88 *
姚文韬,等.一种自适应摄像机与激光雷达联合标定算法.控制工程.2017,(第S1期),77-81. *
采用直线模型的相机参数优化方法;孙俊锋,等;光学精密工程(第10期);232-242 *

Also Published As

Publication number Publication date
CN111862224A (en) 2020-10-30

Similar Documents

Publication Publication Date Title
CN111862224B (en) Method and device for determining external parameters between camera and laser radar
EP3158532B1 (en) Local adaptive histogram equalization
CN110223226B (en) Panoramic image splicing method and system
CN110717942B (en) Image processing method and device, electronic equipment and computer readable storage medium
CN105279372B (en) A kind of method and apparatus of determining depth of building
WO2017076106A1 (en) Method and device for image splicing
WO2021196548A1 (en) Distance determination method, apparatus and system
CN110915193B (en) Image processing system, server device, image processing method, and recording medium
WO2016155110A1 (en) Method and system for correcting image perspective distortion
CN108734738B (en) Camera calibration method and device
CN113012234B (en) High-precision camera calibration method based on plane transformation
CN111383254A (en) Depth information acquisition method and system and terminal equipment
CN115082777A (en) Binocular vision-based underwater dynamic fish form measuring method and device
US20240054662A1 (en) Capsule endoscope image three-dimensional reconstruction method, electronic device, and readable storage medium
CN109496326B (en) Image processing method, device and system
CN105184736B (en) A kind of method of the image registration of narrow overlapping double-view field hyperspectral imager
RU2692970C2 (en) Method of calibration of video sensors of the multispectral system of technical vision
WO2020025509A1 (en) Method and system for mapping the non-uniformity of an image sensor
WO2023273158A1 (en) Method and apparatus for determining operating range of camera in cooperative vehicle infrastructure and roadside device
CN111062984B (en) Method, device, equipment and storage medium for measuring area of video image area
CN115170670A (en) External parameter calibration method, device and program product
US20220198814A1 (en) Image dewarping with curved document boundaries
CN111387987A (en) Height measuring method, device, equipment and storage medium based on image recognition
WO2022253043A1 (en) Facial deformation compensation method for facial depth image, and imaging apparatus and storage medium
JP2004354234A (en) Camera calibration method for photogrammetry

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant