CN115147495A - Calibration method, device and system for vehicle-mounted system - Google Patents

Calibration method, device and system for vehicle-mounted system Download PDF

Info

Publication number
CN115147495A
CN115147495A CN202210622845.9A CN202210622845A CN115147495A CN 115147495 A CN115147495 A CN 115147495A CN 202210622845 A CN202210622845 A CN 202210622845A CN 115147495 A CN115147495 A CN 115147495A
Authority
CN
China
Prior art keywords
relative
camera
calibration
calibration plate
laser radar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210622845.9A
Other languages
Chinese (zh)
Inventor
谢子锐
王一夫
张如高
虞正华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Motovis Technology Shanghai Co ltd
Original Assignee
Motovis Technology Shanghai Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motovis Technology Shanghai Co ltd filed Critical Motovis Technology Shanghai Co ltd
Priority to CN202210622845.9A priority Critical patent/CN115147495A/en
Publication of CN115147495A publication Critical patent/CN115147495A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2016Rotation, translation, scaling

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The invention discloses a calibration method, a calibration device and a calibration system for a vehicle-mounted system, wherein the vehicle-mounted system comprises a laser radar and a multi-camera system, the laser radar and a camera have a common view area, and the calibration method comprises the steps of obtaining a first relative pose of the camera relative to a characteristic pattern of a calibration plate and a second relative pose of the laser radar relative to the calibration plate when the camera shoots the calibration plate in the common view area; constructing a geometric constraint equation for constraining the relative pose of the camera relative to the laser radar by using the first relative pose and the second relative pose; and determining the relative pose of the camera relative to the laser radar based on the geometric constraint equation. The calibration precision is higher.

Description

Calibration method, device and system for vehicle-mounted system
Technical Field
The invention relates to the technical field of automatic driving, in particular to a calibration method, a calibration device and a calibration system for a vehicle-mounted system.
Background
The unmanned technology, also called automatic driving technology, refers to a technology that a vehicle or other vehicles sense a driving environment through sensors equipped in the vehicle or other vehicles without manual operation, autonomously decide a driving path, and control the vehicle to reach a desired destination according to a desired path. The unmanned technology is an integration of a plurality of technologies, and mainly comprises technologies of real-time sensing and positioning, motion path planning, communication and data interaction, vehicle intelligent control and the like.
The method is the same as the traditional driving, and is used for sensing and positioning the running environment of the vehicle in real time (Simultaneous Localization and Mapping), which is the basis for implementing decision and control of the unmanned system. To meet the requirements of real-time performance and reliability, the unmanned vehicle is generally equipped with various sensors such as a vision camera, an IMU (Inertial Measurement Unit), a laser radar, and a wheel speed meter. Before these sensors are put into use, external parameters between these sensors need to be calibrated.
At present, the problem of low precision exists in a calibration method for a laser radar and a multi-camera system.
Disclosure of Invention
In view of this, embodiments of the present invention provide a calibration method, apparatus, system and computer-readable storage medium for a vehicle-mounted system, which can improve calibration accuracy.
The invention provides a calibration method for a vehicle-mounted system, wherein the vehicle-mounted system comprises a laser radar and a multi-camera system, the laser radar and a camera have a common view area, and the method comprises the following steps:
when the camera shoots a calibration plate in the common view area, acquiring a first relative pose of the camera relative to a feature pattern of the calibration plate and a second relative pose of the laser radar relative to the calibration plate;
constructing a geometric constraint equation for constraining the relative pose of the camera with respect to the lidar, using the first relative pose and the second relative pose; and
determining a relative pose of the camera with respect to the lidar based on the geometric constraint equation.
In some embodiments, each camera takes a plurality of shots of the calibration plate to obtain a plurality of frame images;
the multi-view geometric constraint equation is expressed as follows:
Figure BDA0003675254770000021
wherein the content of the first and second substances,
Figure BDA0003675254770000022
indicating a first relative pose of the jth camera relative to the characteristic pattern of the calibration plate when the jth camera captures the ith frame image,
Figure BDA0003675254770000023
the second relative pose of the laser radar relative to the calibration board when the ith camera takes the ith frame picture is represented,
Figure BDA00036752547700000210
representing the relative pose, T, of the lidar relative to the jth camera pb Representing the relative pose of a calibration plate with respect to the characteristic pattern of the calibration plate.
In some embodiments, the second relative pose of the lidar relative to the calibration plate
Figure BDA0003675254770000024
This is obtained based on the following expression:
Figure BDA0003675254770000025
wherein the content of the first and second substances,
Figure BDA0003675254770000026
indicating the rotation matrix of the calibration plate relative to the laser radar when the jth camera takes the ith frame picture,
Figure BDA0003675254770000027
representing the origin of the calibration plate point cloud coordinate system,
Figure BDA0003675254770000028
indicating a rotation matrix of the laser radar relative to the calibration board when the jth camera takes the ith frame picture,
Figure BDA0003675254770000029
and the translation amount of the laser radar relative to the calibration plate when the jth camera takes the ith frame picture is represented.
In some embodiments, the calibration plate is rotated relative to the lidar matrix
Figure BDA0003675254770000031
Can be obtained by the following method:
sequentially taking the upper left corner point, the top edge straight line and the left side edge straight line of the calibration plate point cloud as the origin of the calibration plate point cloud coordinate system
Figure BDA0003675254770000032
x axis v x And y-axis v y Meanwhile, taking the normal vector of the three-dimensional plane corresponding to the calibration plate point cloud as the z-axis v of the calibration plate point cloud coordinate system z
Using said x-axis v x The y-axis v y And said z axis v z Constructing an initial rotation matrix R of the calibration plate relative to the lidar init And for the initial rotation matrix R init Performing singular value decomposition, and regenerating a rotation matrix of the calibration plate relative to the laser radar
Figure BDA0003675254770000033
In some embodiments, said regenerating a rotation matrix of said calibration plate relative to said lidar
Figure BDA0003675254770000034
The method comprises the following steps:
defining R using a method of Singular Value Decomposition (SVD) commonly used in mathematics init =UWV T And setting W as an identity matrix to generate a rotation matrix of the calibration plate relative to the lidar
Figure BDA0003675254770000035
In some embodiments, the determining the relative pose of the camera with respect to the lidar comprises:
the relative pose T is determined pb Of (3) a rotation matrix R pb Setting the geometric constraint equation as an identity matrix, converting the geometric constraint equation into a linear equation, and determining the relative pose of the camera relative to the laser radar according to the linear equation:
Figure BDA0003675254770000036
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003675254770000037
t pb respectively representing a rotation matrix of a jth camera relative to a characteristic pattern of the calibration board, a translation amount of the jth camera relative to the characteristic pattern of the calibration board, a rotation matrix of the lidar relative to the calibration board, a translation amount of the lidar relative to the calibration board, a rotation matrix of the lidar relative to the jth camera, a translation amount of the lidar relative to the jth camera, and a translation amount of the calibration board relative to the characteristic pattern of the calibration board when the jth camera takes an ith frame picture.
In some embodiments, said determining the relative pose of the camera with respect to the lidar, from the linear equation, comprises:
converting the linear equation into the following equation set, and solving the equation set to obtain the relative pose of each camera relative to the laser radar:
Figure BDA0003675254770000041
Figure BDA0003675254770000042
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003675254770000043
Figure BDA0003675254770000044
representing the Kronecker product.
The invention also provides a calibration device, wherein the vehicle-mounted system comprises a laser radar and a multi-camera system, the laser radar and the cameras have a common view area, and the calibration device comprises:
the position and posture acquisition module is used for acquiring a first relative position and posture of the camera relative to a characteristic pattern of the calibration plate and a second relative position and posture of the laser radar relative to the calibration plate when the camera shoots the calibration plate in the common view area;
an equation construction module for constructing a geometric constraint equation for constraining the relative pose of the camera with respect to the lidar, using the first relative pose and the second relative pose; and
and the calibration module is used for determining the relative pose of the camera relative to the laser radar based on the geometric constraint equation.
In another aspect, the present invention also provides a calibration system, which includes a processor and a memory, where the memory is used to store a computer program, and the computer program is executed by the processor to implement the method described above.
In another aspect, the present invention also provides a computer-readable storage medium for storing a computer program, which when executed by a processor implements the method as described above.
In the technical solutions provided by the foregoing embodiments of the present application, the lidar has a common-view area with each camera in the multi-camera system, and when each camera photographs a calibration board in the common-view area, a first relative pose of the camera with respect to a feature pattern of the calibration board and a second relative pose of the lidar with respect to the calibration board are obtained, and a geometric constraint equation that constrains relative poses of the cameras with respect to the lidar is constructed by using the first relative pose and the second relative pose, so as to determine relative poses between the lidar and each camera. The calibration precision can be improved.
Drawings
The features and advantages of the present invention will be more clearly understood by reference to the accompanying drawings, which are illustrative and not to be construed as limiting the invention in any way, and in which:
FIG. 1 illustrates a schematic diagram of an on-board system provided by one embodiment of the present application;
FIG. 2 is a schematic flow chart illustrating a calibration method according to an embodiment of the present application;
FIG. 3 is a functional block diagram of a calibration apparatus provided in an embodiment of the present application;
fig. 4 shows a schematic structural diagram of a calibration system provided in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings of the embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the embodiments of the present invention, are within the scope of the present invention.
Please refer to fig. 1, which is a schematic diagram of an in-vehicle system according to an embodiment of the present application. The vehicle-mounted system comprises a laser radar and a multi-camera system. In fig. 1, a rectangular frame indicates a vehicle body, small circles indicate cameras, and large circles indicate laser radars. A plurality of cameras arranged around the vehicle body can form a multi-camera system, and real-time perception of the surrounding environment of the vehicle is achieved. There may be no or only a small overlapping field of view between the cameras. The laser radar and the camera have a common visual area. The calibration method for the vehicle-mounted system can calibrate external parameters between the multi-camera system and the laser radar. Wherein the external parameter between the multi-camera system and the lidar may refer to the relative pose of the camera with respect to the lidar.
In some embodiments, when calibrating the extrinsic parameters between the multi-camera system and the lidar by the calibration method of the present application, the relative positions of the cameras and the lidar need to be kept unchanged. In addition, each camera needs to complete internal reference calibration in advance, and time synchronization needs to be completed between each camera and the laser radar in advance.
In some embodiments, the multi-camera system and the lidar may be disposed on a movable sensor platform. In the moving process of the sensor platform, the calibration plates for calibration can sequentially and completely appear in the common visual area of each camera and the laser radar. Each camera may sequentially photograph the calibration plate appearing in the common view region to obtain an image of the calibration plate. Meanwhile, when the camera shoots the calibration plate in the common view area, the calibration plate can be scanned through the laser radar so as to obtain corresponding point cloud information. Wherein, the calibration plate can be a common calibration plate.
In other embodiments, the multi-camera system and the lidar may also remain stationary, i.e. the multi-camera system and the lidar are not arranged on a movable sensor platform. Through removing the calibration plate, the calibration plate is enabled to sequentially and completely appear in the common visual area of each camera and the laser radar, so that the cameras and the laser radar can conveniently acquire data.
The present application is described with the example of a multi-camera system and a lidar mounted on a movable sensor platform.
Please refer to fig. 2, which is a flowchart illustrating a calibration method according to an embodiment of the present application. The calibration method includes steps S21 to S23.
And S21, acquiring a first relative pose of the camera relative to the characteristic pattern of the calibration plate and a second relative pose of the laser radar relative to the calibration plate when the camera shoots the calibration plate in the common view area.
In some embodiments, the camera described in the calibration method of the present application may be a camera in a multi-camera system, which needs to be calibrated with an external parameter of a laser radar. Specifically, one or some of the cameras in the multi-camera system may be used, or all of the cameras in the multi-camera system may be used. The following description will take as an example the extrinsic parameter calibration of all cameras and lidar in a multi-camera system.
In some embodiments, a first relative pose of the camera with respect to a feature pattern (e.g., checkerboard pattern) of the calibration plate may be determined based on an image taken of the calibration plate in the common view region by the camera. Based on point cloud information obtained by scanning the calibration plate by the laser radar when the camera shoots the calibration plate in the common view area, the second relative pose of the laser radar relative to the calibration plate can be determined.
In some embodiments, for the case of performing extrinsic parameter calibration on multiple cameras and a laser radar, the multiple cameras may be controlled to sequentially shoot the calibration board in the common view area, and the laser radar is controlled to scan the calibration board while each camera shoots the calibration board. For any camera, according to an image obtained by shooting the calibration plate by the camera, a first relative pose of the camera relative to a characteristic pattern of the calibration plate can be determined, and according to point cloud information obtained by scanning the calibration plate by a laser radar when the camera shoots the calibration plate, a second relative pose of the laser radar relative to the calibration plate can be determined when the camera shoots the calibration plate. I.e. for the case of multiple cameras, there may be multiple first relative poses and multiple second relative poses. For example, assume that a multi-camera system includes a camera a and a camera B. According to the image obtained by shooting the calibration plate by the camera A, the first relative pose of the camera A relative to the characteristic pattern of the calibration plate can be determined, and according to the point cloud information obtained by scanning the calibration plate by the laser radar when the camera A shoots the calibration plate, the second relative pose of the laser radar relative to the calibration plate can be determined when the camera A shoots the calibration plate. Similarly, according to an image obtained by shooting the calibration plate by the camera B, a first relative pose of the camera B relative to the characteristic pattern of the calibration plate can be determined, and according to point cloud information obtained by scanning the calibration plate by the laser radar when the camera B shoots the calibration plate, a second relative pose of the laser radar relative to the calibration plate when the camera B shoots the calibration plate can be determined.
In some embodiments, each camera may take multiple shots of the calibration plate to obtain multiple frames of images. For any camera, according to any frame of image shot by the camera, the first relative pose of the camera relative to the characteristic pattern of the calibration board when the camera shoots the frame of image can be determined. Correspondingly, aiming at the point cloud information obtained by scanning the calibration plate by the laser radar when the camera shoots the frame image, the second relative pose of the laser radar relative to the calibration plate when the camera shoots the frame image can be determined. For example, based on the ith frame image captured by the jth camera, a first relative pose of the jth camera with respect to the feature pattern of the calibration plate when the jth camera captures the ith frame image can be determined. Based on the point cloud information obtained by scanning the calibration plate by the laser radar when the ith camera shoots the ith frame image, the second relative pose of the laser radar relative to the calibration plate when the ith camera shoots the ith frame image can be determined.
In some embodiments, for any camera, the camera may capture the calibration plate at different positions, thereby obtaining multiple frames of images captured by the camera. By moving the sensor platform, the position of the camera can be changed. When the same camera shoots the calibration board at different positions, the images of the calibration board obtained by shooting can be not identical due to the difference of shooting angles and the like. Therefore, the calibration method can ensure that the data for calibration are comprehensive and improve the precision of the calibration result. Specifically, each camera can take at least 3 frames of images. Of course, in other embodiments, each camera may only take one frame of image of the calibration board. In this way, the amount of data collected is reduced.
How to determine the first relative pose is explained below.
In some embodiments, based on the image of the calibration plate taken by the camera, 2D pixel coordinates of a feature pattern (e.g., checkerboard pattern) of the calibration plate in the camera may be determined, and then a first relative pose of the camera with respect to the feature pattern of the calibration plate may be calculated from the feature pattern on the calibration plate and the 2D pixel coordinates in the camera.
In some embodiments, a PnP algorithm in computer vision may be employed to calculate a first relative pose of each camera with respect to the feature pattern of the calibration plate based on the feature pattern on the calibration plate and the 2D pixel coordinates in the camera.
How to determine the second relative position is explained below.
In some embodiments, the calibration plate and the calibration plate point cloud are coplanar, and the second relative pose of the lidar with respect to the calibration plate may be a pose of the lidar equivalent to the calibration plate point cloud. Specifically, the calibration plate point cloud corresponding to the calibration plate may be manually selected from the point cloud information acquired by the laser radar. The present application is illustrated with a rectangular calibration plate as an example. Four corner points corresponding to the rectangular calibration plate can be manually selected from the point cloud information acquired by the laser radar, and connecting lines of the four corner points are used as four edges of the calibration plate. The point cloud enclosed by the four edges can be used as a calibration plate corresponding to the calibration plate point cloud.
In some embodiments, the upper left corner point, the top edge line and the left edge line of the calibration plate point cloud are sequentially used as the origin of the calibration plate point cloud coordinate system
Figure BDA0003675254770000091
x axis v x And y-axis v y Meanwhile, fitting based on a least square method and a RANSAC method to obtain a three-dimensional plane corresponding to the calibration plate point cloud, and taking a normal vector of the three-dimensional plane as a z-axis v of a calibration plate point cloud coordinate system z . Then using the x-axis v x Y axisv y And z-axis v z Constructing an initial rotation matrix R of the calibration plate (i.e., calibration plate point cloud) relative to the lidar init And for the initial rotation matrix R init Singular value decomposition is carried out, and a rotation matrix of the calibration plate relative to the laser radar is regenerated
Figure BDA0003675254770000092
In particular, R is defined using a method of Singular Value Decomposition (SVD) commonly used in mathematics init =UWV T And setting W as an identity matrix so that three axes of the rotation matrix satisfy an orthogonal relationship, and regenerating the rotation matrix of the calibration plate with respect to the lidar
Figure BDA0003675254770000093
Rotation matrix based on acquisition
Figure BDA0003675254770000094
In some embodiments, the second relative pose of the lidar with respect to the calibration plate
Figure BDA0003675254770000095
This can be found based on the following expression:
Figure BDA0003675254770000096
wherein the content of the first and second substances,
Figure BDA0003675254770000097
indicating that when the jth camera takes the ith frame of picture, the rotation matrix of the calibration plate relative to the laser radar is calibrated,
Figure BDA0003675254770000098
representing the origin of the calibration plate point cloud coordinate system,
Figure BDA0003675254770000099
indicating the rotation of the laser radar relative to the calibration board when the jth camera takes the ith frame pictureThe number of the rotation matrixes is changed,
Figure BDA00036752547700000910
and the translation amount of the laser radar relative to the calibration plate when the jth camera takes the ith frame picture is shown. As will be appreciated by those skilled in the art, the relative pose includes a rotation matrix and an amount of translation, so that the laser radar has a second relative pose with respect to the calibration plate
Figure BDA0003675254770000101
Can pass through a rotation matrix
Figure BDA0003675254770000102
And amount of translation
Figure BDA0003675254770000103
To indicate.
And S22, constructing a geometric constraint equation for constraining the relative pose of the camera relative to the laser radar by using the first relative pose and the second relative pose.
In some embodiments, the geometric constraint equation may be expressed as follows:
Figure BDA0003675254770000104
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003675254770000105
the first relative pose of the jth camera relative to the characteristic pattern of the calibration board is shown when the jth camera shoots the ith frame image,
Figure BDA0003675254770000106
the second relative pose of the laser radar relative to the calibration plate is shown when the jth camera takes the ith frame picture,
Figure BDA0003675254770000107
represents the relative pose, T, of the lidar relative to the jth camera pb Representing features of a calibration plate relative to a calibration plateRelative pose of the pattern.
And S23, determining the relative pose of the camera relative to the laser radar based on the geometric constraint equation.
In some embodiments, in the geometric constraint equation, the relative pose of the camera with respect to the lidar is an unknown number, and the relative pose of the camera with respect to the lidar may be obtained by solving the equation. Based on the method, the relative pose T can be obtained when the geometric constraint equation is solved by considering that the characteristic patterns of the calibration plate and the calibration plate are coplanar and have no rotation relation pb Of (3) a rotation matrix R pb Setting the unit matrix, converting the geometric constraint equation into a linear equation and solving the linear equation:
Figure BDA0003675254770000108
wherein the content of the first and second substances,
Figure BDA0003675254770000109
a rotation matrix for representing the characteristic pattern of the jth camera relative to the calibration board when the jth camera takes the ith frame picture,
Figure BDA00036752547700001010
The translation amount of the jth camera relative to the characteristic pattern of the calibration board is shown when the jth camera takes the ith frame picture,
Figure BDA00036752547700001011
A rotation matrix of the laser radar relative to the calibration plate is shown when the jth camera takes the ith frame picture,
Figure BDA00036752547700001012
Indicating the translation amount of the laser radar to the calibration plate when the jth camera takes the ith frame picture,
Figure BDA00036752547700001013
A rotation matrix representing the laser radar relative to the jth camera,
Figure BDA0003675254770000111
Represents the translation amount, t, of the laser radar relative to the jth camera pb And representing the translation between the calibration plate and the characteristic pattern of the calibration plate.
Further, when solving the linear equation, the linear equation can be converted into the following equation system:
Figure BDA0003675254770000112
Figure BDA0003675254770000113
wherein the content of the first and second substances,
Figure BDA0003675254770000114
Figure BDA0003675254770000115
representing the Kronecker product.
In some embodiments, by solving the above equation system, the relative pose between each camera and the lidar, i.e., the extrinsic parameters of the lidar and the multi-camera system, can be obtained. Specifically, for the first equation set in the above equation set, a solution method (for example, singular value decomposition method) of a linear equation set may be used to obtain a rotation matrix of the laser radar with respect to the jth camera
Figure BDA0003675254770000116
Then rotating the matrix
Figure BDA0003675254770000117
Substituting the second equation set into the first equation set to obtain the translation of the laser radar relative to the jth camera based on the similar solving method of the first equation set
Figure BDA0003675254770000118
And of calibration platesAmount of translation t between feature patterns pb . Wherein the rotation matrix of the laser radar relative to the jth camera
Figure BDA0003675254770000119
And the amount of translation of the laser radar relative to the jth camera
Figure BDA00036752547700001110
Namely the relative pose between the laser radar and each camera. Based on the obtained relative pose between each camera and the laser radar, the laser radar and the multi-camera system can be calibrated, and the conversion between a laser radar coordinate system and a camera coordinate system is realized.
In some embodiments of the present application, the lidar has a common view area with each camera in the multi-camera system, and when each camera photographs the calibration board in the common view area, a first relative pose of the camera with respect to the feature pattern of the calibration board and a second relative pose of the lidar with respect to the calibration board are obtained, and a geometric constraint equation that constrains the relative poses of the camera with respect to the lidar is constructed using the first relative pose and the second relative pose to determine the relative poses between the lidar and the cameras. The calibration precision can be improved. Meanwhile, a customized calibration plate or a specific scene does not need to be built, and the adaptability is good.
Please refer to fig. 3, which is a schematic diagram of functional modules of a calibration apparatus according to an embodiment of the present disclosure. The calibration device is used for calibrating a vehicle-mounted system, the vehicle-mounted system comprises a laser radar and a multi-camera system, and the laser radar and the cameras have a common-view area. The calibration device comprises:
the position and posture acquisition module is used for acquiring a first relative position and posture of the camera relative to a characteristic pattern of the calibration plate and a second relative position and posture of the laser radar relative to the calibration plate when the camera shoots the calibration plate in the common view area;
an equation construction module for constructing a geometric constraint equation for constraining the relative pose of the camera with respect to the lidar, using the first relative pose and the second relative pose; and
and the calibration module is used for determining the relative pose of the camera relative to the laser radar based on the geometric constraint equation.
Please refer to fig. 4, which is a schematic structural diagram of a calibration system according to an embodiment of the present application. The calibration system comprises a processor and a memory, wherein the memory is used for storing a computer program, and the computer program realizes the calibration method when being executed by the processor.
The processor may be a Central Processing Unit (CPU). The Processor may also be other general purpose processors, digital Signal Processors (DSPs), application Specific Integrated Circuits (ASICs), field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, or a combination thereof.
The memory, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and modules, such as program instructions/modules corresponding to the methods of the embodiments of the present invention. The processor executes various functional applications and data processing of the processor by executing non-transitory software programs, instructions and modules stored in the memory, that is, the method in the above method embodiment is realized.
The memory may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created by the processor, and the like. Further, the memory may include high speed random access memory, and may also include non-transitory memory, such as at least one disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory optionally includes memory located remotely from the processor, and such remote memory may be coupled to the processor via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
An embodiment of the present application further provides a computer-readable storage medium for storing a computer program, which when executed by a processor, implements the calibration method described above.
Although the embodiments of the present invention have been described in conjunction with the accompanying drawings, those skilled in the art may make various modifications and variations without departing from the spirit and scope of the invention, and such modifications and variations fall within the scope defined by the appended claims.

Claims (10)

1. A calibration method for an on-board system, wherein the on-board system comprises a laser radar and a multi-camera system, and the laser radar and the cameras have a common view area, and the method comprises the following steps:
acquiring a first relative pose of a camera relative to a characteristic pattern of a calibration plate and a second relative pose of the laser radar relative to the calibration plate when the camera shoots the calibration plate in the common view area;
constructing a geometric constraint equation for constraining the relative pose of the camera with respect to the lidar, using the first relative pose and the second relative pose; and
determining a relative pose of the camera with respect to the lidar based on the geometric constraint equation.
2. The method of claim 1, wherein each camera takes a plurality of shots of the calibration plate to obtain a plurality of frame images;
the geometric constraint equation is expressed as follows:
Figure FDA0003675254760000011
wherein the content of the first and second substances,
Figure FDA0003675254760000012
the first relative pose of the jth camera relative to the characteristic pattern of the calibration board is shown when the jth camera shoots the ith frame image,
Figure FDA0003675254760000013
the second relative pose of the laser radar relative to the calibration board when the ith camera takes the ith frame picture is represented,
Figure FDA0003675254760000014
representing the relative pose, T, of the lidar relative to the jth camera pb Representing the relative pose of a calibration plate with respect to the feature pattern of the calibration plate.
3. The method of claim 2, wherein the lidar is configured to have a second relative position with respect to the calibration plate
Figure FDA0003675254760000015
Obtained based on the following expression:
Figure FDA0003675254760000016
wherein, the first and the second end of the pipe are connected with each other,
Figure FDA0003675254760000017
indicating the rotation matrix of the calibration plate relative to the laser radar when the jth camera takes the ith frame picture,
Figure FDA0003675254760000021
representing the origin of the calibration plate point cloud coordinate system,
Figure FDA0003675254760000022
indicating that the laser radar is relative to the calibration when the jth camera shoots the ith frame pictureThe rotation matrix of the plate is such that,
Figure FDA0003675254760000023
and the translation amount of the laser radar relative to the calibration plate when the jth camera takes the ith frame picture is represented.
4. The method of claim 3, wherein the calibration plate is rotated relative to the lidar rotation matrix
Figure FDA0003675254760000024
Can be obtained by the following method:
sequentially taking the upper left angular point, the top edge straight line and the left side edge straight line of the calibration plate point cloud as the origin of the calibration plate point cloud coordinate system
Figure FDA0003675254760000025
x axis v x And y-axis v y Meanwhile, taking the normal vector of the three-dimensional plane corresponding to the calibration plate point cloud as the z-axis v of the calibration plate point cloud coordinate system z
Using said x-axis v x The y-axis v y And said z axis v z Constructing an initial rotation matrix R of the calibration plate relative to the lidar init And for the initial rotation matrix R init Performing singular value decomposition, and regenerating a rotation matrix of the calibration plate relative to the laser radar
Figure FDA0003675254760000026
5. The method of claim 4, wherein the regenerating of the rotation matrix of the calibration plate relative to the lidar
Figure FDA0003675254760000027
The method comprises the following steps:
defining R using a method of Singular Value Decomposition (SVD) commonly used in mathematics init =UWV T And setting W as an identity matrix to generate a rotation matrix of the calibration plate with respect to the lidar
Figure FDA0003675254760000028
6. The method of claim 2, wherein the determining the relative pose of the camera with respect to the lidar comprises:
the relative pose T is determined pb Of (2) a rotation matrix R pb Setting the geometric constraint equation as an identity matrix, converting the geometric constraint equation into a linear equation, and determining the relative pose of the camera relative to the laser radar according to the linear equation:
Figure FDA0003675254760000031
wherein the content of the first and second substances,
Figure FDA0003675254760000035
t pb respectively representing a rotation matrix of a jth camera relative to a characteristic pattern of the calibration board, a translation amount of the jth camera relative to the characteristic pattern of the calibration board, a rotation matrix of the lidar relative to the calibration board, a translation amount of the lidar relative to the calibration board, a rotation matrix of the lidar relative to the jth camera, a translation amount of the lidar relative to the jth camera, and a translation amount of the calibration board relative to the characteristic pattern of the calibration board when the jth camera takes an ith frame picture.
7. The method of claim 6, wherein the determining the relative pose of the camera with respect to the lidar according to the linear equation comprises:
converting the linear equation into the following equation set, and solving the equation set to obtain the relative pose of each camera relative to the laser radar:
Figure FDA0003675254760000033
Figure FDA0003675254760000034
wherein the content of the first and second substances,
Figure FDA0003675254760000041
Figure FDA0003675254760000042
representing the Kronecker product.
8. A calibration device, for calibrating an on-board system, the on-board system including a laser radar and a multi-camera system, the laser radar and the camera having a common viewing area, the calibration device comprising:
the position and posture acquisition module is used for acquiring a first relative position and posture of the camera relative to a characteristic pattern of the calibration plate and a second relative position and posture of the laser radar relative to the calibration plate when the camera shoots the calibration plate in the common view area;
an equation construction module for constructing a geometric constraint equation for constraining the relative pose of the camera with respect to the lidar, using the first relative pose and the second relative pose; and
and the calibration module is used for determining the relative pose of the camera relative to the laser radar based on the geometric constraint equation.
9. Calibration system, characterized in that it comprises a processor and a memory for storing a computer program which, when executed by the processor, carries out the method according to any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium is used for storing a computer program which, when executed by a processor, implements the method of any one of claims 1 to 7.
CN202210622845.9A 2022-06-01 2022-06-01 Calibration method, device and system for vehicle-mounted system Pending CN115147495A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210622845.9A CN115147495A (en) 2022-06-01 2022-06-01 Calibration method, device and system for vehicle-mounted system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210622845.9A CN115147495A (en) 2022-06-01 2022-06-01 Calibration method, device and system for vehicle-mounted system

Publications (1)

Publication Number Publication Date
CN115147495A true CN115147495A (en) 2022-10-04

Family

ID=83405901

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210622845.9A Pending CN115147495A (en) 2022-06-01 2022-06-01 Calibration method, device and system for vehicle-mounted system

Country Status (1)

Country Link
CN (1) CN115147495A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116228875A (en) * 2022-11-30 2023-06-06 苏州魔视智能科技有限公司 Calibration method, device and system for multi-phase locomotive-mounted system and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116228875A (en) * 2022-11-30 2023-06-06 苏州魔视智能科技有限公司 Calibration method, device and system for multi-phase locomotive-mounted system and storage medium
CN116228875B (en) * 2022-11-30 2023-12-08 苏州魔视智能科技有限公司 Calibration method, device and system for multi-phase locomotive-mounted system and storage medium

Similar Documents

Publication Publication Date Title
CN110728715B (en) Intelligent inspection robot camera angle self-adaptive adjustment method
CN109920011B (en) External parameter calibration method, device and equipment for laser radar and binocular camera
CN110136208B (en) Joint automatic calibration method and device for robot vision servo system
CN112270713B (en) Calibration method and device, storage medium and electronic device
KR101666959B1 (en) Image processing apparatus having a function for automatically correcting image acquired from the camera and method therefor
CN111750820B (en) Image positioning method and system
KR101672732B1 (en) Apparatus and method for tracking object
CN106960454B (en) Depth of field obstacle avoidance method and equipment and unmanned aerial vehicle
CN106485753B (en) The method and apparatus of camera calibration for pilotless automobile
CN112396664A (en) Monocular camera and three-dimensional laser radar combined calibration and online optimization method
CN111489288B (en) Image splicing method and device
CN111383264B (en) Positioning method, positioning device, terminal and computer storage medium
CN115423863B (en) Camera pose estimation method and device and computer readable storage medium
JP2019032218A (en) Location information recording method and device
CN116433737A (en) Method and device for registering laser radar point cloud and image and intelligent terminal
CN112837207A (en) Panoramic depth measuring method, four-eye fisheye camera and binocular fisheye camera
JP2023505891A (en) Methods for measuring environmental topography
WO2020114433A1 (en) Depth perception method and apparatus, and depth perception device
Seo et al. A branch-and-bound algorithm for globally optimal calibration of a camera-and-rotation-sensor system
CN115147495A (en) Calibration method, device and system for vehicle-mounted system
CN113436267A (en) Visual inertial navigation calibration method and device, computer equipment and storage medium
CN210986289U (en) Four-eye fisheye camera and binocular fisheye camera
CN115546216B (en) Tray detection method, device, equipment and storage medium
CN114648639B (en) Target vehicle detection method, system and device
CN114754779B (en) Positioning and mapping method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination