CN117152270A - Laser radar and camera combined calibration method, device, equipment and medium - Google Patents

Laser radar and camera combined calibration method, device, equipment and medium Download PDF

Info

Publication number
CN117152270A
CN117152270A CN202311108056.4A CN202311108056A CN117152270A CN 117152270 A CN117152270 A CN 117152270A CN 202311108056 A CN202311108056 A CN 202311108056A CN 117152270 A CN117152270 A CN 117152270A
Authority
CN
China
Prior art keywords
target
determining
camera
information
calibration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311108056.4A
Other languages
Chinese (zh)
Inventor
张伟伟
赵珊
汤永俊
张意贺
陈�光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Faw Nanjing Technology Development Co ltd
FAW Group Corp
Original Assignee
Faw Nanjing Technology Development Co ltd
FAW Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Faw Nanjing Technology Development Co ltd, FAW Group Corp filed Critical Faw Nanjing Technology Development Co ltd
Priority to CN202311108056.4A priority Critical patent/CN117152270A/en
Publication of CN117152270A publication Critical patent/CN117152270A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/497Means for monitoring or calibrating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Abstract

The embodiment of the invention discloses a laser radar and camera combined calibration method, device, equipment and medium. Wherein the method comprises the following steps: acquiring two-dimensional image information obtained by detecting a target by a camera and acquiring three-dimensional point cloud information obtained by detecting the target by a laser radar; determining first position information of a first key point in the two-dimensional image information, and determining second position information of a second key point in the three-dimensional point cloud information; the relative position relation between the first key point and the target hole is the same as that between the second key point and the target hole; and determining a homography matrix according to the first position information and the second position information, and calibrating the relative position relation between the camera and the laser radar according to the homography matrix. According to the technical scheme, the camera and the laser radar are calibrated based on the two-dimensional image information and the key point positions in the three-dimensional point cloud information, so that the calibration efficiency and accuracy are improved, and the method has high robustness and flexibility.

Description

Laser radar and camera combined calibration method, device, equipment and medium
Technical Field
The invention relates to the technical field of intelligent driving, in particular to a laser radar and camera combined calibration method, device, equipment and medium.
Background
In an intelligent driving system, a sensor inevitably has limitation, and in order to further improve the robustness of the system, a mode of fusing sensing results of a plurality of sensors is adopted to detect a target. The accuracy of the laser radar and camera combined calibration result directly influences the accuracy of sensing fusion and combined labeling of the intelligent driving rear end.
The current combined calibration method of the laser radar and the camera mostly adopts a manual mode, but the calibration efficiency and accuracy of the mode are lower, the flexibility is poor, and the actual requirements are difficult to meet.
Disclosure of Invention
The invention provides a laser radar and camera combined calibration method, device, equipment and medium, which are used for carrying out combined calibration on a camera and a laser radar based on two-dimensional image information and key point positions in three-dimensional point cloud information, so that the calibration efficiency and accuracy are improved, and the method has stronger robustness and flexibility.
According to an aspect of the present invention, there is provided a joint calibration method of a laser radar and a camera, the method comprising:
acquiring two-dimensional image information obtained by detecting a target by a camera and three-dimensional point cloud information obtained by detecting the target by a laser radar; wherein, at least one target hole is arranged on the target;
determining first position information of a first key point in the two-dimensional image information, and determining second position information of a second key point in the three-dimensional point cloud information; wherein, the relative position relation between the first key point and the target hole is the same as the relative position relation between the second key point and the target hole;
determining a homography matrix according to the first position information and the second position information, and calibrating the relative position relation between the camera and the laser radar according to the homography matrix.
According to another aspect of the present invention, there is provided a joint calibration device of a laser radar and a camera, including:
the target detection information acquisition module is used for acquiring two-dimensional image information obtained by detecting a target by a camera and three-dimensional point cloud information obtained by detecting the target by a laser radar; wherein, at least one target hole is arranged on the target;
the key point position information determining module is used for determining first position information of a first key point in the two-dimensional image information and determining second position information of a second key point in the three-dimensional point cloud information; wherein, the relative position relation between the first key point and the target hole is the same as the relative position relation between the second key point and the target hole;
and the relative position relation calibration module is used for determining a homography matrix according to the first position information and the second position information, and calibrating the relative position relation between the camera and the laser radar according to the homography matrix.
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor, and the computer program is executed by the at least one processor, so that the at least one processor can execute the combined calibration method of the laser radar and the camera according to any embodiment of the invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to execute the method for joint calibration of a lidar and a camera according to any of the embodiments of the present invention.
According to the technical scheme, two-dimensional image information obtained by detecting the target by the camera is obtained, and three-dimensional point cloud information obtained by detecting the target by the laser radar is obtained; wherein, the target is provided with at least one target hole; determining first position information of a first key point in the two-dimensional image information, and determining second position information of a second key point in the three-dimensional point cloud information; the relative position relation between the first key point and the target hole is the same as that between the second key point and the target hole; and determining a homography matrix according to the first position information and the second position information, and calibrating the relative position relation between the camera and the laser radar according to the homography matrix. According to the technical scheme, the camera and the laser radar are calibrated in a combined mode based on the two-dimensional image information and the key point positions in the three-dimensional point cloud information, so that the calibration efficiency and accuracy are improved, and the method has high robustness and flexibility.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of a method for joint calibration of a laser radar and a camera according to a first embodiment of the present invention;
fig. 2 is a flowchart of a method for joint calibration of a laser radar and a camera according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a combined calibration device for a laser radar and a camera according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device for implementing a method for joint calibration of a lidar and a camera according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," "target," and the like in the description and claims of the present invention and in the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
Fig. 1 is a flowchart of a combined calibration method for a laser radar and a camera according to an embodiment of the present invention, where the method may be performed by a combined calibration device for a laser radar and a camera, and the combined calibration device for a laser radar and a camera may be implemented in hardware and/or software, and the combined calibration device for a laser radar and a camera may be configured in an electronic device with data processing capability. As shown in fig. 1, the method includes:
s110, acquiring two-dimensional image information obtained by detecting a target by a camera and three-dimensional point cloud information obtained by detecting the target by a laser radar; wherein, be provided with at least one target hole on the target.
The target may be a predetermined calibration reference, and at least one target hole is disposed on the target. Optionally, the target is a rectangular calibration plate, and the target hole is a circular hole. It should be noted that, the number and the positions of the target holes may be set according to actual requirements, which are not specifically limited in this embodiment. Wherein the camera and the lidar are pre-mounted on the target vehicle. Alternatively, the camera is a wide-angle camera (e.g., 120 degrees in field), and the lidar is a solid-state lidar. In addition, the camera may be a narrow angle camera, and the lidar may be a mechanical lidar.
In this embodiment, first, two-dimensional image information (i.e., RGB image information) obtained by photographing a target with a camera is obtained, and three-dimensional point cloud information obtained by sensing the target with a laser radar is obtained. It should be noted that, because the target occupies a smaller area within the 120-degree angle of view of the front-view camera, and the total image detection time is too long, in order to save calculation power, the image area of interest including the target can be intercepted from the original image information captured by the camera as two-dimensional image information according to the empirical value. Similarly, because the occupied area of the target in the field of view of the solid-state laser radar is smaller and the detection time of all the point clouds is too long, in order to save calculation force, the interesting point cloud area including the target can be intercepted from the original point cloud information perceived by the laser radar according to the experience value to serve as three-dimensional point cloud information.
S120, determining first position information of a first key point in the two-dimensional image information and determining second position information of a second key point in the three-dimensional point cloud information; the relative position relation between the first key point and the target hole is the same as that between the second key point and the target hole.
The first key point may refer to a detection point in the two-dimensional image information. The first location information may refer to location information of a first key point. The second key point may refer to a detection point in the three-dimensional point cloud information. The second location information may refer to location information of a second key point. It should be noted that, the relative positional relationship between the first key point and the target hole is the same as the relative positional relationship between the second key point and the target hole. For example, a center point of the target hole in the two-dimensional image information may be used as a first key point, and a center point of the target hole in the three-dimensional point cloud information may be used as a second key point.
In this embodiment, after the two-dimensional image information is acquired, the target in the two-dimensional image information is first identified, then all the target holes on the target are identified, and then the center point coordinates (two-dimensional coordinates) of the target holes are determined as the first position information of the first key point according to the established two-dimensional image coordinate system. Similarly, after the three-dimensional point cloud information is acquired, firstly identifying the target in the three-dimensional point cloud information, then identifying all target holes on the target, and further determining the center point coordinates (three-dimensional coordinates) of the target holes as second position information of the second key point according to the established three-dimensional point cloud coordinate system. It should be noted that, the identification method of the target and the target hole is not particularly limited, and may be set according to actual requirements. For example, the identification may be based on an edge detection algorithm.
S130, determining a homography matrix according to the first position information and the second position information, and calibrating the relative position relation between the camera and the laser radar according to the homography matrix.
In this embodiment, after the first position information and the second position information are determined, the homography matrix may be determined according to the first position information and the second position information, so that the second position information may be converted into two-dimensional image information, and then the relative positional relationship between the camera and the laser radar may be calibrated according to the homography matrix. The homography matrix may refer to a projection matrix from one plane to another plane, and may be used to describe a positional mapping relationship of an object between different planes. In addition, the calculation method of the homography matrix may refer to the prior art, which is not described in detail in this embodiment. The relative position relation (including angle and distance) between the camera and the laser radar can be described through the homography matrix, so that joint calibration of the camera and the laser radar can be realized.
Further, in order to improve the calibration accuracy, after determining the homography matrix, the positions of the target targets relative to the camera and the laser radar may be changed, and the steps S110-S130 may be re-executed according to the updated target positions, and the homography matrix obtained by the initial calculation may be optimized according to the updated homography matrix.
According to the technical scheme, two-dimensional image information obtained by detecting the target by the camera is obtained, and three-dimensional point cloud information obtained by detecting the target by the laser radar is obtained; wherein, the target is provided with at least one target hole; determining first position information of a first key point in the two-dimensional image information, and determining second position information of a second key point in the three-dimensional point cloud information; the relative position relation between the first key point and the target hole is the same as that between the second key point and the target hole; and determining a homography matrix according to the first position information and the second position information, and calibrating the relative position relation between the camera and the laser radar according to the homography matrix. According to the technical scheme, the camera and the laser radar are calibrated in a combined mode based on the two-dimensional image information and the key point positions in the three-dimensional point cloud information, so that the calibration efficiency and accuracy are improved, and the method has high robustness and flexibility.
In this embodiment, optionally, a target identifier is further provided on the target, where the target identifier is used to calibrate profile information of the target.
The target identifier can be used for calibrating the outline information of the target. For example, the target identifier may be a two-dimensional code. It should be noted that when the edge detection algorithm is adopted to identify the target, there may be a situation that the target cannot be identified, so that accurate calibration of the camera and the laser radar is affected. Therefore, the target can be identified by the target identifier preset on the target, so that the identification accuracy of the target is improved.
For example, when the target is a rectangular calibration plate, in order to calibrate the profile information of the target, two-dimensional codes may be respectively set at four vertices of the rectangular calibration plate. When the target is identified, four vertexes of the rectangular calibration plate can be positioned by identifying the four two-dimensional codes, and the four vertexes can be sequentially connected to obtain the outline information of the rectangular calibration plate, so that the accurate and effective identification of the target is realized.
Through such setting, this scheme can discern the target with the help of the target label that sets up in advance on the target to improve the recognition accuracy of target.
Example two
Fig. 2 is a flowchart of a combined calibration method of a laser radar and a camera according to a second embodiment of the present invention, where the present embodiment is optimized based on the foregoing embodiment. The concrete optimization is as follows: after calibrating the relative position relation between the camera and the laser radar according to the homography matrix, the method further comprises the following steps: projecting all point clouds in the three-dimensional point cloud information into two-dimensional image information according to the homography matrix; determining the number of invalid point clouds which are not overlapped with the two-dimensional image information in the three-dimensional point cloud information according to the projection result; and determining a projection error according to the number of the invalid point clouds and the total number of the point clouds in the three-dimensional point cloud information, and determining a calibration result according to the projection error.
As shown in fig. 2, the method of this embodiment specifically includes the following steps:
s210, acquiring two-dimensional image information obtained by detecting a target by a camera and three-dimensional point cloud information obtained by detecting the target by a laser radar; wherein, be provided with at least one target hole on the target.
S220, determining first position information of a first key point in the two-dimensional image information, and determining second position information of a second key point in the three-dimensional point cloud information; the relative position relation between the first key point and the target hole is the same as that between the second key point and the target hole.
S230, determining a homography matrix according to the first position information and the second position information, and calibrating the relative position relation between the camera and the laser radar according to the homography matrix.
The specific implementation of S210-S230 may be referred to in the detailed description of S110-S130, and will not be described herein.
And S240, projecting all point clouds in the three-dimensional point cloud information into the two-dimensional image information according to the homography matrix.
In this embodiment, after calibrating the relative position relationship between the camera and the laser radar according to the homography matrix, all the point clouds in the three-dimensional point cloud information can be projected into the two-dimensional image information according to the homography matrix, so as to verify the calibration accuracy. Specifically, position information of each point cloud in the three-dimensional point cloud information is obtained, and projection results of each point cloud can be determined according to the position information of each point cloud and the homography matrix. The projection result comprises that the point cloud is overlapped with the two-dimensional image information or the point cloud is not overlapped with the two-dimensional image information.
S250, determining the number of invalid point clouds which are not overlapped with the two-dimensional image information in the three-dimensional point cloud information according to the projection result.
The number of invalid point clouds may be the sum of the number of point clouds which are not overlapped with the two-dimensional image information in the three-dimensional point cloud information. In this embodiment, when the number of invalid point clouds is determined according to the projection result, the number of invalid point clouds may be obtained by directly accumulating point clouds that do not overlap with the two-dimensional image information in the three-dimensional point cloud information. If the projected point cloud appears in the foreground region of the two-dimensional image, the point cloud can be determined to coincide with the two-dimensional image information; if the projected point cloud appears in the background area of the two-dimensional image, the point cloud can be determined not to coincide with the two-dimensional image information.
And S260, determining projection errors according to the number of invalid point clouds and the total number of the point clouds in the three-dimensional point cloud information, and determining a calibration result according to the projection errors.
In this embodiment, after determining the number of invalid point clouds in the three-dimensional point cloud information, the projection error may be determined according to a ratio of the number of invalid point clouds to the total number of point clouds in the three-dimensional point cloud information, and then the calibration result may be determined according to the projection error. Optionally, determining the calibration result according to the projection error includes: if the projection error is larger than the preset error value, determining that the calibration result is failed in calibration; otherwise, determining the calibration result as successful calibration.
The preset error value may be a preset projection error reference value, which may be specifically set according to actual requirements. Specifically, if the projection error is greater than the preset error value, the projection error is larger (i.e. the calibration accuracy is lower), and the calibration result can be determined to be a calibration failure at this time; if the projection error is smaller than or equal to the preset error value, the projection error is smaller (namely, the calibration precision is higher), and the calibration result can be determined to be successful.
Through such setting, the scheme can carry out quick and accurate verification to the calibration accuracy based on the magnitude relation between the projection error and the preset error value.
According to the technical scheme, after the relative position relation between a camera and a laser radar is calibrated according to the homography matrix, all point clouds in the three-dimensional point cloud information are projected into the two-dimensional image information according to the homography matrix; determining the number of invalid point clouds which are not overlapped with the two-dimensional image information in the three-dimensional point cloud information according to the projection result; and determining a projection error according to the number of the invalid point clouds and the total number of the point clouds in the three-dimensional point cloud information, and determining a calibration result according to the projection error. According to the technical scheme, the camera and the laser radar are calibrated in a combined mode based on the two-dimensional image information and the key point positions in the three-dimensional point cloud information, calibration efficiency and accuracy are improved, and the method and the device can be used for rapidly and accurately verifying the calibration accuracy according to projection errors on the basis of high robustness and flexibility.
Example III
Fig. 3 is a schematic structural diagram of a combined calibration device for a laser radar and a camera according to a third embodiment of the present invention, where the device may execute the combined calibration method for a laser radar and a camera according to any embodiment of the present invention, and the device has functional modules and beneficial effects corresponding to the execution method. As shown in fig. 3, the apparatus includes:
the target detection information obtaining module 310 is configured to obtain two-dimensional image information obtained by detecting a target by a camera, and obtain three-dimensional point cloud information obtained by detecting the target by a laser radar; wherein, at least one target hole is arranged on the target;
a key point position information determining module 320, configured to determine first position information of a first key point in the two-dimensional image information, and determine second position information of a second key point in the three-dimensional point cloud information; wherein, the relative position relation between the first key point and the target hole is the same as the relative position relation between the second key point and the target hole;
the relative position relationship calibration module 330 is configured to determine a homography matrix according to the first position information and the second position information, and calibrate the relative position relationship between the camera and the lidar according to the homography matrix.
Optionally, the apparatus further includes:
the point cloud projection module is used for projecting all point clouds in the three-dimensional point cloud information into the two-dimensional image information according to the homography matrix after calibrating the relative position relation between the camera and the laser radar according to the homography matrix;
the invalid point cloud quantity determining module is used for determining the quantity of invalid point clouds which are not overlapped with the two-dimensional image information in the three-dimensional point cloud information according to a projection result;
and the calibration result determining module is used for determining projection errors according to the number of the invalid point clouds and the total number of the point clouds in the three-dimensional point cloud information, and determining a calibration result according to the projection errors.
Optionally, the calibration result determining module is configured to:
if the projection error is larger than a preset error value, determining that the calibration result is failed in calibration;
otherwise, determining the calibration result as successful calibration.
Optionally, the target is further provided with a target identifier, and the target identifier is used for calibrating profile information of the target.
Optionally, the target is a rectangular calibration plate, and the target hole is a circular hole.
Optionally, the camera is a wide-angle camera, and the laser radar is a solid-state laser radar.
The laser radar and camera combined calibration device provided by the embodiment of the invention can execute the laser radar and camera combined calibration method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the execution method.
Example IV
Fig. 4 shows a schematic diagram of the structure of an electronic device 10 that may be used to implement an embodiment of the invention. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. Electronic equipment may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices (e.g., helmets, glasses, watches, etc.), and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 4, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the various methods and processes described above, such as a laser radar and camera joint calibration method.
In some embodiments, the combined lidar and camera calibration method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When the computer program is loaded into RAM 13 and executed by processor 11, one or more steps of the lidar and camera joint calibration method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the joint calibration method of the lidar and the camera in any other suitable way (e.g. by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuitry, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems-on-chips (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) through which a user can provide input to the electronic device. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (14)

1. A method for joint calibration of a laser radar and a camera, the method comprising:
acquiring two-dimensional image information obtained by detecting a target by a camera and three-dimensional point cloud information obtained by detecting the target by a laser radar; wherein, at least one target hole is arranged on the target;
determining first position information of a first key point in the two-dimensional image information, and determining second position information of a second key point in the three-dimensional point cloud information; wherein, the relative position relation between the first key point and the target hole is the same as the relative position relation between the second key point and the target hole;
determining a homography matrix according to the first position information and the second position information, and calibrating the relative position relation between the camera and the laser radar according to the homography matrix.
2. The method of claim 1, further comprising, after calibrating the relative positional relationship of the camera and the lidar according to the homography matrix:
projecting all point clouds in the three-dimensional point cloud information into the two-dimensional image information according to the homography matrix;
determining the number of invalid point clouds which are not overlapped with the two-dimensional image information in the three-dimensional point cloud information according to a projection result;
and determining a projection error according to the number of the invalid point clouds and the total number of the point clouds in the three-dimensional point cloud information, and determining a calibration result according to the projection error.
3. The method of claim 2, wherein determining a calibration result from the projection error comprises:
if the projection error is larger than a preset error value, determining that the calibration result is failed in calibration;
otherwise, determining the calibration result as successful calibration.
4. The method of claim 1, wherein the target is further provided with a target identifier, and the target identifier is used for calibrating profile information of the target.
5. The method of claim 1, wherein the target is a rectangular calibration plate and the target hole is a circular hole.
6. The method of claim 1, wherein the camera is a wide angle camera and the lidar is a solid state lidar.
7. A laser radar and camera combined calibration device, the device comprising:
the target detection information acquisition module is used for acquiring two-dimensional image information obtained by detecting a target by a camera and three-dimensional point cloud information obtained by detecting the target by a laser radar; wherein, at least one target hole is arranged on the target;
the key point position information determining module is used for determining first position information of a first key point in the two-dimensional image information and determining second position information of a second key point in the three-dimensional point cloud information; wherein, the relative position relation between the first key point and the target hole is the same as the relative position relation between the second key point and the target hole;
and the relative position relation calibration module is used for determining a homography matrix according to the first position information and the second position information, and calibrating the relative position relation between the camera and the laser radar according to the homography matrix.
8. The apparatus of claim 7, wherein the apparatus further comprises:
the point cloud projection module is used for projecting all point clouds in the three-dimensional point cloud information into the two-dimensional image information according to the homography matrix after calibrating the relative position relation between the camera and the laser radar according to the homography matrix;
the invalid point cloud quantity determining module is used for determining the quantity of invalid point clouds which are not overlapped with the two-dimensional image information in the three-dimensional point cloud information according to a projection result;
and the calibration result determining module is used for determining projection errors according to the number of the invalid point clouds and the total number of the point clouds in the three-dimensional point cloud information, and determining a calibration result according to the projection errors.
9. The apparatus of claim 8, wherein the calibration result determination module is configured to:
if the projection error is larger than a preset error value, determining that the calibration result is failed in calibration;
otherwise, determining the calibration result as successful calibration.
10. The apparatus of claim 7, wherein the target is further provided with a target identifier, and wherein the target identifier is used to calibrate profile information of the target.
11. The apparatus of claim 7, wherein the target is a rectangular calibration plate and the target aperture is a circular aperture.
12. The apparatus of claim 7, wherein the camera is a wide angle camera and the lidar is a solid state lidar.
13. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the method of joint calibration of a lidar and a camera of any of claims 1-6.
14. A computer readable storage medium, characterized in that the computer readable storage medium stores computer instructions for causing a processor to execute the combined calibration method of the lidar and the camera according to any of claims 1 to 6.
CN202311108056.4A 2023-08-30 2023-08-30 Laser radar and camera combined calibration method, device, equipment and medium Pending CN117152270A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311108056.4A CN117152270A (en) 2023-08-30 2023-08-30 Laser radar and camera combined calibration method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311108056.4A CN117152270A (en) 2023-08-30 2023-08-30 Laser radar and camera combined calibration method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN117152270A true CN117152270A (en) 2023-12-01

Family

ID=88883758

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311108056.4A Pending CN117152270A (en) 2023-08-30 2023-08-30 Laser radar and camera combined calibration method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN117152270A (en)

Similar Documents

Publication Publication Date Title
CN110689585B (en) Multi-phase external parameter combined calibration method, device, equipment and medium
JP2021144677A (en) Obstacle detection method, device, electronic apparatus, storage medium, and computer program
CN113012210A (en) Method and device for generating depth map, electronic equipment and storage medium
CN112991459B (en) Camera calibration method, device, equipment and storage medium
CN111612852A (en) Method and apparatus for verifying camera parameters
US20230351635A1 (en) Optical axis calibration method and apparatus of optical axis detection system, terminal, system, and medium
CN112381889A (en) Camera inspection method, device, equipment and storage medium
CN112634366B (en) Method for generating position information, related device and computer program product
CN117192520A (en) Method, device, equipment and medium for checking radar calibration parameters
CN117078767A (en) Laser radar and camera calibration method and device, electronic equipment and storage medium
CN114565683B (en) Precision determination method, device, equipment, medium and product
CN115437385B (en) Laser positioning method, device, equipment and medium of mobile robot
CN111336938A (en) Robot and object distance detection method and device thereof
CN117152270A (en) Laser radar and camera combined calibration method, device, equipment and medium
CN112509058B (en) External parameter calculating method, device, electronic equipment and storage medium
CN115147561A (en) Pose graph generation method, high-precision map generation method and device
CN114734444A (en) Target positioning method and device, electronic equipment and storage medium
CN116258714B (en) Defect identification method and device, electronic equipment and storage medium
CN114049615B (en) Traffic object fusion association method and device in driving environment and edge computing equipment
CN113984072B (en) Vehicle positioning method, device, equipment, storage medium and automatic driving vehicle
CN117739993B (en) Robot positioning method and device, robot and storage medium
CN114926549B (en) Three-dimensional point cloud processing method, device, equipment and storage medium
CN117252934A (en) Vehicle camera calibration method, device, equipment and medium
CN116243283A (en) Robot sensor calibration method and device, robot and storage medium
CN116485900A (en) Pose estimation method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination