CN111127563A - Combined calibration method and device, electronic equipment and storage medium - Google Patents
Combined calibration method and device, electronic equipment and storage medium Download PDFInfo
- Publication number
- CN111127563A CN111127563A CN201911308267.6A CN201911308267A CN111127563A CN 111127563 A CN111127563 A CN 111127563A CN 201911308267 A CN201911308267 A CN 201911308267A CN 111127563 A CN111127563 A CN 111127563A
- Authority
- CN
- China
- Prior art keywords
- point cloud
- image data
- laser radar
- camera
- cloud data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
- G06T2207/30208—Marker matrix
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Optical Radar Systems And Details Thereof (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
According to the joint calibration method, the joint calibration device, the electronic equipment and the storage medium, point cloud data and image data acquired by a laser radar and a camera aiming at the same target acquisition area are acquired, wherein a target object which can be used for calibration is arranged in the target acquisition area, and the target object comprises at least one angular point; determining a two-dimensional image coordinate and a three-dimensional point cloud coordinate of each corner point in the image data and the point cloud data respectively; calculating to respectively obtain external parameters of the laser radar and the camera according to the two-dimensional image coordinate and the three-dimensional point cloud coordinate of the same angular point; the laser radar and the external parameters are jointly calibrated according to the external parameters of the laser radar and the camera, so that the laser radar and the camera are not calibrated by using a calibration plate, point cloud data and image data acquired by the laser radar and the camera aiming at a target object in the same target acquisition area are directly calibrated, the calibration efficiency is improved, the calibration process is simplified, and the repeated calibration of the laser radar and the camera is facilitated.
Description
Technical Field
The embodiment of the disclosure relates to the field of data, and in particular, to a joint calibration method and apparatus, an electronic device, and a storage medium.
Background
With the development of intelligent transportation technology, basic traffic information is acquired through sensors, wherein the sensors for acquiring the basic traffic information are generally various, and include: sensors such as laser radar, camera, radar, and GPS. The advantages of various sensors can be exerted by fusing the collected information of the sensors, and the joint calibration of the sensors becomes an important problem in the process.
In the prior art, in order to calibrate different types of sensors, especially for the joint calibration of a laser radar and a camera, calibration plates are generally used to perform calibration operations respectively. Specifically, a black-and-white checkerboard-shaped plate type structure is arranged in a calibration scene, so that the laser radar and the camera realize combined calibration based on the calibration plate.
However, this method uses a calibration board, and the calibration process is cumbersome and has a large calibration workload.
Disclosure of Invention
In order to solve the above problems, the present disclosure provides a joint calibration method, device, electronic device, and storage medium.
In a first aspect, the present disclosure provides a joint calibration method, including:
acquiring point cloud data and image data acquired by a laser radar and a camera aiming at the same target acquisition area, wherein a target object which can be used for calibration is arranged in the target acquisition area, and the target object comprises at least one angular point;
determining a two-dimensional image coordinate and a three-dimensional point cloud coordinate of each corner point in the image data and the point cloud data respectively;
calculating to respectively obtain external parameters of the laser radar and the camera according to the two-dimensional image coordinate and the three-dimensional point cloud coordinate of the same angular point;
and carrying out combined calibration on the laser radar and the external parameters according to the external parameters of the laser radar and the camera.
In a second aspect, the present disclosure provides a joint calibration apparatus, including:
the system comprises a data interface module, a data acquisition module and a data processing module, wherein the data interface module is used for acquiring point cloud data and image data acquired by a laser radar and a camera aiming at the same target acquisition area, a target object which can be used for calibration is arranged in the target acquisition area, and the target object comprises at least one angular point;
the first processing module is used for determining a two-dimensional image coordinate and a three-dimensional point cloud coordinate of each corner point in the image data and the point cloud data respectively; the laser radar and the camera are respectively used for acquiring the external parameters of the laser radar and the camera through calculation according to the two-dimensional image coordinate and the three-dimensional point cloud coordinate of the same angular point;
and the second processing module is used for carrying out combined calibration on the laser radar and the external parameters according to the external parameters of the laser radar and the camera.
In a third aspect, the present disclosure provides an electronic device, comprising: at least one processor and memory;
the memory stores computer-executable instructions;
the at least one processor executing the computer-executable instructions stored by the memory causes the at least one processor to perform the joint calibration method as in any one of the preceding claims.
In a fourth aspect, the present disclosure provides a computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, implement the joint calibration method according to any one of the preceding claims.
According to the joint calibration method, the joint calibration device, the electronic equipment and the storage medium, point cloud data and image data acquired by a laser radar and a camera aiming at the same target acquisition area are acquired, wherein a target object which can be used for calibration is arranged in the target acquisition area, and the target object comprises at least one angular point; determining a two-dimensional image coordinate and a three-dimensional point cloud coordinate of each corner point in the image data and the point cloud data respectively; calculating to respectively obtain external parameters of the laser radar and the camera according to the two-dimensional image coordinate and the three-dimensional point cloud coordinate of the same angular point; the laser radar and the external parameters are jointly calibrated according to the external parameters of the laser radar and the camera, so that the laser radar and the camera are not calibrated by using a calibration plate, point cloud data and image data acquired by the laser radar and the camera aiming at a target object in the same target acquisition area are directly calibrated, the calibration efficiency is improved, the calibration process is simplified, and the repeated calibration of the laser radar and the camera is facilitated.
Drawings
In order to more clearly illustrate the embodiments of the present disclosure or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present disclosure, and for those skilled in the art, other drawings can be obtained according to the drawings without inventive exercise.
FIG. 1 is a schematic diagram of a network architecture upon which the present disclosure is based;
fig. 2 is a schematic flow chart of a joint calibration method according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of a coordinate system involved in a joint calibration method provided in the embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a calibration interface of a combined calibration method according to an embodiment of the present disclosure;
fig. 5 is a block diagram of a combined calibration apparatus provided in the embodiment of the present disclosure;
fig. 6 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure.
Detailed Description
To make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are some, but not all embodiments of the present disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments disclosed herein without making any creative effort, shall fall within the protection scope of the present disclosure.
With the development of intelligent transportation technology, basic traffic information is acquired through sensors, wherein the sensors for acquiring the basic traffic information are generally various, and include: sensors such as laser radar, camera, radar, and GPS. The advantages of various sensors can be exerted by fusing the collected information of the sensors, and the joint calibration of the sensors becomes an important problem in the process.
In the prior art, in order to calibrate different types of sensors, especially for the joint calibration of a laser radar and a camera, calibration plates are generally used to perform calibration operations respectively. Specifically, a black-and-white checkerboard-shaped plate type structure is arranged in a calibration scene, so that the laser radar and the camera realize combined calibration based on the calibration plate.
However, this method uses a calibration board, and the calibration process is cumbersome and has a large calibration workload.
In order to solve the above problems, the present disclosure provides a joint calibration method, apparatus, electronic device, and storage medium.
Referring to fig. 1, fig. 1 is a schematic diagram of a network architecture on which the present disclosure is based, and as shown in fig. 1, one network architecture on which the present disclosure is based may include a joint calibration apparatus 1, a laser radar 2, and a camera 3.
The joint calibration apparatus 1 is hardware or software that can interact with the laser radar 2 and the camera 3 through a network, and can be used to perform the joint calibration method described in the following embodiments.
When the combined calibration device 1 is hardware, it includes a cloud server with an operation function. When the joint calibration apparatus 1 is software, it can be installed in electronic devices with computing function, wherein the electronic devices include, but are not limited to, laptop portable computers, desktop computers, and the like.
In addition, the laser radar 2 and the camera 3 are information acquisition devices erected on the same vehicle or the same automatic driving device, and can be used for shooting or acquiring signals for the same target acquisition area, and sending corresponding data to the combined calibration device 1 for processing by performing communication connection with the combined calibration device 1.
It should be noted that, based on different application scenarios, the laser radar 2 and the camera 3 may also be erected on other devices, and in some cases, the joint calibration apparatus 1 may also be disposed on the same device as the laser radar 2 and the camera 3.
In a first aspect, referring to fig. 2, fig. 2 is a schematic flowchart of a combined calibration method provided in the embodiment of the present disclosure. The method provided by the embodiment of the disclosure comprises the following steps:
It should be noted that the execution subject of the joint calibration method provided by this embodiment is the aforementioned joint calibration apparatus.
Specifically, firstly, a laser radar and a camera can be installed and fixed on a monitoring upright pole or equipment, and the laser radar collects a target collection area to obtain laser radar data, namely point cloud data; acquiring a target acquisition area by a camera to obtain image data; image data of an area to be monitored is acquired by a camera.
It should be noted that, in the embodiment of the present disclosure, the installation positions of the laser radar and the camera may be installed up and down, that is, the laser radar and the camera are stacked on the monitoring upright or the device.
In addition, in order to enable the laser radar and the camera to acquire data aiming at the same target acquisition area, when the camera and the laser radar are installed, the view angle of the laser radar is limited according to the view angle of the camera, so that the two cameras can acquire and obtain data aiming at the same target object which can be used for calibration in the target acquisition area.
Generally, the 0 degree direction of the laser radar needs to be overlapped with the optical axis direction of the camera, and if the horizontal angle of view of the camera is deviated from the optical axis by θ, the angle of view of the laser radar is limited to be deviated from σ in the optical axis direction, wherein σ may be slightly larger than θ.
The target object is any object which appears in the target acquisition area and can be used for calibration, has angular point characteristics, generally refers to a surface with obvious angular point information, such as a truck, a bus and the like which are perpendicular to the ground, and the included angle between the normal vector of the surface and the optical axis of the camera is less than 30 degrees. Wherein, the angular point refers to a point which belongs to two edges of the object on the object structure; the two edges should have different normal vectors. For example, automobiles and their corners; tricycle and its handle bar end, etc.
In addition, in an optional example, the point cloud data and the image data are acquired by the laser radar and the camera synchronously aiming at the same target acquisition area; therefore, point cloud data and image data acquired by the laser radar and the camera aiming at the same target acquisition area are acquired, and the following method can be specifically adopted: the method comprises the steps of respectively obtaining original point cloud data and original image data which are obtained by a laser radar and a camera aiming at the same target collection area, carrying out time synchronization processing on the original point cloud data and the original image data, and determining the point cloud data and the image data which are obtained by collection at the same moment.
Further, the original point cloud data comprises at least one point cloud data and a timestamp of the laser radar when the laser radar collects the point cloud data; the raw image data includes at least one image data and a timestamp of when the camera acquired each image data. That is to say, in this embodiment, a difference between the time stamps of each point cloud data and each image data may be calculated, and the point cloud data and the image data whose difference is smaller than a preset threshold are used as the point cloud data and the image data acquired at the same time.
For example, respectively obtainTime stamps t1 and t2 accurate to ms are taken from the laser radar and the camera, and if the absolute value of the difference value of the time stamps corresponding to the two data is less than a certain set threshold value delta (usually set to be 10ms), namely | t |, the absolute value of the difference value is less than the threshold value delta1-t2|<δ, the two sets of laser radar data and image data are considered to be acquired at the same time.
And 102, respectively determining a two-dimensional image coordinate and a three-dimensional point cloud coordinate of each corner point in the image data and the point cloud data.
Specifically, in the present example, the two-dimensional image coordinates and the three-dimensional point cloud coordinates of each corner point may be determined in the point cloud data and the image data acquired at the same time, respectively.
For the two-dimensional point cloud coordinates, the image data is processed by an angular point detection algorithm, specifically, an area with an obvious plane angular point needs to be selected, and then the two-dimensional image coordinates of the angular point are detected by the angular point detection algorithm. And intercepting a target area in the image and detecting corners in the two-dimensional image of the target object by using a corner detection algorithm based on image gray.
Meanwhile, for the three-dimensional point cloud coordinate, fitting by using a plane fitting algorithm of RANSAC and obtaining by calculation, specifically, firstly, fitting a plane of a target object by using a plane fitting algorithm, wherein a normal vector of the target object is parallel to the ground because the plane of the target object is vertical to the ground, searching a plane parallel to the normal vector of the ground in the fitted plane, namely the plane of the target object, and the plane of the target object forms the three-dimensional point cloud coordinate, then, calculating an included angle between the normal vector and an optical axis, and recording as η, and then, determining whether the plane where the three-dimensional point cloud coordinate is located is consistent with the plane where an angular point in a two-dimensional image coordinate is located according to the included angle between the normal vector of the plane of the target object and the optical axis.
And 103, calculating to respectively obtain external parameters of the laser radar and the camera according to the two-dimensional image coordinate and the three-dimensional point cloud coordinate of the same angular point.
Specifically, fig. 3 is a schematic diagram of a coordinate system involved in a joint calibration method provided in the embodiment of the present disclosure; and iteratively solving a solution with the minimum reprojection error by utilizing a PNP algorithm for a two-dimensional image coordinate and a three-dimensional point cloud coordinate corresponding to each angular point to serve as external parameters of the laser radar and the camera, wherein the external parameters comprise a rotation matrix and a translation vector of a laser radar coordinate system and a camera coordinate system. Further, the following formula shows a specific solving formula:
wherein R represents a rotation matrix; t represents a translation matrix; u, v respectively represent image coordinates; x, y and z represent coordinates in the lidar coordinate system, respectively.
And step 104, carrying out combined calibration on the laser radar and the camera according to the external parameters of the laser radar and the camera.
Specifically, after external parameters of the laser radar and the camera are obtained, point cloud data of the laser radar are mapped into an image, an object and a marker with obvious edges are searched, and the edge alignment condition of the object and the marker is checked. If the point cloud edge and the image edge can be superposed, the calibration precision is high, and vice versa.
Fig. 4 is a schematic diagram of a calibration interface of the joint calibration method provided by the embodiment of the disclosure, as shown in fig. 4, data and images acquired by the laser radar and the camera are jointly calibrated by using the computed external reference of the laser radar and the camera to obtain a result shown in fig. 3, and the external reference calibration result can be verified by using a reprojection image. As can be seen from FIG. 3, the method provided by the invention obtains better external reference of the laser radar and the camera.
The joint calibration method provided by the disclosure comprises the steps of acquiring point cloud data and image data acquired by a laser radar and a camera aiming at the same target acquisition area, wherein a target object which can be used for calibration is arranged in the target acquisition area, and the target object comprises at least one angular point; determining a two-dimensional image coordinate and a three-dimensional point cloud coordinate of each corner point in the image data and the point cloud data respectively; calculating to respectively obtain external parameters of the laser radar and the camera according to the two-dimensional image coordinate and the three-dimensional point cloud coordinate of the same angular point; the laser radar and the external parameters are jointly calibrated according to the external parameters of the laser radar and the camera, so that the laser radar and the camera are not calibrated by using a calibration plate, point cloud data and image data acquired by the laser radar and the camera aiming at a target object in the same target acquisition area are directly calibrated, the calibration efficiency is improved, the calibration process is simplified, and the repeated calibration of the laser radar and the camera is facilitated.
Fig. 5 is a block diagram of a joint calibration apparatus according to an embodiment of the present disclosure, which corresponds to the joint calibration method according to the foregoing embodiment. For ease of illustration, only portions that are relevant to embodiments of the present disclosure are shown. Referring to fig. 5, the present disclosure provides a joint calibration apparatus, including:
the data interface module 10 is configured to obtain point cloud data and image data acquired by a laser radar and a camera for a same target acquisition area, where a target object for calibration is arranged in the target acquisition area, and the target object includes at least one corner point;
a first processing module 20, configured to determine a two-dimensional image coordinate and a three-dimensional point cloud coordinate of each corner point in the image data and the point cloud data, respectively; the laser radar and the camera are respectively used for acquiring the external parameters of the laser radar and the camera through calculation according to the two-dimensional image coordinate and the three-dimensional point cloud coordinate of the same angular point;
and the second processing module 30 is configured to perform joint calibration on the lidar and the external parameters according to the external parameters of the lidar and the camera.
In an optional example, the point cloud data and the image data are acquired by a laser radar and a camera synchronously aiming at the same target acquisition area;
the data interface module 10 is specifically configured to obtain original point cloud data and original image data acquired by the laser radar and the camera for the same target acquisition area, respectively;
the first processing module 20 is specifically configured to perform time synchronization processing on the original point cloud data and the original image data, and determine point cloud data and image data acquired at the same time; and respectively determining the two-dimensional image coordinates and the three-dimensional point cloud coordinates of each corner point in the point cloud data and the image data acquired at the same time.
In an optional example, the raw point cloud data comprises at least one point cloud data and a timestamp of the laser radar when the laser radar collects each point cloud data; the original image data comprises at least one image data and a time stamp of the camera when acquiring each image data;
the pair of first processing modules 20 is specifically configured to calculate a difference between time stamps of each point cloud data and each image data, and use the point cloud data and the image data whose difference is smaller than a preset threshold as point cloud data and image data acquired at the same time.
In an optional example, the first processing module 20 is specifically configured to: processing the point cloud data by using a plane fitting algorithm to determine a three-dimensional point cloud data plane of the target object, and determining a three-dimensional point cloud coordinate of each focus according to the three-dimensional point cloud data plane; and processing the image data by using an angular point detection algorithm of image gray to obtain a two-dimensional image coordinate of each angular point.
In an optional example, the first processing module 20 is specifically configured to: performing iterative operation processing on the two-dimensional image coordinate and the three-dimensional point cloud coordinate of the same angular point by using a PNP algorithm to obtain a solution with the minimum projection error; and taking the solution with the minimum projection error as external parameters of the laser radar and the camera.
The joint calibration device provided by the disclosure acquires point cloud data and image data acquired by a laser radar and a camera aiming at the same target acquisition area, wherein a target object which can be used for calibration is arranged in the target acquisition area, and the target object comprises at least one angular point; determining a two-dimensional image coordinate and a three-dimensional point cloud coordinate of each corner point in the image data and the point cloud data respectively; calculating to respectively obtain external parameters of the laser radar and the camera according to the two-dimensional image coordinate and the three-dimensional point cloud coordinate of the same angular point; the laser radar and the external parameters are jointly calibrated according to the external parameters of the laser radar and the camera, so that the laser radar and the camera are not calibrated by using a calibration plate, point cloud data and image data acquired by the laser radar and the camera aiming at a target object in the same target acquisition area are directly calibrated, the calibration efficiency is improved, the calibration process is simplified, and the repeated calibration of the laser radar and the camera is facilitated.
The electronic device provided in this embodiment may be used to implement the technical solutions of the above method embodiments, and the implementation principles and technical effects are similar, which are not described herein again.
Referring to fig. 6, a schematic diagram of a structure of an electronic device 900 suitable for implementing an embodiment of the present disclosure is shown, where the electronic device 900 may be a terminal device or a server. Among them, the terminal Device may include, but is not limited to, a mobile terminal such as a mobile phone, a notebook computer, a Digital broadcast receiver, a Personal Digital Assistant (PDA), a tablet computer (PAD), a Portable Multimedia Player (PMP), a car terminal (e.g., car navigation terminal), etc., and a fixed terminal such as a Digital TV, a desktop computer, etc. The electronic device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 6, the electronic device 900 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 901, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 902 or a program loaded from a storage means 908 into a Random Access Memory (RAM) 903. In the RAM 903, various programs and data necessary for the operation of the electronic apparatus 900 are also stored. The processing apparatus 901, the ROM902, and the RAM 903 are connected to each other through a bus 904. An input/output (I/O) interface 905 is also connected to bus 904.
Generally, the following devices may be connected to the I/O interface 905: input devices 906 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 907 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 908 including, for example, magnetic tape, hard disk, etc.; and a communication device 909. The communication device 909 may allow the electronic apparatus 900 to perform wireless or wired communication with other apparatuses to exchange data. While fig. 6 illustrates an electronic device 900 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication device 909, or installed from the storage device 908, or installed from the ROM 902. The computer program performs the above-described functions defined in the methods of the embodiments of the present disclosure when executed by the processing apparatus 901.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to perform the methods shown in the above embodiments.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of Network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a unit does not in some cases constitute a limitation of the unit itself, for example, the first retrieving unit may also be described as a "unit for retrieving at least two internet protocol addresses".
The functions described herein above may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: field Programmable Gate Arrays (FPGAs), Application Specific Integrated Circuits (ASICs), Application Specific Standard Products (ASSPs), systems on a chip (SOCs), Complex Programmable Logic Devices (CPLDs), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The following are some embodiments of the disclosure.
In a first aspect, the present disclosure provides a joint calibration method, including:
acquiring point cloud data and image data acquired by a laser radar and a camera aiming at the same target acquisition area, wherein a target object which can be used for calibration is arranged in the target acquisition area, and the target object comprises at least one angular point;
determining a two-dimensional image coordinate and a three-dimensional point cloud coordinate of each corner point in the image data and the point cloud data respectively;
calculating to respectively obtain external parameters of the laser radar and the camera according to the two-dimensional image coordinate and the three-dimensional point cloud coordinate of the same angular point;
and carrying out combined calibration on the laser radar and the external parameters according to the external parameters of the laser radar and the camera.
In an optional example, the point cloud data and the image data are acquired by a laser radar and a camera synchronously aiming at the same target acquisition area;
correspondingly, point cloud data and image data acquired by the laser radar and the camera aiming at the same target acquisition area are acquired, and the method comprises the following steps:
respectively obtaining original point cloud data and original image data acquired by a laser radar and a camera aiming at the same target acquisition area, carrying out time synchronization processing on the original point cloud data and the original image data, and determining the point cloud data and the image data acquired at the same moment;
correspondingly, respectively determining a two-dimensional image coordinate and a three-dimensional point cloud coordinate of each corner point in the image data and the point cloud data;
and respectively determining the two-dimensional image coordinates and the three-dimensional point cloud coordinates of each corner point in the point cloud data and the image data acquired at the same time.
In an optional example, the raw point cloud data comprises at least one point cloud data and a timestamp of the laser radar when the laser radar collects each point cloud data; the original image data comprises at least one image data and a time stamp of the camera when acquiring each image data;
the time synchronization processing of the original point cloud data and the original image data to determine the point cloud data and the image data acquired at the same time comprises the following steps:
and calculating the difference value between the time stamps of the point cloud data and the image data, and taking the point cloud data and the image data with the difference value smaller than a preset threshold value as the point cloud data and the image data acquired at the same moment.
In an alternative example, the determining two-dimensional image coordinates and three-dimensional point cloud coordinates of each corner point in the image data and the point cloud data, respectively, includes:
processing the point cloud data by using a plane fitting algorithm to determine a three-dimensional point cloud data plane of the target object, and determining a three-dimensional point cloud coordinate of each focus according to the three-dimensional point cloud data plane;
and processing the image data by using an angular point detection algorithm of image gray to obtain a two-dimensional image coordinate of each angular point.
In an optional example, the calculating to obtain the external parameters of the lidar and the camera according to the two-dimensional image coordinate and the three-dimensional point cloud coordinate of the same corner point includes:
performing iterative operation processing on the two-dimensional image coordinate and the three-dimensional point cloud coordinate of the same angular point by using a PNP algorithm to obtain a solution with the minimum projection error;
and taking the solution with the minimum projection error as external parameters of the laser radar and the camera.
In a second aspect, the present disclosure provides a joint calibration apparatus, including:
the system comprises a data interface module, a data acquisition module and a data processing module, wherein the data interface module is used for acquiring point cloud data and image data acquired by a laser radar and a camera aiming at the same target acquisition area, a target object which can be used for calibration is arranged in the target acquisition area, and the target object comprises at least one angular point;
the first processing module is used for determining a two-dimensional image coordinate and a three-dimensional point cloud coordinate of each corner point in the image data and the point cloud data respectively; the laser radar and the camera are respectively used for acquiring the external parameters of the laser radar and the camera through calculation according to the two-dimensional image coordinate and the three-dimensional point cloud coordinate of the same angular point;
and the second processing module is used for carrying out combined calibration on the laser radar and the external parameters according to the external parameters of the laser radar and the camera.
In an optional example, the point cloud data and the image data are acquired by a laser radar and a camera synchronously aiming at the same target acquisition area;
the data interface module is specifically used for respectively obtaining original point cloud data and original image data acquired by the laser radar and the camera aiming at the same target acquisition area;
the first processing module is specifically used for carrying out time synchronization processing on the original point cloud data and the original image data and determining point cloud data and image data acquired at the same time; and respectively determining the two-dimensional image coordinates and the three-dimensional point cloud coordinates of each corner point in the point cloud data and the image data acquired at the same time.
In an optional example, the raw point cloud data comprises at least one point cloud data and a timestamp of the laser radar when the laser radar collects each point cloud data; the original image data comprises at least one image data and a time stamp of the camera when acquiring each image data;
the pair of first processing modules is specifically used for calculating a difference value between the time stamps of each point cloud data and each image data, and the point cloud data and the image data with the difference value smaller than a preset threshold value are used as the point cloud data and the image data acquired at the same time.
In an optional example, the first processing module is specifically configured to: processing the point cloud data by using a plane fitting algorithm to determine a three-dimensional point cloud data plane of the target object, and determining a three-dimensional point cloud coordinate of each focus according to the three-dimensional point cloud data plane; and processing the image data by using an angular point detection algorithm of image gray to obtain a two-dimensional image coordinate of each angular point.
In an optional example, the first processing module is specifically configured to: performing iterative operation processing on the two-dimensional image coordinate and the three-dimensional point cloud coordinate of the same angular point by using a PNP algorithm to obtain a solution with the minimum projection error; and taking the solution with the minimum projection error as external parameters of the laser radar and the camera.
In a third aspect, the present disclosure provides an electronic device, comprising: at least one processor and memory;
the memory stores computer-executable instructions;
the at least one processor executing the computer-executable instructions stored by the memory causes the at least one processor to perform the joint calibration method as in any one of the preceding claims.
In a fourth aspect, the present disclosure provides a computer-readable storage medium having stored therein computer-executable instructions that, when executed by a processor, implement the joint calibration method according to any one of the preceding claims.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the disclosure herein is not limited to the particular combination of features described above, but also encompasses other embodiments in which any combination of the features described above or their equivalents does not depart from the spirit of the disclosure. For example, the above features and (but not limited to) the features disclosed in this disclosure having similar functions are replaced with each other to form the technical solution.
Further, while operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. Under certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limitations on the scope of the disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
Claims (12)
1. A joint calibration method is characterized by comprising the following steps:
acquiring point cloud data and image data acquired by a laser radar and a camera aiming at the same target acquisition area, wherein a target object which can be used for calibration is arranged in the target acquisition area, and the target object comprises at least one angular point;
determining a two-dimensional image coordinate and a three-dimensional point cloud coordinate of each corner point in the image data and the point cloud data respectively;
calculating to respectively obtain external parameters of the laser radar and the camera according to the two-dimensional image coordinate and the three-dimensional point cloud coordinate of the same angular point;
and carrying out combined calibration on the laser radar and the external parameters according to the external parameters of the laser radar and the camera.
2. The joint calibration method according to claim 1, wherein the point cloud data and the image data are obtained by synchronously acquiring the same target acquisition area by a laser radar and a camera;
correspondingly, point cloud data and image data acquired by the laser radar and the camera aiming at the same target acquisition area are acquired, and the method comprises the following steps:
respectively obtaining original point cloud data and original image data acquired by a laser radar and a camera aiming at the same target acquisition area, carrying out time synchronization processing on the original point cloud data and the original image data, and determining the point cloud data and the image data acquired at the same moment;
correspondingly, respectively determining a two-dimensional image coordinate and a three-dimensional point cloud coordinate of each corner point in the image data and the point cloud data;
and respectively determining the two-dimensional image coordinates and the three-dimensional point cloud coordinates of each corner point in the point cloud data and the image data acquired at the same time.
3. The joint calibration method according to claim 2, wherein the original point cloud data comprises at least one point cloud data and a timestamp of the lidar when collecting the point cloud data; the original image data comprises at least one image data and a time stamp of the camera when acquiring each image data;
the time synchronization processing of the original point cloud data and the original image data to determine the point cloud data and the image data acquired at the same time comprises the following steps:
and calculating the difference value between the time stamps of the point cloud data and the image data, and taking the point cloud data and the image data with the difference value smaller than a preset threshold value as the point cloud data and the image data acquired at the same moment.
4. The joint calibration method according to claim 1, wherein the determining two-dimensional image coordinates and three-dimensional point cloud coordinates of each corner point in the image data and the point cloud data, respectively, comprises:
processing the point cloud data by using a plane fitting algorithm to determine a three-dimensional point cloud data plane of the target object, and determining a three-dimensional point cloud coordinate of each focus according to the three-dimensional point cloud data plane;
and processing the image data by using an angular point detection algorithm of image gray to obtain a two-dimensional image coordinate of each angular point.
5. The joint calibration method according to claim 1, wherein the calculating to obtain the external parameters of the lidar and the camera respectively according to the two-dimensional image coordinates and the three-dimensional point cloud coordinates of the same corner point comprises:
performing iterative operation processing on the two-dimensional image coordinate and the three-dimensional point cloud coordinate of the same angular point by using a PNP algorithm to obtain a solution with the minimum projection error;
and taking the solution with the minimum projection error as external parameters of the laser radar and the camera.
6. A joint calibration device, comprising:
the system comprises a data interface module, a data acquisition module and a data processing module, wherein the data interface module is used for acquiring point cloud data and image data acquired by a laser radar and a camera aiming at the same target acquisition area, a target object which can be used for calibration is arranged in the target acquisition area, and the target object comprises at least one angular point;
the first processing module is used for determining a two-dimensional image coordinate and a three-dimensional point cloud coordinate of each corner point in the image data and the point cloud data respectively; the laser radar and the camera are respectively used for acquiring the external parameters of the laser radar and the camera through calculation according to the two-dimensional image coordinate and the three-dimensional point cloud coordinate of the same angular point;
and the second processing module is used for carrying out combined calibration on the laser radar and the external parameters according to the external parameters of the laser radar and the camera.
7. The joint calibration device according to claim 6, wherein the point cloud data and the image data are obtained by synchronously acquiring the same target acquisition area by the laser radar and the camera;
the data interface module is specifically used for respectively obtaining original point cloud data and original image data acquired by the laser radar and the camera aiming at the same target acquisition area;
the first processing module is specifically used for carrying out time synchronization processing on the original point cloud data and the original image data and determining point cloud data and image data acquired at the same time; and respectively determining the two-dimensional image coordinates and the three-dimensional point cloud coordinates of each corner point in the point cloud data and the image data acquired at the same time.
8. The joint calibration device according to claim 7, wherein the raw point cloud data comprises at least one point cloud data and a timestamp of the lidar when collecting the point cloud data; the original image data comprises at least one image data and a time stamp of the camera when acquiring each image data;
the pair of first processing modules is specifically used for calculating a difference value between the time stamps of each point cloud data and each image data, and the point cloud data and the image data with the difference value smaller than a preset threshold value are used as the point cloud data and the image data acquired at the same time.
9. The joint calibration apparatus according to claim 6, wherein the first processing module is specifically configured to: processing the point cloud data by using a plane fitting algorithm to determine a three-dimensional point cloud data plane of the target object, and determining a three-dimensional point cloud coordinate of each focus according to the three-dimensional point cloud data plane; and processing the image data by using an angular point detection algorithm of image gray to obtain a two-dimensional image coordinate of each angular point.
10. The joint calibration apparatus according to claim 6, wherein the first processing module is specifically configured to: performing iterative operation processing on the two-dimensional image coordinate and the three-dimensional point cloud coordinate of the same angular point by using a PNP algorithm to obtain a solution with the minimum projection error; and taking the solution with the minimum projection error as external parameters of the laser radar and the camera.
11. An electronic device, comprising: at least one processor and memory;
the memory stores computer-executable instructions;
the at least one processor executing computer-executable instructions stored by the memory causes the at least one processor to perform the joint calibration method as claimed in any one of claims 1-5.
12. A computer-readable storage medium having computer-executable instructions stored thereon which, when executed by a processor, implement the joint calibration method as claimed in any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911308267.6A CN111127563A (en) | 2019-12-18 | 2019-12-18 | Combined calibration method and device, electronic equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911308267.6A CN111127563A (en) | 2019-12-18 | 2019-12-18 | Combined calibration method and device, electronic equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111127563A true CN111127563A (en) | 2020-05-08 |
Family
ID=70499528
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911308267.6A Pending CN111127563A (en) | 2019-12-18 | 2019-12-18 | Combined calibration method and device, electronic equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111127563A (en) |
Cited By (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111580122A (en) * | 2020-05-28 | 2020-08-25 | 睿镞科技(北京)有限责任公司 | Space measuring apparatus, method, device, and computer-readable storage medium |
CN111721281A (en) * | 2020-05-27 | 2020-09-29 | 北京百度网讯科技有限公司 | Position identification method and device and electronic equipment |
CN111735479A (en) * | 2020-08-28 | 2020-10-02 | 中国计量大学 | Multi-sensor combined calibration device and method |
CN111814769A (en) * | 2020-09-02 | 2020-10-23 | 深圳市城市交通规划设计研究中心股份有限公司 | Information acquisition method and device, terminal equipment and storage medium |
CN111965624A (en) * | 2020-08-06 | 2020-11-20 | 北京百度网讯科技有限公司 | Calibration method, device and equipment for laser radar and camera and readable storage medium |
CN112017251A (en) * | 2020-10-19 | 2020-12-01 | 杭州飞步科技有限公司 | Calibration method and device, road side equipment and computer readable storage medium |
CN112017205A (en) * | 2020-07-27 | 2020-12-01 | 清华大学 | Automatic calibration method and system for space positions of laser radar and camera sensor |
CN112162263A (en) * | 2020-10-26 | 2021-01-01 | 苏州挚途科技有限公司 | Combined calibration method and device for sensor and electronic equipment |
CN112215905A (en) * | 2020-10-22 | 2021-01-12 | 北京易达恩能科技有限公司 | Automatic calibration method of mobile infrared temperature measurement system |
CN112270713A (en) * | 2020-10-14 | 2021-01-26 | 北京航空航天大学杭州创新研究院 | Calibration method and device, storage medium and electronic device |
CN112381873A (en) * | 2020-10-23 | 2021-02-19 | 北京亮道智能汽车技术有限公司 | Data labeling method and device |
CN112394347A (en) * | 2020-11-18 | 2021-02-23 | 杭州海康威视数字技术股份有限公司 | Target detection method, device and equipment |
CN112446926A (en) * | 2020-12-14 | 2021-03-05 | 北京易达恩能科技有限公司 | Method and device for calibrating relative position of laser radar and multi-eye fisheye camera |
CN112505663A (en) * | 2020-11-25 | 2021-03-16 | 上海交通大学 | Calibration method for multi-line laser radar and camera combined calibration |
CN112509058A (en) * | 2020-11-30 | 2021-03-16 | 北京百度网讯科技有限公司 | Method and device for calculating external parameters, electronic equipment and storage medium |
CN112526470A (en) * | 2020-12-22 | 2021-03-19 | 北京百度网讯科技有限公司 | Method and device for calibrating radar parameters, electronic equipment and storage medium |
CN112578367A (en) * | 2020-10-21 | 2021-03-30 | 上汽大众汽车有限公司 | System and method for measuring relative time of camera and laser radar in automatic driving system |
CN112764004A (en) * | 2020-12-22 | 2021-05-07 | 中国第一汽车股份有限公司 | Point cloud processing method, device, equipment and storage medium |
CN112802126A (en) * | 2021-02-26 | 2021-05-14 | 上海商汤临港智能科技有限公司 | Calibration method, calibration device, computer equipment and storage medium |
CN112819903A (en) * | 2021-03-02 | 2021-05-18 | 福州视驰科技有限公司 | Camera and laser radar combined calibration method based on L-shaped calibration plate |
CN112861660A (en) * | 2021-01-22 | 2021-05-28 | 上海西井信息科技有限公司 | Laser radar array and camera synchronization device, method, equipment and storage medium |
CN113176557A (en) * | 2021-04-29 | 2021-07-27 | 中国科学院自动化研究所 | Virtual laser radar online simulation method based on projection |
CN113192145A (en) * | 2021-05-08 | 2021-07-30 | 深圳市商汤科技有限公司 | Equipment calibration method and device, electronic equipment and storage medium |
CN113406604A (en) * | 2021-06-30 | 2021-09-17 | 山东新一代信息产业技术研究院有限公司 | Device and method for calibrating positions of laser radar and camera |
CN113538591A (en) * | 2021-06-18 | 2021-10-22 | 深圳奥锐达科技有限公司 | Calibration method and device for distance measuring device and camera fusion system |
CN113724338A (en) * | 2021-08-31 | 2021-11-30 | 上海西井信息科技有限公司 | Method, system, device and storage medium for shooting moving object based on table |
CN113740829A (en) * | 2021-11-05 | 2021-12-03 | 新石器慧通(北京)科技有限公司 | External parameter monitoring method and device for environment sensing equipment, medium and running device |
CN113763478A (en) * | 2020-09-09 | 2021-12-07 | 北京京东乾石科技有限公司 | Unmanned vehicle camera calibration method, device, equipment, storage medium and system |
CN113759346A (en) * | 2020-10-10 | 2021-12-07 | 北京京东乾石科技有限公司 | Laser radar calibration method and device, electronic equipment and storage medium |
CN113848541A (en) * | 2021-09-22 | 2021-12-28 | 深圳市镭神智能系统有限公司 | Calibration method and device, unmanned aerial vehicle and computer readable storage medium |
CN114152935A (en) * | 2021-11-19 | 2022-03-08 | 苏州一径科技有限公司 | Method, device and equipment for evaluating radar external parameter calibration precision |
CN115236689A (en) * | 2022-09-23 | 2022-10-25 | 北京小马易行科技有限公司 | Method and device for determining relative positions of laser radar and image acquisition equipment |
CN115239815A (en) * | 2021-06-23 | 2022-10-25 | 上海仙途智能科技有限公司 | Camera calibration method and device |
CN115994955A (en) * | 2023-03-23 | 2023-04-21 | 深圳佑驾创新科技有限公司 | Camera external parameter calibration method and device and vehicle |
CN116740197A (en) * | 2023-08-11 | 2023-09-12 | 之江实验室 | External parameter calibration method and device, storage medium and electronic equipment |
CN116736227A (en) * | 2023-08-15 | 2023-09-12 | 无锡聚诚智能科技有限公司 | Method for jointly calibrating sound source position by microphone array and camera |
WO2023217047A1 (en) * | 2022-05-07 | 2023-11-16 | 节卡机器人股份有限公司 | Positioning method and apparatus, and electronic device and readable storage medium |
CN117388831A (en) * | 2023-12-13 | 2024-01-12 | 中科视语(北京)科技有限公司 | Camera and laser radar combined calibration method and device, electronic equipment and medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104142157A (en) * | 2013-05-06 | 2014-11-12 | 北京四维图新科技股份有限公司 | Calibration method, device and equipment |
CN109920011A (en) * | 2019-05-16 | 2019-06-21 | 长沙智能驾驶研究院有限公司 | Outer ginseng scaling method, device and the equipment of laser radar and binocular camera |
CN109949372A (en) * | 2019-03-18 | 2019-06-28 | 北京智行者科技有限公司 | A kind of laser radar and vision combined calibrating method |
CN110349221A (en) * | 2019-07-16 | 2019-10-18 | 北京航空航天大学 | A kind of three-dimensional laser radar merges scaling method with binocular visible light sensor |
-
2019
- 2019-12-18 CN CN201911308267.6A patent/CN111127563A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104142157A (en) * | 2013-05-06 | 2014-11-12 | 北京四维图新科技股份有限公司 | Calibration method, device and equipment |
CN109949372A (en) * | 2019-03-18 | 2019-06-28 | 北京智行者科技有限公司 | A kind of laser radar and vision combined calibrating method |
CN109920011A (en) * | 2019-05-16 | 2019-06-21 | 长沙智能驾驶研究院有限公司 | Outer ginseng scaling method, device and the equipment of laser radar and binocular camera |
CN110349221A (en) * | 2019-07-16 | 2019-10-18 | 北京航空航天大学 | A kind of three-dimensional laser radar merges scaling method with binocular visible light sensor |
Cited By (56)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111721281A (en) * | 2020-05-27 | 2020-09-29 | 北京百度网讯科技有限公司 | Position identification method and device and electronic equipment |
CN111721281B (en) * | 2020-05-27 | 2022-07-15 | 阿波罗智联(北京)科技有限公司 | Position identification method and device and electronic equipment |
CN111580122B (en) * | 2020-05-28 | 2022-12-06 | 睿镞科技(北京)有限责任公司 | Space measuring apparatus, method, device, and computer-readable storage medium |
CN111580122A (en) * | 2020-05-28 | 2020-08-25 | 睿镞科技(北京)有限责任公司 | Space measuring apparatus, method, device, and computer-readable storage medium |
CN112017205A (en) * | 2020-07-27 | 2020-12-01 | 清华大学 | Automatic calibration method and system for space positions of laser radar and camera sensor |
CN112017205B (en) * | 2020-07-27 | 2021-06-25 | 清华大学 | Automatic calibration method and system for space positions of laser radar and camera sensor |
CN111965624A (en) * | 2020-08-06 | 2020-11-20 | 北京百度网讯科技有限公司 | Calibration method, device and equipment for laser radar and camera and readable storage medium |
CN111965624B (en) * | 2020-08-06 | 2024-04-09 | 阿波罗智联(北京)科技有限公司 | Laser radar and camera calibration method, device, equipment and readable storage medium |
CN111735479A (en) * | 2020-08-28 | 2020-10-02 | 中国计量大学 | Multi-sensor combined calibration device and method |
CN111735479B (en) * | 2020-08-28 | 2021-03-23 | 中国计量大学 | Multi-sensor combined calibration device and method |
CN111814769A (en) * | 2020-09-02 | 2020-10-23 | 深圳市城市交通规划设计研究中心股份有限公司 | Information acquisition method and device, terminal equipment and storage medium |
CN113763478A (en) * | 2020-09-09 | 2021-12-07 | 北京京东乾石科技有限公司 | Unmanned vehicle camera calibration method, device, equipment, storage medium and system |
CN113759346A (en) * | 2020-10-10 | 2021-12-07 | 北京京东乾石科技有限公司 | Laser radar calibration method and device, electronic equipment and storage medium |
CN112270713A (en) * | 2020-10-14 | 2021-01-26 | 北京航空航天大学杭州创新研究院 | Calibration method and device, storage medium and electronic device |
CN112017251A (en) * | 2020-10-19 | 2020-12-01 | 杭州飞步科技有限公司 | Calibration method and device, road side equipment and computer readable storage medium |
CN112578367A (en) * | 2020-10-21 | 2021-03-30 | 上汽大众汽车有限公司 | System and method for measuring relative time of camera and laser radar in automatic driving system |
CN112215905A (en) * | 2020-10-22 | 2021-01-12 | 北京易达恩能科技有限公司 | Automatic calibration method of mobile infrared temperature measurement system |
CN112381873A (en) * | 2020-10-23 | 2021-02-19 | 北京亮道智能汽车技术有限公司 | Data labeling method and device |
CN112162263A (en) * | 2020-10-26 | 2021-01-01 | 苏州挚途科技有限公司 | Combined calibration method and device for sensor and electronic equipment |
CN112394347A (en) * | 2020-11-18 | 2021-02-23 | 杭州海康威视数字技术股份有限公司 | Target detection method, device and equipment |
CN112505663B (en) * | 2020-11-25 | 2022-09-13 | 上海交通大学 | Calibration method for multi-line laser radar and camera combined calibration |
CN112505663A (en) * | 2020-11-25 | 2021-03-16 | 上海交通大学 | Calibration method for multi-line laser radar and camera combined calibration |
CN112509058A (en) * | 2020-11-30 | 2021-03-16 | 北京百度网讯科技有限公司 | Method and device for calculating external parameters, electronic equipment and storage medium |
CN112509058B (en) * | 2020-11-30 | 2023-08-22 | 北京百度网讯科技有限公司 | External parameter calculating method, device, electronic equipment and storage medium |
CN112446926A (en) * | 2020-12-14 | 2021-03-05 | 北京易达恩能科技有限公司 | Method and device for calibrating relative position of laser radar and multi-eye fisheye camera |
CN112764004B (en) * | 2020-12-22 | 2024-05-03 | 中国第一汽车股份有限公司 | Point cloud processing method, device, equipment and storage medium |
CN112764004A (en) * | 2020-12-22 | 2021-05-07 | 中国第一汽车股份有限公司 | Point cloud processing method, device, equipment and storage medium |
CN112526470A (en) * | 2020-12-22 | 2021-03-19 | 北京百度网讯科技有限公司 | Method and device for calibrating radar parameters, electronic equipment and storage medium |
CN112861660B (en) * | 2021-01-22 | 2023-10-13 | 上海西井科技股份有限公司 | Laser radar array and camera synchronization device, method, equipment and storage medium |
CN112861660A (en) * | 2021-01-22 | 2021-05-28 | 上海西井信息科技有限公司 | Laser radar array and camera synchronization device, method, equipment and storage medium |
CN112802126A (en) * | 2021-02-26 | 2021-05-14 | 上海商汤临港智能科技有限公司 | Calibration method, calibration device, computer equipment and storage medium |
CN112819903B (en) * | 2021-03-02 | 2024-02-20 | 福州视驰科技有限公司 | L-shaped calibration plate-based camera and laser radar combined calibration method |
CN112819903A (en) * | 2021-03-02 | 2021-05-18 | 福州视驰科技有限公司 | Camera and laser radar combined calibration method based on L-shaped calibration plate |
CN113176557A (en) * | 2021-04-29 | 2021-07-27 | 中国科学院自动化研究所 | Virtual laser radar online simulation method based on projection |
CN113192145A (en) * | 2021-05-08 | 2021-07-30 | 深圳市商汤科技有限公司 | Equipment calibration method and device, electronic equipment and storage medium |
CN113538591B (en) * | 2021-06-18 | 2024-03-12 | 深圳奥锐达科技有限公司 | Calibration method and device for distance measuring device and camera fusion system |
CN113538591A (en) * | 2021-06-18 | 2021-10-22 | 深圳奥锐达科技有限公司 | Calibration method and device for distance measuring device and camera fusion system |
CN115239815B (en) * | 2021-06-23 | 2023-10-27 | 上海仙途智能科技有限公司 | Camera calibration method and device |
CN115239815A (en) * | 2021-06-23 | 2022-10-25 | 上海仙途智能科技有限公司 | Camera calibration method and device |
CN113406604A (en) * | 2021-06-30 | 2021-09-17 | 山东新一代信息产业技术研究院有限公司 | Device and method for calibrating positions of laser radar and camera |
CN113724338A (en) * | 2021-08-31 | 2021-11-30 | 上海西井信息科技有限公司 | Method, system, device and storage medium for shooting moving object based on table |
CN113724338B (en) * | 2021-08-31 | 2024-05-03 | 上海西井科技股份有限公司 | Method, system, equipment and storage medium for shooting mobile object based on table |
CN113848541B (en) * | 2021-09-22 | 2022-08-26 | 深圳市镭神智能系统有限公司 | Calibration method and device, unmanned aerial vehicle and computer readable storage medium |
CN113848541A (en) * | 2021-09-22 | 2021-12-28 | 深圳市镭神智能系统有限公司 | Calibration method and device, unmanned aerial vehicle and computer readable storage medium |
CN113740829A (en) * | 2021-11-05 | 2021-12-03 | 新石器慧通(北京)科技有限公司 | External parameter monitoring method and device for environment sensing equipment, medium and running device |
CN114152935A (en) * | 2021-11-19 | 2022-03-08 | 苏州一径科技有限公司 | Method, device and equipment for evaluating radar external parameter calibration precision |
WO2023217047A1 (en) * | 2022-05-07 | 2023-11-16 | 节卡机器人股份有限公司 | Positioning method and apparatus, and electronic device and readable storage medium |
CN115236689A (en) * | 2022-09-23 | 2022-10-25 | 北京小马易行科技有限公司 | Method and device for determining relative positions of laser radar and image acquisition equipment |
CN115994955B (en) * | 2023-03-23 | 2023-07-04 | 深圳佑驾创新科技有限公司 | Camera external parameter calibration method and device and vehicle |
CN115994955A (en) * | 2023-03-23 | 2023-04-21 | 深圳佑驾创新科技有限公司 | Camera external parameter calibration method and device and vehicle |
CN116740197B (en) * | 2023-08-11 | 2023-11-21 | 之江实验室 | External parameter calibration method and device, storage medium and electronic equipment |
CN116740197A (en) * | 2023-08-11 | 2023-09-12 | 之江实验室 | External parameter calibration method and device, storage medium and electronic equipment |
CN116736227B (en) * | 2023-08-15 | 2023-10-27 | 无锡聚诚智能科技有限公司 | Method for jointly calibrating sound source position by microphone array and camera |
CN116736227A (en) * | 2023-08-15 | 2023-09-12 | 无锡聚诚智能科技有限公司 | Method for jointly calibrating sound source position by microphone array and camera |
CN117388831A (en) * | 2023-12-13 | 2024-01-12 | 中科视语(北京)科技有限公司 | Camera and laser radar combined calibration method and device, electronic equipment and medium |
CN117388831B (en) * | 2023-12-13 | 2024-03-15 | 中科视语(北京)科技有限公司 | Camera and laser radar combined calibration method and device, electronic equipment and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111127563A (en) | Combined calibration method and device, electronic equipment and storage medium | |
CN110322500B (en) | Optimization method and device for instant positioning and map construction, medium and electronic equipment | |
CN110988849B (en) | Calibration method and device of radar system, electronic equipment and storage medium | |
CN110927708B (en) | Calibration method, device and equipment of intelligent road side unit | |
CN110095752B (en) | Positioning method, apparatus, device and medium | |
US9185289B2 (en) | Generating a composite field of view using a plurality of oblique panoramic images of a geographic area | |
CN110660098B (en) | Positioning method and device based on monocular vision | |
CN110349212B (en) | Optimization method and device for instant positioning and map construction, medium and electronic equipment | |
CN111222509B (en) | Target detection method and device and electronic equipment | |
CN114993328B (en) | Vehicle positioning evaluation method, device, equipment and computer readable medium | |
WO2023029893A1 (en) | Texture mapping method and apparatus, device and storage medium | |
CN116182878A (en) | Road curved surface information generation method, device, equipment and computer readable medium | |
CN113759348B (en) | Radar calibration method, device, equipment and storage medium | |
CN116758498B (en) | Obstacle information generation method, obstacle information generation device, electronic device, and computer-readable medium | |
CN117191080A (en) | Calibration method, device, equipment and storage medium for camera and IMU external parameters | |
CN116990830A (en) | Distance positioning method and device based on binocular and TOF, electronic equipment and medium | |
CN115760827A (en) | Point cloud data detection method, device, equipment and storage medium | |
CN111383337B (en) | Method and device for identifying objects | |
CN110634159A (en) | Target detection method and device | |
CN110348374B (en) | Vehicle detection method and device, electronic equipment and storage medium | |
CN115086538A (en) | Shooting position determining method, device, equipment and medium | |
CN115082516A (en) | Target tracking method, device, equipment and medium | |
CN112887793A (en) | Video processing method, display device, and storage medium | |
CN112037280A (en) | Object distance measuring method and device | |
CN115201796B (en) | External reference correction method of vehicle sensor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |