CN111179358A - Calibration method, device, equipment and storage medium - Google Patents
Calibration method, device, equipment and storage medium Download PDFInfo
- Publication number
- CN111179358A CN111179358A CN201911397497.4A CN201911397497A CN111179358A CN 111179358 A CN111179358 A CN 111179358A CN 201911397497 A CN201911397497 A CN 201911397497A CN 111179358 A CN111179358 A CN 111179358A
- Authority
- CN
- China
- Prior art keywords
- point cloud
- calibration
- calibration plate
- camera
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 66
- 238000012360 testing method Methods 0.000 claims description 83
- 238000012545 processing Methods 0.000 claims description 39
- 238000005457 optimization Methods 0.000 claims description 38
- 238000010586 diagram Methods 0.000 description 16
- 230000000694 effects Effects 0.000 description 5
- 238000005259 measurement Methods 0.000 description 3
- 230000001360 synchronised effect Effects 0.000 description 3
- 239000003086 colorant Substances 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000003064 k means clustering Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/4802—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10044—Radar image
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Abstract
The application provides a calibration method, a calibration device, calibration equipment and a storage medium. The method comprises the following steps: acquiring a time-synchronized data pair, wherein the data pair comprises an image shot by a camera and a point cloud scanned by a laser radar, the image comprises a calibration plate, the point cloud comprises a point cloud corresponding to the calibration plate, and the calibration plate comprises a plurality of vertexes; determining first position coordinates of each vertex of the calibration plate according to the point cloud, wherein the first position coordinates are coordinates under a laser radar coordinate system; determining second position coordinates of each vertex of the calibration plate according to the image and the internal reference of the camera, wherein the second position coordinates are coordinates in a camera coordinate system; and determining an external reference calibration result between the camera and the laser radar according to the first position coordinate and the second position coordinate of each vertex. The method and the device can improve the accuracy of external reference calibration.
Description
Technical Field
The present application relates to the field of computer technologies, and in particular, to a calibration method, apparatus, device, and storage medium.
Background
The vehicle acquires the information of the three-dimensional world through the vehicle-mounted sensor during automatic driving, and simultaneously senses the surrounding objects. Cameras and lidar are two sensors commonly used in autonomous driving. In order to obtain semantic information of an object and a position of the object in the real world, images acquired by a camera and a point cloud scanned by a laser radar need to be associated, but information acquired by the two sensors is located in a self coordinate system. In order to be able to perform the correlation, external referencing of the camera and the lidar is required.
Disclosure of Invention
The embodiment of the application provides a calibration method, a calibration device, calibration equipment and a storage medium.
In a first aspect, an embodiment of the present application provides a calibration method, including:
acquiring a time-synchronized data pair, wherein the data pair comprises an image shot by a camera and a point cloud scanned by a laser radar, the image comprises a calibration plate, the point cloud comprises a point cloud corresponding to the calibration plate, and the calibration plate comprises a plurality of vertexes;
determining first position coordinates of each vertex of the calibration plate according to the point cloud, wherein the first position coordinates are coordinates under a laser radar coordinate system;
determining second position coordinates of each vertex of the calibration plate according to the image and the internal reference of the camera, wherein the second position coordinates are coordinates in a camera coordinate system;
and determining an external reference calibration result between the camera and the laser radar according to the first position coordinate and the second position coordinate of each vertex.
In a possible embodiment, determining the first position coordinates of the vertices of the calibration plate from the point cloud comprises:
extracting point clouds on a target plane in the point clouds, wherein the target plane is a plane where the calibration plate is located;
clustering the point clouds on the target plane to obtain at least one point cloud cluster;
determining the point cloud cluster with the largest number of data points in all the point cloud clusters as the point cloud cluster corresponding to the calibration board;
and determining the first position coordinates of each vertex of the calibration plate according to the point cloud cluster corresponding to the calibration plate and the shape information of the calibration plate.
In one possible embodiment, the shape information of the calibration plate includes a size of the calibration plate;
determining the point cloud cluster with the largest number of data points in all the point cloud clusters as the point cloud cluster corresponding to the calibration board, wherein the method comprises the following steps:
selecting the point cloud cluster with the maximum number of data points from all the point cloud clusters as a point cloud cluster to be processed, and executing a first processing process; the first processing procedure comprises:
calculating the difference between the space size occupied by the point cloud cluster to be processed and the size of the calibration plate;
if the difference is smaller than or equal to a first preset threshold value, determining the point cloud cluster to be processed as the point cloud cluster corresponding to the calibration plate;
and if the difference value is larger than the first preset threshold value, deleting the point cloud cluster to be processed, and returning to the step of re-determining the point cloud cluster to be processed.
In a possible embodiment, determining the external reference calibration result between the camera and the lidar according to the first position coordinate and the second position coordinate of each vertex comprises:
and carrying out nonlinear optimization on external parameters between the camera and the laser radar according to an optimization target to obtain an external parameter calibration result, wherein the optimization target is determined by the distance between the first position coordinate and the second position coordinate of each vertex.
In one possible embodiment, the optimization goal is to minimize an average distance of all vertices on the calibration plate, wherein the average distance is an average of distances between the first position coordinates and the second position coordinates of the respective vertices.
In one possible embodiment, the acquiring the time-synchronized data pair includes:
acquiring a plurality of frames of video images shot by the camera;
selecting continuous multi-frame first images of which the calibration plate is still from the multi-frame video images;
selecting one frame of first image from the plurality of frames of first images as a second image;
the point cloud closest in time to the timestamp of the second image is treated as a set of time-synchronized data pairs with the second image.
In a possible embodiment, the calibration plate comprises a plurality of reference points;
selecting a plurality of continuous first images of which the calibration board is still from the plurality of frames of video images, wherein the method comprises the following steps:
and calculating the sum of the moving distances of all reference points on the calibration board for every two adjacent frames of video images, and if the sum is smaller than a second preset threshold, judging that the calibration board keeps still on the two frames of video images, wherein the moving distance of one reference point is the distance between the positions of the reference point in the two frames of video images.
In one possible embodiment, the method further comprises:
and storing and/or displaying the external reference calibration result.
In one possible embodiment, the method further comprises:
acquiring a time-synchronized test data pair, wherein the test data pair comprises a test image shot by a camera and a test point cloud scanned by a laser radar, the test image comprises a test object, and the point cloud comprises a point cloud corresponding to the test object;
drawing the point cloud corresponding to the test object to a corresponding position point on the test image according to the external reference calibration result and the internal reference of the camera and a preset pattern;
and displaying the drawn test image.
In a second aspect, an embodiment of the present application provides a calibration apparatus, including:
the system comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring a time-synchronized data pair, the data pair comprises an image shot by a camera and a point cloud scanned by a laser radar, the image comprises a calibration plate, the point cloud comprises the point cloud corresponding to the calibration plate, and the calibration plate comprises a plurality of vertexes;
the first processing module is used for determining first position coordinates of each vertex of the calibration plate according to the point cloud, wherein the first position coordinates are coordinates under a laser radar coordinate system;
the second processing module is used for determining second position coordinates of each vertex of the calibration plate according to the image and the internal reference of the camera, wherein the second position coordinates are coordinates in a camera coordinate system;
and the third processing module is used for determining an external reference calibration result between the camera and the laser radar according to the first position coordinate and the second position coordinate of each vertex.
In a possible implementation manner, the first processing module is specifically configured to:
extracting point clouds on a target plane in the point clouds, wherein the target plane is a plane where the calibration plate is located;
clustering the point clouds on the target plane to obtain at least one point cloud cluster;
determining the point cloud cluster with the largest number of data points in all the point cloud clusters as the point cloud cluster corresponding to the calibration board;
and determining the first position coordinates of each vertex of the calibration plate according to the point cloud cluster corresponding to the calibration plate and the shape information of the calibration plate.
In one possible embodiment, the shape information of the calibration plate includes a size of the calibration plate;
the first processing module is specifically configured to:
selecting the point cloud cluster with the maximum number of data points from all the point cloud clusters as a point cloud cluster to be processed, and executing a first processing process; the first processing procedure comprises:
calculating the difference between the space size occupied by the point cloud cluster to be processed and the size of the calibration plate;
if the difference is smaller than or equal to a first preset threshold value, determining the point cloud cluster to be processed as the point cloud cluster corresponding to the calibration plate;
and if the difference value is larger than the first preset threshold value, deleting the point cloud cluster to be processed, and returning to the step of re-determining the point cloud cluster to be processed.
In a possible implementation manner, the third processing module is specifically configured to:
and carrying out nonlinear optimization on external parameters between the camera and the laser radar according to an optimization target to obtain an external parameter calibration result, wherein the optimization target is determined by the distance between the first position coordinate and the second position coordinate of each vertex.
In one possible embodiment, the optimization goal is to minimize an average distance of all vertices on the calibration plate, wherein the average distance is an average of distances between the first position coordinates and the second position coordinates of the respective vertices.
In a possible implementation manner, the obtaining module is specifically configured to:
acquiring a plurality of frames of video images shot by the camera;
selecting continuous multi-frame first images of which the calibration plate is still from the multi-frame video images;
selecting one frame of first image from the plurality of frames of first images as a second image;
the point cloud closest in time to the timestamp of the second image is treated as a set of time-synchronized data pairs with the second image.
In a possible embodiment, the calibration plate comprises a plurality of reference points;
the acquisition module is specifically configured to:
and calculating the sum of the moving distances of all reference points on the calibration board for every two adjacent frames of video images, and if the sum is smaller than a second preset threshold, judging that the calibration board keeps still on the two frames of video images, wherein the moving distance of one reference point is the distance between the positions of the reference point in the two frames of video images.
In a possible embodiment, the apparatus further comprises an output module;
the output module is configured to:
and storing and/or displaying the external reference calibration result.
In a possible embodiment, the device further comprises a test module;
the test module is used for:
acquiring a time-synchronized test data pair, wherein the test data pair comprises a test image shot by a camera and a test point cloud scanned by a laser radar, the test image comprises a test object, and the point cloud comprises a point cloud corresponding to the test object;
drawing the point cloud corresponding to the test object to a corresponding position point on the test image according to the external reference calibration result and the internal reference of the camera and a preset pattern;
and displaying the drawn test image.
In a third aspect, an embodiment of the present application provides a calibration apparatus, including: at least one processor and memory;
the memory stores computer-executable instructions;
the at least one processor executes computer-executable instructions stored by the memory to cause the at least one processor to perform the calibration method as described above in the first aspect and various possible implementations of the first aspect.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where computer-executable instructions are stored in the computer-readable storage medium, and when a processor executes the computer-executable instructions, the calibration method according to the first aspect and various possible implementation manners of the first aspect is implemented.
The calibration method, the calibration device, the calibration equipment and the storage medium provided by the embodiment of the application acquire time-synchronized data pairs, wherein the data pairs comprise images shot by a camera and point clouds scanned by a laser radar, the images comprise calibration plates, the point clouds comprise point clouds corresponding to the calibration plates, and the calibration plates comprise a plurality of vertexes; determining first position coordinates of each vertex of the calibration plate under a laser radar coordinate system according to the point cloud; determining second position coordinates of each vertex of the calibration plate under a camera coordinate system according to the image and the internal reference of the camera; according to the first position coordinates and the second position coordinates of each vertex, an external reference calibration result between the camera and the laser radar is determined, and external reference calibration can be performed on the camera and the laser radar by utilizing the corresponding position coordinates of each vertex of the calibration plate under a laser radar coordinate system and a camera coordinate system, so that the accuracy of external reference calibration is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a schematic structural diagram of a calibration system according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a calibration method according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a test image after rendering according to an embodiment of the present application;
fig. 4 is a schematic flowchart of a calibration method according to another embodiment of the present application;
fig. 5 is a schematic diagram of extracting a point cloud on a target plane by a RANSAC algorithm according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a point cloud cluster obtained by clustering point clouds in FIG. 5;
FIG. 7 is a schematic diagram of restoring the point cloud cluster corresponding to the calibration plate in FIG. 6 to a three-dimensional space;
fig. 8 is a schematic flowchart of a calibration method according to another embodiment of the present application;
fig. 9 is a schematic structural diagram of a calibration device according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a calibration device according to another embodiment of the present application;
fig. 11 is a schematic hardware structure diagram of a calibration apparatus according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Fig. 1 is a schematic structural diagram of a calibration system according to an embodiment of the present application. As shown in fig. 1, the calibration system provided in the present embodiment includes a camera 11, a laser radar 12, and a calibration device 13. The calibration board is placed within the shooting range of the camera 11 and within the scanning range of the laser radar 12. The camera 11 is used for shooting an image containing a calibration plate, and the laser radar 12 is used for scanning a point cloud containing a point cloud corresponding to the calibration plate. The calibration device 13 is configured to execute the calibration method provided in this embodiment to perform external reference calibration on the camera 11 and the laser radar 13. The calibration device 13 may be a desktop computer, a server, a notebook computer, a vehicle-mounted terminal, a mobile phone, a robot, and the like, which is not limited herein.
In one possible application scenario, the camera and lidar are vehicle-mounted sensors, fixedly mounted at a designated location on the vehicle. The staff can hold the calibration plate or place the calibration plate within the sensible range of the camera and the lidar. Video images are taken by a camera for a period of time and the area containing the calibration plate is scanned by the lidar during that period of time. And selecting an image from the video images shot by the camera and the point cloud of the laser radar at the corresponding moment to form a group of data pairs. The staff can place the calibration plate in a plurality of different positions in proper order to obtain multiunit data pair. The calibration method provided by the embodiment is used for carrying out optimization calculation on one or more groups of data to obtain an external parameter calibration result, and the external parameter calibration between the camera on the vehicle and the laser radar is realized.
Fig. 2 is a schematic flow chart of a calibration method according to an embodiment of the present application. As shown in fig. 2, the method includes:
s201, obtaining a data pair with synchronous time, wherein the data pair comprises an image shot by a camera and a point cloud scanned by a laser radar, the image comprises a calibration plate, the point cloud comprises a point cloud corresponding to the calibration plate, and the calibration plate comprises a plurality of vertexes.
In this embodiment, the shape and size of the calibration plate may be selected according to actual requirements, and are not limited herein. Accordingly, the number of vertices of the calibration plate is not limited. As shown in fig. 1, the calibration board may be a rectangular flat plate, and black and white squares may be drawn in an array on a side of the calibration board facing the camera and the lidar, and the calibration board has 4 vertex positions. The camera and the laser radar can be fixedly installed at a designated position, for example, when the camera and the laser radar are vehicle-mounted sensors, the camera and the laser radar can be fixedly installed at a designated position of a vehicle. The calibration plate may be held by a worker or mounted on a fixed bracket. The camera shoots the calibration plate to obtain an image containing the calibration plate. And scanning the calibration plate by the laser radar to obtain a point cloud containing the point cloud corresponding to the calibration plate. The time synchronization refers to the time for shooting the image by the camera and the time for scanning the point cloud by the laser radar, and the time synchronization can ensure that the positions of the calibration plate in the three-dimensional space in the image shot by the camera and the point cloud scanned by the laser radar are the same. It should be noted that, in consideration of the limitation of the measurement precision, the time synchronization described in the present application may be understood that the time duration of the interval between the time when the camera captures the image and the time when the laser radar scans the point cloud is smaller than the set reasonable time duration, and the set reasonable time duration is an empirical value and may be adaptively adjusted based on different measurement scenarios and different measurement tool precisions. The data pairs may be one or more groups, and are not limited herein. The calibration equipment acquires the time-synchronized data pair, which can be an image acquired by the camera and a point cloud scanned by the laser radar, and the time-synchronized image and the point cloud are selected to form the data pair.
S202, determining first position coordinates of each vertex of the calibration plate according to the point cloud, wherein the first position coordinates are coordinates in a laser radar coordinate system.
In this embodiment, the coordinates of the point cloud obtained by scanning with the laser radar are coordinates in a laser radar coordinate system. The calibration equipment can search the point cloud corresponding to the calibration plate from the scanned point cloud, and then determine the coordinates of each vertex of the calibration plate under the laser radar coordinate system according to the point cloud corresponding to the calibration plate. In this embodiment, coordinates of one vertex in the lidar coordinate system are referred to as first position coordinates, and coordinates in the camera coordinate system are referred to as second position coordinates.
S203, determining second position coordinates of each vertex of the calibration plate according to the image and the internal reference of the camera, wherein the second position coordinates are coordinates in a camera coordinate system.
In this embodiment, the internal parameters of the camera may include, but are not limited to, one or more of a focal length, a radial distortion magnitude, a scaling factor, a principal point of an image, a length and a width, and the like, and are not limited herein. The calibration device can calculate second position coordinates of each vertex of the calibration plate under the camera coordinate system according to the image and the internal reference of the camera.
Alternatively, the calibration device may first establish a calibration plate coordinate system, for example, when the calibration plate is a rectangular flat plate, the vertex of the upper left corner of the calibration plate may be used as the coordinate origin, the long axis of the calibration plate is an X axis, the short axis of the calibration plate is a Y axis, and the normal vector of the calibration plate is determined as a Z axis according to the right-hand rule. And then obtaining the position of the vertex of the calibration plate in the image through angular point detection, obtaining external parameters from the calibration plate to the camera through PnP (spectral-n-point) algorithm solution by combining the known three-dimensional information of the vertex under the coordinate system of the calibration plate and the internal parameters of the camera, and converting the coordinates of each vertex of the calibration plate under the coordinate system of the camera according to the external parameters so as to obtain the position coordinates of each vertex of the calibration plate under the coordinate system of the camera. The PnP algorithm is an algorithm for solving camera external parameters under the condition that camera internal parameters are known through a plurality of pairs of matching points on a three-dimensional space and a two-dimensional image. The PnP algorithm may include a Direct Linear Transformation (DLT) algorithm, a P3P algorithm, an EpnP (efficient perceptual-n-Point) algorithm, a nonlinear optimization algorithm, and the like.
S204, determining an external reference calibration result between the camera and the laser radar according to the first position coordinate and the second position coordinate of each vertex.
In this embodiment, the external parameters between the camera and the lidar may include a rotation matrix and a translation vector between the camera coordinate system and the lidar coordinate system. The first and second position coordinates for a vertex both correspond to the actual spatial position of the vertex. The calibration equipment can optimize external parameters between the camera and the laser radar according to the first position coordinates and the second position coordinates of each vertex to obtain external parameter calibration results.
The method includes the steps that time-synchronized data pairs are obtained, the data pairs comprise images shot by a camera and point clouds scanned by a laser radar, wherein the images comprise calibration plates, the point clouds comprise point clouds corresponding to the calibration plates, and the calibration plates comprise a plurality of vertexes; determining first position coordinates of each vertex of the calibration plate under a laser radar coordinate system according to the point cloud; determining second position coordinates of each vertex of the calibration plate under a camera coordinate system according to the image and the internal reference of the camera; according to the first position coordinates and the second position coordinates of each vertex, an external reference calibration result between the camera and the laser radar is determined, and external reference calibration can be performed on the camera and the laser radar by utilizing the corresponding position coordinates of each vertex of the calibration plate under a laser radar coordinate system and a camera coordinate system, so that the accuracy of external reference calibration is improved.
Optionally, S204 may include:
and carrying out nonlinear optimization on external parameters between the camera and the laser radar according to an optimization target to obtain an external parameter calibration result, wherein the optimization target is determined by the distance between the first position coordinate and the second position coordinate of each vertex.
In this embodiment, the calibration device may construct an equation set to be solved including an external parameter according to the first position coordinate and the second position coordinate of each vertex, determine an optimization target according to a distance between the first position coordinate and the second position coordinate of each vertex, and perform nonlinear optimization on the equation set to be solved according to the optimization target to obtain an optimal solution as an external parameter calibration result. The distance between the first position coordinate and the second position coordinate of one vertex may be a distance between two position points of the vertex, which are transformed to the same coordinate system (camera coordinate system or laser radar coordinate system) by the external reference. For example, a first position coordinate of a vertex is transformed into a camera coordinate system through an external parameter to obtain a position point, and the distance between the position point and a position point corresponding to a second position coordinate of the vertex is calculated.
In a traditional calibration mode, the position point of a calibration plate in an image is constrained to a plane where the calibration plate is located in a point cloud for optimization, and the plane cannot be strictly constrained to a certain three-dimensional space position, so that the calibration accuracy is low. In the embodiment, the optimization target is determined by the distance between the first position coordinate and the second position coordinate of each vertex, and each vertex of the calibration plate in the image is constrained to each vertex of the calibration plate in the point cloud for optimization, so that the optimization effect of the optimization target is improved, and the accuracy of external parameter calibration is further improved.
Optionally, the optimization goal is to minimize an average distance of all vertices on the calibration plate, wherein the average distance is an average of distances between the first position coordinates and the second position coordinates of the respective vertices.
For example, assuming that the calibration board has 4 vertexes in total, the optimization target is to minimize the average distance between the 4 vertexes of the calibration board, where the distance corresponding to each vertex is the distance between the first position coordinate and the second position coordinate of the vertex in two position points transformed into the same coordinate system by the external reference. The average distance of the 4 vertices is the average of the distances corresponding to the 4 vertices. Optionally, the distance between two location points is calculated using euclidean distance.
Alternatively, a plurality of positions with uniform spatial distribution can be selected as the positions for placing the calibration plate, each position measures a group of data pairs, and nonlinear optimization is performed through a certain number of measured data pairs (such as 10, 20, and the like), so as to obtain the external reference calibration result. The average distance of each vertex of the calibration plate is selected to represent the joint degree of the positions of the calibration plate under the camera coordinate system and the laser radar coordinate system, so that the accuracy of external reference calibration is improved.
Optionally, the method may further include:
and storing and/or displaying the external reference calibration result.
In this embodiment, the calibration device may store the external reference calibration result in a designated file or a designated database, so that the user may query and store the external reference calibration result. The calibration device can also display the external reference calibration result on a screen for the user to view.
Optionally, the method may further include:
acquiring a time-synchronized test data pair, wherein the test data pair comprises a test image shot by a camera and a test point cloud scanned by a laser radar, the test image comprises a test object, and the point cloud comprises a point cloud corresponding to the test object;
drawing the point cloud corresponding to the test object to a corresponding position point on the test image according to the external reference calibration result and the internal reference of the camera and a preset pattern;
and displaying the drawn test image.
In this embodiment, the test object may be an object such as a lane, a road sign, a street lamp, a tree, or other designated object for verifying the reference calibration result, which is not limited herein. The method can acquire a test image which is shot by a camera and contains a test object, and test point cloud which is obtained by scanning the test object by a laser radar. And then, converting the point cloud of the test object into a camera coordinate system according to the calibrated external reference calibration result, converting the point cloud of the test object into a corresponding position point in the camera coordinate system according to the internal reference of the camera, and drawing the corresponding position point onto the test image through a preset pattern. And displaying the drawn test image so that a user can judge the effect of external reference calibration by observing the joint condition of the position point after point cloud conversion and the image of the test object on the test image. The preset patterns may be different colors, patterns, etc. to distinguish the points of the point cloud projected onto the test image.
Fig. 3 is a schematic diagram of a drawn test image according to an embodiment of the present application. In the figure, one or more of a lane, a road sign, a street lamp, and a tree may be a test object. And multiple rows of position points drawn on the image are the positions of the point cloud scanned by the laser radar in the two-dimensional image space after the point cloud is multiplied by the corresponding external parameters and then multiplied by the internal parameters of the camera. Different colors can be drawn according to the distance between the point cloud and the camera plane, so that the calibration effect can be better observed. And selecting rich scenes of objects for projection verification. When the object edge is attached to the point cloud edge, the calibrated external reference is good, and when the edge difference is larger than a certain threshold value, the calibrated parameter is not good, and the calibration needs to be carried out again.
According to the embodiment, the point cloud corresponding to the test object is drawn to the corresponding position point on the test image according to the external reference calibration result and the internal reference of the camera and the preset pattern, so that the external reference calibration result can be checked, and the external reference calibration can be recalibrated when the external reference calibration has a deviation.
Fig. 4 is a schematic flowchart of a calibration method according to another embodiment of the present application. The embodiment describes in detail a specific implementation process for determining the first position coordinates of each vertex of the calibration plate according to the point cloud. As shown in fig. 4, the method includes:
s401, acquiring a data pair with synchronous time, wherein the data pair comprises an image shot by a camera and a point cloud scanned by a laser radar, the image comprises a calibration plate, the point cloud comprises a point cloud corresponding to the calibration plate, and the calibration plate comprises a plurality of vertexes.
In this embodiment, S401 is similar to S201 in the embodiment of fig. 2, and is not described here again.
S402, extracting the point cloud on a target plane in the point cloud, wherein the target plane is the plane where the calibration plate is located.
In this embodiment, the calibration device may extract the point cloud on the target plane in the point cloud by using a Random Sample Consensus (RANSAC) algorithm. Fig. 5 is a schematic diagram illustrating the extraction of a point cloud on a target plane by a RANSAC algorithm according to an embodiment of the present application. The three coordinate axes in fig. 5 are three coordinate axes of the coordinate system of the lidar.
Optionally, the point cloud may include an area with a radius of several tens of meters and centered on the lidar, but only a part of an area where the calibration plate in the forward view is located is needed for the joint calibration, so that redundant information of the point cloud of the lidar may be filtered out through the prior information to reduce interference of the redundant area on the calibration result. For example, the prior information may be a distance between the arrangement position of the calibration plate and the laser radar, and assuming that the distance between the arrangement position of the calibration plate and the laser radar is 8 meters, a point cloud in an area between 7 meters and 9 meters away from the laser radar in the direction in which the calibration plate is located may be selected.
And S403, clustering the point clouds on the target plane to obtain at least one point cloud cluster.
In this embodiment, since the RANSAC algorithm extracts all point cloud information in a certain plane, and the external reference calibration requires a certain spatially continuous region corresponding to the calibration plate in the plane, there is no need for surrounding noise point information. As shown in fig. 5, besides the point cloud of the middle rectangular area being the point cloud corresponding to the calibration plate, other noise point clouds are also included in the plane. Therefore, the screened point cloud needs to be further filtered. The point cloud can be projected onto a two-dimensional plane, and a plurality of point cloud clusters are obtained through clustering by a clustering algorithm. The clustering algorithm may be a mean-shift clustering algorithm, a k-means clustering algorithm, etc., and is not limited herein. Fig. 6 is a schematic diagram of a point cloud cluster obtained by clustering point clouds in fig. 5. Clustering the point clouds in the plane to obtain 4 point cloud clusters: a, B, C and D.
S404, determining the point cloud cluster with the largest number of data points in all the point cloud clusters as the point cloud cluster corresponding to the calibration plate.
In this embodiment, in the point cloud clusters of the target plane, the number of point clouds corresponding to the calibration plate is generally large, and the number of noise point clouds is generally small, so that the point cloud cluster with the largest number of data points in all the point cloud clusters can be determined as the point cloud cluster corresponding to the calibration plate. As shown in fig. 6, the point cloud cluster B may be determined to be the point cloud cluster corresponding to the calibration plate. After the point cloud cluster corresponding to the calibration plate is determined, the point cloud cluster can be restored to a three-dimensional space under a laser radar coordinate system from a two-dimensional plane. Fig. 7 is a schematic diagram illustrating the point cloud cluster corresponding to the calibration plate in fig. 6 restored to a three-dimensional space.
Optionally, the shape information of the calibration plate comprises the size of the calibration plate; s404 may include:
selecting the point cloud cluster with the maximum number of data points from all the point cloud clusters as a point cloud cluster to be processed, and executing a first processing process; the first processing procedure comprises:
calculating the difference between the space size occupied by the point cloud cluster to be processed and the size of the calibration plate;
if the difference is smaller than or equal to a first preset threshold value, determining the point cloud cluster to be processed as the point cloud cluster corresponding to the calibration plate;
and if the difference value is larger than the first preset threshold value, deleting the point cloud cluster to be processed, and returning to the step of re-determining the point cloud cluster to be processed.
In this embodiment, in order to prevent the noise point cloud cluster from being recognized as the point cloud cluster of the calibration plate by mistake when the number of data points in a certain noise point cloud cluster exceeds the number of point clouds corresponding to the calibration plate, the point cloud cluster with the largest number of data points is verified through the size of the calibration plate in this embodiment. The dimensions of the calibration plate may include the side lengths of the calibration plate. For example, when the calibration plate is rectangular in shape, the dimensions of the calibration plate may include, but are not limited to, information about the length, width, etc. of the calibration plate. Comparing the space size occupied by the point cloud cluster with the largest number of data points with the size of the calibration plate, and if the difference value between the space size and the size of the calibration plate is smaller than or equal to a first preset threshold value, indicating that the sizes of the point cloud cluster and the calibration plate are relatively close, so that the point cloud cluster is determined to be the point cloud cluster corresponding to the calibration plate. If the difference value between the two is larger than the first preset threshold value, the size difference between the two is larger, and therefore the point cloud cluster is determined not to be the point cloud cluster corresponding to the calibration plate. And deleting the point cloud cluster, selecting the point cloud cluster with the largest number of data points from the rest point cloud clusters, comparing the point cloud cluster with the calibration plate size, and repeating the process until the point cloud cluster corresponding to the calibration plate is found from all the point cloud data.
In the embodiment, the point cloud cluster with the largest number of data points is verified through the size of the calibration plate, so that the condition that the noise point cloud cluster is identified as the point cloud cluster of the calibration plate by mistake when the number of data points in a certain noise point cloud cluster exceeds the number of point clouds corresponding to the calibration plate can be prevented, and the accuracy of external reference calibration is improved.
S405, determining first position coordinates of each vertex of the calibration plate according to the point cloud cluster corresponding to the calibration plate and the shape information of the calibration plate.
In this embodiment, the shape information of the calibration board represents the shape of the calibration board, for example, the shape information of the calibration board may be a rectangle, a triangle, a pentagon, and the like, which is not limited herein. The calibration equipment can determine the first position coordinates of each vertex of the calibration plate according to the point cloud cluster corresponding to the calibration plate and the shape information of the calibration plate. For example, when the calibration plate is rectangular, the minimum bounding rectangle of the point cloud cluster corresponding to the calibration plate may be calculated, and the coordinates of four vertices of the minimum bounding rectangle may be used as the first position coordinates of each vertex of the calibration plate.
S406, determining second position coordinates of each vertex of the calibration plate according to the image and the internal reference of the camera, wherein the second position coordinates are coordinates in a camera coordinate system.
In this embodiment, S406 is similar to S203 in the embodiment of fig. 2, and is not described herein again.
S407, determining an external reference calibration result between the camera and the laser radar according to the first position coordinate and the second position coordinate of each vertex.
In this embodiment, S407 is similar to S204 in the embodiment of fig. 2, and is not described herein again.
In the embodiment, the point cloud on the plane where the calibration plate is located is clustered, the point cloud cluster with the largest number of data points is determined to be the point cloud cluster corresponding to the calibration plate, then the first position coordinates of each vertex of the calibration plate are determined according to the point cloud cluster corresponding to the calibration plate and the shape information of the calibration plate, and the coordinates of each vertex of the calibration plate under the laser radar coordinate system can be accurately determined.
Fig. 8 is a schematic flowchart of a calibration method according to another embodiment of the present application. The embodiment describes a specific implementation process of acquiring time-synchronized data pairs in detail. As shown in fig. 8, the method includes:
and S801, acquiring a plurality of frames of video images shot by the camera.
In this embodiment, the calibration device may acquire a plurality of frames of video images of the camera taken for the calibration board within a period of time.
S802, selecting continuous multi-frame first images of which the calibration plates are still from the multi-frame video images.
In this embodiment, the calibration device may identify a position of the calibration plate in the video image, and select a plurality of consecutive frames of first images, in which the calibration plate remains still, from the plurality of frames of video images according to the position of the calibration plate in the video image.
Optionally, S802 may include:
and calculating the sum of the moving distances of all reference points on the calibration board for every two adjacent frames of video images, and if the sum is smaller than a second preset threshold, judging that the calibration board keeps still on the two frames of video images, wherein the moving distance of one reference point is the distance between the positions of the reference point in the two frames of video images.
In this embodiment, the calibration device may sequentially determine the positions of the calibration plate in two adjacent frames of video images according to the time sequence, and determine whether the calibration plate remains still in the two frames of video images. The reference points may be vertices of the calibration plate or other points on the calibration plate that can be identified by image processing. As shown in fig. 1, when black and white squares are drawn in an array on the side of the calibration plate facing the camera and the lidar, the corner points of the black and white squares may be selected as reference points. If the sum of the moving distances of the reference points of the calibration board on the two frames of video images is less than the second preset threshold value, the position of the calibration board on the two frames of video images is very close, and therefore the calibration board is judged to be still on the two frames of video images.
And S803, selecting one frame of first image from the plurality of frames of first images as a second image.
And S804, taking the point cloud closest to the time stamp of the second image in terms of time and the second image as a group of time-synchronous data pairs.
In this embodiment, when the number of the first images that continuously remain still reaches a certain number, it may be determined that the calibration plate within the acquisition time corresponding to the first images remains still or moves only slightly, and the accuracy of the calibration result is not affected, so that one frame of the first images may be selected from the first images as the second image. The timestamp characterizes the acquisition time of the image. And finding the closest point cloud in time through the time stamp of the second image to serve as a group of data pairs which can be used for calibration. For example, to further ensure the calibration accuracy, the first image with the time stamp located at the middle may be selected from the multiple first images as the second image. By selecting the first image with the time stamp positioned at the middle frame as the second image, the calibration plate in the second image can be further ensured to be static, and the calibration accuracy is further improved.
And S805, determining first position coordinates of each vertex of the calibration plate according to the point cloud, wherein the first position coordinates are coordinates in a laser radar coordinate system.
In this embodiment, S805 is similar to S202 in the embodiment of fig. 2, and is not described herein again.
S806, determining second position coordinates of each vertex of the calibration plate according to the image and the internal reference of the camera, wherein the second position coordinates are coordinates in a camera coordinate system.
In this embodiment, S806 is similar to S203 in the embodiment of fig. 2, and is not described herein again.
S807, determining an external reference calibration result between the camera and the laser radar according to the first position coordinate and the second position coordinate of each vertex.
In this embodiment, S807 is similar to S204 in the embodiment of fig. 2, and is not described herein again.
Generally, a camera is used for shooting a video image containing a calibration plate and a point cloud scanned by a laser radar within a period of time, data corresponding to the calibration plate in the time-synchronized image and point cloud are extracted, and a position point of the calibration plate in the image is constrained to a plane where the calibration plate in the point cloud is located for optimization, so that an external parameter between the camera and the laser radar is obtained. However, optimization is performed by constraining the position points of the calibration plate in the image to the plane where the calibration plate is located in the point cloud, resulting in lower accuracy of the calibrated external parameters.
The method includes the steps that time-synchronized data pairs are obtained, the data pairs comprise images shot by a camera and point clouds scanned by a laser radar, wherein the images comprise calibration plates, the point clouds comprise point clouds corresponding to the calibration plates, and the calibration plates comprise a plurality of vertexes; determining first position coordinates of each vertex of the calibration plate under a laser radar coordinate system according to the point cloud; determining second position coordinates of each vertex of the calibration plate under a camera coordinate system according to the image and the internal reference of the camera; according to the first position coordinates and the second position coordinates of each vertex, an external reference calibration result between the camera and the laser radar is determined, and external reference calibration can be performed on the camera and the laser radar by utilizing the corresponding position coordinates of each vertex of the calibration plate under a laser radar coordinate system and a camera coordinate system, so that the accuracy of external reference calibration is improved. In addition, in the embodiment, continuous multi-frame first images with the calibration plate kept still are selected from multi-frame video images shot by the camera, and then point cloud data with one of the first images synchronized with time is selected as a data pair for calibration, so that the actual spatial positions of the images and the calibration plate in the point cloud can be ensured to be the same, and the time synchronism of the images and the point cloud is ensured.
Fig. 9 is a schematic structural diagram of a calibration device according to an embodiment of the present application. As shown in fig. 9, the calibration device 90 includes: an obtaining module 901, a first processing module 902, a second processing module 903, and a third processing module 904.
An obtaining module 901, configured to obtain a time-synchronized data pair, where the data pair includes an image captured by a camera and a point cloud scanned by a laser radar, where the image includes a calibration plate, the point cloud includes a point cloud corresponding to the calibration plate, and the calibration plate includes multiple vertices.
A first processing module 902, configured to determine, according to the point cloud, first position coordinates of each vertex of the calibration board, where the first position coordinates are coordinates in a laser radar coordinate system.
A second processing module 903, configured to determine second position coordinates of each vertex of the calibration board according to the image and the internal reference of the camera, where the second position coordinates are coordinates in a camera coordinate system.
And a third processing module 904, configured to determine an external reference calibration result between the camera and the lidar according to the first position coordinate and the second position coordinate of each vertex.
The method includes the steps that time-synchronized data pairs are obtained, the data pairs comprise images shot by a camera and point clouds scanned by a laser radar, wherein the images comprise calibration plates, the point clouds comprise point clouds corresponding to the calibration plates, and the calibration plates comprise a plurality of vertexes; determining first position coordinates of each vertex of the calibration plate under a laser radar coordinate system according to the point cloud; determining second position coordinates of each vertex of the calibration plate under a camera coordinate system according to the image and the internal reference of the camera; according to the first position coordinates and the second position coordinates of each vertex, an external reference calibration result between the camera and the laser radar is determined, and external reference calibration can be performed on the camera and the laser radar by utilizing the corresponding position coordinates of each vertex of the calibration plate under a laser radar coordinate system and a camera coordinate system, so that the accuracy of external reference calibration is improved.
Fig. 10 is a schematic structural diagram of a calibration apparatus according to yet another embodiment of the present application. As shown in fig. 10, the calibration apparatus 90 provided in this embodiment may further include, on the basis of the calibration apparatus provided in the embodiment shown in fig. 9: an output module 905, and a test module 906.
Optionally, the first processing module 902 is specifically configured to:
extracting point clouds on a target plane in the point clouds, wherein the target plane is a plane where the calibration plate is located;
clustering the point clouds on the target plane to obtain at least one point cloud cluster;
determining the point cloud cluster with the largest number of data points in all the point cloud clusters as the point cloud cluster corresponding to the calibration board;
and determining the first position coordinates of each vertex of the calibration plate according to the point cloud cluster corresponding to the calibration plate and the shape information of the calibration plate.
Optionally, the shape information of the calibration plate comprises the size of the calibration plate;
the first processing module 902 is specifically configured to:
selecting the point cloud cluster with the maximum number of data points from all the point cloud clusters as a point cloud cluster to be processed, and executing a first processing process; the first processing procedure comprises:
calculating the difference between the space size occupied by the point cloud cluster to be processed and the size of the calibration plate;
if the difference is smaller than or equal to a first preset threshold value, determining the point cloud cluster to be processed as the point cloud cluster corresponding to the calibration plate;
and if the difference value is larger than the first preset threshold value, deleting the point cloud cluster to be processed, and returning to the step of re-determining the point cloud cluster to be processed.
Optionally, the third processing module 904 is specifically configured to:
and carrying out nonlinear optimization on external parameters between the camera and the laser radar according to an optimization target to obtain an external parameter calibration result, wherein the optimization target is determined by the distance between the first position coordinate and the second position coordinate of each vertex.
Optionally, the optimization goal is to minimize an average distance of all vertices on the calibration plate, wherein the average distance is an average of distances between the first position coordinates and the second position coordinates of the respective vertices.
Optionally, the obtaining module 901 is specifically configured to:
acquiring a plurality of frames of video images shot by the camera;
selecting continuous multi-frame first images of which the calibration plate is still from the multi-frame video images;
selecting one frame of first image from the plurality of frames of first images as a second image;
the point cloud closest in time to the timestamp of the second image is treated as a set of time-synchronized data pairs with the second image.
Optionally, the calibration plate comprises a plurality of reference points thereon;
the obtaining module 901 is specifically configured to:
and calculating the sum of the moving distances of all reference points on the calibration board for every two adjacent frames of video images, and if the sum is smaller than a second preset threshold, judging that the calibration board keeps still on the two frames of video images, wherein the moving distance of one reference point is the distance between the positions of the reference point in the two frames of video images.
Optionally, the output module 905 is configured to:
and storing and/or displaying the external reference calibration result.
Optionally, the testing module 906 is configured to:
acquiring a time-synchronized test data pair, wherein the test data pair comprises a test image shot by a camera and a test point cloud scanned by a laser radar, the test image comprises a test object, and the point cloud comprises a point cloud corresponding to the test object;
drawing the point cloud corresponding to the test object to a corresponding position point on the test image according to the external reference calibration result and the internal reference of the camera and a preset pattern;
and displaying the drawn test image.
The calibration device provided in the embodiment of the present application can be used to implement the method embodiments described above, and the implementation principle and technical effect are similar, which are not described herein again.
Fig. 11 is a schematic hardware structure diagram of a calibration apparatus according to an embodiment of the present application. As shown in fig. 11, the calibration apparatus 110 provided in this embodiment includes: at least one processor 1101 and memory 1102. The calibration device 110 further comprises a communication component 1103. The processor 1101, the memory 1102, and the communication unit 1103 are connected by a bus 1104.
In a specific implementation, the at least one processor 1101 executes the computer-executable instructions stored by the memory 1102, so that the at least one processor 1101 executes the calibration method as described above.
For a specific implementation process of the processor 1101, reference may be made to the above method embodiments, which implement similar principles and technical effects, and details of this embodiment are not described herein again.
In the embodiment shown in fig. 11, it should be understood that the Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in the incorporated application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in the processor.
The memory may comprise high speed RAM memory and may also include non-volatile storage NVM, such as at least one disk memory.
The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, the buses in the figures of the present application are not limited to only one bus or one type of bus.
The application also provides a computer-readable storage medium, wherein computer-executable instructions are stored in the computer-readable storage medium, and when a processor executes the computer-executable instructions, the calibration method is realized.
The computer-readable storage medium may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disk. Readable storage media can be any available media that can be accessed by a general purpose or special purpose computer.
An exemplary readable storage medium is coupled to the processor such the processor can read information from, and write information to, the readable storage medium. Of course, the readable storage medium may also be an integral part of the processor. The processor and the readable storage medium may reside in an Application Specific Integrated Circuits (ASIC). Of course, the processor and the readable storage medium may also reside as discrete components in the apparatus.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
Claims (20)
1. A calibration method, comprising:
acquiring a time-synchronized data pair, wherein the data pair comprises an image shot by a camera and a point cloud scanned by a laser radar, the image comprises a calibration plate, the point cloud comprises a point cloud corresponding to the calibration plate, and the calibration plate comprises a plurality of vertexes;
determining first position coordinates of each vertex of the calibration plate according to the point cloud, wherein the first position coordinates are coordinates under a laser radar coordinate system;
determining second position coordinates of each vertex of the calibration plate according to the image and the internal reference of the camera, wherein the second position coordinates are coordinates in a camera coordinate system;
and determining an external reference calibration result between the camera and the laser radar according to the first position coordinate and the second position coordinate of each vertex.
2. The method of claim 1, wherein determining first location coordinates for each vertex of the calibration plate from the point cloud comprises:
extracting point clouds on a target plane in the point clouds, wherein the target plane is a plane where the calibration plate is located;
clustering the point clouds on the target plane to obtain at least one point cloud cluster;
determining the point cloud cluster with the largest number of data points in all the point cloud clusters as the point cloud cluster corresponding to the calibration board;
and determining the first position coordinates of each vertex of the calibration plate according to the point cloud cluster corresponding to the calibration plate and the shape information of the calibration plate.
3. The method of claim 2, wherein the shape information of the calibration plate includes a size of the calibration plate;
determining the point cloud cluster with the largest number of data points in all the point cloud clusters as the point cloud cluster corresponding to the calibration plate, wherein the method comprises the following steps:
selecting the point cloud cluster with the maximum number of data points from all the point cloud clusters as a point cloud cluster to be processed, and executing a first processing process; the first processing procedure comprises:
calculating the difference between the space size occupied by the point cloud cluster to be processed and the size of the calibration plate;
if the difference is smaller than or equal to a first preset threshold value, determining the point cloud cluster to be processed as the point cloud cluster corresponding to the calibration plate;
and if the difference value is larger than the first preset threshold value, deleting the point cloud cluster to be processed, and returning to the step of re-determining the point cloud cluster to be processed.
4. The method of claim 1, wherein determining the external reference calibration result between the camera and the lidar based on the first and second position coordinates of each vertex comprises:
and carrying out nonlinear optimization on external parameters between the camera and the laser radar according to an optimization target to obtain an external parameter calibration result, wherein the optimization target is determined by the distance between the first position coordinate and the second position coordinate of each vertex.
5. The method of claim 4, wherein the optimization goal is to minimize an average distance of all vertices on the calibration plate, wherein the average distance is an average of distances between the first and second position coordinates of each vertex.
6. The method of claim 1, wherein the obtaining the time-synchronized data pair comprises:
acquiring a plurality of frames of video images shot by the camera;
selecting continuous multi-frame first images of which the calibration plate is still from the multi-frame video images;
selecting one frame of first image from the plurality of frames of first images as a second image;
the point cloud closest in time to the timestamp of the second image is treated as a set of time-synchronized data pairs with the second image.
7. The method of claim 6, wherein the calibration plate includes a plurality of reference points thereon;
selecting a plurality of continuous first images of which the calibration board is still from the plurality of frames of video images, wherein the method comprises the following steps:
and calculating the sum of the moving distances of all reference points on the calibration board for every two adjacent frames of video images, and if the sum is smaller than a second preset threshold, judging that the calibration board keeps still on the two frames of video images, wherein the moving distance of one reference point is the distance between the positions of the reference point in the two frames of video images.
8. The method according to any one of claims 1-7, further comprising:
and storing and/or displaying the external reference calibration result.
9. The method according to any one of claims 1-7, further comprising:
acquiring a time-synchronized test data pair, wherein the test data pair comprises a test image shot by a camera and a test point cloud scanned by a laser radar, the test image comprises a test object, and the point cloud comprises a point cloud corresponding to the test object;
drawing the point cloud corresponding to the test object to a corresponding position point on the test image according to the external reference calibration result and the internal reference of the camera and a preset pattern;
and displaying the drawn test image.
10. A calibration device, comprising:
the system comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring a time-synchronized data pair, the data pair comprises an image shot by a camera and a point cloud scanned by a laser radar, the image comprises a calibration plate, the point cloud comprises the point cloud corresponding to the calibration plate, and the calibration plate comprises a plurality of vertexes;
the first processing module is used for determining first position coordinates of each vertex of the calibration plate according to the point cloud, wherein the first position coordinates are coordinates under a laser radar coordinate system;
the second processing module is used for determining second position coordinates of each vertex of the calibration plate according to the image and the internal reference of the camera, wherein the second position coordinates are coordinates in a camera coordinate system;
and the third processing module is used for determining an external reference calibration result between the camera and the laser radar according to the first position coordinate and the second position coordinate of each vertex.
11. The apparatus of claim 10, wherein the first processing module is specifically configured to:
extracting point clouds on a target plane in the point clouds, wherein the target plane is a plane where the calibration plate is located;
clustering the point clouds on the target plane to obtain at least one point cloud cluster;
determining the point cloud cluster with the largest number of data points in all the point cloud clusters as the point cloud cluster corresponding to the calibration board;
and determining the first position coordinates of each vertex of the calibration plate according to the point cloud cluster corresponding to the calibration plate and the shape information of the calibration plate.
12. The apparatus of claim 11, wherein the shape information of the calibration plate includes a size of the calibration plate;
the first processing module is specifically configured to:
selecting the point cloud cluster with the maximum number of data points from all the point cloud clusters as a point cloud cluster to be processed, and executing a first processing process; the first processing procedure comprises:
calculating the difference between the space size occupied by the point cloud cluster to be processed and the size of the calibration plate;
if the difference is smaller than or equal to a first preset threshold value, determining the point cloud cluster to be processed as the point cloud cluster corresponding to the calibration plate;
and if the difference value is larger than the first preset threshold value, deleting the point cloud cluster to be processed, and returning to the step of re-determining the point cloud cluster to be processed.
13. The apparatus according to claim 10, wherein the third processing module is specifically configured to:
and carrying out nonlinear optimization on external parameters between the camera and the laser radar according to an optimization target to obtain an external parameter calibration result, wherein the optimization target is determined by the distance between the first position coordinate and the second position coordinate of each vertex.
14. The apparatus of claim 13, wherein the optimization objective is to minimize an average distance of all vertices on the calibration plate, wherein the average distance is an average of distances between the first position coordinate and the second position coordinate of each vertex.
15. The apparatus of claim 10, wherein the obtaining module is specifically configured to:
acquiring a plurality of frames of video images shot by the camera;
selecting continuous multi-frame first images of which the calibration plate is still from the multi-frame video images;
selecting one frame of first image from the plurality of frames of first images as a second image;
the point cloud closest in time to the timestamp of the second image is treated as a set of time-synchronized data pairs with the second image.
16. The apparatus of claim 15, wherein the calibration plate includes a plurality of reference points thereon;
the acquisition module is specifically configured to:
and calculating the sum of the moving distances of all reference points on the calibration board for every two adjacent frames of video images, and if the sum is smaller than a second preset threshold, judging that the calibration board keeps still on the two frames of video images, wherein the moving distance of one reference point is the distance between the positions of the reference point in the two frames of video images.
17. The apparatus of any one of claims 10-16, further comprising an output module;
the output module is configured to:
and storing and/or displaying the external reference calibration result.
18. The apparatus of any one of claims 10-16, further comprising a test module;
the test module is used for:
acquiring a time-synchronized test data pair, wherein the test data pair comprises a test image shot by a camera and a test point cloud scanned by a laser radar, the test image comprises a test object, and the point cloud comprises a point cloud corresponding to the test object;
drawing the point cloud corresponding to the test object to a corresponding position point on the test image according to the external reference calibration result and the internal reference of the camera and a preset pattern;
and displaying the drawn test image.
19. A calibration apparatus, comprising: at least one processor and memory;
the memory stores computer-executable instructions;
the at least one processor executing computer-executable instructions stored by the memory causes the at least one processor to perform the calibration method of any of claims 1-9.
20. A computer-readable storage medium, wherein the computer-readable storage medium has stored therein computer-executable instructions, which when executed by a processor, implement the calibration method as claimed in any one of claims 1 to 9.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911397497.4A CN111179358B (en) | 2019-12-30 | 2019-12-30 | Calibration method, device, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911397497.4A CN111179358B (en) | 2019-12-30 | 2019-12-30 | Calibration method, device, equipment and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111179358A true CN111179358A (en) | 2020-05-19 |
CN111179358B CN111179358B (en) | 2024-01-05 |
Family
ID=70649027
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911397497.4A Active CN111179358B (en) | 2019-12-30 | 2019-12-30 | Calibration method, device, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111179358B (en) |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111965624A (en) * | 2020-08-06 | 2020-11-20 | 北京百度网讯科技有限公司 | Calibration method, device and equipment for laser radar and camera and readable storage medium |
CN112015231A (en) * | 2020-07-31 | 2020-12-01 | 中标慧安信息技术股份有限公司 | Method and system for processing surveillance video partition |
CN112180347A (en) * | 2020-09-09 | 2021-01-05 | 湖北亿咖通科技有限公司 | External orientation element calibration method, device, electronic device and storage medium |
CN112215896A (en) * | 2020-09-01 | 2021-01-12 | 深圳市瑞立视多媒体科技有限公司 | Camera frame data processing method and device for multi-camera calibration and computer equipment |
CN112230204A (en) * | 2020-10-27 | 2021-01-15 | 深兰人工智能(深圳)有限公司 | Combined calibration method and device for laser radar and camera |
CN112270713A (en) * | 2020-10-14 | 2021-01-26 | 北京航空航天大学杭州创新研究院 | Calibration method and device, storage medium and electronic device |
CN112305557A (en) * | 2020-10-20 | 2021-02-02 | 深圳无境智能机器人有限公司 | Panoramic camera and multi-line laser radar external parameter calibration system |
CN112577517A (en) * | 2020-11-13 | 2021-03-30 | 上汽大众汽车有限公司 | Multi-element positioning sensor combined calibration method and system |
CN112578367A (en) * | 2020-10-21 | 2021-03-30 | 上汽大众汽车有限公司 | System and method for measuring relative time of camera and laser radar in automatic driving system |
CN112710235A (en) * | 2020-12-21 | 2021-04-27 | 北京百度网讯科技有限公司 | Calibration method and device of structured light measuring sensor |
CN112750169A (en) * | 2021-01-13 | 2021-05-04 | 深圳瀚维智能医疗科技有限公司 | Camera calibration method, device and system and computer readable storage medium |
CN113066134A (en) * | 2021-04-23 | 2021-07-02 | 深圳市商汤科技有限公司 | Calibration method and device of visual sensor, electronic equipment and storage medium |
CN113138375A (en) * | 2021-04-27 | 2021-07-20 | 北京理工大学 | Combined calibration method, system and calibration plate |
CN113177989A (en) * | 2021-05-07 | 2021-07-27 | 深圳云甲科技有限公司 | Intraoral scanner calibration method and device |
CN113256740A (en) * | 2021-06-29 | 2021-08-13 | 湖北亿咖通科技有限公司 | Calibration method of radar and camera, electronic device and storage medium |
CN113256729A (en) * | 2021-03-17 | 2021-08-13 | 广西综合交通大数据研究院 | External parameter calibration method, device, equipment and storage medium for laser radar and camera |
CN113763478A (en) * | 2020-09-09 | 2021-12-07 | 北京京东乾石科技有限公司 | Unmanned vehicle camera calibration method, device, equipment, storage medium and system |
CN114076919A (en) * | 2020-08-20 | 2022-02-22 | 北京万集科技股份有限公司 | Millimeter wave radar and camera combined calibration method and device, server and computer readable storage medium |
CN114333418A (en) * | 2021-12-30 | 2022-04-12 | 深兰人工智能(深圳)有限公司 | Data processing method for automatic driving and related device |
CN114758005A (en) * | 2022-03-23 | 2022-07-15 | 中国科学院自动化研究所 | Laser radar and camera external parameter calibration method and device |
WO2022179094A1 (en) * | 2021-02-24 | 2022-09-01 | 长沙行深智能科技有限公司 | Vehicle-mounted lidar external parameter joint calibration method and system, medium and device |
CN115439561A (en) * | 2022-10-25 | 2022-12-06 | 杭州华橙软件技术有限公司 | Sensor calibration method for robot, robot and storage medium |
CN115994955A (en) * | 2023-03-23 | 2023-04-21 | 深圳佑驾创新科技有限公司 | Camera external parameter calibration method and device and vehicle |
CN116485917A (en) * | 2023-06-19 | 2023-07-25 | 擎翌(上海)智能科技有限公司 | Combined calibration method, system, equipment and medium for shooting device and radar device |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107993268A (en) * | 2017-12-26 | 2018-05-04 | 广东工业大学 | A kind of method and system of Camera Self-Calibration |
CN108389233A (en) * | 2018-02-23 | 2018-08-10 | 大连理工大学 | The laser scanner and camera calibration method approached based on boundary constraint and mean value |
CN109636837A (en) * | 2018-12-21 | 2019-04-16 | 浙江大学 | A kind of evaluation method of monocular camera and ginseng calibration accuracy outside millimetre-wave radar |
US20190120948A1 (en) * | 2017-10-19 | 2019-04-25 | DeepMap Inc. | Lidar and camera synchronization |
CN109949371A (en) * | 2019-03-18 | 2019-06-28 | 北京智行者科技有限公司 | A kind of scaling method for laser radar and camera data |
CN110221275A (en) * | 2019-05-21 | 2019-09-10 | 菜鸟智能物流控股有限公司 | Calibration method and device between laser radar and camera |
CN110349221A (en) * | 2019-07-16 | 2019-10-18 | 北京航空航天大学 | A kind of three-dimensional laser radar merges scaling method with binocular visible light sensor |
CN110473262A (en) * | 2019-08-22 | 2019-11-19 | 北京双髻鲨科技有限公司 | Outer ginseng scaling method, device, storage medium and the electronic equipment of more mesh cameras |
CN110580724A (en) * | 2019-08-28 | 2019-12-17 | 贝壳技术有限公司 | method and device for calibrating binocular camera set and storage medium |
CN110599541A (en) * | 2019-08-28 | 2019-12-20 | 贝壳技术有限公司 | Method and device for calibrating multiple sensors and storage medium |
-
2019
- 2019-12-30 CN CN201911397497.4A patent/CN111179358B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190120948A1 (en) * | 2017-10-19 | 2019-04-25 | DeepMap Inc. | Lidar and camera synchronization |
CN107993268A (en) * | 2017-12-26 | 2018-05-04 | 广东工业大学 | A kind of method and system of Camera Self-Calibration |
CN108389233A (en) * | 2018-02-23 | 2018-08-10 | 大连理工大学 | The laser scanner and camera calibration method approached based on boundary constraint and mean value |
CN109636837A (en) * | 2018-12-21 | 2019-04-16 | 浙江大学 | A kind of evaluation method of monocular camera and ginseng calibration accuracy outside millimetre-wave radar |
CN109949371A (en) * | 2019-03-18 | 2019-06-28 | 北京智行者科技有限公司 | A kind of scaling method for laser radar and camera data |
CN110221275A (en) * | 2019-05-21 | 2019-09-10 | 菜鸟智能物流控股有限公司 | Calibration method and device between laser radar and camera |
CN110349221A (en) * | 2019-07-16 | 2019-10-18 | 北京航空航天大学 | A kind of three-dimensional laser radar merges scaling method with binocular visible light sensor |
CN110473262A (en) * | 2019-08-22 | 2019-11-19 | 北京双髻鲨科技有限公司 | Outer ginseng scaling method, device, storage medium and the electronic equipment of more mesh cameras |
CN110580724A (en) * | 2019-08-28 | 2019-12-17 | 贝壳技术有限公司 | method and device for calibrating binocular camera set and storage medium |
CN110599541A (en) * | 2019-08-28 | 2019-12-20 | 贝壳技术有限公司 | Method and device for calibrating multiple sensors and storage medium |
Non-Patent Citations (2)
Title |
---|
YOONSU PARK ET AL.: "Calibration between Color Camera and 3D LIDAR Instruments with a Polygonal Planar Board", 《SENSORS》 * |
韩正勇 等: "一种针孔相机与三维激光雷达外参标定方法", 《传感器与微系统》 * |
Cited By (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112015231A (en) * | 2020-07-31 | 2020-12-01 | 中标慧安信息技术股份有限公司 | Method and system for processing surveillance video partition |
CN111965624A (en) * | 2020-08-06 | 2020-11-20 | 北京百度网讯科技有限公司 | Calibration method, device and equipment for laser radar and camera and readable storage medium |
CN111965624B (en) * | 2020-08-06 | 2024-04-09 | 阿波罗智联(北京)科技有限公司 | Laser radar and camera calibration method, device, equipment and readable storage medium |
CN114076919A (en) * | 2020-08-20 | 2022-02-22 | 北京万集科技股份有限公司 | Millimeter wave radar and camera combined calibration method and device, server and computer readable storage medium |
CN112215896A (en) * | 2020-09-01 | 2021-01-12 | 深圳市瑞立视多媒体科技有限公司 | Camera frame data processing method and device for multi-camera calibration and computer equipment |
CN112215896B (en) * | 2020-09-01 | 2024-01-30 | 深圳市瑞立视多媒体科技有限公司 | Multi-camera calibrated camera frame data processing method and device and computer equipment |
CN113763478B (en) * | 2020-09-09 | 2024-04-12 | 北京京东尚科信息技术有限公司 | Unmanned vehicle camera calibration method, device, equipment, storage medium and system |
CN113763478A (en) * | 2020-09-09 | 2021-12-07 | 北京京东乾石科技有限公司 | Unmanned vehicle camera calibration method, device, equipment, storage medium and system |
CN112180347A (en) * | 2020-09-09 | 2021-01-05 | 湖北亿咖通科技有限公司 | External orientation element calibration method, device, electronic device and storage medium |
CN112270713A (en) * | 2020-10-14 | 2021-01-26 | 北京航空航天大学杭州创新研究院 | Calibration method and device, storage medium and electronic device |
CN112270713B (en) * | 2020-10-14 | 2024-06-14 | 北京航空航天大学杭州创新研究院 | Calibration method and device, storage medium and electronic device |
CN112305557A (en) * | 2020-10-20 | 2021-02-02 | 深圳无境智能机器人有限公司 | Panoramic camera and multi-line laser radar external parameter calibration system |
CN112305557B (en) * | 2020-10-20 | 2023-10-20 | 深圳市诺达通信技术有限公司 | Panoramic camera and multi-line laser radar external parameter calibration system |
CN112578367A (en) * | 2020-10-21 | 2021-03-30 | 上汽大众汽车有限公司 | System and method for measuring relative time of camera and laser radar in automatic driving system |
CN112230204A (en) * | 2020-10-27 | 2021-01-15 | 深兰人工智能(深圳)有限公司 | Combined calibration method and device for laser radar and camera |
CN112577517A (en) * | 2020-11-13 | 2021-03-30 | 上汽大众汽车有限公司 | Multi-element positioning sensor combined calibration method and system |
CN112710235A (en) * | 2020-12-21 | 2021-04-27 | 北京百度网讯科技有限公司 | Calibration method and device of structured light measuring sensor |
CN112750169A (en) * | 2021-01-13 | 2021-05-04 | 深圳瀚维智能医疗科技有限公司 | Camera calibration method, device and system and computer readable storage medium |
CN112750169B (en) * | 2021-01-13 | 2024-03-19 | 深圳瀚维智能医疗科技有限公司 | Camera calibration method, device, system and computer readable storage medium |
WO2022179094A1 (en) * | 2021-02-24 | 2022-09-01 | 长沙行深智能科技有限公司 | Vehicle-mounted lidar external parameter joint calibration method and system, medium and device |
CN113256729A (en) * | 2021-03-17 | 2021-08-13 | 广西综合交通大数据研究院 | External parameter calibration method, device, equipment and storage medium for laser radar and camera |
CN113066134A (en) * | 2021-04-23 | 2021-07-02 | 深圳市商汤科技有限公司 | Calibration method and device of visual sensor, electronic equipment and storage medium |
CN113138375B (en) * | 2021-04-27 | 2022-11-29 | 北京理工大学 | Combined calibration method |
CN113138375A (en) * | 2021-04-27 | 2021-07-20 | 北京理工大学 | Combined calibration method, system and calibration plate |
CN113177989A (en) * | 2021-05-07 | 2021-07-27 | 深圳云甲科技有限公司 | Intraoral scanner calibration method and device |
CN113256740A (en) * | 2021-06-29 | 2021-08-13 | 湖北亿咖通科技有限公司 | Calibration method of radar and camera, electronic device and storage medium |
CN114333418B (en) * | 2021-12-30 | 2022-11-01 | 深兰人工智能(深圳)有限公司 | Data processing method for automatic driving and related device |
CN114333418A (en) * | 2021-12-30 | 2022-04-12 | 深兰人工智能(深圳)有限公司 | Data processing method for automatic driving and related device |
CN114758005A (en) * | 2022-03-23 | 2022-07-15 | 中国科学院自动化研究所 | Laser radar and camera external parameter calibration method and device |
CN115439561B (en) * | 2022-10-25 | 2023-03-10 | 杭州华橙软件技术有限公司 | Robot sensor calibration method, robot and storage medium |
CN115439561A (en) * | 2022-10-25 | 2022-12-06 | 杭州华橙软件技术有限公司 | Sensor calibration method for robot, robot and storage medium |
CN115994955B (en) * | 2023-03-23 | 2023-07-04 | 深圳佑驾创新科技有限公司 | Camera external parameter calibration method and device and vehicle |
CN115994955A (en) * | 2023-03-23 | 2023-04-21 | 深圳佑驾创新科技有限公司 | Camera external parameter calibration method and device and vehicle |
CN116485917B (en) * | 2023-06-19 | 2023-09-22 | 擎翌(上海)智能科技有限公司 | Combined calibration method, system, equipment and medium for shooting device and radar device |
CN116485917A (en) * | 2023-06-19 | 2023-07-25 | 擎翌(上海)智能科技有限公司 | Combined calibration method, system, equipment and medium for shooting device and radar device |
Also Published As
Publication number | Publication date |
---|---|
CN111179358B (en) | 2024-01-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111179358B (en) | Calibration method, device, equipment and storage medium | |
US11503275B2 (en) | Camera calibration system, target, and process | |
CN111291584B (en) | Method and system for identifying two-dimensional code position | |
CN111815707B (en) | Point cloud determining method, point cloud screening method, point cloud determining device, point cloud screening device and computer equipment | |
CN109801333B (en) | Volume measurement method, device and system and computing equipment | |
CN109919975B (en) | Wide-area monitoring moving target association method based on coordinate calibration | |
CN112270719B (en) | Camera calibration method, device and system | |
CN112184811B (en) | Monocular space structured light system structure calibration method and device | |
KR20110059506A (en) | System and method for obtaining camera parameters from multiple images and computer program products thereof | |
EP3093822B1 (en) | Displaying a target object imaged in a moving picture | |
CN110738703B (en) | Positioning method and device, terminal and storage medium | |
WO2022217988A1 (en) | Sensor configuration scheme determination method and apparatus, computer device, storage medium, and program | |
CN112257713A (en) | Image processing method, image processing device, electronic equipment and computer readable storage medium | |
CN110766731A (en) | Method and device for automatically registering panoramic image and point cloud and storage medium | |
CN110991297A (en) | Target positioning method and system based on scene monitoring | |
CN114387199A (en) | Image annotation method and device | |
CN107067441B (en) | Camera calibration method and device | |
CN113936010A (en) | Shelf positioning method and device, shelf carrying equipment and storage medium | |
CN111862208B (en) | Vehicle positioning method, device and server based on screen optical communication | |
CN112262411B (en) | Image association method, system and device | |
CN111598956A (en) | Calibration method, device and system | |
CN116380918A (en) | Defect detection method, device and equipment | |
CN115719387A (en) | 3D camera calibration method, point cloud image acquisition method and camera calibration system | |
CN117115434A (en) | Data dividing apparatus and method | |
CN112686962A (en) | Indoor visual positioning method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |