CN111538032A - Time synchronization method and device based on independent drawing tracks of camera and laser radar - Google Patents

Time synchronization method and device based on independent drawing tracks of camera and laser radar Download PDF

Info

Publication number
CN111538032A
CN111538032A CN202010422713.2A CN202010422713A CN111538032A CN 111538032 A CN111538032 A CN 111538032A CN 202010422713 A CN202010422713 A CN 202010422713A CN 111538032 A CN111538032 A CN 111538032A
Authority
CN
China
Prior art keywords
camera
laser radar
angle change
pose
lidar
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010422713.2A
Other languages
Chinese (zh)
Other versions
CN111538032B (en
Inventor
刘继廷
其他发明人请求不公开姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Digital Green Earth Technology Co.,Ltd.
Original Assignee
Beijing Greenvalley Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Greenvalley Technology Co ltd filed Critical Beijing Greenvalley Technology Co ltd
Priority to CN202010422713.2A priority Critical patent/CN111538032B/en
Publication of CN111538032A publication Critical patent/CN111538032A/en
Application granted granted Critical
Publication of CN111538032B publication Critical patent/CN111538032B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Electromagnetism (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

The embodiment of the invention discloses a time synchronization method and a time synchronization device based on independent drawing tracks of a camera and a laser radar, wherein the method comprises the following steps: acquiring respective pose information according to the information acquired by the camera and the laser radar respectively; calculating the angle change of the camera according to the pose information of the camera, interpolating according to the pose information of the laser radar at each sampling moment of the camera by taking the timestamp of the camera as a reference, and then calculating the angle change of the laser radar; and carrying out data equivalent interception on the respective angle changes of the camera and the laser radar in the same time and constructing a correlation equation so as to solve the synchronous time difference between the camera and the laser radar. The technical scheme of the invention utilizes angle change based on software synchronization and utilizes interpolation and a correlation equation to calculate the time difference, thereby avoiding the influence on the estimation result caused by position information which does not change obviously when the pose estimation time difference is used, improving the time synchronization precision, reducing the cost and the like.

Description

Time synchronization method and device based on independent drawing tracks of camera and laser radar
Technical Field
The invention relates to the technical field of time synchronization, in particular to a time synchronization method and device based on independent drawing tracks of a camera and a laser radar.
Background
In the field of mobile mapping, a laser radar is an excellent 3D composition scheme due to invariance of illumination conditions and high-precision range information, compared with a camera, the laser radar is insufficient in that abundant visual appearance information cannot be provided, and the information has a great effect on acquisition, processing and the like of point clouds, so that an effective solution is to fuse data of the laser radar and data of the camera, and the most important problem of combining range and color information is the time synchronization problem of the laser radar and the camera.
However, the prior art mainly has the following defects: 1. for the method of hardware synchronization, the same clock source is used for triggering. The problem of data synchronization of multiple sensors is solved by using high precision hardware. Although the method is high in accuracy, the cost is high, and the registration of the time stamp involves linkage debugging of hardware and is very complicated. For example, patent publication No. CN110217178A, "a hardware synchronization based unmanned sensing system and its working method", etc. 2. For the method using software synchronization, since the camera and the lidar have the same motion change on the same rigid body, a common method is to estimate the time difference by comparing pose changes of the camera and the lidar between consecutive frames. However, the method is low in accuracy, because only one or a few pieces of information of the pose with 6 degrees of freedom are obviously changed in the moving process. In addition, there are other time synchronization methods, such as "map generation method and system based on crowd-sourced data open scene" disclosed in patent publication No. CN110599570A, which time-synchronizes the extracted semantic information with the track information of gps by using the ID and time stamp of the vehicle. However, such methods are also not suitable for use in lidar and cameras that do not have accurate time stamps.
Disclosure of Invention
In view of this, the embodiment of the present invention provides a time synchronization method and apparatus based on independent mapping tracks of a camera and a laser radar.
An embodiment of the present invention provides a time synchronization method based on independent mapping tracks of a camera and a laser radar, including:
acquiring respective pose information according to the information acquired by the camera and the laser radar respectively;
calculating the angle change of the camera according to the pose information of the camera, interpolating according to the pose information of the laser radar at each sampling moment of the camera by taking a timestamp of the camera as a reference, and then calculating the angle change of the laser radar;
and equivalently intercepting the angle change data of the camera and the laser radar within the same time, and constructing a correlation equation between the camera and the laser radar according to the intercepted angle change data so as to solve the synchronous time difference between the camera and the laser radar.
Further, in the above method for time synchronization based on independent mapping trajectories of a camera and a lidar, the method further includes:
and performing the step of solving the synchronous time difference between the camera and the laser radar once every preset time interval.
Further, in the above time synchronization method based on independent mapping tracks of the camera and the lidar, the "acquiring respective pose information according to the information acquired by the camera and the lidar" includes:
extracting image feature points in each frame of image according to the image information acquired by the camera, and matching the image feature points of adjacent frames to construct a reprojection error function;
optimizing the re-projection error function based on a preset algorithm to solve and obtain pose information of the camera at each moment;
extracting point cloud feature points and surface features in each frame of point cloud according to the point cloud data acquired by the laser radar, and constructing a distance function between the point cloud feature points and the surface features;
and optimizing the distance function based on the preset algorithm to solve and obtain the pose information of the laser radar at each moment.
Further, in the above time synchronization method based on independent camera and lidar drawing tracks, the "taking the time stamp of the camera as a reference, interpolating according to the pose information of the lidar at each sampling time of the camera, and then calculating the angle change of the lidar" includes:
and taking the sampling time stamp of the camera as a reference, performing linear interpolation on the laser radar at each sampling time corresponding to the camera according to the acquired pose information of the laser radar to acquire the pose of the laser radar at the corresponding sampling time, and calculating the angle change of the laser radar at each sampling time according to each acquired pose.
Further, in the above time synchronization method based on independent camera and lidar mapping trajectories, the "calculating the angle change of the camera according to the pose information of the camera" includes:
if the pose of the camera at the ith moment is
Figure BDA0002497510460000031
The pose at the i +1 th moment is
Figure BDA0002497510460000032
The angle of the camera at the ith time is changed
Figure BDA0002497510460000033
The following formula is satisfied:
Figure BDA0002497510460000034
further, in the above time synchronization method based on independent mapping trajectories of a camera and a lidar, the "calculating the angle change of the lidar at each sampling time according to each acquired pose" includes:
if the pose of the laser radar at the ith moment is
Figure BDA0002497510460000041
The pose at the i +1 th moment is
Figure BDA0002497510460000042
The angle of the laser beam at the ith moment is changed
Figure BDA0002497510460000043
The following formula is satisfied:
Figure BDA0002497510460000044
further, in the above time synchronization method based on independent camera and lidar mapping tracks, the "constructing a correlation equation between the camera and the lidar according to the intercepted angle variation data to solve the synchronization time difference between the camera and the lidar" includes:
respectively calculating respective corresponding angle change mean values according to the respective intercepted angle change data of the camera and the laser radar, and constructing a normalized cross-correlation equation based on the angle change data and the angle change mean values;
solving a time difference value corresponding to the maximum value of the normalized cross-correlation equation, and taking the time difference value as a synchronous time difference between the camera and the laser radar;
wherein if NCC represents normalized cross-correlation value, T is the number of intercepted data,
Figure BDA0002497510460000045
and
Figure BDA0002497510460000046
respectively the angle change of the camera and the laser radar at the ith moment, delta t is the time difference between the camera and the laser radar, mcAnd mlThe normalized cross-correlation equations should satisfy the following equation if the angle change mean values of the camera and the lidar are respectively:
Figure BDA0002497510460000047
another embodiment of the present invention provides a time synchronization apparatus based on independent mapping trajectories of a camera and a lidar, including:
the position and pose information acquisition module is used for acquiring position and pose information of the camera and the laser radar according to the information acquired by the camera and the laser radar respectively;
the angle change calculation module is used for calculating the angle change of the camera according to the pose information of the camera, interpolating according to the pose information of the laser radar at each sampling moment of the camera by taking a timestamp of the camera as a reference, and then calculating the angle change of the laser radar;
and the synchronous time difference calculation module is used for carrying out data equivalent interception on the respective angle change of the camera and the laser radar in the same time, and constructing a correlation equation between the camera and the laser radar according to the intercepted angle change data so as to solve the synchronous time difference between the camera and the laser radar.
Another embodiment of the present invention provides a terminal, including: a processor and a memory, the memory storing a computer program for executing the computer program to implement the camera and lidar independent mapping trajectory based time synchronization method described above.
Yet another embodiment of the invention proposes a computer-readable storage medium, which stores a computer program that, when executed, implements a method for time synchronization based on independent camera and lidar mapping trajectories according to the above.
The technical scheme of the invention has the following beneficial effects:
the method provided by the embodiment of the invention utilizes the angle change and takes the time stamp of the camera with high frequency as the reference to perform data interpolation on the laser radar, and further constructs the correlation equation of the angle change and the time stamp to solve the time difference, so that the influence of position information which does not change obviously on the estimation result when the pose is used for estimating the time difference is avoided, the time synchronization precision is improved, and the cost is reduced compared with a hardware synchronization method by using a software synchronization method.
Drawings
In order to more clearly illustrate the technical solution of the present invention, the drawings required to be used in the embodiments will be briefly described below, and it should be understood that the following drawings only illustrate some embodiments of the present invention, and therefore should not be considered as limiting the scope of the present invention. Like components are numbered similarly in the various figures.
FIG. 1 is a first flowchart of a time synchronization method based on independent mapping tracks of a camera and a lidar according to an embodiment of the present invention;
FIG. 2 is a second flow chart of the time synchronization method based on independent mapping tracks of the camera and the lidar according to the embodiment of the invention;
FIG. 3 is a third flowchart of a time synchronization method based on independent mapping tracks of a camera and a lidar according to an embodiment of the present invention;
FIG. 4 is a fourth flowchart illustrating a time synchronization method based on independent camera and lidar mapping tracks according to an embodiment of the present invention;
fig. 5 shows a schematic structural diagram of a time synchronization device based on independent drawing tracks of a camera and a lidar according to an embodiment of the present invention.
Description of the main element symbols:
10-a time synchronization device based on independent mapping trajectories of the camera and the lidar; 110-pose information acquisition module; 120-angle change calculation module; 130-synchronous time difference calculation module.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments.
The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
Hereinafter, the terms "including", "having", and their derivatives, which may be used in various embodiments of the present invention, are only intended to indicate specific features, numbers, steps, operations, elements, components, or combinations of the foregoing, and should not be construed as first excluding the existence of, or adding to, one or more other features, numbers, steps, operations, elements, components, or combinations of the foregoing.
Furthermore, the terms "first," "second," "third," and the like are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which various embodiments of the present invention belong. The terms (such as those defined in commonly used dictionaries) should be interpreted as having a meaning that is consistent with their contextual meaning in the relevant art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein in various embodiments of the present invention.
Example 1
Referring to fig. 1, the present embodiment provides a time synchronization method based on independent camera and lidar drawing tracks, which may be applied to various scenarios such as unmanned aerial vehicles, autonomous navigation vehicles, or mobile backpack drawing, and the method uses the change of angles to solve the synchronization time difference between the camera and the lidar which are drawn independently, so as to fuse the information of the camera and the lidar to establish more accurate drawing.
The time synchronization method based on independent mapping trajectories of the camera and the lidar is described in detail below, as shown in fig. 1.
And S100, acquiring respective pose information according to the information acquired by the camera and the laser radar respectively.
The step S100 mainly includes two steps, that is, the trajectory of the camera and the trajectory of the lidar are acquired respectively. It can be understood that the obtained pose information (including position and attitude information) of the camera or the lidar at each moment in the three-dimensional space is a track of the camera or the lidar.
Exemplarily, as shown in fig. 2, the step S100 mainly includes the following sub-steps:
and a substep S110, extracting image characteristic points in each frame of image according to the image information acquired by the camera, and matching the image characteristic points of adjacent frames to construct a reprojection error function.
Preferably, a synchronized positioning and mapping (SLAM) technique may be employed for the solution. Exemplarily, feature point extraction can be performed on each frame of image information collected by the camera, then image feature points between adjacent frames are calculated for matching, and a reprojection error function is constructed according to the feature matching between the adjacent frames.
It is understood that the so-called reprojection error refers to an error between pixel coordinates of an image (i.e., an observed projection position) and a position at which a point in a three-dimensional space is projected in accordance with a currently estimated pose. The image feature points refer to extracting some representative region information from the image, such as corners, edges, and blocks in the image, and can be extracted specifically according to actual requirements. The construction of the reprojection error function using the feature points of the image can refer to the related art that has been disclosed, and therefore, the detailed description thereof is omitted here.
And a substep S120 of optimizing the reprojection error function based on a preset algorithm to solve and obtain the pose information of the camera at each moment.
Since the pose of the camera is unknown and observation point noise exists, in order to minimize the reprojection error, exemplarily, the reprojection error function can be optimized and solved by adopting a gauss-newton method or a levenberg-marquardt method, so as to obtain the relative pose between the adjacent frame images, and further, the pose of each frame image can be calculated. Further, if the error after the point with the mismatching is reprojected is large, the optimization processing can be further performed by methods such as a graph optimization model and the like.
Therefore, the pose of the camera at each moment can be obtained through the synchronous positioning and map building method, and the pose at each moment is connected to be the track of the camera.
And a substep S130, extracting point cloud feature points and surface features in each frame of point cloud according to the point cloud data acquired by the laser radar, and constructing a distance function between the point cloud feature points and the surface features.
For the laser radar, the track, namely the pose information at each moment, can be obtained by adopting a synchronous positioning and map building method according to the collected point cloud data. Similarly, exemplarily, the feature points and the surface features in each frame of point cloud may be extracted first, wherein the point cloud feature points are mainly points with the maximum or minimum curvature; and the surface feature is an equation of one scanning surface obtained by fitting to the point cloud. And then, constructing a point-to-surface distance function to solve the pose information of the laser radar. For example, the optimal solution may be performed using an algorithm such as a least squares method based on an euclidean distance from a point to a nearest tangent plane as an objective function. In addition, the pose of the laser radar can also realize point cloud registration and the like by using a point-to-point distance method.
And a substep S140 of optimizing the distance function based on the preset algorithm to solve and obtain the pose information of the laser radar at each moment. Exemplarily, for the distance function from the point to the surface, a preset algorithm such as a singular value decomposition method, an orthogonal matrix method or a four-element method can be adopted to solve to obtain the relative pose between the current feature point and the current surface, and further the pose of each frame of point cloud can be obtained by solving, and all the pose of the point cloud are connected to be the track of the laser.
It is to be understood that, for the above sub-steps S110 to S120 for solving the pose information of the camera and the sub-steps S130 to S140 for solving the pose information of the lidar, the solving order of the two pose information is not limited, and the two pose information may be solved in parallel, in a front-back order, or the like.
After the pose information of the camera and the laser radar is acquired through the step S100, the angle change information of the camera and the laser radar is calculated, and then the time difference is solved based on the angle change. This is because the angle of the device often has obvious changes relative to the translation information during the mobile mapping process, especially in the situations such as the backpack device needs various attitude transformations during the winding, S-shaped orbit walking, and the unmanned aerial vehicle flying. However, in the prior art, the time difference is mainly directly estimated by using pose information, which may affect the estimation result for position information with insignificant change.
And S200, calculating the angle change of the camera according to the pose information of the camera, interpolating according to the pose information of the laser radar at each sampling moment of the camera by taking the time stamp of the camera as a reference, and then calculating the angle change of the laser radar.
The step S200 is mainly used to obtain the angle change of the camera and the lidar. For the angle change of the camera, exemplarily, if the pose of the camera at the ith moment is obtained by solving
Figure BDA0002497510460000101
The pose at the i +1 th moment is
Figure BDA0002497510460000102
The angle of the camera at the ith time is changed
Figure BDA0002497510460000103
The following equation will be satisfied:
Figure BDA0002497510460000104
wherein the pose R of the cameracIs a 4 x 4 matrix (containing translation information and spatial rotation information);
Figure BDA0002497510460000105
the pose of the camera relative to a camera coordinate system at the ith moment is shown, and generally, the camera coordinate system takes the pose of the first frame image obtained by solving;
Figure BDA0002497510460000106
a transpose representing the pose at time i; tr () represents the trace of the matrix.
Exemplarily, as shown in fig. 3, the process of calculating the angle change of the lidar in step S200 mainly includes the following sub-steps:
and a substep S210, taking the sampling time stamp of the camera as a reference, and performing linear interpolation on the laser radar at each sampling time corresponding to the camera according to the acquired pose information of the laser radar so as to acquire the pose of the laser radar at the corresponding sampling time.
Because the camera and the lidar cannot be guaranteed to be started and ended at the same time, and the camera and the lidar have different sampling frequencies, generally, the sampling frequency of the camera is greater than that of the lidar, and correlation is established for angle changes of the camera and the lidar at the same time. Preferably, a linear interpolation method is employed.
For example, if the sampling time stamp of the camera is 1.0, 1.5, 2.0, 2.5, 3.0. Exemplarily, the specific interpolated value may be calculated according to pose change information between two poses of the laser radar at the previous and subsequent times corresponding to the current time. Thus, the pose data of the laser radar at each time including the interpolation can be obtained.
And a substep S220 of calculating the angle change of the laser radar at each sampling moment according to the acquired positions.
Exemplarily, if lidarPose at time i of
Figure BDA0002497510460000111
The pose at the i +1 th moment is
Figure BDA0002497510460000112
The angle of the laser beam at the ith time is changed
Figure BDA0002497510460000113
The following formula is satisfied:
Figure BDA0002497510460000114
wherein the position and posture of the laser radar
Figure BDA0002497510460000115
Also a 4 x 4 matrix (containing translation information and spatial rotation information);
Figure BDA0002497510460000116
the pose of the laser radar relative to a laser radar coordinate system at the ith moment is represented, and generally, the pose of the first frame point cloud obtained by solving is taken by the laser radar coordinate system;
Figure BDA0002497510460000117
representing the transpose of the pose at time i.
And step S300, carrying out data equivalent interception on the respective angle change of the camera and the laser radar within the same time, and constructing a correlation equation between the camera and the laser radar according to the intercepted angle change data so as to solve the synchronous time difference between the camera and the laser radar.
Exemplarily, as shown in fig. 4, the step S300 mainly includes the following sub-steps:
and a substep S310 of equally intercepting the respective angle change data of the camera and the laser radar in the same time.
Exemplarily, the same number of angle change values are intercepted in the same time from the respective angle change data of the camera and the lidar. For example, the data within the 60s between the 10 th s and the 70 th s are respectively intercepted, the laser radar is intercepted in the same amount at the corresponding time by taking the sampling time of the intercepted data of the camera as a reference, and the other redundant amount is deleted, so that the two amounts are ensured to be equal.
And a substep S320 of respectively calculating respective corresponding angle change mean values according to the respective intercepted angle change data of the camera and the laser radar, and constructing a normalized cross-correlation equation based on the obtained angle change data and the angle change mean values.
Exemplarily, the average value of the angular variation of the respective intercepted data of the camera and the lidar may be calculated separately, i.e. by using an average solution. For example, if sampling is performed every 1s, then 60 angular variation data will be obtained by truncation within 60s, and then the average of these data is calculated.
Since the camera and the laser are mounted on the same rigid body, both should have the same angular variation at the same time, i.e. there is
Figure BDA0002497510460000121
Where Δ t is the time difference between the camera and the lidar. If NCC represents a normalized cross-correlation value, the constructed normalized cross-correlation equation should satisfy:
Figure BDA0002497510460000122
wherein T is the number of the intercepted data,
Figure BDA0002497510460000123
and
Figure BDA0002497510460000124
angle changes of camera and lidar respectively at the ith moment, mcAnd mlThe average of the angle changes of the camera and the lidar respectively.
And a substep S330 of solving a time difference value corresponding to the maximum value of the normalized cross-correlation equation, and taking the time difference value as a synchronization time difference between the camera and the laser radar.
For the above normalized cross-correlation equation, Δ t corresponding to the maximum solution of the equation may be solved, and then the Δ t obtained by the solution may be used as the synchronization time difference between the camera and the lidar. It can be understood that, because the camera and the lidar work independently, the drawing tracks of the camera and the lidar have a time difference problem, and the synchronization time difference obtained through the calculation can be synchronized with the precision time of the camera and the lidar, so that a foundation is provided for the subsequent information fusion of the two types of information, and the like.
Further, because the artificial interpolation is performed in the process of solving the time difference, that is, an error exists between the time difference and the actual situation, and the previous error is accumulated to the next moment, the problem of accumulated error occurs, and the long-term accumulated error can cause the result obtained by calculation to be unreliable, so that the time synchronization can be performed again at intervals in order to reduce the error accumulation problem caused by the calculation time synchronization. In one embodiment, preferably, the method further comprises: and repeatedly executing the step of solving the synchronous time difference between the camera and the laser radar once every preset time interval.
Exemplarily, the time difference synchronization may be performed at preset time intervals, for example, the synchronization time difference value is solved every 40S to 60S according to the above steps S100 to S300, so that the problem of long-term accumulation of time errors may be reduced, and the accuracy of the calculation result may be ensured.
Further preferably, for the device equipped with the camera and the laser radar, in order to improve the precision of the drawing track, the device can be closed as much as possible in the moving process, even if the starting point and the end point are in the same place as much as possible, so that the time synchronization of calculation can be further optimized by utilizing the closed loop detection function in the synchronous positioning and map construction technology, each error boundary can be effectively reduced, and the aim of improving the precision of the track is fulfilled.
According to the time synchronization method based on the independent drawing tracks of the camera and the laser radar, the respective track information of the camera and the laser radar is solved to calculate the respective angle change, and the angle change and the correlation equation are constructed to solve to obtain the synchronization time difference between the camera and the laser radar, and as the angle of the equipment is often obviously changed in the translation process, especially for the scenes with winding loops in the backpack drawing, compared with the method for estimating the time difference by using the pose in the prior art, the method can avoid the occurrence of the large error influence of the position information with unobvious change on the estimation result, so that the precision is improved; and because of the software synchronization method, the voltage can be greatly reduced, and the like.
Example 2
Referring to fig. 1, based on the method of embodiment 1, this embodiment provides a time synchronization apparatus 10 based on independent mapping tracks of a camera and a lidar, including:
a pose information acquiring module 110, configured to acquire pose information of each camera and each lidar according to information acquired by each camera and each lidar;
an angle change calculation module 120, configured to calculate an angle change of the camera according to the pose information of the camera, perform interpolation according to the pose information of the laser radar at each sampling time of the camera by using a timestamp of the camera as a reference, and then calculate the angle change of the laser radar;
and a synchronous time difference calculation module 130, configured to perform data equivalent interception on the respective angle changes of the camera and the lidar within the same time, and construct a correlation equation between the camera and the lidar according to the intercepted angle change data, so as to solve a synchronous time difference between the camera and the lidar.
It is understood that the above-described time synchronization apparatus 10 based on independent drawing trajectories of cameras and lidar corresponds to the on-screen display method of the multi-os of embodiment 1. Any of the options in embodiment 1 are also applicable to this embodiment, and will not be described in detail here.
The present invention also provides a terminal, such as a computer, a server, etc., which includes a memory and a processor, wherein the memory stores a computer program, and the processor executes the computer program, so that the terminal device executes the functions of each module in the above-mentioned time synchronization method based on the independent mapping tracks of the camera and the lidar or the above-mentioned time synchronization device based on the independent mapping tracks of the camera and the lidar.
The memory may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created according to the use of the terminal, and the like. Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The present invention also provides a computer-readable storage medium for storing the computer program used in the above-mentioned terminal.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative and, for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention.

Claims (10)

1. A time synchronization method based on independent drawing tracks of a camera and a laser radar is characterized by comprising the following steps:
acquiring respective pose information according to the information acquired by the camera and the laser radar respectively;
calculating the angle change of the camera according to the pose information of the camera, interpolating according to the pose information of the laser radar at each sampling moment of the camera by taking a timestamp of the camera as a reference, and then calculating the angle change of the laser radar;
and equivalently intercepting the angle change data of the camera and the laser radar within the same time, and constructing a correlation equation between the camera and the laser radar according to the intercepted angle change data so as to solve the synchronous time difference between the camera and the laser radar.
2. The method of claim 1, further comprising:
and performing the step of solving the synchronous time difference between the camera and the laser radar once every preset time interval.
3. The method according to claim 1 or 2, wherein the obtaining respective pose information from the information collected by each of the camera and the lidar comprises:
extracting image feature points in each frame of image according to the image information acquired by the camera, and matching the image feature points of adjacent frames to construct a reprojection error function;
optimizing the re-projection error function based on a preset algorithm to solve and obtain pose information of the camera at each moment;
extracting point cloud feature points and surface features in each frame of point cloud according to the point cloud data acquired by the laser radar, and constructing a distance function between the point cloud feature points and the surface features;
and optimizing the distance function based on the preset algorithm to solve and obtain the pose information of the laser radar at each moment.
4. The method according to claim 1 or 2, wherein the interpolating according to the pose information of the lidar at each sampling time of the camera with reference to the time stamp of the camera and then calculating the angle change of the lidar comprises:
and taking the sampling time stamp of the camera as a reference, performing linear interpolation on the laser radar at each sampling time corresponding to the camera according to the acquired pose information of the laser radar to acquire the pose of the laser radar at the corresponding sampling time, and calculating the angle change of the laser radar at each sampling time according to each acquired pose.
5. The method of claim 4, wherein the calculating the angular change of the camera from the pose information of the camera comprises:
if the pose of the camera at the ith moment is
Figure FDA0002497510450000021
The pose at the i +1 th moment is
Figure FDA0002497510450000022
The angle of the camera at the ith time is changed
Figure FDA0002497510450000023
The following formula is satisfied:
Figure FDA0002497510450000024
6. the method of claim 5, wherein the calculating the angular change of the lidar at the sampling times from the acquired positions comprises:
if the pose of the laser radar at the ith moment is
Figure FDA0002497510450000025
The pose at the i +1 th moment is
Figure FDA0002497510450000026
The angle of the laser beam at the ith moment is changed
Figure FDA0002497510450000027
The following formula is satisfied:
Figure FDA0002497510450000028
7. the method of claim 6, wherein the "constructing a correlation equation between the camera and the lidar from the truncated angle change data to solve for a synchronization time difference between the camera and the lidar" comprises:
respectively calculating respective corresponding angle change mean values according to the respective intercepted angle change data of the camera and the laser radar, and constructing a normalized cross-correlation equation based on the angle change data and the angle change mean values;
solving a time difference value corresponding to the maximum value of the normalized cross-correlation equation, and taking the time difference value as a synchronous time difference between the camera and the laser radar;
wherein if NCC represents normalized cross-correlation value, T is the number of intercepted data,
Figure FDA0002497510450000031
and
Figure FDA0002497510450000032
respectively the angle change of the camera and the laser radar at the ith moment, delta t is the time difference between the camera and the laser radar, mcAnd mlThe normalized cross-correlation equations should satisfy the following equation if the angle change mean values of the camera and the lidar are respectively:
Figure FDA0002497510450000033
8. a time synchronization device based on independent drawing tracks of a camera and a laser radar is characterized by comprising:
the position and pose information acquisition module is used for acquiring position and pose information of the camera and the laser radar according to the information acquired by the camera and the laser radar respectively;
the angle change calculation module is used for calculating the angle change of the camera according to the pose information of the camera, interpolating according to the pose information of the laser radar at each sampling moment of the camera by taking a timestamp of the camera as a reference, and then calculating the angle change of the laser radar;
and the synchronous time difference calculation module is used for carrying out data equivalent interception on the respective angle change of the camera and the laser radar in the same time, and constructing a correlation equation between the camera and the laser radar according to the intercepted angle change data so as to solve the synchronous time difference between the camera and the laser radar.
9. A terminal device, comprising: a processor and a memory, the memory storing a computer program for executing the computer program to implement the camera and lidar independent mapping trajectory based time synchronization method according to any of claims 1 to 7.
10. A computer-readable storage medium, characterized in that it stores a computer program that, when executed, implements the camera and lidar independent mapping trajectory-based time synchronization method according to any of claims 1 to 7.
CN202010422713.2A 2020-05-19 2020-05-19 Time synchronization method and device based on independent drawing tracks of camera and laser radar Active CN111538032B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010422713.2A CN111538032B (en) 2020-05-19 2020-05-19 Time synchronization method and device based on independent drawing tracks of camera and laser radar

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010422713.2A CN111538032B (en) 2020-05-19 2020-05-19 Time synchronization method and device based on independent drawing tracks of camera and laser radar

Publications (2)

Publication Number Publication Date
CN111538032A true CN111538032A (en) 2020-08-14
CN111538032B CN111538032B (en) 2021-04-13

Family

ID=71977884

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010422713.2A Active CN111538032B (en) 2020-05-19 2020-05-19 Time synchronization method and device based on independent drawing tracks of camera and laser radar

Country Status (1)

Country Link
CN (1) CN111538032B (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112184768A (en) * 2020-09-24 2021-01-05 杭州易现先进科技有限公司 SFM reconstruction method and device based on laser radar and computer equipment
CN112214019A (en) * 2020-09-21 2021-01-12 国网浙江省电力有限公司 Non-blind area intelligent feedback control system, method and terminal for unmanned inspection equipment
CN112671499A (en) * 2021-03-16 2021-04-16 深圳裹动智驾科技有限公司 Multi-sensor synchronization method and system and main control equipment
CN112964291A (en) * 2021-04-02 2021-06-15 清华大学 Sensor calibration method and device, computer storage medium and terminal
CN114217665A (en) * 2021-12-21 2022-03-22 清华大学 Camera and laser radar time synchronization method, device and storage medium
CN114710228A (en) * 2022-05-31 2022-07-05 杭州闪马智擎科技有限公司 Time synchronization method and device, storage medium and electronic device
CN115994934A (en) * 2023-03-16 2023-04-21 福思(杭州)智能科技有限公司 Data time alignment method and device and domain controller

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109218562A (en) * 2018-09-07 2019-01-15 百度在线网络技术(北京)有限公司 Clock synchronizing method, device, equipment, storage medium and vehicle
US20190065863A1 (en) * 2017-08-23 2019-02-28 TuSimple Feature matching and correspondence refinement and 3d submap position refinement system and method for centimeter precision localization using camera-based submap and lidar-based global map
CN109587405A (en) * 2018-10-24 2019-04-05 科大讯飞股份有限公司 Method for synchronizing time and device
CN109887057A (en) * 2019-01-30 2019-06-14 杭州飞步科技有限公司 The method and apparatus for generating high-precision map
CN110217178A (en) * 2019-06-18 2019-09-10 浙江大学 A kind of unmanned sensory perceptual system and its working method based on hardware synchronization
CN111045017A (en) * 2019-12-20 2020-04-21 成都理工大学 Method for constructing transformer substation map of inspection robot by fusing laser and vision

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190065863A1 (en) * 2017-08-23 2019-02-28 TuSimple Feature matching and correspondence refinement and 3d submap position refinement system and method for centimeter precision localization using camera-based submap and lidar-based global map
CN109218562A (en) * 2018-09-07 2019-01-15 百度在线网络技术(北京)有限公司 Clock synchronizing method, device, equipment, storage medium and vehicle
CN109587405A (en) * 2018-10-24 2019-04-05 科大讯飞股份有限公司 Method for synchronizing time and device
CN109887057A (en) * 2019-01-30 2019-06-14 杭州飞步科技有限公司 The method and apparatus for generating high-precision map
CN110217178A (en) * 2019-06-18 2019-09-10 浙江大学 A kind of unmanned sensory perceptual system and its working method based on hardware synchronization
CN111045017A (en) * 2019-12-20 2020-04-21 成都理工大学 Method for constructing transformer substation map of inspection robot by fusing laser and vision

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
SHUHUAN WEN等: "Camera Recognition and Laser Detection based on EKF-SLAM in the Autonomous Navigation of Humanoid Robot", 《JOURNAL OF INTELLIGENT&ROBOTIC SYSTEMS》 *
盛淼: "基于双目视觉惯导的SLAM算法研究", 《中国优秀硕士学位论文全文数据库》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112214019A (en) * 2020-09-21 2021-01-12 国网浙江省电力有限公司 Non-blind area intelligent feedback control system, method and terminal for unmanned inspection equipment
CN112184768A (en) * 2020-09-24 2021-01-05 杭州易现先进科技有限公司 SFM reconstruction method and device based on laser radar and computer equipment
CN112184768B (en) * 2020-09-24 2023-10-31 杭州易现先进科技有限公司 SFM reconstruction method and device based on laser radar and computer equipment
CN112671499A (en) * 2021-03-16 2021-04-16 深圳裹动智驾科技有限公司 Multi-sensor synchronization method and system and main control equipment
CN112671499B (en) * 2021-03-16 2022-04-01 深圳安途智行科技有限公司 Multi-sensor synchronization method and system and main control equipment
CN112964291A (en) * 2021-04-02 2021-06-15 清华大学 Sensor calibration method and device, computer storage medium and terminal
CN114217665A (en) * 2021-12-21 2022-03-22 清华大学 Camera and laser radar time synchronization method, device and storage medium
CN114710228A (en) * 2022-05-31 2022-07-05 杭州闪马智擎科技有限公司 Time synchronization method and device, storage medium and electronic device
CN115994934A (en) * 2023-03-16 2023-04-21 福思(杭州)智能科技有限公司 Data time alignment method and device and domain controller

Also Published As

Publication number Publication date
CN111538032B (en) 2021-04-13

Similar Documents

Publication Publication Date Title
CN111538032B (en) Time synchronization method and device based on independent drawing tracks of camera and laser radar
CN111024066B (en) Unmanned aerial vehicle vision-inertia fusion indoor positioning method
CN111561923B (en) SLAM (simultaneous localization and mapping) mapping method and system based on multi-sensor fusion
CN110068335B (en) Unmanned aerial vehicle cluster real-time positioning method and system under GPS rejection environment
US9967463B2 (en) Method for camera motion estimation and correction
CN110675450B (en) Method and system for generating orthoimage in real time based on SLAM technology
US9652864B2 (en) Three-dimensional object recognition device and three-dimensional object recognition method
CN107167826B (en) Vehicle longitudinal positioning system and method based on variable grid image feature detection in automatic driving
CN111462207A (en) RGB-D simultaneous positioning and map creation method integrating direct method and feature method
Le Gentil et al. Idol: A framework for imu-dvs odometry using lines
EP3786891A1 (en) Method and system for visual localization based on dual dome cameras
CN110749308B (en) SLAM-oriented outdoor positioning method using consumer-grade GPS and 2.5D building models
CN113763548B (en) Vision-laser radar coupling-based lean texture tunnel modeling method and system
CN114136315A (en) Monocular vision-based auxiliary inertial integrated navigation method and system
KR20230020845A (en) Electronic deivce and method for tracking object thereof
CN115371673A (en) Binocular camera target positioning method based on Bundle Adjustment in unknown environment
CN117232499A (en) Multi-sensor fusion point cloud map construction method, device, equipment and medium
CN115218906A (en) Indoor SLAM-oriented visual inertial fusion positioning method and system
CN113345032A (en) Wide-angle camera large-distortion image based initial image construction method and system
CN113129422A (en) Three-dimensional model construction method and device, storage medium and computer equipment
KR102406240B1 (en) Robust stereo visual inertial navigation apparatus and method
Reina et al. Robust gyroscope-aided camera self-calibration
CN115128655B (en) Positioning method and device for automatic driving vehicle, electronic equipment and storage medium
CN108801248B (en) Planar vision inertial navigation method based on UKF
CN117782161A (en) IMU and panoramic camera external parameter calibration method for mobile mapping system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: Room 2301-2308, third floor, building 2, incubator, Zhongguancun Software Park, Dongbeiwang, Haidian District, Beijing 100094

Patentee after: Beijing Digital Green Earth Technology Co.,Ltd.

Address before: Room 2301-2308, third floor, building 2, incubator, Zhongguancun Software Park, Dongbeiwang, Haidian District, Beijing 100094

Patentee before: BEIJING GREENVALLEY TECHNOLOGY Co.,Ltd.