CN112561824B - Data alignment restoration method and system for fusion of 2D laser and depth image - Google Patents
Data alignment restoration method and system for fusion of 2D laser and depth image Download PDFInfo
- Publication number
- CN112561824B CN112561824B CN202011519696.0A CN202011519696A CN112561824B CN 112561824 B CN112561824 B CN 112561824B CN 202011519696 A CN202011519696 A CN 202011519696A CN 112561824 B CN112561824 B CN 112561824B
- Authority
- CN
- China
- Prior art keywords
- sequence
- laser
- data
- depth image
- point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 35
- 230000004927 fusion Effects 0.000 title claims abstract description 8
- 239000011159 matrix material Substances 0.000 claims description 22
- 230000008439 repair process Effects 0.000 claims description 16
- 230000006870 function Effects 0.000 claims description 13
- 239000000126 substance Substances 0.000 claims description 8
- 238000000605 extraction Methods 0.000 claims description 7
- 238000006386 neutralization reaction Methods 0.000 claims description 5
- 238000011156 evaluation Methods 0.000 claims description 4
- 238000013507 mapping Methods 0.000 claims description 4
- 230000009466 transformation Effects 0.000 claims description 4
- 238000004364 calculation method Methods 0.000 claims description 3
- 230000008859 change Effects 0.000 abstract description 4
- 230000000007 visual effect Effects 0.000 abstract description 2
- 238000001514 detection method Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000006467 substitution reaction Methods 0.000 description 2
- 238000012800 visualization Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
Images
Classifications
-
- G06T5/77—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Computational Biology (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Image Processing (AREA)
Abstract
The invention relates to a data alignment repairing method and a data alignment repairing system for fusion of a 2D laser and a depth image, wherein the data alignment repairing method comprises the following steps: extracting 2D laser data and depth image data acquired for the same target at the same time; determining sequence data with the highest similarity value in the 2D laser data and the depth image data; and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data. According to the method, 2D laser data and depth image data at the same time are extracted based on the time stamp, sequence data with the highest similarity value are screened out, the change trend of an effective sequence is transplanted to an invalid data interval of another mode by utilizing the correlation of the sequence data, a new repairing sequence is obtained, the visual effect of a 2D laser map and a depth image can be enhanced by the repairing sequence, and the two sensor data can be mutually compensated.
Description
Technical Field
The invention relates to the technical field of image restoration, in particular to a data alignment restoration method and system for fusing 2D laser and depth images.
Background
Autonomous positioning and navigation of the robot rely on detection and information acquisition of the environment by the sensors. In a real scene, although single sensors such as a laser scanner and a vision sensor are used for environment detection, the single sensors have certain advantages, some inevitable defects still exist, such as low acquisition frequency and poor repositioning capability of the laser sensor, and the robot is difficult to return to the previous working state after tracking loss; the vision sensor is greatly influenced by ambient light, the operation load is large, and the capability of dynamically capturing information is poor.
In recent years, the detection method of multi-sensor fusion is developed rapidly, and the method can compensate the limitation of a single sensor to a certain extent. However, the vision system adopted by the methods is wide and effectively determines the depth information. In addition, in some special complex environments, data loss may occur in sensor acquisition.
Disclosure of Invention
In order to solve the above problems in the prior art, that is, to improve the accuracy of data information acquisition, the present invention aims to provide a data alignment restoration method and system for fusing a 2D laser and a depth image.
In order to solve the technical problems, the invention provides the following scheme:
a data alignment repair method for fusing a 2D laser and a depth image comprises the following steps:
extracting 2D laser data and depth image data acquired for the same target at the same time;
determining sequence data with the highest similarity value in the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data.
Optionally, the extracting 2D laser data and depth image data acquired at the same time for the same target specifically includes:
let D i Representing the ith depth image captured by an RGB-D camera, and the function f (i) representing the depth image D i I is more than or equal to 1 and less than or equal to M, and M represents the number of the depth images;
L j represents j 2D laser data obtained by a 2D laser radar sensor and a function g (j) table2D laser data L j J is more than or equal to 1 and less than or equal to N at the acquisition time, and N represents the number of 2D laser data;
determining 2D laser data L collected for the same target at the same time according to the formula f (i) = g (j) = t j,t And depth image data D i,t And t represents a set time.
Optionally, the determining sequence data with the highest similarity value in the 2D laser data and the depth image data specifically includes:
aligning the 2D laser data and the depth image data along a horizontal x axis or a vertical y axis according to the depth information to obtain a plurality of groups of 2D laser sequencesAnd a two-dimensional point sequence>Wherein +>Denotes a slave length ω 2D laser sequence->Selecting a laser sequence consisting of w points with the midpoint u as a starting point, wherein u + w is less than or equal to omega, and combining>A two-dimensional point sequence with a starting point v and a length w along the x-axis coordinate;
2D laser sequence corresponding to highest similarity valueAnd a two-dimensional point sequence>The two-dimensional point reference sequence is a 2D laser reference sequence and a two-dimensional point reference sequence.
Optionally, the 2D laser data and the depth image data are aligned according to the depth information along a horizontal x axis or a vertical y axis to obtain a plurality of groups of 2D laser sequencesAnd a two-dimensional point sequence->The method specifically comprises the following steps:
wherein u represents a 2D laser sequenceV represents a two-dimensional point sequence->W represents depth image data D i,t H denotes depth image data D i,t W denotes a two-dimensional point sequence->W is more than or equal to 1 and less than or equal to W; Ψ is 2D laser data L j,t Ψ ≧ W, ω denotes the 2D laser sequence ≧ W>1 ≦ ω ≦ Ψ;
determining a 2D laser sequence from an x-axis coordinate uDetermining a two-dimensional point sequence based on a y-axis coordinate v>
Optionally, the 2D laser sequence is calculated according to the following formulaAnd a two-dimensional point sequence>The similarity value of (c):
the formula I is as follows:
wherein the content of the first and second substances,representing a two-dimensional sequence of points>And a two-dimensional laser sequence>Cov (. X.) denotes the evaluation of the two-dimensional point sequence->And a two-dimensional laser sequence>The covariance matrix of (1), var (. + -.) represents the evaluation of the two-dimensional point sequence->The variance matrix of (E), E (×) represents the mathematical expectation;
the second formula is as follows:
wherein, the first and the second end of the pipe are connected with each other,representing a two-dimensional point sequence->And a two-dimensional laser sequence>A similarity value of (a);
the formula III is as follows:
wherein the content of the first and second substances,representing a two-dimensional point sequence->And a two-dimensional laser sequence->The similarity value of (a).
Optionally, the determining, according to the sequence data with the highest similarity value, a hole region and/or a burr region in the sequence data, and repairing the hole region and/or the burr region based on the sequence data specifically includes:
when 2D laser sequencePoint in (4)>Two-dimensional point sequence->Point D in v,β When the value is 0, the 2D laser sequence is based on>Two-dimensional point sequence->The noise pair areas are all hollow; β represents a noise pair;
delete pointDevice for combining a 2D laser sequence>Neutralization point->Adjacent suitable point replacement; delete Point D v,β Combining a two-dimensional point sequence>Neutralization point D v,β Adjacent suitable point replacement; the appropriate point is that the 2D laser sequence->And two-dimensional dot sequenceColumn(s)Well match between them to the corresponding points;
when 2D laser sequencePoint in (4)>Two-dimensional point sequence->Point D in v,β If the value is not 0, the 2D laser sequence is determined based on the following constraints>And a two-dimensional point sequence>The points with hollows or burrs in the middle:
where ε represents the threshold for discriminating the pairing distance and A represents a homography matrix of 3 × 3 for 2D laser sequencesAnd a two-dimensional point sequence>Carrying out mapping transformation;
the repair data is determined according to the following formula:
Wherein, the first and the second end of the pipe are connected with each other,is a 2D laser sequence->Point in for repairing a noise point &>Accordingly; />As a two-dimensional sequence of pointsPoint of (1) for repairing noise point D v,β (ii) a Alpha denotes a 2D laser sequence>And a two-dimensional point sequence>A good matching pair between;
Optionally, the calculation method of the homography matrix a includes:
2D laser sequencePoint structure on x, z two-dimensional planeAn integration point set L; l is an mx 3 point set matrix, based on the sum of the values of the sum and the number of the sum>Is a 2D laser sequence>Values of points on an x, z two-dimensional plane;
two-dimensional point sequencePoints on the x, z two-dimensional plane form a point set D; d is an mx 3 point set matrix, based on the sum of the values of the two points>Is a two-dimensional point sequence->Values of points on the x, z two-dimensional plane, wherein e is more than or equal to 1 and less than or equal to m;
D=L×A。
in order to solve the technical problems, the invention also provides the following scheme:
a 2D laser and depth image fused data alignment repair system, the data alignment repair system comprising:
the device comprises an extraction unit, a data acquisition unit and a data acquisition unit, wherein the extraction unit is used for extracting 2D laser data and depth image data acquired for the same target at the same time;
a determination unit configured to determine sequence data having a highest similarity value between the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
and the repairing unit is used for determining a hole area and/or a burr area in the sequence data according to the sequence data with the highest similarity value and repairing the hole area and/or the burr area based on the sequence data.
In order to solve the technical problem, the invention also provides the following scheme:
a 2D laser and depth image fused data alignment repair system comprising:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to:
extracting 2D laser data and depth image data acquired for the same target at the same time;
determining sequence data with the highest similarity value in the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data.
In order to solve the technical problems, the invention also provides the following scheme:
a computer-readable storage medium storing one or more programs that, when executed by an electronic device including a plurality of application programs, cause the electronic device to:
extracting 2D laser data and depth image data acquired for the same target at the same time;
determining sequence data with the highest similarity value in the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data.
According to the embodiment of the invention, the invention discloses the following technical effects:
according to the method, 2D laser data and depth image data at the same time are extracted based on the time stamp, sequence data with the highest similarity value are screened out, the change trend of an effective sequence is transplanted to an invalid data interval of another mode by utilizing the correlation of the sequence data, a new repairing sequence is obtained, the visual effect of a 2D laser map and a depth image can be enhanced by the repairing sequence, and the two sensor data can be mutually compensated.
Drawings
FIG. 1 is a flow chart of a method for data alignment repair with fusion of 2D laser and depth image according to the present invention;
fig. 2 is a schematic block diagram of a 2D laser and depth image fused data alignment repair system according to the present invention.
Description of the symbols:
an extraction unit-1, a determination unit-2, and a repair unit-3.
Detailed Description
Preferred embodiments of the present invention are described below with reference to the accompanying drawings. It should be understood by those skilled in the art that these embodiments are only for explaining the technical principle of the present invention, and are not intended to limit the scope of the present invention.
The invention aims to provide a data alignment restoration method for fusion of a 2D laser and a depth image, which is characterized in that 2D laser data and depth image data at the same time are extracted based on a timestamp, sequence data with the highest similarity value are screened out, the change trend of an effective sequence is transplanted to an invalid data interval of another mode by utilizing the correlation of the sequence data, a new restoration sequence is obtained, the restoration sequence can enhance the visualization effect of a 2D laser map and the depth image, and the two sensor data can be mutually compensated.
In order to make the aforementioned objects, features and advantages of the present invention more comprehensible, the present invention is described in detail with reference to the accompanying drawings and the detailed description thereof.
As shown in fig. 1, the data alignment restoration method for fusing 2D laser and depth image of the present invention includes:
step 100: extracting 2D laser data and depth image data acquired for the same target at the same time;
step 200: determining sequence data with the highest similarity value in the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
step 300: and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data.
In step 100, the extracting 2D laser data and depth image data acquired at the same time for the same target specifically includes:
let D i Representing the ith depth image captured by an RGB-D camera, the function f (i) representing the depth image D i I is more than or equal to 1 and less than or equal to M, and M represents the number of depth images;
L j represents the jth 2D laser data obtained by the 2D laser radar sensor, and the function g (j) represents the 2D laser data L j J is more than or equal to 1 and less than or equal to N at the acquisition time, and N represents the number of 2D laser data;
determining 2D laser data L acquired at the same time on the same target based on the time stamp according to the formula f (i) = g (j) = t j,t And depth image data D i,t And t represents a set time.
The functions f (i) and g (j) are used to convert depth image data D i,t And 2D laser data L i,t Very adjacent in time series, where 1 ≦ t = f (M) = g (N), may align depth images at the same time with the 2D laser data.
In step 200, the determining sequence data with the highest similarity value in the 2D laser data and the depth image data specifically includes:
step 210: aligning the 2D laser data and the depth image data along a horizontal x axis or a vertical y axis according to the depth information to obtain a plurality of groups of 2D laser sequencesAnd a two-dimensional point sequence>Wherein it is present>Denotes a slave length ω 2D laser sequence->A laser sequence consisting of w points is selected by taking the middle point u as a starting point, the u + w is less than or equal to w, and the mark is selected>Representing a two-dimensional sequence of points with a starting point v and a length w along the x-axis coordinate.
Aligning the 2D laser data and the depth image data along a horizontal x axis or a vertical y axis according to depth information to obtain a plurality of groups of 2D laser sequencesAnd a two-dimensional point sequence->The method specifically comprises the following steps:
Wherein u represents a 2D laser sequenceV denotes a two-dimensional point sequence->W denotes depth image data D i,t H denotes depth image data D i,t W denotes a two-dimensional point sequence->W is more than or equal to 1 and less than or equal to W; Ψ is 2D laser data L j,t Ψ ≧ W, ω denotes the 2D laser sequence ≧ W>1 ≦ ω ≦ Ψ.
Is a two-dimensional sequence of points with a starting point h and a length w along the x-axis coordinate. The goal of depth alignment is to find the appropriate value of (u, v), 1 ≦ u ≦ Ψ -W,1 ≦ v ≦ H.
Step 212: determining a 2D laser sequence from an x-axis coordinate uDetermining a two-dimensional point sequence based on the y-axis coordinate v>
2D laser sequence corresponding to highest similarity valueAnd a two-dimensional point sequence>The reference sequence of the 2D laser and the reference sequence of the two-dimensional points are obtained.
Optionally, the 2D laser sequence is calculated according to any one of the following formulasAnd a two-dimensional point sequence->The similarity value of (c):
the formula I is as follows:
wherein the content of the first and second substances,representing a two-dimensional point sequence->And a two-dimensional laser sequence->Cov (×) represents the evaluation of a two-dimensional sequence of points ·>And a two-dimensional laser sequence->In the covariance matrix of (1), var (—) represents solving a two-dimensional point sequence &>The variance matrix of (a), E (×) represents the mathematical expectation.
The second formula is as follows:
wherein, the first and the second end of the pipe are connected with each other,representing a two-dimensional point sequence->And a two-dimensional laser sequence->Is based on the similarity value of (4), in particular 1 minus the two-dimensional point sequence->And a two-dimensional laser sequence->The cosine distance of (d); the larger the value obtained, the more similar the two sequences.
The formula III is as follows:
wherein the content of the first and second substances,representing a two-dimensional point sequence->And a two-dimensional laser sequence>In particular 1 minus the normalized two-dimensional point sequence->And a two-dimensional laser sequence/>The Euclidean distance of; the larger the value obtained, the more similar the two sequences are.
In step 300, the determining, according to the sequence data with the highest similarity value, a hole region and/or a burr region in the sequence data, and repairing the hole region and/or the burr region based on the sequence data specifically includes:
when 2D laser sequencePoint in (4)>Two-dimensional point sequence->Point D in v,β When the value is 0, the 2D laser sequence is based on>Two-dimensional point sequence->The noise pair area is hollow; β represents a noise pair;
delete pointDevice for combining a 2D laser sequence>Neutralization point->Adjacent suitable point replacement; delete Point D v,β The two-dimensional point sequence is combined>Neutral pointD v,β Adjacent suitable point replacement; suitable points are for the 2D laser sequence>And a two-dimensional point sequence>Well match between them to the corresponding points;
when 2D laser sequencePoint in (4)>Two-dimensional point sequence->Point D in v,β If the value is not 0, the 2D laser sequence is determined based on the following constraints>And a two-dimensional point sequence>The middle is a hollow or burred point:
where ε represents the threshold for discriminating the pairing distance and A represents a 3 × 3 homography matrix for 2D laser sequencesAnd a two-dimensional point sequence>Carrying out mapping transformation;
the repair data is determined according to the following formula:
Wherein the content of the first and second substances,is a 2D laser sequence->Point in for repairing the noise point->Accordingly; />As a two-dimensional sequence of pointsPoint of (1) for repairing noise point D v,β (ii) a Alpha denotes a 2D laser sequence->And a two-dimensional point sequence>A good matching pair between;
The shape transformation of two sets of similar points on a plane can be described as a homography matrix. The mapping using the homography matrix method facilitates the conversion from an m x 3 set of points (L) on a two-dimensional lidar plane to an m x 3 set of points (D) on a vertical plane of an RGB-D depth image having a 3 x 3 homography matrix (a).
Further, the calculation method of the homography matrix A comprises the following steps:
2D laser sequencePoints on the x, z two-dimensional plane form a point set L; l is an mx 3 point set matrix, which is combined with a plurality of pixel sets>Is a 2D laser sequence>Values of points on an x, z two-dimensional plane;
two-dimensional point sequencePoints on the x, z two-dimensional plane form a point set D; d is an mx 3 point set matrix, based on the sum of the values of the two points>Is a two-dimensional point sequence->Values of points on the x, z two-dimensional plane, wherein e is more than or equal to 1 and less than or equal to m;
D=L×A。
suppose thatAnd &>There is a good matching pair α and a noise pair β, where K = α + β, the noise pair β is formed byAnd &>A burr or a hollow spot on either. Due to the presence of noise pairs, an ideal method is to select n pairs from the α good pairs and calculate the value of matrix A, where n < sizeof (α) < K. N pairs are generally randomly selected from the K pairs, and if the selected n pairs are all suitable pairs, the value of a can be obtained. Assuming c = α/(α + β), the probability that both selected n pairs are suitable is c n . The maximum number of attempts to find a consistent sample with a 97.5% confidence ratio is approximately 4/c n 。
In order to solve the problems of acquisition error and data loss caused by insufficient compactness of data layer fusion when a laser scanner and an RGB-D camera are used as front-end sensors for information acquisition at the same time, the invention extracts 2D laser data and depth image data of the same target at the same time based on a timestamp, screens out sequence data with the highest similarity value, and then transplants the change trend of an effective sequence of one mode (laser or depth image) to an invalid data interval of the other mode by utilizing the correlation of the sequence data and the data, thereby obtaining a new repair sequence, wherein the repair sequence can enhance the visualization effect of the 2D laser map and the depth image, and the data of the two sensors can be mutually compensated.
In addition, the invention also provides a data alignment restoration system for fusing the 2D laser and the depth image, which can improve the accuracy of data information acquisition.
As shown in fig. 2, the data alignment restoration system for fusing 2D laser and depth image according to the present invention includes an extraction unit 1, a determination unit 2, and a restoration unit 3.
Specifically, the extraction unit 1 is configured to extract 2D laser data and depth image data acquired at the same time for the same target;
the determining unit 2 is configured to determine sequence data with the highest similarity value in the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
the repairing unit 3 is configured to determine a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repair the hole region and/or the burr region based on the sequence data.
In addition, the invention also provides a data alignment restoration system for fusing the 2D laser and the depth image, which comprises:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to:
extracting 2D laser data and depth image data acquired for the same target at the same time;
determining sequence data with the highest similarity value in the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data.
The invention also provides the following scheme:
a computer-readable storage medium storing one or more programs that, when executed by an electronic device including a plurality of application programs, cause the electronic device to:
extracting 2D laser data and depth image data acquired for the same target at the same time;
determining sequence data with the highest similarity value in the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data.
Compared with the prior art, the data alignment restoration system and the computer-readable storage medium for fusing the 2D laser and the depth image have the same beneficial effects as the data alignment restoration method for fusing the 2D laser and the depth image, and are not described herein again.
So far, the technical solutions of the present invention have been described in connection with the preferred embodiments shown in the drawings, but it is easily understood by those skilled in the art that the scope of the present invention is obviously not limited to these specific embodiments. Equivalent changes or substitutions of related technical features can be made by those skilled in the art without departing from the principle of the invention, and the technical scheme after the changes or substitutions can fall into the protection scope of the invention.
Claims (8)
1. A data alignment repairing method for fusing a 2D laser and a depth image is characterized by comprising the following steps:
extracting 2D laser data and depth image data acquired for the same target at the same time; the method specifically comprises the following steps:
let D i Representing the ith depth image captured by an RGB-D camera, and the function f (i) representing the depth image D i I is more than or equal to 1 and less than or equal to M, and M represents the number of depth images;
L j represents the jth 2D laser data obtained by the 2D laser radar sensor, and the function g (j) represents the 2D laser data L j J is more than or equal to 1 and less than or equal to N at the acquisition time, and N represents the number of 2D laser data;
determining 2D laser data L acquired at the same time for the same target according to the formula f (i) = g (j) = t j,t And depth image data D i,t T represents a set time;
determining sequence data with the highest similarity value in the 2D laser data and the depth image data; the method specifically comprises the following steps:
aligning the 2D laser data and the depth image data along a horizontal x axis or a vertical y axis according to the depth information to obtain a plurality of groups of 2D laser sequencesAnd a two-dimensional point sequence->Wherein it is present>Denotes a slave length ω 2D laser sequence->Selecting a laser sequence consisting of w points with the middle point u as a starting point, wherein u + w is less than or equal to omega, and then combining>A two-dimensional point sequence with a starting point v and a length w along the x-axis coordinate;
2D laser sequence corresponding to highest similarity valueAnd a two-dimensional point sequence->The method comprises the following steps of (1) obtaining a 2D laser reference sequence and a two-dimensional point reference sequence;
the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data.
2. The method for repairing data alignment of 2D laser and depth image fusion according to claim 1, wherein the 2D laser data and the depth image data are aligned along a horizontal x axis or a vertical y axis according to depth information to obtain a plurality of groups of 2D laser sequencesAnd a two-dimensional point sequence->The method specifically comprises the following steps:
wherein u represents a 2D laser sequenceV represents a two-dimensional point sequence->W denotes depth image data D i,t H denotes depth image data D i,t W denotes a two-dimensional point sequence->The length of (i) is more than or equal to W and less than or equal to W; Ψ is 2D laser data L j,t Ψ ≧ w, ω represents the 2D laser sequence >>1 ≦ ω ≦ Ψ;
3. The method for 2D laser and depth image fused data alignment repair according to claim 1, wherein the 2D laser sequence is calculated according to any one of the following formulasAnd a two-dimensional point sequence->The similarity value of (c):
the formula I is as follows:
wherein the content of the first and second substances,representing a two-dimensional sequence of points>And a two-dimensional laser sequence->Cov (. X.) denotes the evaluation of the two-dimensional point sequence->And a two-dimensional laser sequence->Is used to solve the two-dimensional point sequence represented by the covariance matrix, var (. + -.) -The variance matrix of (a), E (×) represents the mathematical expectation;
the formula II is as follows:
wherein, the first and the second end of the pipe are connected with each other,representing a two-dimensional point sequence->And a two-dimensional laser sequence->The similarity value of (a);
the formula III is as follows:
4. The method for repairing the 2D laser and depth image fused data alignment according to claim 1, wherein the determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value and repairing the hole region and/or the burr region based on the sequence data specifically comprises:
when 2D laser sequencePoint in (4)>Two-dimensional point sequence->Point D in v,β When the value is 0, the 2D laser sequence is performedTwo-dimensional point sequence->The noise pair area is hollow; β represents a noise pair;
delete pointDevice for combining a 2D laser sequence>Neutralization point>Adjacent suitable point replacement; delete Point D v,β The two-dimensional point sequence is combined>Neutralization point D v,β Adjacent suitable point replacement; suitable points are for the 2D laser sequence>And a two-dimensional point sequence>Well match between them to the corresponding points;
when 2D laser sequencePoint in (4)>Two-dimensional point sequence->Point D in v,β If the value is not 0, the 2D laser sequence is determined based on the following constraints>And a two-dimensional point sequence>The points with hollows or burrs in the middle:
where ε represents the threshold for discriminating the pairing distance and A represents a homography matrix of 3 × 3 for 2D laser sequencesAnd a two-dimensional point sequence>Carrying out mapping transformation;
the repair data is determined according to the following formula:
Wherein the content of the first and second substances,for 2D laser sequences>Point in for repairing the noise point->Accordingly; />For a two-dimensional point sequence>Point of (1) for repairing noise point D v,β (ii) a Alpha denotes a 2D laser sequence->And a two-dimensional point sequence>A good matching pair between; />
5. The 2D laser and depth image fused data alignment repairing method according to claim 4, wherein the calculation method of the homography matrix A comprises the following steps:
2D laser sequencePoints on the x, z two-dimensional plane form a point set L; l is an mx 3 point set matrix, based on the sum of the values of the sum and the number of the sum>Is a 2D laser sequence->Values of points on an x, z two-dimensional plane;
two-dimensional point sequencePoints on the x, z two-dimensional plane form a point set D; d is an mx 3 point set matrix, which is combined with a plurality of image frames>Is a two-dimensional point sequence->The values of points on the x, z two-dimensional plane, wherein e is more than or equal to 1 and less than or equal to m;
D=L×A。
6. a 2D laser and depth image fused data alignment repair system, comprising:
the device comprises an extraction unit, a data acquisition unit and a data acquisition unit, wherein the extraction unit is used for extracting 2D laser data and depth image data acquired for the same target at the same time; the method specifically comprises the following steps:
let D i Representing the ith depth image captured by an RGB-D camera, the function f (i) representing the depth image D i I is more than or equal to 1 and less than or equal to M, and M represents the number of depth images;
L j represents the jth 2D laser data obtained by the 2D laser radar sensor, and the function g (j) represents the 2D laser data L j J is more than or equal to 1 and less than or equal to N at the acquisition time, and N represents the number of 2D laser data;
determining 2D laser data L acquired at the same time for the same target according to the formula f (i) = g (j) = t j,t And depth image data D i,t T represents a set time;
a determination unit configured to determine sequence data having a highest similarity value between the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence; the method specifically comprises the following steps:
aligning the 2D laser data and the depth image data along a horizontal x axis or a vertical y axis according to the depth information to obtain a plurality of groups of 2D laser sequencesAnd a two-dimensional point sequence->Wherein it is present>Denotes a slave length ω 2D laser sequence->Selecting a laser sequence consisting of w points with the middle point u as a starting point, wherein u + w is less than or equal to omega, and then combining>To representThe starting point is v, and the length w is a two-dimensional point sequence along the x-axis coordinate;
2D laser sequence corresponding to highest similarity valueAnd a two-dimensional point sequence->A 2D laser reference sequence and a two-dimensional point reference sequence are obtained;
and the repairing unit is used for determining a hole area and/or a burr area in the sequence data according to the sequence data with the highest similarity value and repairing the hole area and/or the burr area based on the sequence data.
7. A 2D laser and depth image fused data alignment repair system comprising:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to:
extracting 2D laser data and depth image data acquired for the same target at the same time; the method specifically comprises the following steps:
let D i Representing the ith depth image captured by an RGB-D camera, the function f (i) representing the depth image D i I is more than or equal to 1 and less than or equal to M, and M represents the number of the depth images;
L j represents the jth 2D laser data obtained by the 2D laser radar sensor, and the function g (j) represents the 2D laser data L j J is more than or equal to 1 and less than or equal to N at the acquisition time, and N represents the number of 2D laser data;
determining 2D laser data L collected for the same target at the same time according to the formula f (i) = g (j) = t j,t And depth image data D i,t T represents a set time;
determining sequence data with the highest similarity value in the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence; the method specifically comprises the following steps:
aligning the 2D laser data and the depth image data along a horizontal x axis or a vertical y axis according to the depth information to obtain a plurality of groups of 2D laser sequencesAnd a two-dimensional point sequence->Wherein +>Denotes a slave length ω 2D laser sequence->Selecting a laser sequence consisting of w points with the middle point u as a starting point, wherein u + w is less than or equal to omega, and then combining>A two-dimensional point sequence with a starting point v and a length w along the x-axis coordinate;
2D laser sequence corresponding to highest similarity valueAnd a two-dimensional point sequence->A 2D laser reference sequence and a two-dimensional point reference sequence are obtained;
and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data.
8. A computer-readable storage medium storing one or more programs that, when executed by an electronic device including a plurality of application programs, cause the electronic device to:
extracting 2D laser data and depth image data acquired for the same target at the same time; the method specifically comprises the following steps:
let D i Representing the ith depth image captured by an RGB-D camera, the function f (i) representing the depth image D i I is more than or equal to 1 and less than or equal to M, and M represents the number of depth images;
L j represents the jth 2D laser data obtained by the 2D laser radar sensor, and the function g (j) represents the 2D laser data L j Of (2)J is more than or equal to 1 and less than or equal to N at the set time, and N represents the number of 2D laser data;
determining 2D laser data L acquired at the same time for the same target according to the formula f (i) = g (j) = t j,t And depth image data D i,t T represents a set time;
determining sequence data with the highest similarity value in the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
the method specifically comprises the following steps:
aligning the 2D laser data and the depth image data along a horizontal x axis or a vertical y axis according to the depth information to obtain a plurality of groups of 2D laser sequencesAnd a two-dimensional point sequence->Wherein it is present>Represents a slave length ω 2D laser sequence->Selecting a laser sequence consisting of w points with the midpoint u as a starting point, wherein u + w is less than or equal to omega, and combining>A two-dimensional point sequence with a starting point v and a length w along the x-axis coordinate; />
2D laser sequence corresponding to highest similarity valueAnd a two-dimensional point sequence->The method comprises the following steps of (1) obtaining a 2D laser reference sequence and a two-dimensional point reference sequence;
and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011519696.0A CN112561824B (en) | 2020-12-21 | 2020-12-21 | Data alignment restoration method and system for fusion of 2D laser and depth image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011519696.0A CN112561824B (en) | 2020-12-21 | 2020-12-21 | Data alignment restoration method and system for fusion of 2D laser and depth image |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112561824A CN112561824A (en) | 2021-03-26 |
CN112561824B true CN112561824B (en) | 2023-04-07 |
Family
ID=75032102
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011519696.0A Active CN112561824B (en) | 2020-12-21 | 2020-12-21 | Data alignment restoration method and system for fusion of 2D laser and depth image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112561824B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110428372A (en) * | 2019-07-08 | 2019-11-08 | 希格斯动力科技(珠海)有限公司 | Depth data and 2D laser data fusion method and device, storage medium |
CN111291708A (en) * | 2020-02-25 | 2020-06-16 | 华南理工大学 | Transformer substation inspection robot obstacle detection and identification method integrated with depth camera |
CN111624622A (en) * | 2020-04-24 | 2020-09-04 | 库卡机器人(广东)有限公司 | Obstacle detection method and device |
CN112016612A (en) * | 2020-08-26 | 2020-12-01 | 四川阿泰因机器人智能装备有限公司 | Monocular depth estimation-based multi-sensor fusion SLAM method |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109300190B (en) * | 2018-09-06 | 2021-08-10 | 百度在线网络技术(北京)有限公司 | Three-dimensional data processing method, device, equipment and storage medium |
-
2020
- 2020-12-21 CN CN202011519696.0A patent/CN112561824B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110428372A (en) * | 2019-07-08 | 2019-11-08 | 希格斯动力科技(珠海)有限公司 | Depth data and 2D laser data fusion method and device, storage medium |
CN111291708A (en) * | 2020-02-25 | 2020-06-16 | 华南理工大学 | Transformer substation inspection robot obstacle detection and identification method integrated with depth camera |
CN111624622A (en) * | 2020-04-24 | 2020-09-04 | 库卡机器人(广东)有限公司 | Obstacle detection method and device |
CN112016612A (en) * | 2020-08-26 | 2020-12-01 | 四川阿泰因机器人智能装备有限公司 | Monocular depth estimation-based multi-sensor fusion SLAM method |
Also Published As
Publication number | Publication date |
---|---|
CN112561824A (en) | 2021-03-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111882612B (en) | Vehicle multi-scale positioning method based on three-dimensional laser detection lane line | |
CN110555901B (en) | Method, device, equipment and storage medium for positioning and mapping dynamic and static scenes | |
CN110807809B (en) | Light-weight monocular vision positioning method based on point-line characteristics and depth filter | |
CN110599545B (en) | Feature-based dense map construction system | |
CN111830953A (en) | Vehicle self-positioning method, device and system | |
CN112419497A (en) | Monocular vision-based SLAM method combining feature method and direct method | |
CN112862881B (en) | Road map construction and fusion method based on crowd-sourced multi-vehicle camera data | |
CN102201058A (en) | Cat eye effect object recognition algorithm of active and passive imaging system sharing same aperture | |
CN110634138A (en) | Bridge deformation monitoring method, device and equipment based on visual perception | |
CN111028271A (en) | Multi-camera personnel three-dimensional positioning and tracking system based on human skeleton detection | |
CN112652020A (en) | Visual SLAM method based on AdaLAM algorithm | |
CN113781532B (en) | Automatic matching and searching method for SAR satellite image and optical image | |
Chen et al. | Camera geolocation from mountain images | |
CN112580683B (en) | Multi-sensor data time alignment system and method based on cross correlation | |
CN112561824B (en) | Data alignment restoration method and system for fusion of 2D laser and depth image | |
Praczyk et al. | Concept and first results of optical navigational system | |
CN117274627A (en) | Multi-temporal snow remote sensing image matching method and system based on image conversion | |
Shen et al. | Plant image mosaic based on depth and color dual information feature source from Kinect | |
CN111080712A (en) | Multi-camera personnel positioning, tracking and displaying method based on human body skeleton detection | |
CN114283199A (en) | Dynamic scene-oriented dotted line fusion semantic SLAM method | |
CN114140494A (en) | Single-target tracking system and method in complex scene, electronic device and storage medium | |
CN113409334A (en) | Centroid-based structured light angle point detection method | |
CN111178264A (en) | Estimation algorithm for tower footing attitude of iron tower in aerial image of unmanned aerial vehicle | |
Rasyidy et al. | A Framework for Road Boundary Detection based on Camera-LIDAR Fusion in World Coordinate System and Its Performance Evaluation Using Carla Simulator | |
CN116385502B (en) | Image registration method based on region search under geometric constraint |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |