CN112561824B - Data alignment restoration method and system for fusion of 2D laser and depth image - Google Patents

Data alignment restoration method and system for fusion of 2D laser and depth image Download PDF

Info

Publication number
CN112561824B
CN112561824B CN202011519696.0A CN202011519696A CN112561824B CN 112561824 B CN112561824 B CN 112561824B CN 202011519696 A CN202011519696 A CN 202011519696A CN 112561824 B CN112561824 B CN 112561824B
Authority
CN
China
Prior art keywords
sequence
laser
data
depth image
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011519696.0A
Other languages
Chinese (zh)
Other versions
CN112561824A (en
Inventor
杨明浩
瞿元昊
强保华
张家清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute of Automation of Chinese Academy of Science
Guilin University of Electronic Technology
Original Assignee
Institute of Automation of Chinese Academy of Science
Guilin University of Electronic Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute of Automation of Chinese Academy of Science, Guilin University of Electronic Technology filed Critical Institute of Automation of Chinese Academy of Science
Priority to CN202011519696.0A priority Critical patent/CN112561824B/en
Publication of CN112561824A publication Critical patent/CN112561824A/en
Application granted granted Critical
Publication of CN112561824B publication Critical patent/CN112561824B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T5/77
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to a data alignment repairing method and a data alignment repairing system for fusion of a 2D laser and a depth image, wherein the data alignment repairing method comprises the following steps: extracting 2D laser data and depth image data acquired for the same target at the same time; determining sequence data with the highest similarity value in the 2D laser data and the depth image data; and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data. According to the method, 2D laser data and depth image data at the same time are extracted based on the time stamp, sequence data with the highest similarity value are screened out, the change trend of an effective sequence is transplanted to an invalid data interval of another mode by utilizing the correlation of the sequence data, a new repairing sequence is obtained, the visual effect of a 2D laser map and a depth image can be enhanced by the repairing sequence, and the two sensor data can be mutually compensated.

Description

Data alignment restoration method and system for fusion of 2D laser and depth image
Technical Field
The invention relates to the technical field of image restoration, in particular to a data alignment restoration method and system for fusing 2D laser and depth images.
Background
Autonomous positioning and navigation of the robot rely on detection and information acquisition of the environment by the sensors. In a real scene, although single sensors such as a laser scanner and a vision sensor are used for environment detection, the single sensors have certain advantages, some inevitable defects still exist, such as low acquisition frequency and poor repositioning capability of the laser sensor, and the robot is difficult to return to the previous working state after tracking loss; the vision sensor is greatly influenced by ambient light, the operation load is large, and the capability of dynamically capturing information is poor.
In recent years, the detection method of multi-sensor fusion is developed rapidly, and the method can compensate the limitation of a single sensor to a certain extent. However, the vision system adopted by the methods is wide and effectively determines the depth information. In addition, in some special complex environments, data loss may occur in sensor acquisition.
Disclosure of Invention
In order to solve the above problems in the prior art, that is, to improve the accuracy of data information acquisition, the present invention aims to provide a data alignment restoration method and system for fusing a 2D laser and a depth image.
In order to solve the technical problems, the invention provides the following scheme:
a data alignment repair method for fusing a 2D laser and a depth image comprises the following steps:
extracting 2D laser data and depth image data acquired for the same target at the same time;
determining sequence data with the highest similarity value in the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data.
Optionally, the extracting 2D laser data and depth image data acquired at the same time for the same target specifically includes:
let D i Representing the ith depth image captured by an RGB-D camera, and the function f (i) representing the depth image D i I is more than or equal to 1 and less than or equal to M, and M represents the number of the depth images;
L j represents j 2D laser data obtained by a 2D laser radar sensor and a function g (j) table2D laser data L j J is more than or equal to 1 and less than or equal to N at the acquisition time, and N represents the number of 2D laser data;
determining 2D laser data L collected for the same target at the same time according to the formula f (i) = g (j) = t j,t And depth image data D i,t And t represents a set time.
Optionally, the determining sequence data with the highest similarity value in the 2D laser data and the depth image data specifically includes:
aligning the 2D laser data and the depth image data along a horizontal x axis or a vertical y axis according to the depth information to obtain a plurality of groups of 2D laser sequences
Figure BDA0002849080310000021
And a two-dimensional point sequence>
Figure BDA0002849080310000022
Wherein +>
Figure BDA0002849080310000023
Denotes a slave length ω 2D laser sequence->
Figure BDA0002849080310000024
Selecting a laser sequence consisting of w points with the midpoint u as a starting point, wherein u + w is less than or equal to omega, and combining>
Figure BDA0002849080310000025
A two-dimensional point sequence with a starting point v and a length w along the x-axis coordinate;
for each set of 2D laser sequences
Figure BDA0002849080310000026
And a two-dimensional point sequence->
Figure BDA0002849080310000027
Computing 2D laser sequences
Figure BDA0002849080310000028
And a two-dimensional point sequence->
Figure BDA0002849080310000029
A similarity value of (a);
2D laser sequence corresponding to highest similarity value
Figure BDA00028490803100000210
And a two-dimensional point sequence>
Figure BDA00028490803100000211
The two-dimensional point reference sequence is a 2D laser reference sequence and a two-dimensional point reference sequence.
Optionally, the 2D laser data and the depth image data are aligned according to the depth information along a horizontal x axis or a vertical y axis to obtain a plurality of groups of 2D laser sequences
Figure BDA0002849080310000031
And a two-dimensional point sequence->
Figure BDA0002849080310000032
The method specifically comprises the following steps:
according to the formula
Figure BDA0002849080310000033
Determining a suitable point (u, v);
wherein u represents a 2D laser sequence
Figure BDA0002849080310000034
V represents a two-dimensional point sequence->
Figure BDA0002849080310000035
W represents depth image data D i,t H denotes depth image data D i,t W denotes a two-dimensional point sequence->
Figure BDA0002849080310000036
W is more than or equal to 1 and less than or equal to W; Ψ is 2D laser data L j,t Ψ ≧ W, ω denotes the 2D laser sequence ≧ W>
Figure BDA0002849080310000037
1 ≦ ω ≦ Ψ;
determining a 2D laser sequence from an x-axis coordinate u
Figure BDA0002849080310000038
Determining a two-dimensional point sequence based on a y-axis coordinate v>
Figure BDA0002849080310000039
Optionally, the 2D laser sequence is calculated according to the following formula
Figure BDA00028490803100000310
And a two-dimensional point sequence>
Figure BDA00028490803100000311
The similarity value of (c):
the formula I is as follows:
Figure BDA00028490803100000312
Figure BDA00028490803100000313
wherein the content of the first and second substances,
Figure BDA00028490803100000314
representing a two-dimensional sequence of points>
Figure BDA00028490803100000315
And a two-dimensional laser sequence>
Figure BDA00028490803100000316
Cov (. X.) denotes the evaluation of the two-dimensional point sequence->
Figure BDA00028490803100000317
And a two-dimensional laser sequence>
Figure BDA00028490803100000318
The covariance matrix of (1), var (. + -.) represents the evaluation of the two-dimensional point sequence->
Figure BDA00028490803100000319
The variance matrix of (E), E (×) represents the mathematical expectation;
the second formula is as follows:
Figure BDA00028490803100000320
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0002849080310000041
representing a two-dimensional point sequence->
Figure BDA0002849080310000042
And a two-dimensional laser sequence>
Figure BDA0002849080310000043
A similarity value of (a);
the formula III is as follows:
Figure BDA0002849080310000044
Figure BDA0002849080310000045
wherein the content of the first and second substances,
Figure BDA0002849080310000046
representing a two-dimensional point sequence->
Figure BDA0002849080310000047
And a two-dimensional laser sequence->
Figure BDA0002849080310000048
The similarity value of (a).
Optionally, the determining, according to the sequence data with the highest similarity value, a hole region and/or a burr region in the sequence data, and repairing the hole region and/or the burr region based on the sequence data specifically includes:
when 2D laser sequence
Figure BDA0002849080310000049
Point in (4)>
Figure BDA00028490803100000410
Two-dimensional point sequence->
Figure BDA00028490803100000411
Point D in v,β When the value is 0, the 2D laser sequence is based on>
Figure BDA00028490803100000412
Two-dimensional point sequence->
Figure BDA00028490803100000413
The noise pair areas are all hollow; β represents a noise pair;
delete point
Figure BDA00028490803100000414
Device for combining a 2D laser sequence>
Figure BDA00028490803100000415
Neutralization point->
Figure BDA00028490803100000416
Adjacent suitable point replacement; delete Point D v,β Combining a two-dimensional point sequence>
Figure BDA00028490803100000417
Neutralization point D v,β Adjacent suitable point replacement; the appropriate point is that the 2D laser sequence->
Figure BDA00028490803100000418
And two-dimensional dot sequenceColumn(s)
Figure BDA00028490803100000419
Well match between them to the corresponding points;
when 2D laser sequence
Figure BDA00028490803100000420
Point in (4)>
Figure BDA00028490803100000421
Two-dimensional point sequence->
Figure BDA00028490803100000422
Point D in v,β If the value is not 0, the 2D laser sequence is determined based on the following constraints>
Figure BDA00028490803100000423
And a two-dimensional point sequence>
Figure BDA00028490803100000424
The points with hollows or burrs in the middle:
Figure BDA00028490803100000425
where ε represents the threshold for discriminating the pairing distance and A represents a homography matrix of 3 × 3 for 2D laser sequences
Figure BDA00028490803100000426
And a two-dimensional point sequence>
Figure BDA00028490803100000427
Carrying out mapping transformation;
the repair data is determined according to the following formula:
Figure BDA00028490803100000428
or>
Figure BDA00028490803100000429
Wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0002849080310000051
is a 2D laser sequence->
Figure BDA0002849080310000052
Point in for repairing a noise point &>
Figure BDA0002849080310000053
Accordingly; />
Figure BDA0002849080310000054
As a two-dimensional sequence of points
Figure BDA0002849080310000055
Point of (1) for repairing noise point D v,β (ii) a Alpha denotes a 2D laser sequence>
Figure BDA0002849080310000056
And a two-dimensional point sequence>
Figure BDA0002849080310000057
A good matching pair between;
will be dotted
Figure BDA0002849080310000058
Replacement noise point pick-up>
Figure BDA0002849080310000059
Will->
Figure BDA00028490803100000510
Replacement of noise points D v,β
Optionally, the calculation method of the homography matrix a includes:
2D laser sequence
Figure BDA00028490803100000511
Point structure on x, z two-dimensional planeAn integration point set L; l is an mx 3 point set matrix, based on the sum of the values of the sum and the number of the sum>
Figure BDA00028490803100000512
Is a 2D laser sequence>
Figure BDA00028490803100000513
Values of points on an x, z two-dimensional plane;
two-dimensional point sequence
Figure BDA00028490803100000514
Points on the x, z two-dimensional plane form a point set D; d is an mx 3 point set matrix, based on the sum of the values of the two points>
Figure BDA00028490803100000515
Is a two-dimensional point sequence->
Figure BDA00028490803100000516
Values of points on the x, z two-dimensional plane, wherein e is more than or equal to 1 and less than or equal to m;
D=L×A。
in order to solve the technical problems, the invention also provides the following scheme:
a 2D laser and depth image fused data alignment repair system, the data alignment repair system comprising:
the device comprises an extraction unit, a data acquisition unit and a data acquisition unit, wherein the extraction unit is used for extracting 2D laser data and depth image data acquired for the same target at the same time;
a determination unit configured to determine sequence data having a highest similarity value between the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
and the repairing unit is used for determining a hole area and/or a burr area in the sequence data according to the sequence data with the highest similarity value and repairing the hole area and/or the burr area based on the sequence data.
In order to solve the technical problem, the invention also provides the following scheme:
a 2D laser and depth image fused data alignment repair system comprising:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to:
extracting 2D laser data and depth image data acquired for the same target at the same time;
determining sequence data with the highest similarity value in the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data.
In order to solve the technical problems, the invention also provides the following scheme:
a computer-readable storage medium storing one or more programs that, when executed by an electronic device including a plurality of application programs, cause the electronic device to:
extracting 2D laser data and depth image data acquired for the same target at the same time;
determining sequence data with the highest similarity value in the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data.
According to the embodiment of the invention, the invention discloses the following technical effects:
according to the method, 2D laser data and depth image data at the same time are extracted based on the time stamp, sequence data with the highest similarity value are screened out, the change trend of an effective sequence is transplanted to an invalid data interval of another mode by utilizing the correlation of the sequence data, a new repairing sequence is obtained, the visual effect of a 2D laser map and a depth image can be enhanced by the repairing sequence, and the two sensor data can be mutually compensated.
Drawings
FIG. 1 is a flow chart of a method for data alignment repair with fusion of 2D laser and depth image according to the present invention;
fig. 2 is a schematic block diagram of a 2D laser and depth image fused data alignment repair system according to the present invention.
Description of the symbols:
an extraction unit-1, a determination unit-2, and a repair unit-3.
Detailed Description
Preferred embodiments of the present invention are described below with reference to the accompanying drawings. It should be understood by those skilled in the art that these embodiments are only for explaining the technical principle of the present invention, and are not intended to limit the scope of the present invention.
The invention aims to provide a data alignment restoration method for fusion of a 2D laser and a depth image, which is characterized in that 2D laser data and depth image data at the same time are extracted based on a timestamp, sequence data with the highest similarity value are screened out, the change trend of an effective sequence is transplanted to an invalid data interval of another mode by utilizing the correlation of the sequence data, a new restoration sequence is obtained, the restoration sequence can enhance the visualization effect of a 2D laser map and the depth image, and the two sensor data can be mutually compensated.
In order to make the aforementioned objects, features and advantages of the present invention more comprehensible, the present invention is described in detail with reference to the accompanying drawings and the detailed description thereof.
As shown in fig. 1, the data alignment restoration method for fusing 2D laser and depth image of the present invention includes:
step 100: extracting 2D laser data and depth image data acquired for the same target at the same time;
step 200: determining sequence data with the highest similarity value in the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
step 300: and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data.
In step 100, the extracting 2D laser data and depth image data acquired at the same time for the same target specifically includes:
let D i Representing the ith depth image captured by an RGB-D camera, the function f (i) representing the depth image D i I is more than or equal to 1 and less than or equal to M, and M represents the number of depth images;
L j represents the jth 2D laser data obtained by the 2D laser radar sensor, and the function g (j) represents the 2D laser data L j J is more than or equal to 1 and less than or equal to N at the acquisition time, and N represents the number of 2D laser data;
determining 2D laser data L acquired at the same time on the same target based on the time stamp according to the formula f (i) = g (j) = t j,t And depth image data D i,t And t represents a set time.
The functions f (i) and g (j) are used to convert depth image data D i,t And 2D laser data L i,t Very adjacent in time series, where 1 ≦ t = f (M) = g (N), may align depth images at the same time with the 2D laser data.
In step 200, the determining sequence data with the highest similarity value in the 2D laser data and the depth image data specifically includes:
step 210: aligning the 2D laser data and the depth image data along a horizontal x axis or a vertical y axis according to the depth information to obtain a plurality of groups of 2D laser sequences
Figure BDA0002849080310000081
And a two-dimensional point sequence>
Figure BDA0002849080310000082
Wherein it is present>
Figure BDA0002849080310000083
Denotes a slave length ω 2D laser sequence->
Figure BDA0002849080310000084
A laser sequence consisting of w points is selected by taking the middle point u as a starting point, the u + w is less than or equal to w, and the mark is selected>
Figure BDA0002849080310000085
Representing a two-dimensional sequence of points with a starting point v and a length w along the x-axis coordinate.
Aligning the 2D laser data and the depth image data along a horizontal x axis or a vertical y axis according to depth information to obtain a plurality of groups of 2D laser sequences
Figure BDA0002849080310000086
And a two-dimensional point sequence->
Figure BDA0002849080310000087
The method specifically comprises the following steps:
step 211: according to the formula
Figure BDA0002849080310000088
Figure BDA0002849080310000089
Determining a suitable point (u, v); />
Wherein u represents a 2D laser sequence
Figure BDA0002849080310000091
V denotes a two-dimensional point sequence->
Figure BDA0002849080310000092
W denotes depth image data D i,t H denotes depth image data D i,t W denotes a two-dimensional point sequence->
Figure BDA0002849080310000093
W is more than or equal to 1 and less than or equal to W; Ψ is 2D laser data L j,t Ψ ≧ W, ω denotes the 2D laser sequence ≧ W>
Figure BDA0002849080310000094
1 ≦ ω ≦ Ψ.
Figure BDA0002849080310000095
Is a two-dimensional sequence of points with a starting point h and a length w along the x-axis coordinate. The goal of depth alignment is to find the appropriate value of (u, v), 1 ≦ u ≦ Ψ -W,1 ≦ v ≦ H.
Step 212: determining a 2D laser sequence from an x-axis coordinate u
Figure BDA0002849080310000096
Determining a two-dimensional point sequence based on the y-axis coordinate v>
Figure BDA0002849080310000097
Step 220: for each set of 2D laser sequences
Figure BDA0002849080310000098
And a two-dimensional point sequence->
Figure BDA0002849080310000099
Computing 2D laser sequences
Figure BDA00028490803100000910
And a two-dimensional point sequence->
Figure BDA00028490803100000911
A similarity value of (a);
2D laser sequence corresponding to highest similarity value
Figure BDA00028490803100000912
And a two-dimensional point sequence>
Figure BDA00028490803100000913
The reference sequence of the 2D laser and the reference sequence of the two-dimensional points are obtained.
Optionally, the 2D laser sequence is calculated according to any one of the following formulas
Figure BDA00028490803100000914
And a two-dimensional point sequence->
Figure BDA00028490803100000915
The similarity value of (c):
the formula I is as follows:
Figure BDA00028490803100000916
Figure BDA00028490803100000917
wherein the content of the first and second substances,
Figure BDA00028490803100000918
representing a two-dimensional point sequence->
Figure BDA00028490803100000919
And a two-dimensional laser sequence->
Figure BDA00028490803100000920
Cov (×) represents the evaluation of a two-dimensional sequence of points ·>
Figure BDA00028490803100000921
And a two-dimensional laser sequence->
Figure BDA00028490803100000922
In the covariance matrix of (1), var (—) represents solving a two-dimensional point sequence &>
Figure BDA00028490803100000923
The variance matrix of (a), E (×) represents the mathematical expectation.
The second formula is as follows:
Figure BDA0002849080310000101
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0002849080310000102
representing a two-dimensional point sequence->
Figure BDA0002849080310000103
And a two-dimensional laser sequence->
Figure BDA0002849080310000104
Is based on the similarity value of (4), in particular 1 minus the two-dimensional point sequence->
Figure BDA0002849080310000105
And a two-dimensional laser sequence->
Figure BDA0002849080310000106
The cosine distance of (d); the larger the value obtained, the more similar the two sequences.
The formula III is as follows:
Figure BDA0002849080310000107
/>
Figure BDA0002849080310000108
wherein the content of the first and second substances,
Figure BDA0002849080310000109
representing a two-dimensional point sequence->
Figure BDA00028490803100001010
And a two-dimensional laser sequence>
Figure BDA00028490803100001011
In particular 1 minus the normalized two-dimensional point sequence->
Figure BDA00028490803100001012
And a two-dimensional laser sequence/>
Figure BDA00028490803100001013
The Euclidean distance of; the larger the value obtained, the more similar the two sequences are.
In step 300, the determining, according to the sequence data with the highest similarity value, a hole region and/or a burr region in the sequence data, and repairing the hole region and/or the burr region based on the sequence data specifically includes:
when 2D laser sequence
Figure BDA00028490803100001014
Point in (4)>
Figure BDA00028490803100001015
Two-dimensional point sequence->
Figure BDA00028490803100001016
Point D in v,β When the value is 0, the 2D laser sequence is based on>
Figure BDA00028490803100001017
Two-dimensional point sequence->
Figure BDA00028490803100001018
The noise pair area is hollow; β represents a noise pair;
delete point
Figure BDA00028490803100001019
Device for combining a 2D laser sequence>
Figure BDA00028490803100001020
Neutralization point->
Figure BDA00028490803100001021
Adjacent suitable point replacement; delete Point D v,β The two-dimensional point sequence is combined>
Figure BDA00028490803100001022
Neutral pointD v,β Adjacent suitable point replacement; suitable points are for the 2D laser sequence>
Figure BDA00028490803100001023
And a two-dimensional point sequence>
Figure BDA00028490803100001024
Well match between them to the corresponding points;
when 2D laser sequence
Figure BDA00028490803100001025
Point in (4)>
Figure BDA00028490803100001026
Two-dimensional point sequence->
Figure BDA00028490803100001027
Point D in v,β If the value is not 0, the 2D laser sequence is determined based on the following constraints>
Figure BDA00028490803100001028
And a two-dimensional point sequence>
Figure BDA00028490803100001029
The middle is a hollow or burred point:
Figure BDA00028490803100001030
where ε represents the threshold for discriminating the pairing distance and A represents a 3 × 3 homography matrix for 2D laser sequences
Figure BDA0002849080310000111
And a two-dimensional point sequence>
Figure BDA0002849080310000112
Carrying out mapping transformation;
the repair data is determined according to the following formula:
Figure BDA0002849080310000113
or->
Figure BDA0002849080310000114
Wherein the content of the first and second substances,
Figure BDA0002849080310000115
is a 2D laser sequence->
Figure BDA0002849080310000116
Point in for repairing the noise point->
Figure BDA0002849080310000117
Accordingly; />
Figure BDA0002849080310000118
As a two-dimensional sequence of points
Figure BDA0002849080310000119
Point of (1) for repairing noise point D v,β (ii) a Alpha denotes a 2D laser sequence->
Figure BDA00028490803100001110
And a two-dimensional point sequence>
Figure BDA00028490803100001111
A good matching pair between;
will be dotted
Figure BDA00028490803100001112
Replacement noise point pick-up>
Figure BDA00028490803100001113
Will->
Figure BDA00028490803100001114
Replacement of noise points D v,β
The shape transformation of two sets of similar points on a plane can be described as a homography matrix. The mapping using the homography matrix method facilitates the conversion from an m x 3 set of points (L) on a two-dimensional lidar plane to an m x 3 set of points (D) on a vertical plane of an RGB-D depth image having a 3 x 3 homography matrix (a).
Figure BDA00028490803100001115
Further, the calculation method of the homography matrix A comprises the following steps:
2D laser sequence
Figure BDA00028490803100001116
Points on the x, z two-dimensional plane form a point set L; l is an mx 3 point set matrix, which is combined with a plurality of pixel sets>
Figure BDA00028490803100001117
Is a 2D laser sequence>
Figure BDA00028490803100001118
Values of points on an x, z two-dimensional plane;
two-dimensional point sequence
Figure BDA00028490803100001119
Points on the x, z two-dimensional plane form a point set D; d is an mx 3 point set matrix, based on the sum of the values of the two points>
Figure BDA00028490803100001120
Is a two-dimensional point sequence->
Figure BDA00028490803100001121
Values of points on the x, z two-dimensional plane, wherein e is more than or equal to 1 and less than or equal to m;
D=L×A。
suppose that
Figure BDA00028490803100001122
And &>
Figure BDA00028490803100001123
There is a good matching pair α and a noise pair β, where K = α + β, the noise pair β is formed by
Figure BDA00028490803100001124
And &>
Figure BDA00028490803100001125
A burr or a hollow spot on either. Due to the presence of noise pairs, an ideal method is to select n pairs from the α good pairs and calculate the value of matrix A, where n < sizeof (α) < K. N pairs are generally randomly selected from the K pairs, and if the selected n pairs are all suitable pairs, the value of a can be obtained. Assuming c = α/(α + β), the probability that both selected n pairs are suitable is c n . The maximum number of attempts to find a consistent sample with a 97.5% confidence ratio is approximately 4/c n
In order to solve the problems of acquisition error and data loss caused by insufficient compactness of data layer fusion when a laser scanner and an RGB-D camera are used as front-end sensors for information acquisition at the same time, the invention extracts 2D laser data and depth image data of the same target at the same time based on a timestamp, screens out sequence data with the highest similarity value, and then transplants the change trend of an effective sequence of one mode (laser or depth image) to an invalid data interval of the other mode by utilizing the correlation of the sequence data and the data, thereby obtaining a new repair sequence, wherein the repair sequence can enhance the visualization effect of the 2D laser map and the depth image, and the data of the two sensors can be mutually compensated.
In addition, the invention also provides a data alignment restoration system for fusing the 2D laser and the depth image, which can improve the accuracy of data information acquisition.
As shown in fig. 2, the data alignment restoration system for fusing 2D laser and depth image according to the present invention includes an extraction unit 1, a determination unit 2, and a restoration unit 3.
Specifically, the extraction unit 1 is configured to extract 2D laser data and depth image data acquired at the same time for the same target;
the determining unit 2 is configured to determine sequence data with the highest similarity value in the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
the repairing unit 3 is configured to determine a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repair the hole region and/or the burr region based on the sequence data.
In addition, the invention also provides a data alignment restoration system for fusing the 2D laser and the depth image, which comprises:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to:
extracting 2D laser data and depth image data acquired for the same target at the same time;
determining sequence data with the highest similarity value in the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data.
The invention also provides the following scheme:
a computer-readable storage medium storing one or more programs that, when executed by an electronic device including a plurality of application programs, cause the electronic device to:
extracting 2D laser data and depth image data acquired for the same target at the same time;
determining sequence data with the highest similarity value in the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data.
Compared with the prior art, the data alignment restoration system and the computer-readable storage medium for fusing the 2D laser and the depth image have the same beneficial effects as the data alignment restoration method for fusing the 2D laser and the depth image, and are not described herein again.
So far, the technical solutions of the present invention have been described in connection with the preferred embodiments shown in the drawings, but it is easily understood by those skilled in the art that the scope of the present invention is obviously not limited to these specific embodiments. Equivalent changes or substitutions of related technical features can be made by those skilled in the art without departing from the principle of the invention, and the technical scheme after the changes or substitutions can fall into the protection scope of the invention.

Claims (8)

1. A data alignment repairing method for fusing a 2D laser and a depth image is characterized by comprising the following steps:
extracting 2D laser data and depth image data acquired for the same target at the same time; the method specifically comprises the following steps:
let D i Representing the ith depth image captured by an RGB-D camera, and the function f (i) representing the depth image D i I is more than or equal to 1 and less than or equal to M, and M represents the number of depth images;
L j represents the jth 2D laser data obtained by the 2D laser radar sensor, and the function g (j) represents the 2D laser data L j J is more than or equal to 1 and less than or equal to N at the acquisition time, and N represents the number of 2D laser data;
determining 2D laser data L acquired at the same time for the same target according to the formula f (i) = g (j) = t j,t And depth image data D i,t T represents a set time;
determining sequence data with the highest similarity value in the 2D laser data and the depth image data; the method specifically comprises the following steps:
aligning the 2D laser data and the depth image data along a horizontal x axis or a vertical y axis according to the depth information to obtain a plurality of groups of 2D laser sequences
Figure FDA0004050864120000011
And a two-dimensional point sequence->
Figure FDA0004050864120000012
Wherein it is present>
Figure FDA0004050864120000013
Denotes a slave length ω 2D laser sequence->
Figure FDA0004050864120000014
Selecting a laser sequence consisting of w points with the middle point u as a starting point, wherein u + w is less than or equal to omega, and then combining>
Figure FDA0004050864120000015
A two-dimensional point sequence with a starting point v and a length w along the x-axis coordinate;
for each set of 2D laser sequences
Figure FDA0004050864120000016
And a two-dimensional point sequence->
Figure FDA0004050864120000017
Computing 2D laser sequences
Figure FDA0004050864120000018
And a two-dimensional point sequence>
Figure FDA0004050864120000019
The similarity value of (a);
2D laser sequence corresponding to highest similarity value
Figure FDA00040508641200000110
And a two-dimensional point sequence->
Figure FDA00040508641200000111
The method comprises the following steps of (1) obtaining a 2D laser reference sequence and a two-dimensional point reference sequence;
the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data.
2. The method for repairing data alignment of 2D laser and depth image fusion according to claim 1, wherein the 2D laser data and the depth image data are aligned along a horizontal x axis or a vertical y axis according to depth information to obtain a plurality of groups of 2D laser sequences
Figure FDA0004050864120000021
And a two-dimensional point sequence->
Figure FDA0004050864120000022
The method specifically comprises the following steps:
according to the formula
Figure FDA0004050864120000023
Determining a suitable point (u, v);
wherein u represents a 2D laser sequence
Figure FDA0004050864120000024
V represents a two-dimensional point sequence->
Figure FDA0004050864120000025
W denotes depth image data D i,t H denotes depth image data D i,t W denotes a two-dimensional point sequence->
Figure FDA0004050864120000026
The length of (i) is more than or equal to W and less than or equal to W; Ψ is 2D laser data L j,t Ψ ≧ w, ω represents the 2D laser sequence >>
Figure FDA0004050864120000027
1 ≦ ω ≦ Ψ;
determining a 2D laser sequence from an x-axis coordinate u
Figure FDA0004050864120000028
Determining a two-dimensional point sequence based on the y-axis coordinate v>
Figure FDA0004050864120000029
3. The method for 2D laser and depth image fused data alignment repair according to claim 1, wherein the 2D laser sequence is calculated according to any one of the following formulas
Figure FDA00040508641200000210
And a two-dimensional point sequence->
Figure FDA00040508641200000211
The similarity value of (c):
the formula I is as follows:
Figure FDA00040508641200000212
wherein the content of the first and second substances,
Figure FDA00040508641200000213
representing a two-dimensional sequence of points>
Figure FDA00040508641200000214
And a two-dimensional laser sequence->
Figure FDA00040508641200000215
Cov (. X.) denotes the evaluation of the two-dimensional point sequence->
Figure FDA00040508641200000216
And a two-dimensional laser sequence->
Figure FDA00040508641200000217
Is used to solve the two-dimensional point sequence represented by the covariance matrix, var (. + -.) -
Figure FDA00040508641200000218
The variance matrix of (a), E (×) represents the mathematical expectation;
the formula II is as follows:
Figure FDA00040508641200000219
wherein, the first and the second end of the pipe are connected with each other,
Figure FDA00040508641200000220
representing a two-dimensional point sequence->
Figure FDA00040508641200000221
And a two-dimensional laser sequence->
Figure FDA00040508641200000222
The similarity value of (a);
the formula III is as follows:
Figure FDA0004050864120000031
wherein the content of the first and second substances,
Figure FDA0004050864120000032
representing a two-dimensional point sequence->
Figure FDA0004050864120000033
And a two-dimensional laser sequence->
Figure FDA0004050864120000034
The similarity value of (c).
4. The method for repairing the 2D laser and depth image fused data alignment according to claim 1, wherein the determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value and repairing the hole region and/or the burr region based on the sequence data specifically comprises:
when 2D laser sequence
Figure FDA0004050864120000035
Point in (4)>
Figure FDA0004050864120000036
Two-dimensional point sequence->
Figure FDA0004050864120000037
Point D in v,β When the value is 0, the 2D laser sequence is performed
Figure FDA0004050864120000038
Two-dimensional point sequence->
Figure FDA0004050864120000039
The noise pair area is hollow; β represents a noise pair;
delete point
Figure FDA00040508641200000310
Device for combining a 2D laser sequence>
Figure FDA00040508641200000311
Neutralization point>
Figure FDA00040508641200000312
Adjacent suitable point replacement; delete Point D v,β The two-dimensional point sequence is combined>
Figure FDA00040508641200000313
Neutralization point D v,β Adjacent suitable point replacement; suitable points are for the 2D laser sequence>
Figure FDA00040508641200000314
And a two-dimensional point sequence>
Figure FDA00040508641200000315
Well match between them to the corresponding points;
when 2D laser sequence
Figure FDA00040508641200000316
Point in (4)>
Figure FDA00040508641200000317
Two-dimensional point sequence->
Figure FDA00040508641200000318
Point D in v,β If the value is not 0, the 2D laser sequence is determined based on the following constraints>
Figure FDA00040508641200000319
And a two-dimensional point sequence>
Figure FDA00040508641200000320
The points with hollows or burrs in the middle:
Figure FDA00040508641200000321
where ε represents the threshold for discriminating the pairing distance and A represents a homography matrix of 3 × 3 for 2D laser sequences
Figure FDA00040508641200000322
And a two-dimensional point sequence>
Figure FDA00040508641200000323
Carrying out mapping transformation;
the repair data is determined according to the following formula:
Figure FDA00040508641200000324
or->
Figure FDA00040508641200000325
Wherein the content of the first and second substances,
Figure FDA00040508641200000326
for 2D laser sequences>
Figure FDA00040508641200000327
Point in for repairing the noise point->
Figure FDA00040508641200000328
Accordingly; />
Figure FDA00040508641200000329
For a two-dimensional point sequence>
Figure FDA00040508641200000330
Point of (1) for repairing noise point D v,β (ii) a Alpha denotes a 2D laser sequence->
Figure FDA00040508641200000331
And a two-dimensional point sequence>
Figure FDA00040508641200000332
A good matching pair between; />
Will be dotted
Figure FDA00040508641200000333
Replacement noise point>
Figure FDA00040508641200000334
Will->
Figure FDA00040508641200000335
Replacement of noise points D v,β
5. The 2D laser and depth image fused data alignment repairing method according to claim 4, wherein the calculation method of the homography matrix A comprises the following steps:
2D laser sequence
Figure FDA0004050864120000041
Points on the x, z two-dimensional plane form a point set L; l is an mx 3 point set matrix, based on the sum of the values of the sum and the number of the sum>
Figure FDA0004050864120000042
Is a 2D laser sequence->
Figure FDA0004050864120000043
Values of points on an x, z two-dimensional plane;
two-dimensional point sequence
Figure FDA0004050864120000044
Points on the x, z two-dimensional plane form a point set D; d is an mx 3 point set matrix, which is combined with a plurality of image frames>
Figure FDA0004050864120000045
Is a two-dimensional point sequence->
Figure FDA0004050864120000046
The values of points on the x, z two-dimensional plane, wherein e is more than or equal to 1 and less than or equal to m;
D=L×A。
6. a 2D laser and depth image fused data alignment repair system, comprising:
the device comprises an extraction unit, a data acquisition unit and a data acquisition unit, wherein the extraction unit is used for extracting 2D laser data and depth image data acquired for the same target at the same time; the method specifically comprises the following steps:
let D i Representing the ith depth image captured by an RGB-D camera, the function f (i) representing the depth image D i I is more than or equal to 1 and less than or equal to M, and M represents the number of depth images;
L j represents the jth 2D laser data obtained by the 2D laser radar sensor, and the function g (j) represents the 2D laser data L j J is more than or equal to 1 and less than or equal to N at the acquisition time, and N represents the number of 2D laser data;
determining 2D laser data L acquired at the same time for the same target according to the formula f (i) = g (j) = t j,t And depth image data D i,t T represents a set time;
a determination unit configured to determine sequence data having a highest similarity value between the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence; the method specifically comprises the following steps:
aligning the 2D laser data and the depth image data along a horizontal x axis or a vertical y axis according to the depth information to obtain a plurality of groups of 2D laser sequences
Figure FDA0004050864120000047
And a two-dimensional point sequence->
Figure FDA0004050864120000048
Wherein it is present>
Figure FDA0004050864120000049
Denotes a slave length ω 2D laser sequence->
Figure FDA00040508641200000410
Selecting a laser sequence consisting of w points with the middle point u as a starting point, wherein u + w is less than or equal to omega, and then combining>
Figure FDA00040508641200000411
To representThe starting point is v, and the length w is a two-dimensional point sequence along the x-axis coordinate;
for each set of 2D laser sequences
Figure FDA00040508641200000412
And a two-dimensional point sequence>
Figure FDA00040508641200000413
Computing 2D laser sequences
Figure FDA00040508641200000414
And a two-dimensional point sequence->
Figure FDA00040508641200000415
The similarity value of (a);
2D laser sequence corresponding to highest similarity value
Figure FDA0004050864120000051
And a two-dimensional point sequence->
Figure FDA0004050864120000052
A 2D laser reference sequence and a two-dimensional point reference sequence are obtained;
and the repairing unit is used for determining a hole area and/or a burr area in the sequence data according to the sequence data with the highest similarity value and repairing the hole area and/or the burr area based on the sequence data.
7. A 2D laser and depth image fused data alignment repair system comprising:
a processor; and
a memory arranged to store computer executable instructions that, when executed, cause the processor to:
extracting 2D laser data and depth image data acquired for the same target at the same time; the method specifically comprises the following steps:
let D i Representing the ith depth image captured by an RGB-D camera, the function f (i) representing the depth image D i I is more than or equal to 1 and less than or equal to M, and M represents the number of the depth images;
L j represents the jth 2D laser data obtained by the 2D laser radar sensor, and the function g (j) represents the 2D laser data L j J is more than or equal to 1 and less than or equal to N at the acquisition time, and N represents the number of 2D laser data;
determining 2D laser data L collected for the same target at the same time according to the formula f (i) = g (j) = t j,t And depth image data D i,t T represents a set time;
determining sequence data with the highest similarity value in the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence; the method specifically comprises the following steps:
aligning the 2D laser data and the depth image data along a horizontal x axis or a vertical y axis according to the depth information to obtain a plurality of groups of 2D laser sequences
Figure FDA0004050864120000053
And a two-dimensional point sequence->
Figure FDA0004050864120000054
Wherein +>
Figure FDA0004050864120000055
Denotes a slave length ω 2D laser sequence->
Figure FDA0004050864120000056
Selecting a laser sequence consisting of w points with the middle point u as a starting point, wherein u + w is less than or equal to omega, and then combining>
Figure FDA0004050864120000057
A two-dimensional point sequence with a starting point v and a length w along the x-axis coordinate;
for each set of 2D laser sequences
Figure FDA0004050864120000058
And a two-dimensional point sequence->
Figure FDA0004050864120000059
Computing 2D laser sequences
Figure FDA00040508641200000510
And a two-dimensional point sequence>
Figure FDA00040508641200000511
The similarity value of (a);
2D laser sequence corresponding to highest similarity value
Figure FDA00040508641200000512
And a two-dimensional point sequence->
Figure FDA00040508641200000513
A 2D laser reference sequence and a two-dimensional point reference sequence are obtained;
and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data.
8. A computer-readable storage medium storing one or more programs that, when executed by an electronic device including a plurality of application programs, cause the electronic device to:
extracting 2D laser data and depth image data acquired for the same target at the same time; the method specifically comprises the following steps:
let D i Representing the ith depth image captured by an RGB-D camera, the function f (i) representing the depth image D i I is more than or equal to 1 and less than or equal to M, and M represents the number of depth images;
L j represents the jth 2D laser data obtained by the 2D laser radar sensor, and the function g (j) represents the 2D laser data L j Of (2)J is more than or equal to 1 and less than or equal to N at the set time, and N represents the number of 2D laser data;
determining 2D laser data L acquired at the same time for the same target according to the formula f (i) = g (j) = t j,t And depth image data D i,t T represents a set time;
determining sequence data with the highest similarity value in the 2D laser data and the depth image data; the sequence data comprises a 2D laser reference sequence and a two-dimensional spot reference sequence;
the method specifically comprises the following steps:
aligning the 2D laser data and the depth image data along a horizontal x axis or a vertical y axis according to the depth information to obtain a plurality of groups of 2D laser sequences
Figure FDA0004050864120000061
And a two-dimensional point sequence->
Figure FDA0004050864120000062
Wherein it is present>
Figure FDA0004050864120000063
Represents a slave length ω 2D laser sequence->
Figure FDA0004050864120000064
Selecting a laser sequence consisting of w points with the midpoint u as a starting point, wherein u + w is less than or equal to omega, and combining>
Figure FDA0004050864120000065
A two-dimensional point sequence with a starting point v and a length w along the x-axis coordinate; />
For each set of 2D laser sequences
Figure FDA0004050864120000066
And a two-dimensional point sequence>
Figure FDA0004050864120000067
Computing 2D laser sequences
Figure FDA0004050864120000068
And a two-dimensional point sequence>
Figure FDA0004050864120000069
The similarity value of (a);
2D laser sequence corresponding to highest similarity value
Figure FDA00040508641200000610
And a two-dimensional point sequence->
Figure FDA00040508641200000611
The method comprises the following steps of (1) obtaining a 2D laser reference sequence and a two-dimensional point reference sequence;
and determining a hole region and/or a burr region in the sequence data according to the sequence data with the highest similarity value, and repairing the hole region and/or the burr region based on the sequence data.
CN202011519696.0A 2020-12-21 2020-12-21 Data alignment restoration method and system for fusion of 2D laser and depth image Active CN112561824B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011519696.0A CN112561824B (en) 2020-12-21 2020-12-21 Data alignment restoration method and system for fusion of 2D laser and depth image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011519696.0A CN112561824B (en) 2020-12-21 2020-12-21 Data alignment restoration method and system for fusion of 2D laser and depth image

Publications (2)

Publication Number Publication Date
CN112561824A CN112561824A (en) 2021-03-26
CN112561824B true CN112561824B (en) 2023-04-07

Family

ID=75032102

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011519696.0A Active CN112561824B (en) 2020-12-21 2020-12-21 Data alignment restoration method and system for fusion of 2D laser and depth image

Country Status (1)

Country Link
CN (1) CN112561824B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110428372A (en) * 2019-07-08 2019-11-08 希格斯动力科技(珠海)有限公司 Depth data and 2D laser data fusion method and device, storage medium
CN111291708A (en) * 2020-02-25 2020-06-16 华南理工大学 Transformer substation inspection robot obstacle detection and identification method integrated with depth camera
CN111624622A (en) * 2020-04-24 2020-09-04 库卡机器人(广东)有限公司 Obstacle detection method and device
CN112016612A (en) * 2020-08-26 2020-12-01 四川阿泰因机器人智能装备有限公司 Monocular depth estimation-based multi-sensor fusion SLAM method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109300190B (en) * 2018-09-06 2021-08-10 百度在线网络技术(北京)有限公司 Three-dimensional data processing method, device, equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110428372A (en) * 2019-07-08 2019-11-08 希格斯动力科技(珠海)有限公司 Depth data and 2D laser data fusion method and device, storage medium
CN111291708A (en) * 2020-02-25 2020-06-16 华南理工大学 Transformer substation inspection robot obstacle detection and identification method integrated with depth camera
CN111624622A (en) * 2020-04-24 2020-09-04 库卡机器人(广东)有限公司 Obstacle detection method and device
CN112016612A (en) * 2020-08-26 2020-12-01 四川阿泰因机器人智能装备有限公司 Monocular depth estimation-based multi-sensor fusion SLAM method

Also Published As

Publication number Publication date
CN112561824A (en) 2021-03-26

Similar Documents

Publication Publication Date Title
CN111882612B (en) Vehicle multi-scale positioning method based on three-dimensional laser detection lane line
CN110555901B (en) Method, device, equipment and storage medium for positioning and mapping dynamic and static scenes
CN110807809B (en) Light-weight monocular vision positioning method based on point-line characteristics and depth filter
CN110599545B (en) Feature-based dense map construction system
CN111830953A (en) Vehicle self-positioning method, device and system
CN112419497A (en) Monocular vision-based SLAM method combining feature method and direct method
CN112862881B (en) Road map construction and fusion method based on crowd-sourced multi-vehicle camera data
CN102201058A (en) Cat eye effect object recognition algorithm of active and passive imaging system sharing same aperture
CN110634138A (en) Bridge deformation monitoring method, device and equipment based on visual perception
CN111028271A (en) Multi-camera personnel three-dimensional positioning and tracking system based on human skeleton detection
CN112652020A (en) Visual SLAM method based on AdaLAM algorithm
CN113781532B (en) Automatic matching and searching method for SAR satellite image and optical image
Chen et al. Camera geolocation from mountain images
CN112580683B (en) Multi-sensor data time alignment system and method based on cross correlation
CN112561824B (en) Data alignment restoration method and system for fusion of 2D laser and depth image
Praczyk et al. Concept and first results of optical navigational system
CN117274627A (en) Multi-temporal snow remote sensing image matching method and system based on image conversion
Shen et al. Plant image mosaic based on depth and color dual information feature source from Kinect
CN111080712A (en) Multi-camera personnel positioning, tracking and displaying method based on human body skeleton detection
CN114283199A (en) Dynamic scene-oriented dotted line fusion semantic SLAM method
CN114140494A (en) Single-target tracking system and method in complex scene, electronic device and storage medium
CN113409334A (en) Centroid-based structured light angle point detection method
CN111178264A (en) Estimation algorithm for tower footing attitude of iron tower in aerial image of unmanned aerial vehicle
Rasyidy et al. A Framework for Road Boundary Detection based on Camera-LIDAR Fusion in World Coordinate System and Its Performance Evaluation Using Carla Simulator
CN116385502B (en) Image registration method based on region search under geometric constraint

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant