CN113516682A - Loop detection method of laser SLAM - Google Patents

Loop detection method of laser SLAM Download PDF

Info

Publication number
CN113516682A
CN113516682A CN202110775299.8A CN202110775299A CN113516682A CN 113516682 A CN113516682 A CN 113516682A CN 202110775299 A CN202110775299 A CN 202110775299A CN 113516682 A CN113516682 A CN 113516682A
Authority
CN
China
Prior art keywords
frame
current frame
descriptor
point cloud
loop detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110775299.8A
Other languages
Chinese (zh)
Other versions
CN113516682B (en
Inventor
张立伟
刘洁琳
徐光进
何炳蔚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fuzhou University
Original Assignee
Fuzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuzhou University filed Critical Fuzhou University
Priority to CN202110775299.8A priority Critical patent/CN113516682B/en
Publication of CN113516682A publication Critical patent/CN113516682A/en
Application granted granted Critical
Publication of CN113516682B publication Critical patent/CN113516682B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a loop detection method of laser SLAM, comprising the following steps: acquiring single-frame three-dimensional point cloud data and synchronous IMU information thereof; step 2: segmenting the current frame point cloud into
Figure DEST_PATH_IMAGE002
A unit space for storing the unit space data
Figure DEST_PATH_IMAGE004
A matrix, whereby a first descriptor is constructed; storing the segmented point cloud data and IMU information into a B vector so as to construct a second descriptor; and step 3: constructing a KDTree by using a second descriptor of the historical frame in the step 2; and 4, step 4: performing nearest neighbor search by using a second descriptor, and finding n candidate similar frames from the KDTree; matching the candidate similar frame with the current point cloud according to the first descriptor, and judging the candidate similar frame according to whether the matching result meets the threshold valueAnd if the closed loop exists, obtaining the relative rotation change of the current frame and the loop frame. The method for detecting the loop and constructing the map realizes loop detection and the map construction method of the mobile robot with higher precision and stronger robustness in outdoor scenes and scenes with high similarity degree of local road sections.

Description

Loop detection method of laser SLAM
Technical Field
The invention relates to the field of mobile robots, in particular to a loop detection method of a laser SLAM.
Background
With the development of the times, people pay more And more attention to the application of mobile robots to actual environments, the research direction of robots for automatically constructing a surrounding environment map (SLAM) And successfully positioning the robots is a core problem in the field of mobile robots, And although the robots have a plurality of advantages, the capabilities of the robots for automatically drawing the surrounding environments And successfully positioning the robots are still quite limited. At present, the SLAM method only based on the laser radar easily generates error drifting phenomenon after long-time operation under the conditions that the robot moves fast and the outdoor environment changes little.
The loop detection method can effectively solve the problem of motion drift in the field of laser SLAM. The existing loop detection method starts from processing the feature vectors of the point cloud features, such as histograms and the like, and has limitation in the situations that outdoor scene changes are small and a robot runs to the same road section in the opposite direction.
Disclosure of Invention
In view of this, the present invention provides a loop detection method for a laser SLAM, which can reduce the complexity of point cloud calculation and ensure good loop detection accuracy and robustness.
The invention is realized by adopting the following scheme: a loop detection method of laser SLAM comprises the following steps:
step S1: acquiring single key frame three-dimensional environment point cloud data and synchronous Inertial Measurement Unit (IMU) information thereof, namely IMU information;
step S2: dividing the current frame point cloud space into R × L unit spaces, and respectively storing the unit space data in AR×LA matrix, whereby a first descriptor is constructed; will be provided withStoring the segmented point cloud data and IMU information into BR+3A vector, thereby constructing a second descriptor;
step S3: constructing a KDTree by using second descriptors of all historical key frames; calculating a similarity value between the current frame and the second descriptor of the historical key frame, and using the inter-frame similarity value as a nearest neighbor searching edge to find n candidate similar frames with the highest similarity with the current frame from the KDTree;
step S4: performing loop matching of the candidate similar frame and the current frame on the column vectors of the first descriptors of the candidate similar frame and the current frame according to the similarity value in the step S3 to obtain a relative rotation value of the current frame and the candidate similar frame with the highest similarity, taking the relative rotation value and IMU information as an initial value of NDT matching to obtain relative pose change of the current frame and the candidate similar frame, and if the variation is smaller than a threshold value, successfully performing loop detection and performing loop correction; otherwise, the loop detection fails, and the loop detection of the next frame is entered.
Further, the specific content of step S1 is: taking the point cloud with the NaN value removed as the input point cloud of the loop detection method; and acquiring synchronous IMU data information of the current frame, and integrating the acquired IMU information to obtain position data of the current frame.
Further, the specific content of step S2 is:
the method comprises the steps of taking a laser radar as an origin, averagely dividing a current frame point cloud space into L rows along a clockwise direction according to a polar coordinate mode, averagely dividing the current frame point cloud space into R rows along a radius increasing direction, dividing the current frame point cloud space into R multiplied by L unit spaces, selecting a point closest to the origin in each unit space as a unit space value, and sequentially storing the unit space value in AR×LThe matrix completes the construction of the first descriptor;
counting the point number of the point cloud in the unit space, if the point number is greater than a threshold value, determining that the unit space is not empty, and determining the R-th line of the current frame point cloud space as R belonging to [1, R ∈]The number of the hollow and non-hollow unit spaces is used as a vector BR+3After the R data values are all calculated, the position data of the current point cloud is used as a vector BR+3And (4) finishing the construction of the second descriptor.
Further, the similarity in step S3 is calculated as follows: constructing a KDTree by using a second descriptor, firstly carrying out Euclidean distance calculation on the last three values of the current frame second descriptor and the historical frame second descriptor, if the calculation result is higher than a threshold value, discarding the historical frame, otherwise, calculating the distance between the current frame second descriptor and the historical frame second descriptor to obtain a candidate similar frame, wherein the distance calculation formula is as follows:
Figure BDA0003154129350000031
wherein, b and bnIs a second descriptor of the mth frame point cloud and the nth frame point cloud.
Further, the specific content of step S4 is:
performing the similarity calculation of step S3 using the column vectors of the first descriptors of the candidate similar frame and the current frame; translating the first descriptors of the current frame according to the column vectors to obtain L first descriptors of the current frame, performing similarity calculation on the L first descriptors and the candidate similar frame one by one, and selecting a value with the highest similarity result as a similarity value of the current frame and the candidate similar frame; selecting a frame with the highest similarity with the current frame from the n candidate similar frames as a loop detection frame, wherein the similarity judgment formula is as follows:
Figure BDA0003154129350000041
wherein s ismIs the first descriptor, s, of the current framenIs the first descriptor of the candidate similar frame,
Figure BDA0003154129350000042
for the ith column of the current frame,
Figure BDA0003154129350000043
is the k column of the candidate similar frame;
if the x (x ∈ 1, L) th first descriptor of the current frame has the highest similarity value with the loop detection frame, the relative rotation value Q between the current frame and the current frame is:
Figure BDA0003154129350000044
with Q and vector BR+3And the last three values are used as initial poses to carry out NDT registration of the current frame and the loop detection frame, if the registration result is smaller than a threshold value, the loop detection is considered to be successful, loop correction is carried out, otherwise, the loop detection fails, and loop detection of the next key frame is carried out.
Compared with the prior art, the invention has the following beneficial effects:
the method is suitable for scenes with obvious characteristic changes, and the calculation complexity of point cloud data is reduced; in an outdoor scene with unobvious local road section feature changes and a scene with large difference between the loop driving direction and the historical driving direction, the loop detection precision and robustness can be guaranteed while the point cloud calculation complexity is reduced.
Drawings
FIG. 1 is a flow chart of a method according to an embodiment of the present invention.
Fig. 2 is a trace truth value in the scenario of KITTI data set 00 according to the embodiment of the present invention and a trace comparison diagram of the method of the present invention.
Fig. 3 is a trace truth value in the scenario of KITTI data set 08 according to an embodiment of the present invention and a trace comparison diagram of the method according to the present invention.
Detailed Description
The invention is further explained below with reference to the drawings and the embodiments.
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
As shown in fig. 1, the present embodiment provides a loop detection method for a laser SLAM, including the following steps:
step S1: acquiring single key frame three-dimensional environment point cloud data and synchronous Inertial Measurement Unit (IMU) information, namely IMU information;
step S2: dividing the current frame point cloud space into R × L unit spaces, and respectively storing the unit space data in AR×LA matrix, whereby a first descriptor is constructed; storing the point cloud data and IMU information after segmentation into BR+3A vector, thereby constructing a second descriptor;
step S3: constructing a KDTree (k-dimensional-tree) by using second descriptors of all the historical key frames; calculating a similarity value between the current frame and the second descriptor of the historical key frame, and using the inter-frame similarity value as a nearest neighbor searching edge to find n candidate similar frames with the highest similarity with the current frame from the KDTree;
step S4: performing loop matching of the candidate similar frame and the current frame on the column vectors of the first descriptors of the candidate similar frame and the current frame according to the similarity value in the step S3 to obtain a relative rotation value of the current frame and the candidate similar frame with the highest similarity, taking the relative rotation value and IMU information as an initial value of NDT matching to obtain a relative pose change of the current frame and the candidate similar frame, and if the change amount is smaller than a threshold (such as 0.01), successfully performing loop detection and performing loop correction; if the loop detection fails, entering the loop detection of the next frame;
in this embodiment, the specific content of step S1 is: taking the point cloud with the NaN value (the distance abnormality in the point cloud data or the interference point reflected by the surface) removed as an input point cloud; and acquiring synchronous IMU data information of the current point cloud frame, and integrating the acquired IMU information to obtain the position data of the current frame.
In this embodiment, the specific content of step S2 is:
the method comprises the steps of taking a laser radar as an origin, averagely dividing a current frame point cloud space into L rows along a clockwise direction according to a polar coordinate mode, averagely dividing the current frame point cloud space into R rows along a radius increasing direction, dividing the current frame point cloud space into R multiplied by L unit spaces, selecting a point closest to the origin in each unit space as a unit space value, and sequentially storing the unit space value in AR×LThe matrix completes the construction of the first descriptor;
counting the point number of the point cloud in the unit space, if the point number is greater than a threshold (such as 30 points), considering that the unit space is not empty, and determining the rr < th > of the current frame point cloud space as [1, R ∈ [ ]]Line, R ∈ [1, R ]]The number of the hollow and non-hollow unit spaces is used as a vector BR+3After the R data values are all calculated, the position data of the current point cloud is used as a vector BR+3And (4) finishing the construction of the second descriptor.
Storing a first descriptor and a second descriptor of the point cloud of each key frame into a descriptor library in a matrix form by taking the index value of the corresponding key frame as a name so as to facilitate subsequent calling;
in the present embodiment, the similarity calculation in step S3 is as follows: constructing a KDTree by using a second descriptor, firstly carrying out Euclidean distance calculation on the last three values of the current frame second descriptor and the historical frame second descriptor, if the calculation result is higher than a threshold value (such as 3 meters), discarding the historical frame, otherwise, calculating the distance between the current frame second descriptor and the historical frame second descriptor to obtain a candidate similar frame, wherein the distance calculation formula is as follows:
Figure BDA0003154129350000071
wherein, b and bnIs a second descriptor of the mth frame point cloud and the nth frame point cloud.
In this embodiment, the specific content of step S4 is:
performing the similarity calculation of step S3 using the column vectors of the first descriptors of the candidate similar frame and the current frame; translating the first descriptors of the current frame according to the column vectors to obtain L first descriptors of the current frame, performing similarity calculation on the L first descriptors and the candidate similar frame one by one, and selecting a value with the highest similarity result as a similarity value of the current frame and the candidate similar frame; selecting a frame with the highest similarity with the current frame from the n candidate similar frames as a loop detection frame, wherein the similarity judgment formula is as follows:
Figure BDA0003154129350000072
wherein s ismIs the first descriptor, s, of the current framenIs the first descriptor of the candidate similar frame,
Figure BDA0003154129350000073
for the ith column of the current frame,
Figure BDA0003154129350000074
is the k column of the candidate similar frame;
if the similarity value between the xth x ∈ [1, L ] first descriptor of the current frame and the loop detection frame is the highest, the relative rotation value Q between the current frame and the current frame is:
Figure BDA0003154129350000081
with Q and vector BR+3And the last three values are used as initial poses to perform NDT registration of the current frame and the loop detection frame, if the registration result is smaller than a threshold (such as 0.01), the loop detection is considered to be successful, loop correction is performed, otherwise, the loop detection fails, and loop detection of the next key frame is performed.
Preferably, in this embodiment, the point cloud data subjected to the time consistency processing is fused with the IMU data to construct a second descriptor, which is used for fast retrieval to obtain candidate similar frames; and finally, registering a local map formed by the current frame and the candidate similar frame to finish loop detection.
Preferably, in the present embodiment,
step 1: acquiring single key frame point cloud data and synchronous IMU information thereof; the main implementation mode is as follows: removing the interference points of distance abnormality or surface reflection in the point cloud data; acquiring synchronous IMU data information of a current point cloud frame; integrating the acquired IMU information to obtain the position data of the current point cloud frame;
step 2: dividing the current frame point cloud space into R × L unit spaces, and respectively storing the unit space data in AR×LA matrix, whereby a first descriptor is constructed; storing the segmented point cloud data and IMU information into a B vector so as to construct a second descriptor; the main implementation mode is as follows: the method comprises the steps of taking a laser radar as an origin, averagely dividing a current frame point cloud space into L rows along a clockwise direction according to a polar coordinate mode, averagely dividing the current frame point cloud space into R rows along a radius increasing direction, dividing the current frame point cloud space into R multiplied by L unit spaces, selecting a point closest to the origin in each unit space as a unit space value, and sequentially storing the unit space value in AR×LThe matrix completes the construction of the first descriptor;
counting the point number of the point cloud in the unit space, if the point number is greater than a threshold value (such as 30 points), determining that the unit space is not empty, taking the number which is not empty in the unit space of the R-th row as an R-th data value of a vector B, after the R data values are all calculated, storing the position data of the current point cloud into the vector B, and completing the construction of a second descriptor;
storing a first descriptor and a second descriptor of the point cloud of each key frame into a descriptor library in a matrix form by taking the index value of the corresponding key frame as a name so as to facilitate subsequent calling;
and step 3: and constructing a KDTree by using the second descriptor, firstly carrying out Euclidean distance calculation on the last three values of the second descriptor of the current frame and the second descriptor of the historical frame, if the calculation result is higher than a threshold (such as 3 meters), discarding the historical frame, otherwise,
and obtaining a candidate similar frame by calculating the similarity between the current frame second descriptor and the historical frame second descriptor, wherein the similarity calculation formula is as follows:
Figure BDA0003154129350000091
wherein b and bnA second descriptor for the mth frame point cloud and the nth frame point cloud;
and 4, step 4: performing the similarity calculation in step 3 by using the column vectors of the first descriptors of the candidate similar frame and the current frame; translating the first descriptors of the current frame according to the column vectors to obtain L first descriptors of the current frame, performing similarity calculation on the L first descriptors and the candidate similar frame one by one, and selecting a value with the highest similarity result as a similarity value of the current frame and the candidate similar frame; selecting a frame with the highest similarity with the current frame from the n candidate similar frames as a loop detection frame, wherein the similarity judgment formula is as follows:
Figure BDA0003154129350000101
wherein s ismIs the first descriptor, s, of the current framenIs the first descriptor of the candidate similar frame,
Figure BDA0003154129350000102
for the ith column of the current frame,
Figure BDA0003154129350000103
is the k column of the candidate similar frame;
if the x (x ∈ 1, L) th first descriptor of the current frame has the highest similarity value with the loop detection frame, the relative rotation value Q between the current frame and the current frame is:
Figure BDA0003154129350000104
with Q and vector BR+3And the last three values are used as initial poses to perform NDT registration of the current frame and the loop detection frame, if the registration result is smaller than a threshold (such as 0.01), the loop detection is considered to be successful, loop correction is performed, otherwise, the loop detection fails, and loop detection of the next key frame is performed.
In this embodiment, to verify the feasibility of the technical method of the present invention, the 00 sequence and 05 sequence with more loopback numbers in the KITTI dataset are selected to implement the scheme described in the embodiment, and the specific result is shown in fig. 2(00 sequence) and fig. 3(05 sequence), where the diagram is a full-segment trajectory comparison result between a trajectory (solid line) constructed by using the method and a trajectory true value (dotted line, i.e., reference) of the sequence, and the trajectory comparison diagram fully illustrates the rationality and effectiveness of the method.
The above description is only a preferred embodiment of the present invention, and all equivalent changes and modifications made in accordance with the claims of the present invention should be covered by the present invention.

Claims (5)

1. A loop detection method of laser SLAM is characterized in that: the method comprises the following steps:
step S1: acquiring single key frame three-dimensional environment point cloud data and synchronous inertial measurement unit Information (IMU) information thereof;
step S2: dividing the current frame point cloud space into R × L unit spaces, and respectively storing the unit space data in AR×LA matrix, whereby a first descriptor is constructed; storing the point cloud data and IMU information after segmentation into BR+3A vector, thereby constructing a second descriptor;
step S3: constructing a KDTree by using second descriptors of all historical key frames; calculating a similarity value between the current frame and the second descriptor of the historical key frame, and using the inter-frame similarity value as a nearest neighbor searching edge to find n candidate similar frames with the highest similarity with the current frame from the KDTree;
step S4: performing loop matching of the candidate similar frame and the current frame on the column vectors of the first descriptors of the candidate similar frame and the current frame according to the similarity value in the step S3 to obtain a relative rotation value of the current frame and the candidate similar frame with the highest similarity, taking the relative rotation value and IMU information as an initial value of NDT matching to obtain relative pose change of the current frame and the candidate similar frame, and if the variation is smaller than a threshold value, successfully performing loop detection and performing loop correction; otherwise, the loop detection fails, and the loop detection of the next frame is entered.
2. The method of claim 1, wherein the loop detection of the laser SLAM comprises: the specific content of step S1 is: taking the point cloud with the NaN value removed as an input point cloud; and acquiring synchronous IMU data information of the current point cloud frame, and integrating the acquired IMU information to obtain the position data of the current frame.
3. The method of claim 1, wherein the loop detection of the laser SLAM comprises: the specific content of step S2 is:
the method comprises the steps of taking a laser radar as an origin, averagely dividing a current frame point cloud space into L rows along a clockwise direction according to a polar coordinate mode, averagely dividing the current frame point cloud space into R rows along a radius increasing direction, dividing the current frame point cloud space into R multiplied by L unit spaces, selecting a point closest to the origin in each unit space as a unit space value, and sequentially storing the unit space value in AR×LThe matrix completes the construction of the first descriptor;
counting the point number of the point cloud in the unit space, if the point number is greater than a threshold value, determining that the unit space is not empty, and determining the R-th line of the current frame point cloud space as R belonging to [1, R ∈]The number of the hollow and non-hollow unit spaces is used as a vector BR+3After the R data values are all calculated, the position data of the current point cloud is used as a vector BR+3And (4) finishing the construction of the second descriptor.
4. The method of claim 1, wherein the loop detection of the laser SLAM comprises: the similarity in step S3 is calculated as follows: constructing a KDTree by using a second descriptor, firstly carrying out Euclidean distance calculation on the last three values of the current frame second descriptor and the historical frame second descriptor, if the calculation result is higher than a threshold value, discarding the historical frame, otherwise, calculating the similarity of the current frame second descriptor and the historical frame second descriptor to obtain a candidate similar frame, wherein the similarity calculation formula is as follows:
Figure FDA0003154129340000021
wherein, b and bnIs a second descriptor of the mth frame point cloud and the nth frame point cloud.
5. The method of claim 1, wherein the loop detection of the laser SLAM comprises: the specific content of step S4 is:
performing the similarity calculation of step S3 using the column vectors of the first descriptors of the candidate similar frame and the current frame; translating the first descriptors of the current frame according to the column vectors to obtain L first descriptors of the current frame, performing similarity calculation on the L first descriptors and the candidate similar frame one by one, and selecting a value with the highest similarity result as a similarity value of the current frame and the candidate similar frame; selecting a frame with the highest similarity with the current frame from the n candidate similar frames as a loop detection frame, wherein the similarity judgment formula is as follows:
Figure FDA0003154129340000031
wherein s ismIs the first descriptor, s, of the current framenIs the first descriptor of the candidate similar frame,
Figure FDA0003154129340000032
for the ith column of the current frame,
Figure FDA0003154129340000033
as the second candidate similar framek columns;
if the similarity value between the xth x ∈ [1, L ] first descriptor of the current frame and the loop detection frame is the highest, the relative rotation value Q between the current frame and the current frame is:
Figure FDA0003154129340000034
with Q and vector BR+3And the last three values are used as initial poses to carry out NDT registration of the current frame and the loop detection frame, if the registration result is smaller than a threshold value, the loop detection is considered to be successful, loop correction is carried out, otherwise, the loop detection fails, and loop detection of the next key frame is carried out.
CN202110775299.8A 2021-07-08 2021-07-08 Loop detection method of laser SLAM Active CN113516682B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110775299.8A CN113516682B (en) 2021-07-08 2021-07-08 Loop detection method of laser SLAM

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110775299.8A CN113516682B (en) 2021-07-08 2021-07-08 Loop detection method of laser SLAM

Publications (2)

Publication Number Publication Date
CN113516682A true CN113516682A (en) 2021-10-19
CN113516682B CN113516682B (en) 2023-08-11

Family

ID=78066455

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110775299.8A Active CN113516682B (en) 2021-07-08 2021-07-08 Loop detection method of laser SLAM

Country Status (1)

Country Link
CN (1) CN113516682B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115047487A (en) * 2022-03-11 2022-09-13 武汉科技大学 Loop detection method and device based on point cloud intensity and height information
CN115200588A (en) * 2022-09-14 2022-10-18 煤炭科学研究总院有限公司 SLAM autonomous navigation method and device for mobile robot
CN116363371A (en) * 2023-05-26 2023-06-30 山东大学 Point cloud segmentation method based on inter-frame similarity

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019169540A1 (en) * 2018-03-06 2019-09-12 斯坦德机器人(深圳)有限公司 Method for tightly-coupling visual slam, terminal and computer readable storage medium
CN111563442A (en) * 2020-04-29 2020-08-21 上海交通大学 Slam method and system for fusing point cloud and camera image data based on laser radar
CN112665575A (en) * 2020-11-27 2021-04-16 重庆大学 SLAM loop detection method based on mobile robot
CN112767490A (en) * 2021-01-29 2021-05-07 福州大学 Outdoor three-dimensional synchronous positioning and mapping method based on laser radar
CN112907491A (en) * 2021-03-18 2021-06-04 中煤科工集团上海有限公司 Laser point cloud loopback detection method and system suitable for underground roadway

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019169540A1 (en) * 2018-03-06 2019-09-12 斯坦德机器人(深圳)有限公司 Method for tightly-coupling visual slam, terminal and computer readable storage medium
CN111563442A (en) * 2020-04-29 2020-08-21 上海交通大学 Slam method and system for fusing point cloud and camera image data based on laser radar
CN112665575A (en) * 2020-11-27 2021-04-16 重庆大学 SLAM loop detection method based on mobile robot
CN112767490A (en) * 2021-01-29 2021-05-07 福州大学 Outdoor three-dimensional synchronous positioning and mapping method based on laser radar
CN112907491A (en) * 2021-03-18 2021-06-04 中煤科工集团上海有限公司 Laser point cloud loopback detection method and system suitable for underground roadway

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
ALIVE一路向东: "重读《视觉SLAM十四讲》ch11 回环检测", Retrieved from the Internet <URL:https://zhuanlan.zhihu.com/p/378328947> *
QIN YE ET.AL: "A Novel Loop Closure Detection Approach Using Simplified Structure for Low-Cost LiDAR", 《HTTPS://DOI.ORG/10.3390/S20082299》 *
何炳蔚: "激光-机器视觉测量系统的数据采样及网格化", 《现代制造工程》 *
林俊钦;韩宝玲;罗庆生;赵嘉珩;葛卓;: "基于NDT匹配和改进回环检测的SLAM研究", 光学技术, no. 02 *
蔡斌斌: "面向无人驾驶的激光点云高精度建图与定位研究", 《中国优秀硕士学位论文全文数据库 (工程科技Ⅱ辑)》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115047487A (en) * 2022-03-11 2022-09-13 武汉科技大学 Loop detection method and device based on point cloud intensity and height information
CN115200588A (en) * 2022-09-14 2022-10-18 煤炭科学研究总院有限公司 SLAM autonomous navigation method and device for mobile robot
CN115200588B (en) * 2022-09-14 2023-01-06 煤炭科学研究总院有限公司 SLAM autonomous navigation method and device for mobile robot
CN116363371A (en) * 2023-05-26 2023-06-30 山东大学 Point cloud segmentation method based on inter-frame similarity

Also Published As

Publication number Publication date
CN113516682B (en) 2023-08-11

Similar Documents

Publication Publication Date Title
CN113516682A (en) Loop detection method of laser SLAM
CN110322511B (en) Semantic SLAM method and system based on object and plane features
CN105606102B (en) Grid model based PDR indoor positioning method and system
CN106780631B (en) Robot closed-loop detection method based on deep learning
CN113313763B (en) Monocular camera pose optimization method and device based on neural network
CN110136174B (en) Target object tracking method and device
CN113592015B (en) Method and device for positioning and training feature matching network
CN116592897B (en) Improved ORB-SLAM2 positioning method based on pose uncertainty
CN114187418A (en) Loop detection method, point cloud map construction method, electronic device and storage medium
CN114219022B (en) Multi-sensor multi-target tracking method combining cluster analysis and particle swarm optimization algorithm
WO2020194079A1 (en) Method and system for performing localization of an object in a 3d
CN117213470B (en) Multi-machine fragment map aggregation updating method and system
CN116817887B (en) Semantic visual SLAM map construction method, electronic equipment and storage medium
US20230281867A1 (en) Methods performed by electronic devices, electronic devices, and storage media
CN117053779A (en) Tightly coupled laser SLAM method and device based on redundant key frame removal
CN113643322B (en) Dynamic object detection method based on deep Labv3+ _SLAM
Yao et al. Dynamic feature point tracking in an image sequence
CN115187614A (en) Real-time simultaneous positioning and mapping method based on STDC semantic segmentation network
CN115239902A (en) Method, device and equipment for establishing surrounding map of mobile equipment and storage medium
CN115690737A (en) Parking space detection method and device, electronic equipment and storage medium
CN115239974A (en) Vision synchronous positioning and map construction closed-loop detection method integrating attention mechanism
CN118351186A (en) SLAM system and electronic equipment thereof
CN114332222B (en) Autonomous positioning method for dynamic scene
CN114529577B (en) Road side visual angle multi-target tracking method
CN115393432B (en) Robot positioning method, storage medium and equipment for eliminating static characteristic mismatching

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant