CN113516682B - Loop detection method of laser SLAM - Google Patents

Loop detection method of laser SLAM Download PDF

Info

Publication number
CN113516682B
CN113516682B CN202110775299.8A CN202110775299A CN113516682B CN 113516682 B CN113516682 B CN 113516682B CN 202110775299 A CN202110775299 A CN 202110775299A CN 113516682 B CN113516682 B CN 113516682B
Authority
CN
China
Prior art keywords
frame
descriptor
current frame
point cloud
loop detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110775299.8A
Other languages
Chinese (zh)
Other versions
CN113516682A (en
Inventor
张立伟
刘洁琳
徐光进
何炳蔚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fuzhou University
Original Assignee
Fuzhou University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuzhou University filed Critical Fuzhou University
Priority to CN202110775299.8A priority Critical patent/CN113516682B/en
Publication of CN113516682A publication Critical patent/CN113516682A/en
Application granted granted Critical
Publication of CN113516682B publication Critical patent/CN113516682B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Abstract

The application relates to a loop detection method of laser SLAM, step 1: acquiring single-frame three-dimensional point cloud data and synchronous IMU information thereof; step 2: partitioning a current frame point cloud intoThe unit space is used for storing the unit space data into the unit spaceA matrix from which a first descriptor is constructed; storing the segmented point cloud data and IMU information into a B vector so as to construct a second descriptor; step 3: constructing a KDTree by using the second descriptor of the history frame in the step 2; step 4: carrying out nearest neighbor search by using a second descriptor, and finding n candidate similar frames from KDTree; and matching the candidate similar frames with the current point cloud according to the first descriptor, judging whether closed loops exist in the candidate similar frames according to whether the matching result accords with a threshold value, and if so, obtaining the relative rotation change of the current frame and the loop-back frame. The application realizes the loop detection and map construction method of the mobile robot with higher precision and stronger robustness in the outdoor scene and the scene with high similarity degree of the local road sections.

Description

Loop detection method of laser SLAM
Technical Field
The application relates to a loop detection method of laser SLAM, belonging to the field of mobile robots.
Background
With the development of the age, there is an increasing interest in using mobile robots for real environments, and the research direction of robots to automatically construct a map (Simultaneous Localization And Mapping, SLAM) of the surrounding environment and successfully locate themselves is a core problem in the field of mobile robots, but the ability of robots to automatically map the surrounding environment and successfully locate themselves is still very limited, despite several advantages. The current SLAM method only based on the laser radar is easy to generate error drift when the robot moves fast and the outdoor environment changes little.
The loop detection method can effectively solve the problem of motion drift in the laser SLAM field. The existing loop detection method is mainly started from feature vectors for processing point cloud features, such as a histogram, and the like, is large in calculation amount and has limitation when outdoor scene changes are small and robots travel to the same road section in opposite directions.
Disclosure of Invention
Therefore, the application aims to provide the loop detection method of the laser SLAM, which can reduce the calculation complexity of the point cloud and ensure good loop detection precision and robustness.
The application is realized by adopting the following scheme: a loop detection method of laser SLAM comprises the following steps:
step S1: acquiring single key frame three-dimensional environment point cloud data and synchronous inertial measurement unit (Inertial Measurement Unit, IMU) information, namely IMU information;
step S2: dividing the current frame point cloud space into R×L unit spaces, and storing the unit space data into A R×L A matrix from which a first descriptor is constructed; storing the segmented point cloud data and IMU information into B R+3 Vectors, from which a second descriptor is constructed;
step S3: constructing a KDTree by using the second descriptors of all the historical key frames; calculating a similarity value between the current frame and a second descriptor of the historical key frame, and using the inter-frame similarity value as a nearest neighbor search edge so as to find n candidate similar frames with the highest similarity with the current frame from KDTree;
step S4: performing loop matching on the candidate similar frames and the current frame according to the similarity value of the first descriptor of the candidate similar frames and the current frame in the step S3 to obtain a relative rotation value of the candidate similar frame with the highest similarity, using the relative rotation value and the IMU information as initial values of NDT matching to obtain relative pose change of the current frame and the candidate similar frame, and if the change amount is smaller than a threshold value, performing loop detection successfully and performing loop correction; otherwise, loop detection fails, and loop detection of the next frame is entered.
Further, the specific content of the step S1 is: taking the point cloud with the NaN value removed as an input point cloud of the loop detection method; and acquiring the synchronous IMU data information of the current frame, and integrating the acquired IMU information to obtain the position data of the current frame.
Further, the specific content of the step S2 is as follows:
taking a laser radar as an origin, dividing the current frame point cloud space into L columns in the clockwise direction according to the form of polar coordinates, dividing the current frame point cloud space into R rows in the radius increasing direction, dividing the current frame point cloud space into R multiplied by L unit spaces, selecting the point closest to the origin in each unit space as a unit space value, and sequentially storing the unit space values into A R×L The matrix is used for completing construction of the first descriptor;
counting the point cloud point number in the unit space, if the point number is larger than the threshold value, considering that the unit space is not empty, and setting the r line of the point cloud space of the current frame, r is E [1, R]The number of the non-empty cell spaces is taken as a vector B R+3 After all the R data values are calculated, the position data of the current point cloud is used as a vector B R+3 The construction of the second descriptor is completed.
Further, the similarity in step S3 is calculated as follows: constructing a KDTree by using the second descriptor, firstly carrying out Euclidean distance calculation on the last three values of the second descriptor of the current frame and the second descriptor of the historical frame, discarding the historical frame if the calculation result is higher than a threshold value, otherwise, calculating the distance between the second descriptor of the current frame and the second descriptor of the historical frame to obtain a candidate similar frame, wherein the distance calculation formula is as follows:
wherein b and b n Is the second descriptor of the mth frame point cloud and the nth frame point cloud.
Further, the specific content of step S4 is:
performing similarity calculation in the step S3 by using column vectors of the first descriptors of the candidate similar frames and the current frame; translating the first descriptors of the current frame according to the column vectors to obtain L first descriptors of the current frame, carrying out similarity calculation on the L first descriptors and candidate similar frames one by one, and selecting a value with the highest similarity result as a similarity value of the current frame and the candidate similar frames; selecting a frame with highest similarity with the current frame from n candidate similar frames as a loop detection frame, wherein a similarity judgment formula is as follows:
wherein s is m S is the first descriptor of the current frame n As the first descriptor of the candidate similar frame,for the ith column of the current frame, +.>A kth column that is a candidate similar frame;
if the similarity value between the x (x E [1, L ]) first descriptor of the current frame and the loop detection frame is highest, the relative rotation value Q between the current frame and the current frame is:
with Q and vector B R+3 And taking the last three values of the frame as initial pose, performing NDT registration of the current frame and the loop detection frame, if the registration result is smaller than a threshold value, considering loop detection to be successful, performing loop correction, otherwise, performing loop detection of the next key frame if the loop detection fails.
Compared with the prior art, the application has the following beneficial effects:
the method is suitable for scenes with significant feature changes, and the calculation complexity of the point cloud data is reduced; in an outdoor scene with insignificant local road section characteristic change and a scene with large difference between the loop running direction and the historical running direction, the method can reduce the complexity of point cloud calculation and ensure good loop detection precision and robustness.
Drawings
FIG. 1 is a flow chart of a method according to an embodiment of the application.
Fig. 2 is a trace truth value of a scene in a KITTI dataset 00 according to an embodiment of the application, and a trace comparison chart of the method according to the application.
Fig. 3 is a trace truth value versus trace of the method of the present application in the scene of the KITTI dataset 08 according to an embodiment of the present application.
Detailed Description
The application will be further described with reference to the accompanying drawings and examples.
It should be noted that the following detailed description is illustrative and is intended to provide further explanation of the application. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the present application. As used herein, the singular is also intended to include the plural unless the context clearly indicates otherwise, and furthermore, it is to be understood that the terms "comprises" and/or "comprising" when used in this specification are taken to specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof.
As shown in fig. 1, the present embodiment provides a loop detection method of a laser SLAM, including the following steps:
step S1: acquiring single key frame three-dimensional environment point cloud data and synchronous Inertial measurement unit (Inertial MeasurementUnit, IMU) information, namely IMU information;
step S2: dividing the current frame point cloud space into R×L unit spaces, and storing the unit space data into A R×L The matrix is formed by a matrix of,constructing a first descriptor therefrom; storing the segmented point cloud data and IMU information into B R+3 Vectors, from which a second descriptor is constructed;
step S3: constructing a KDTree (k-dimensional-tree) by using the second descriptors of all the historical key frames; calculating a similarity value between the current frame and a second descriptor of the historical key frame, and using the inter-frame similarity value as a nearest neighbor search edge so as to find n candidate similar frames with the highest similarity with the current frame from KDTree;
step S4: performing loop matching on the candidate similar frames and the current frame according to the similarity value of the first descriptor of the candidate similar frames and the current frame in the step S3 to obtain a relative rotation value of the candidate similar frame with the highest similarity, using the relative rotation value and the IMU information as initial values of NDT matching to obtain relative pose change of the current frame and the candidate similar frame, and if the change amount is smaller than a threshold value (such as 0.01), performing loop detection successfully and performing loop correction; if loop detection fails, loop detection of the next frame is entered;
in this embodiment, the specific content of step S1 is as follows: taking a point cloud with the NaN value (the interference point of the distance abnormality or the surface reflection in the point cloud data) removed as an input point cloud; and acquiring the synchronous IMU data information of the current point cloud frame, and integrating the acquired IMU information to obtain the position data of the current frame.
In this embodiment, the specific content of step S2 is as follows:
taking a laser radar as an origin, dividing the current frame point cloud space into L columns in the clockwise direction according to the form of polar coordinates, dividing the current frame point cloud space into R rows in the radius increasing direction, dividing the current frame point cloud space into R multiplied by L unit spaces, selecting the point closest to the origin in each unit space as a unit space value, and sequentially storing the unit space values into A R×L The matrix is used for completing construction of the first descriptor;
counting the number of point clouds in the unit space, if the number of points is larger than a threshold value (such as 30 points), considering that the unit space is not empty, and determining the rr E [1, R ] of the point cloud space of the current frame]Line r.epsilon.1, R]Number of non-empty cell spacesIs vector B R+3 After all the R data values are calculated, the position data of the current point cloud is used as a vector B R+3 The construction of the second descriptor is completed.
Storing the first descriptor and the second descriptor of the point cloud of each key frame in a matrix form and taking the index value of the corresponding key frame as a name into a descriptor library so as to facilitate subsequent calling;
in this embodiment, the similarity in step S3 is calculated as follows: constructing KDTree by using the second descriptor, firstly performing Euclidean distance calculation on the last three values of the second descriptor of the current frame and the second descriptor of the historical frame, discarding the historical frame if the calculation result is higher than a threshold value (such as 3 meters), otherwise, calculating the distance between the second descriptor of the current frame and the second descriptor of the historical frame to obtain a candidate similar frame, wherein the distance calculation formula is as follows:
wherein b and b n Is the second descriptor of the mth frame point cloud and the nth frame point cloud.
In this embodiment, the specific content of step S4 is as follows:
performing similarity calculation in the step S3 by using column vectors of the first descriptors of the candidate similar frames and the current frame; translating the first descriptors of the current frame according to the column vectors to obtain L first descriptors of the current frame, carrying out similarity calculation on the L first descriptors and candidate similar frames one by one, and selecting a value with the highest similarity result as a similarity value of the current frame and the candidate similar frames; selecting a frame with highest similarity with the current frame from n candidate similar frames as a loop detection frame, wherein a similarity judgment formula is as follows:
wherein s is m S is the first descriptor of the current frame n First descriptor for candidate similar frame,For the ith column of the current frame, +.>A kth column that is a candidate similar frame;
if the x < 1, L > first descriptor of the current frame has the highest similarity value with the loop detection frame, the relative rotation value Q between the current frame and the current frame is:
with Q and vector B R+3 And (3) taking the last three values of the frame as initial pose, performing NDT registration of the current frame and the loop detection frame, if the registration result is smaller than a threshold value (such as 0.01), considering loop detection to be successful, performing loop correction, otherwise, performing loop detection of the next key frame if loop detection fails.
Preferably, in this embodiment, the second descriptor is constructed by fusing the point cloud data subjected to the time consistency processing with the IMU data, and is used for quickly retrieving to obtain the candidate similar frames; and finally, registering the local map formed by the current frame and the candidate similar frame to finish loop detection.
Preferably, in this embodiment, the first and second embodiments,
step 1: acquiring single key frame point cloud data and synchronous IMU information thereof; the main implementation modes are as follows: removing interference points with abnormal distance or surface reflection in the point cloud data; acquiring synchronous IMU data information of a current point cloud frame; integrating the acquired IMU information to obtain the position data of the current point cloud frame;
step 2: dividing the current frame point cloud space into R×L unit spaces, and storing the unit space data respectivelyIn A R×L A matrix from which a first descriptor is constructed; storing the segmented point cloud data and IMU information into a B vector so as to construct a second descriptor; the main implementation modes are as follows: taking a laser radar as an origin, dividing the current frame point cloud space into L columns in the clockwise direction according to the form of polar coordinates, dividing the current frame point cloud space into R rows in the radius increasing direction, dividing the current frame point cloud space into R multiplied by L unit spaces, selecting the point closest to the origin in each unit space as a unit space value, and sequentially storing the unit space values into A R×L The matrix is used for completing construction of the first descriptor;
counting the number of point clouds in the unit space, if the number of points is greater than a threshold value (such as 30 points), considering that the unit space is not empty, taking the number of non-empty units in the unit space of the R-th row as an R-th data value of a vector B, after all R data values are calculated, storing the position data of the current point cloud into the vector B, and completing construction of a second descriptor;
storing the first descriptor and the second descriptor of the point cloud of each key frame in a matrix form and taking the index value of the corresponding key frame as a name into a descriptor library so as to facilitate subsequent calling;
step 3: constructing KDTree by the second descriptor, firstly performing Euclidean distance calculation on the last three values of the second descriptor of the current frame and the second descriptor of the historical frame, discarding the historical frame if the calculation result is higher than a threshold value (such as 3 meters), otherwise,
and obtaining candidate similar frames by calculating the similarity between the second descriptor of the current frame and the second descriptor of the historical frame, wherein the similarity calculation formula is as follows:
wherein b and b n A second descriptor which is an mth frame point cloud and an nth frame point cloud;
step 4: performing similarity calculation in the step 3 by using column vectors of the first descriptors of the candidate similar frames and the current frame; translating the first descriptors of the current frame according to the column vectors to obtain L first descriptors of the current frame, carrying out similarity calculation on the L first descriptors and candidate similar frames one by one, and selecting a value with the highest similarity result as a similarity value of the current frame and the candidate similar frames; selecting a frame with highest similarity with the current frame from n candidate similar frames as a loop detection frame, wherein a similarity judgment formula is as follows:
wherein s is m S is the first descriptor of the current frame n As the first descriptor of the candidate similar frame,for the ith column of the current frame, +.>A kth column that is a candidate similar frame;
if the similarity value between the x (x E [1, L ]) first descriptor of the current frame and the loop detection frame is highest, the relative rotation value Q between the current frame and the current frame is:
with Q and vector B R+3 And (3) taking the last three values of the frame as initial pose, performing NDT registration of the current frame and the loop detection frame, if the registration result is smaller than a threshold value (such as 0.01), considering loop detection to be successful, performing loop correction, otherwise, performing loop detection of the next key frame if loop detection fails.
In this embodiment, in order to verify the feasibility of the technical method of the present application, 00 sequences and 05 sequences with relatively large number of loops in the KITTI data set are selected to implement the scheme described in the embodiment, the specific results are shown in FIG. 2 (00 sequences) and FIG. 3 (05 sequences), the track comparison graph is a full-road track comparison result of a track (solid line) constructed by using the method and a track true value (dashed line, reference) of the sequence, and the rationality and effectiveness of the method are fully illustrated by the track comparison graph.
The foregoing description is only of the preferred embodiments of the application, and all changes and modifications that come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.

Claims (3)

1. A loop detection method of laser SLAM is characterized in that: the method comprises the following steps:
step S1: acquiring three-dimensional environmental point cloud data of a single key frame and information of a synchronous Inertial Measurement Unit (IMU) information;
step S2: dividing the current frame point cloud space into R×L unit spaces, and storing the unit space data into A R×L A matrix from which a first descriptor is constructed; storing the segmented point cloud data and IMU information into B R+3 Vectors, from which a second descriptor is constructed;
step S3: constructing a KDTree by using the second descriptors of all the historical key frames; calculating a similarity value between the current frame and a second descriptor of the historical key frame, and using the inter-frame similarity value as a nearest neighbor search edge so as to find n candidate similar frames with the highest similarity with the current frame from KDTree;
step S4: performing loop matching on the candidate similar frames and the current frame according to the similarity value of the first descriptor of the candidate similar frames and the current frame in the step S3 to obtain a relative rotation value of the candidate similar frame with the highest similarity, using the relative rotation value and the IMU information as initial values of NDT matching to obtain relative pose change of the current frame and the candidate similar frame, and if the change amount is smaller than a threshold value, performing loop detection successfully and performing loop correction; otherwise, loop detection fails, and loop detection of the next frame is entered;
the specific content of the step S2 is as follows:
taking a laser radar as an origin, dividing the current frame point cloud space into L columns in a clockwise direction according to a polar coordinate mode, dividing the current frame point cloud space into R rows in a radius increasing direction, dividing the current frame point cloud space into R multiplied by L unit spaces, and selecting the unit space closest to the origin from each unit spaceThe points are used as the unit space values, and the unit space values are sequentially stored into A R×L The matrix is used for completing construction of the first descriptor;
counting the point cloud point number in the unit space, if the point number is larger than the threshold value, considering that the unit space is not empty, and setting the r line of the point cloud space of the current frame, r is E [1, R]The number of the non-empty cell spaces is taken as a vector B R+3 After all the R data values are calculated, the position data of the current point cloud is used as a vector B R+3 Completing the construction of the second descriptor;
the specific content of the step S4 is as follows:
performing similarity calculation by using the column vectors of the first descriptors of the candidate similar frames and the current frame; translating the first descriptors of the current frame according to the column vectors to obtain L first descriptors of the current frame, carrying out similarity calculation on the L first descriptors and candidate similar frames one by one, and selecting a value with the highest similarity result as a similarity value of the current frame and the candidate similar frames; selecting a frame with highest similarity with the current frame from n candidate similar frames as a loop detection frame, wherein a similarity judgment formula is as follows:
wherein s is m S is the first descriptor of the current frame n As the first descriptor of the candidate similar frame,for the i-th column of the current frame,a kth column that is a candidate similar frame;
if the x < 1, L > first descriptor of the current frame has the highest similarity value with the loop detection frame, the relative rotation value Q between the current frame and the loop detection frame is:
with Q and vector B R+3 And taking the last three values of the frame as initial pose, performing NDT registration of the current frame and the loop detection frame, if the registration result is smaller than a threshold value, considering loop detection to be successful, performing loop correction, otherwise, performing loop detection of the next key frame if the loop detection fails.
2. The loop detection method of the laser SLAM according to claim 1, wherein: the specific content of the step S1 is as follows: taking the point cloud with the NaN value removed as an input point cloud; and acquiring the synchronous IMU data information of the current point cloud frame, and integrating the acquired IMU information to obtain the position data of the current frame.
3. The loop detection method of the laser SLAM according to claim 1, wherein: the similarity is calculated as follows: constructing a KDTree by using the second descriptor, firstly performing Euclidean distance calculation on the last three values of the second descriptor of the current frame and the second descriptor of the historical frame, discarding the historical frame if the calculation result is higher than a threshold value, otherwise, calculating the similarity of the second descriptor of the current frame and the second descriptor of the historical frame to obtain candidate similar frames, wherein a similarity calculation formula is as follows:
wherein b and b n Is the second descriptor of the mth frame point cloud and the nth frame point cloud.
CN202110775299.8A 2021-07-08 2021-07-08 Loop detection method of laser SLAM Active CN113516682B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110775299.8A CN113516682B (en) 2021-07-08 2021-07-08 Loop detection method of laser SLAM

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110775299.8A CN113516682B (en) 2021-07-08 2021-07-08 Loop detection method of laser SLAM

Publications (2)

Publication Number Publication Date
CN113516682A CN113516682A (en) 2021-10-19
CN113516682B true CN113516682B (en) 2023-08-11

Family

ID=78066455

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110775299.8A Active CN113516682B (en) 2021-07-08 2021-07-08 Loop detection method of laser SLAM

Country Status (1)

Country Link
CN (1) CN113516682B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115200588B (en) * 2022-09-14 2023-01-06 煤炭科学研究总院有限公司 SLAM autonomous navigation method and device for mobile robot
CN116363371B (en) * 2023-05-26 2023-08-01 山东大学 Point cloud segmentation method based on inter-frame similarity

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019169540A1 (en) * 2018-03-06 2019-09-12 斯坦德机器人(深圳)有限公司 Method for tightly-coupling visual slam, terminal and computer readable storage medium
CN111563442A (en) * 2020-04-29 2020-08-21 上海交通大学 Slam method and system for fusing point cloud and camera image data based on laser radar
CN112665575A (en) * 2020-11-27 2021-04-16 重庆大学 SLAM loop detection method based on mobile robot
CN112767490A (en) * 2021-01-29 2021-05-07 福州大学 Outdoor three-dimensional synchronous positioning and mapping method based on laser radar
CN112907491A (en) * 2021-03-18 2021-06-04 中煤科工集团上海有限公司 Laser point cloud loopback detection method and system suitable for underground roadway

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019169540A1 (en) * 2018-03-06 2019-09-12 斯坦德机器人(深圳)有限公司 Method for tightly-coupling visual slam, terminal and computer readable storage medium
CN111563442A (en) * 2020-04-29 2020-08-21 上海交通大学 Slam method and system for fusing point cloud and camera image data based on laser radar
CN112665575A (en) * 2020-11-27 2021-04-16 重庆大学 SLAM loop detection method based on mobile robot
CN112767490A (en) * 2021-01-29 2021-05-07 福州大学 Outdoor three-dimensional synchronous positioning and mapping method based on laser radar
CN112907491A (en) * 2021-03-18 2021-06-04 中煤科工集团上海有限公司 Laser point cloud loopback detection method and system suitable for underground roadway

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
激光-机器视觉测量系统的数据采样及网格化;何炳蔚;《现代制造工程》;全文 *

Also Published As

Publication number Publication date
CN113516682A (en) 2021-10-19

Similar Documents

Publication Publication Date Title
CN109631855B (en) ORB-SLAM-based high-precision vehicle positioning method
CN112179330B (en) Pose determination method and device of mobile equipment
CN110322511B (en) Semantic SLAM method and system based on object and plane features
US8798357B2 (en) Image-based localization
CN110930495A (en) Multi-unmanned aerial vehicle cooperation-based ICP point cloud map fusion method, system, device and storage medium
CN113516682B (en) Loop detection method of laser SLAM
US9651388B1 (en) System and method for improved simultaneous localization and mapping
WO2012155121A2 (en) Systems and methods for estimating the geographic location at which image data was captured
CN112541944B (en) Probability twin target tracking method and system based on conditional variational encoder
CN113298014B (en) Closed loop detection method, storage medium and equipment based on reverse index key frame selection strategy
CN113514843A (en) Multi-subgraph laser radar positioning method and system and terminal
CN111899280A (en) Monocular vision odometer method adopting deep learning and mixed pose estimation
CN116403139A (en) Visual tracking and positioning method based on target detection
CN114187418A (en) Loop detection method, point cloud map construction method, electronic device and storage medium
Zhang et al. A visual-inertial dynamic object tracking SLAM tightly coupled system
CN117213470A (en) Multi-machine fragment map aggregation updating method and system
CN116592897B (en) Improved ORB-SLAM2 positioning method based on pose uncertainty
Jo et al. Mixture density-PoseNet and its application to monocular camera-based global localization
CN111951341A (en) Closed loop detection improvement method based on RGB-D SLAM
CN114413882B (en) Global initial positioning method and device based on multi-hypothesis tracking
CN116295354A (en) Unmanned vehicle active global positioning method and system
CN115962773A (en) Method, device and equipment for synchronous positioning and map construction of mobile robot
CN115187614A (en) Real-time simultaneous positioning and mapping method based on STDC semantic segmentation network
CN115239902A (en) Method, device and equipment for establishing surrounding map of mobile equipment and storage medium
CN114219022A (en) Multi-sensor multi-target tracking method combining cluster analysis and particle swarm optimization algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant