CN113111081A - Mobile robot mapping method under weak characteristic environment - Google Patents

Mobile robot mapping method under weak characteristic environment Download PDF

Info

Publication number
CN113111081A
CN113111081A CN202110408720.1A CN202110408720A CN113111081A CN 113111081 A CN113111081 A CN 113111081A CN 202110408720 A CN202110408720 A CN 202110408720A CN 113111081 A CN113111081 A CN 113111081A
Authority
CN
China
Prior art keywords
matrix
robot
map
mapping
cloud data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110408720.1A
Other languages
Chinese (zh)
Inventor
彭倍
曾双耀
邵继业
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Artigent Robotics Equipment Co ltd
Original Assignee
Sichuan Artigent Robotics Equipment Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Artigent Robotics Equipment Co ltd filed Critical Sichuan Artigent Robotics Equipment Co ltd
Priority to CN202110408720.1A priority Critical patent/CN113111081A/en
Publication of CN113111081A publication Critical patent/CN113111081A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/23Updating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases

Abstract

The invention discloses a mobile robot mapping method under a weak characteristic environment, which is applied to the field of mobile robots and aims at the problem of low mapping precision under the weak characteristic environment in the prior art, the mobile robot mapping method based on a map optimization SLAM algorithm improves scanning matching, can effectively improve the mapping quality of a mobile robot, and specifically comprises the following steps: in the graph optimization SLAM, the pose of the robot is used as a vertex, the relationship between the poses forms an edge (edge), once the graph is formed, the pose of the robot is adjusted through an improved scanning matching algorithm to meet the constraint of the edges as much as possible, and therefore the final graph is obtained; the method of the invention can obviously improve the drawing accuracy of the weak characteristic environment.

Description

Mobile robot mapping method under weak characteristic environment
Technical Field
The invention belongs to the field of mobile robots, and particularly relates to a mobile robot map building technology.
Background
At present, most indoor autonomous mobile robots adopt sensors such as code wheels, gyroscopes, IMUs and laser radars to acquire corresponding information, and real-time postures of the robots are estimated through a fusion algorithm. The code wheel is similar to an automobile odometer, the moving distance of the robot is calculated by calculating the number of turns of rotation of the wheel hub, but the robot is difficult to avoid slipping in the moving process, the code wheel detection errors are caused by the influence of factors such as tire abrasion and the like, and the code wheel detection errors can be gradually accumulated along with the movement of the robot, so that the error accumulation can not be effectively eliminated. A gyroscope and an IMU (Inertial measurement unit) have a null shift (null shift is an output when an input is 0) due to electrical characteristics of an electronic device, and errors are gradually accumulated with time, and error accumulation cannot be effectively eliminated.
In environments such as a similar prison, because the environment similarity is large, namely a weak feature environment is presented, a great challenge is brought to effective mapping of a robot, the weak feature environment is that a small number of similar straight line features exist in the environment, but the corner features are few, and the similarity of feature points can directly cause mapping drift or errors.
Disclosure of Invention
In order to solve the technical problem, the invention provides a mobile robot mapping method under a weak characteristic environment.
The technical scheme adopted by the invention is as follows: a mobile robot mapping method under a weak characteristic environment is characterized in that the pose of a robot is used as a vertex, the pose relation is used as an edge, and the mapping is completed by optimizing the vertex to meet the constraint formed by the edge.
The method specifically comprises the following steps:
s1, acquiring pose data of the robot through a sensor;
s2, moving the robot to the next position, and collecting point cloud data by using a voxel grid filter;
s3, calculating the collected point cloud data by adopting an improved scanning matching algorithm, optimizing a known sub-map if the scanning matching result is the known sub-map, and then executing the step S4; otherwise, inserting a new pose point, updating the sub-map, and executing the step S4;
s4, judging whether the image building is finished or not, and if the image building is finished, finishing the image building; otherwise, the process returns to step S2.
Step S3 specifically includes the following substeps:
s31, taking two frames of point cloud data, equally dividing the two frames of point cloud data into k groups respectively, calculating respective centroids of the two frames of point cloud data, and removing the group if the centroids are in the divided groups;
s32, respectively taking a group from the two frames of point cloud data processed in the step S31 to form a matrix W, and calculating a rotation matrix and a translation matrix of the matrix W;
s33, calculating the distance between two frames of point clouds according to the rotation matrix and the translation matrix, if the distance is smaller than a set threshold value, indicating that the two groups are matched, otherwise, not matching;
s34, repeating the steps S32-S33 until the matching of all groups is completed, judging whether the number of the matched groups meets the requirement, if so, successfully matching, optimizing the known sub-map, and then executing the step S4; otherwise, inserting a new pose point, updating the sub-map, and executing the step S4.
Step S32 specifically includes: if the matrix W is full-rank, calculating a rotation matrix and a translation matrix of the matrix W by singular value decomposition; otherwise, pass through
Figure BDA0003023309850000021
Minimizing E (R, t), and iteratively solving a rotation matrix and a translation matrix of the resolving matrix W, wherein R represents the rotation matrix of the matrix W, t represents the translation matrix of the matrix W, and piRepresenting the p-th group of i-th point cloud data, qiAnd representing the jth point cloud data of the qth group, and n representing the number of point clouds of each group.
Step S34, inserting a new pose point, and updating the sub-map, specifically: and when the matching fails, judging that the current position of the robot is a new pose point, and if the robot is not on the known sub-map, inserting the new pose point and updating the sub-map.
The invention has the beneficial effects that: the method of the invention uses the pose of the robot as a node or a vertex (vetex), the relationship between the poses forms an edge (edge), and then adopts improved scanning matching calculation to adjust the pose of the robot to meet the constraint formed by the edges as much as possible, thereby obtaining the final mapping; the method of the invention has the following advantages:
1. the map building precision for the weak characteristic environment is obviously improved compared with that of some open source SLAM algorithms;
2. because iteration is not needed, the algorithm time complexity is low, and the real-time performance of the mapping is good.
Drawings
FIG. 1 is a schematic diagram illustrating an information matrix acquisition process of an optimized SLAM algorithm;
wherein, FIG. 1(a) shows an observation landmark m1FIG. 1(b) is a schematic view of a robot from x1Move to x2Fig. 1(c) is a schematic view after several steps of movement;
FIG. 2 is a flow chart of a method of the present invention;
FIG. 3 is a simulation environment provided by an embodiment of the present invention;
fig. 4 is a diagram building result of the gmaping algorithm provided by the embodiment of the present invention;
FIG. 5 is a graph building result of the Hector-SLAM algorithm provided by the embodiment of the present invention;
FIG. 6 is a diagram building result of the Cartogrer-SLAM algorithm provided by the embodiment of the present invention;
FIG. 7 is a graph illustrating the results of the method of the present invention;
FIG. 8 is a comparison of pose errors for the method of the present invention and prior art;
FIG. 9 is a comparison of the angular error of the method of the present invention with that of the prior art.
Detailed Description
In order to facilitate the understanding of the technical contents of the present invention by those skilled in the art, the present invention will be further explained with reference to the accompanying drawings.
The invention provides a mobile robot mapping method in a weak characteristic environment. Scanning matching is improved mainly based on a map optimization SLAM (Simultaneous Localization and mapping) algorithm, and the map building quality of the mobile robot can be effectively improved. In graph-optimized SLAM, the pose of the robot is a node or vertex (vetex), the relationship between poses forms an edge (edge), such as an odometry relationship between time t '+ 1 and time t', or a pose transformation matrix calculated by visionAnd (6) forming edges. Once the graph is constructed, the pose of the robot is adjusted to meet the constraints of the edge construction as much as possible. FIG. 1 is a schematic diagram illustrating an information matrix acquisition process of the optimized SLAM algorithm, wherein FIG. 1(a) is an observed landmark m1FIG. 1(b) is a schematic view of a robot from x1Move to x2Fig. 1(c) is a schematic view after several steps of movement; graph optimization SLAM can be broken down into two tasks:
1. and constructing a graph. The pose of the robot is taken as a vertex, and the pose relationship is taken as an edge and is called as a front end.
2. And (5) optimizing the graph. And adjusting the pose vertex of the robot to meet the constraint of the edge as much as possible, and the pose vertex is called as a rear end.
In the graph optimization process: data are accumulated firstly, and the pose of the robot is the constructed peak. The edges are relationships between poses, which may be calculated by encoder data, calculated by ICP matching, or closed loop detected.
Because the algorithm based on ICP matching under the indoor environment is complex, and the ICP algorithm is highly dependent on initial estimation, the practical application range of the ICP algorithm is limited, and an improved scanning matching algorithm based on the laser radar is provided. Because the graph construction method is not a main part of algorithm consumption and precision improvement, the algorithm only optimizes the scanning matching in the optimization graph, and the algorithm flow chart is shown in FIG. 2; the method comprises the following steps:
s1: and collecting data input by sensors such as a laser radar and a speedometer. Those skilled in the art should understand that in this step, the laser radar collects point cloud data, and the odometer collects movement distance data.
S2: graph construction (front end) is performed by using a graph optimization SLAM algorithm, the robot moves to the next position, and point cloud data is collected by using a voxel grid filter.
S3: improved scan matching algorithm:
s3.1: taking point cloud data of two frames, respectively recording as a P point set and a Q point set, wherein the P point set and the Q point set are respectively divided into k groups according to
Figure BDA0003023309850000031
Calculate each set of centroids, and remove the centroids. N is a radical ofkNumber of points, x, representing the k-th groupiRepresenting the kth set of ith point cloud data coordinates.
S3.2: calculate matrix W (available from equation (5)), and if W is full rank, calculate R and t from singular value decomposition, i.e. rotation matrix R ═ VUTAnd the translation matrix t ═ pi-Rqi(ii) a Otherwise, pass through
Figure BDA0003023309850000041
Minimizing E (R, t), and iteratively solving for R and t.
Wherein p isiRepresenting the p-th group of i-th point cloud data, qiRepresenting the qth group of jth point cloud data;
solving R and t by a singular value decomposition method is deduced as follows, and any two groups of point clouds are set as follows:
Figure BDA0003023309850000042
r and t can be obtained by minimizing the following formula:
Figure BDA0003023309850000043
wherein, muqAnd mupThe centroids of the Q point cloud set and the P point cloud set are respectively. The above formula is expanded and simplified:
Figure BDA0003023309850000044
q′i、p′ithe points corresponding to the centroid are subtracted from the original point cloud, respectively. The above formula is related to R only, and therefore the R value is sought to be minimized. R is an orthogonal matrix, so RTR is 1, and the minimum value of the above formula is equivalent to finding the maximum value of the following formula:
Figure BDA0003023309850000045
represented by the traces of the matrix:
Figure BDA0003023309850000046
wherein
Figure BDA0003023309850000047
tr (-) denotes the trace of the matrix. Performing singular value decomposition on W to obtain W ═ U Λ VT,W=UΛVTIs the singular value decomposition of W, where U and V are both unitary orthogonal matrices and T represents the transpose of the matrix. U is called a left singular matrix; v is called a right singular matrix; a has a value only on the main diagonal, called singular value, and the other elements are all 0.
If a variable K is equal to VUTTherefore KW ═ VUTUΛVT=VΛVTTherefore KW is a positive definite symmetric matrix, and according to the theorem, for the positive definite symmetric matrix A and for any orthogonal matrix B, there is tr (A) ≧ tr (BA), therefore, RT=VUT. Therefore, the method comprises the following steps:
Figure BDA0003023309850000051
s3.3: determining a distance
Figure BDA0003023309850000052
If d is less than the threshold, two groups of matching are illustrated, and the number of matched groups is increased; otherwise, the set does not match. The value of the threshold here is based on: for example, the same object is scanned at different positions or angles, if the matching degree of two groups of point clouds is more than 95%, the threshold value is reasonable, different objects are selected to repeat the process, and the most appropriate threshold value of the matching degree is selected.
S3.4: judging whether all groups are matched, if so, judging whether the matching success rate meets the requirement, if so, representing success, otherwise, failing; if not all groups have completed matching, then matching continues.
S4: and judging whether the scanning matching result is successful according to the S3.4. If yes, optimizing the sub-map with known matching; otherwise, inserting a new pose point and updating the sub-map.
In step S4, it is determined whether all groups have completed matching, for example, the matching rate is set to 95%, if the matching rate of all groups reaches 95%, it indicates that matching is successful, otherwise, matching is failed.
When the matching fails, the current position of the robot is judged to be a new pose point and is not on the known sub-map, and the mapping process is continuous, so that the position of the new pose point can be deduced through the pose points on the established known sub-map.
S5: and judging whether the map building is finished or not. If yes, ending the algorithm; otherwise, go to S2. In this step, whether the image building is finished or not is judged according to whether the scanning of the actual environment characteristics is finished or not.
The embodiment is performed under the condition that the operating system is ubuntu16.04 and the simulation software is Gazebo7, a simulation environment as shown in fig. 3 is established, black circles represent a robot model with an RPLIDAR a1 laser radar, and the periphery represents an obstacle wall, and mainly simulates a weak feature environment (such as a long channel of a prison environment) with few feature points, similar features and obvious straight line segments.
Setting parameters: extracting two frames of point cloud data of the robot processed by the steps S1 and S2, setting the scanning angle of the robot to be 200 degrees, and scanning a feature point at every 1 degree by the laser radar of the type, so that the group number is divided into 10 groups according to the step S3.1, and each group has 20 adjacent point cloud data according to the step S3.1
Figure BDA0003023309850000053
Calculating the coordinates of the centroid, removing the centroid if the calculated centroid is exactly at any of the 20 points, calculating the W matrix according to step S3.2, determining whether the W matrix is non-singular, and if so, determining that the rotation matrix R is VUTAnd the translation matrix t ═ upi-Ruqj(ii) a Otherwise, iteratively calculating and solving R and t according to a formula E (R, t). Finally, d is calculated according to step S3.3, if d is at the threshold value of 0.012Within cm, the two groups of point clouds are successfully matched, otherwise, the point clouds are not successful. The subsequent steps traverse two frames of point cloud group data. Finally, it is determined according to steps S4 and S5.
Mapping analysis is carried out by respectively using open source algorithms Gmapping, Hector-SLAM, Cartogrrapher-SLAM and an improved mapping optimization SLAM algorithm, and the results are as follows:
as shown in fig. 4 to 7, the gmaping algorithm may have a drift phenomenon in the later stage of the map building, while the sector algorithm has a fault in the middle stage of the map building, which directly results in a map building failure, and the Cartographer-SLAM algorithm has a better effect and a best improved algorithm effect compared with the former two algorithms.
The statistical pose error and the angle error are shown in fig. 8 and fig. 9, respectively. The error of the Gmapping algorithm increases along with the increase of time, and is caused by drift in the later period; the error of the Hector-SLAM algorithm is very large from the beginning, and the environment with weak characteristics cannot be dealt with; the Cartogrer-SLAM algorithm has the best effect in an open source algorithm, but the stability of errors and the mapping effect are relatively poor compared with the improved algorithm, so that the algorithm for improving the scanning matching provided by the algorithm can achieve a better effect in the weak characteristic environment.
In fig. 8 and 9, the symbol t represents time, unit: second; the ordinate represents the pose error, i.e. the absolute square root of the sum of the squares of the x-direction and the y-direction, in units: centimeters; improved pro posal represents the Improved Graph SLAM algorithm.
The invention mainly provides an improved scanning matching algorithm for the conditions of less angular point characteristics and large environment similarity under the similar monitoring long corridor environment, and aims at the conditions of large complexity and unsatisfactory practical application of the ICP algorithm of the basic Graph SLAM.
It will be appreciated by those of ordinary skill in the art that the embodiments described herein are intended to assist the reader in understanding the principles of the invention and are to be construed as being without limitation to such specifically recited embodiments and examples. Various modifications and alterations to this invention will become apparent to those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the claims of the present invention.

Claims (5)

1. A mobile robot mapping method under a weak characteristic environment is characterized in that the robot pose is used as a vertex, pose relations are used as edges, and mapping is completed by optimizing the vertex to meet the constraint formed by the edges.
2. The method for mapping a mobile robot in a weak feature environment according to claim 1, comprising the following steps:
s1, acquiring pose data of the robot through a sensor;
s2, moving the robot to the next position, and collecting point cloud data by using a voxel grid filter;
s3, calculating the collected point cloud data by adopting an improved scanning matching algorithm, optimizing a known sub-map if the scanning matching result is the known sub-map, and then executing the step S4; otherwise, inserting a new pose point, updating the sub-map, and executing the step S4;
s4, judging whether the image building is finished or not, and if the image building is finished, finishing the image building; otherwise, the process returns to step S2.
3. The method for mapping a mobile robot in a weak feature environment as claimed in claim 2, wherein the step S3 specifically includes the following sub-steps:
s31, taking two frames of point cloud data, equally dividing the two frames of point cloud data into k groups respectively, calculating respective centroids of the two frames of point cloud data, and removing the group if the centroids are in the divided groups;
s32, respectively taking a group from the two frames of point cloud data processed in the step S31 to form a matrix W, and calculating a rotation matrix and a translation matrix of the matrix W;
s33, calculating the distance between two frames of point clouds according to the rotation matrix and the translation matrix, if the distance is smaller than a set threshold value, indicating that the two groups are matched, otherwise, not matching;
s34, repeating the steps S32-S33 until the matching of all groups is completed, judging whether the number of the matched groups meets the requirement, if so, successfully matching, optimizing the known sub-map, and then executing the step S4; otherwise, inserting a new pose point, updating the sub-map, and executing the step S4.
4. The method for mapping a mobile robot in a weak feature environment according to claim 3, wherein the step S32 specifically comprises: if the matrix W is full-rank, calculating a rotation matrix and a translation matrix of the matrix W by singular value decomposition; otherwise, pass through
Figure FDA0003023309840000011
Minimizing E (R, t), and iteratively solving a rotation matrix and a translation matrix of the resolving matrix W, wherein R represents the rotation matrix of the matrix W, t represents the translation matrix of the matrix W, and piRepresenting the p-th group of i-th point cloud data, qiAnd representing the jth point cloud data of the qth group, and n representing the number of point clouds of each group.
5. The method for mapping a mobile robot in a weak feature environment according to claim 4, wherein the step S34 inserts a new pose point and updates the sub-map, specifically: and when the matching fails, judging that the current position of the robot is a new pose point, and if the robot is not on the known sub-map, inserting the new pose point and updating the sub-map.
CN202110408720.1A 2021-04-16 2021-04-16 Mobile robot mapping method under weak characteristic environment Pending CN113111081A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110408720.1A CN113111081A (en) 2021-04-16 2021-04-16 Mobile robot mapping method under weak characteristic environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110408720.1A CN113111081A (en) 2021-04-16 2021-04-16 Mobile robot mapping method under weak characteristic environment

Publications (1)

Publication Number Publication Date
CN113111081A true CN113111081A (en) 2021-07-13

Family

ID=76717697

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110408720.1A Pending CN113111081A (en) 2021-04-16 2021-04-16 Mobile robot mapping method under weak characteristic environment

Country Status (1)

Country Link
CN (1) CN113111081A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109100731A (en) * 2018-07-17 2018-12-28 重庆大学 A kind of method for positioning mobile robot based on laser radar scanning matching algorithm
CN109974712A (en) * 2019-04-22 2019-07-05 广东亿嘉和科技有限公司 It is a kind of that drawing method is built based on the Intelligent Mobile Robot for scheming optimization
CN110333720A (en) * 2019-07-10 2019-10-15 国网四川省电力公司电力科学研究院 A kind of SLAM optimization method based on particle filter
CN110645974A (en) * 2019-09-26 2020-01-03 西南科技大学 Mobile robot indoor map construction method fusing multiple sensors
CN112230243A (en) * 2020-10-28 2021-01-15 西南科技大学 Indoor map construction method for mobile robot
US20210078174A1 (en) * 2019-09-17 2021-03-18 Wuyi University Intelligent medical material supply robot based on internet of things and slam technology

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109100731A (en) * 2018-07-17 2018-12-28 重庆大学 A kind of method for positioning mobile robot based on laser radar scanning matching algorithm
CN109974712A (en) * 2019-04-22 2019-07-05 广东亿嘉和科技有限公司 It is a kind of that drawing method is built based on the Intelligent Mobile Robot for scheming optimization
CN110333720A (en) * 2019-07-10 2019-10-15 国网四川省电力公司电力科学研究院 A kind of SLAM optimization method based on particle filter
US20210078174A1 (en) * 2019-09-17 2021-03-18 Wuyi University Intelligent medical material supply robot based on internet of things and slam technology
CN110645974A (en) * 2019-09-26 2020-01-03 西南科技大学 Mobile robot indoor map construction method fusing multiple sensors
CN112230243A (en) * 2020-10-28 2021-01-15 西南科技大学 Indoor map construction method for mobile robot

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
翁潇文等: "基于图优化的二维激光SLAM研究", 《自动化与仪表》 *

Similar Documents

Publication Publication Date Title
CN111076733B (en) Robot indoor map building method and system based on vision and laser slam
CN103413352A (en) Scene three-dimensional reconstruction method based on RGBD multi-sensor fusion
CN112183171A (en) Method and device for establishing beacon map based on visual beacon
CN114018248B (en) Mileage metering method and image building method integrating code wheel and laser radar
CN111145228A (en) Heterogeneous image registration method based on local contour point and shape feature fusion
CN110749895B (en) Laser radar point cloud data-based positioning method
CN113587933A (en) Indoor mobile robot positioning method based on branch-and-bound algorithm
CN111123953B (en) Particle-based mobile robot group under artificial intelligence big data and control method thereof
CN110992422A (en) Medicine box posture estimation method based on 3D vision
CN110688440B (en) Map fusion method suitable for less sub-map overlapping parts
CN109443355B (en) Visual-inertial tight coupling combined navigation method based on self-adaptive Gaussian PF
CN117392268A (en) Laser scanning mapping method and system based on self-adaption combined CPD and ICP algorithm
Luperto et al. Predicting performance of SLAM algorithms
CN111739066A (en) Visual positioning method, system and storage medium based on Gaussian process
CN113111081A (en) Mobile robot mapping method under weak characteristic environment
CN115950414A (en) Adaptive multi-fusion SLAM method for different sensor data
Thallas et al. Particle filter—Scan matching hybrid SLAM employing topological information
CN113554705B (en) Laser radar robust positioning method under changing scene
CN115962773A (en) Method, device and equipment for synchronous positioning and map construction of mobile robot
Zhao et al. 3D indoor map building with monte carlo localization in 2D map
Danping et al. Simultaneous localization and mapping based on Lidar
Watanabe et al. Robust localization with architectural floor plans and depth camera
Guo et al. 3D Lidar SLAM Based on Ground Segmentation and Scan Context Loop Detection
CN113009917B (en) Mobile robot map construction method based on closed loop detection and correction, storage medium and equipment
Hou et al. Visual Positioning System of Intelligent Robot Based on Improved Kalman Filter

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination