CN110645974A - Mobile robot indoor map construction method fusing multiple sensors - Google Patents

Mobile robot indoor map construction method fusing multiple sensors Download PDF

Info

Publication number
CN110645974A
CN110645974A CN201910915091.4A CN201910915091A CN110645974A CN 110645974 A CN110645974 A CN 110645974A CN 201910915091 A CN201910915091 A CN 201910915091A CN 110645974 A CN110645974 A CN 110645974A
Authority
CN
China
Prior art keywords
mobile robot
pose
constructing
information
vertex
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910915091.4A
Other languages
Chinese (zh)
Other versions
CN110645974B (en
Inventor
刘冉
秦正泓
张华�
何永平
肖宇峰
付文鹏
张静
刘满禄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southwest University of Science and Technology
Original Assignee
Southwest University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southwest University of Science and Technology filed Critical Southwest University of Science and Technology
Priority to CN201910915091.4A priority Critical patent/CN110645974B/en
Publication of CN110645974A publication Critical patent/CN110645974A/en
Application granted granted Critical
Publication of CN110645974B publication Critical patent/CN110645974B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a construction method of an indoor map of a mobile robot fusing multiple sensors, which comprises the following steps: respectively acquiring distance information between the mobile robot and an anchor point, pose information of the mobile robot and environment information through a UWB (ultra wide band), a speedometer and a laser radar; constructing a vertex-constraint graph according to the distance information, the pose information and the environment information; optimizing the vertex-constraint graph through a graph optimization algorithm to obtain optimized track data of the mobile robot; and constructing a grid map through the optimized mobile robot track data and the environment information. The odometer provided by the invention provides accurate pose change of the robot in a short time, integrates UWB positioning information to provide accurate pose change for a long time, and performs map construction by combining data of the laser radar, thereby solving the problem of low precision of the laser radar in constructing an indoor complex environment map.

Description

Mobile robot indoor map construction method fusing multiple sensors
Technical Field
The invention belongs to the technical field of indoor maps of mobile robots, and particularly relates to a construction method of an indoor map of a mobile robot integrating multiple sensors.
Background
In recent years, mobile robot technology plays an important role in the industrial field, the medical field and the service field, and is well applied to harmful and dangerous occasions such as the national defense field, the space exploration field and the like. In the research field of mobile robots, SLAM has been a popular research topic, and provides a navigation map and a real-time position for a robot, which are the prerequisites for the robot to perform path planning and path tracking, so that it occupies an important position in mobile robot navigation.
Because the laser radar has the advantages of high precision, wide range and high transmission speed, the laser radar is increasingly widely applied to the navigation of the mobile robot, and the construction technology of the indoor environment map based on the laser scanning system is widely applied to the navigation of the robot and is commonly used for robot positioning, construction of the environment map and path planning. The general laser scanner is very expensive, and although many cheap lasers are available on the market at present, the measurement range is limited and the resolution is low. The odometer of the robot can be obtained through a photoelectric encoder, and the error caused by the odometer is larger and larger along with the increase of time, so that the position and posture estimation of the robot has serious deviation. The two-dimensional lidar has a severely reduced mapping accuracy when facing a complex indoor environment.
Disclosure of Invention
Aiming at the defects in the prior art, the method for constructing the indoor map of the mobile robot fusing the multiple sensors solves the problem of low indoor map precision in the prior art.
In order to achieve the purpose of the invention, the invention adopts the technical scheme that: a mobile robot indoor map construction method fusing multiple sensors comprises the following steps:
s1, respectively acquiring distance information between the mobile robot and an anchor point, pose information of the mobile robot and environment information through UWB, a speedometer and a laser radar;
s2, constructing a vertex-constraint graph according to the distance information, the pose information and the environment information;
s3, optimizing the vertex-constraint graph through a graph optimization algorithm to obtain optimized mobile robot track data;
and S4, constructing a grid map through the optimized mobile robot track data and the environment information.
Further, the step S1 includes the following steps:
s1.1, carrying a UWB (ultra wide band) label and a laser radar on a mobile robot, wherein the robot is provided with a speedometer and an encoder;
s1.2, acquiring distance information between the mobile robot and an anchor point through a UWB (ultra wide band) label;
s1.3, acquiring pose data of the robot through a speedometer, wherein the speedometer acquires data through an encoder;
and S1.4, scanning by a laser radar to obtain environment information.
Further, the step S2 includes the following steps:
s2.1, forming a vertex according to the pose information of the mobile robot, and constructing an edge based on the odometer to obtain a first initial image;
s2.2, adding UWB constraint on the first initial graph, and constructing an edge based on UWB to obtain a second initial graph;
and S2.3, performing closed-loop detection through the environment data, adding a laser-based edge on the second initial graph, and constructing a laser-closed-loop edge to obtain a vertex-constraint graph.
Further, the laser-closed loop edge constructing step in step S2.3 is as follows:
a1, constructing a source point cloud set Q ═ { Q ═ Q1,q2,…,qNAnd a target point cloud set P ═ P1,p2,…,pN};
A2, constructing a rotation matrix R and a translation matrix T of the target point cloud set P, and constructing a target function E (R, T) through the rotation matrix R and the translation matrix T;
a3, setting a threshold, and judging whether E (R, T) is smaller than the threshold, if yes, judging the laser-closed loop is closed, and finishing the construction of the laser-closed loop edge, otherwise, entering the step A4;
a4, substituting the rotation matrix R and the translation matrix T into a source point cloud set Q to obtain a point set M;
a5, substituting the point set M into a target point cloud set P to obtain a new rotation matrix R 'and a new translation matrix T';
a6 assigns R to R ', T to T', substitutes the updated R and T into the objective function E (R, T), and returns to step A3.
Further, the objective function E (R, T) is:
Figure BDA0002215880290000031
where N represents the total number of points in the target point cloud set, i 1,2iRepresenting the ith point, p, in the source point cloud setiRepresenting the ith point in the target point cloud set.
Further, the specific method for optimizing the vertex-constraint graph by the graph optimization algorithm in step S3 is as follows: and adjusting the pose vertex in the vertex-constraint graph to minimize an error function F (x) of the pose information and obtain the pose vertex which satisfies the constraint to the maximum extent.
Further, the error function f (x) is:
wherein x isiRepresenting pose vertices i, xjRepresenting a set of pose vertices j, C representing a constrained relationship between vertices of the graph,ΩijDenotes xiAnd xjAn information matrix of observed values between, eij(xi,xj,zij) Denotes xiAnd xjSatisfies the constraint zijDegree of (a), zijRepresenting the actual observed value between the pose vertex i and the pose vertex j acquired by the sensor, and the actual observed value zijThe method comprises the steps of pose transformation between adjacent pose vertexes i and j, the distance between the pose vertexes i and anchor points j and the pose transformation between non-adjacent pose vertexes i and j.
Further, the step S4 includes the following sub-steps:
s4.1, dividing the environment into a plurality of grid units according to the environment information;
s4.2, calculating the probability l of each grid unit being occupiedt,ijProbability of being occupied lt,ijA grid cell of 0.8 or more is represented as an obstacle, a grid with an obstacle is represented in gray, and a grid without an obstacle is represented in white, to obtain a grid map.
Further, the probability l that each grid cell is occupiedt,ij
Figure BDA0002215880290000041
Wherein lt-1,ijRepresenting the probability that the time grid was occupied at the previous moment, p (m)ij|zt,xt) Representing a grid mijOccupied posterior probability, mijRepresenting a grid cell with abscissa i and ordinate j, ztIs an observed value at time t, xtThe pose of the mobile robot at time t.
The invention has the beneficial effects that:
(1) according to the invention, the distance information between the mobile robot and the anchor point, the pose information of the mobile robot and the environment information are respectively acquired by the UWB, the odometer and the laser radar to construct a map, and various sensors acquire data, so that the constructed map is more accurate.
(2) According to the method, the vertex-constraint graph is constructed, and the vertex-constraint graph is optimized by using the graph optimization algorithm, so that the pose information of the mobile robot is more accurate, and a foundation is laid for constructing an accurate indoor map.
(3) According to the invention, the odometer is used for obtaining the accurate pose change of the robot in a short time, the positioning information of the UWB is fused to obtain the accurate pose change of the robot in a long time, and meanwhile, the accurate map construction is carried out by combining the measurement data of the laser radar, so that the problem of low accuracy of the laser radar in constructing the indoor complex environment map is solved.
Drawings
Fig. 1 is a flowchart of an indoor map construction method of a mobile robot with multiple sensors integrated therein according to the present invention.
FIG. 2 is a graph comparing the results of the experiment according to the present invention.
Detailed Description
The following description of the embodiments of the present invention is provided to facilitate the understanding of the present invention by those skilled in the art, but it should be understood that the present invention is not limited to the scope of the embodiments, and it will be apparent to those skilled in the art that various changes may be made without departing from the spirit and scope of the invention as defined and defined in the appended claims, and all matters produced by the invention using the inventive concept are protected.
Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
As shown in fig. 1, a method for constructing an indoor map of a mobile robot with multiple sensors integrated therein includes the following steps:
s1, respectively acquiring distance information between the mobile robot and an anchor point, pose information of the mobile robot and environment information through UWB, a speedometer and a laser radar;
s2, constructing a vertex-constraint graph according to the distance information, the pose information and the environment information;
s3, optimizing the vertex-constraint graph through a graph optimization algorithm to obtain optimized mobile robot track data;
and S4, constructing a grid map through the optimized mobile robot track data and the environment information.
Step S1 includes the following steps:
s1.1, carrying a UWB (ultra wide band) label and a laser radar on a mobile robot, wherein the robot is provided with a speedometer and an encoder;
s1.2, acquiring distance information between the mobile robot and an anchor point through a UWB (ultra wide band) label;
s1.3, acquiring pose data of the robot through a speedometer, wherein the speedometer acquires data through an encoder;
and S1.4, scanning by a laser radar to obtain environment information.
Step S2 includes the following steps:
s2.1, forming a vertex according to the pose information of the mobile robot, and constructing an edge based on the odometer to obtain a first initial image;
s2.2, adding UWB constraint on the first initial graph, and constructing an edge based on UWB to obtain a second initial graph;
and S2.3, performing closed-loop detection through the environment data, adding a laser-based edge on the second initial graph, and constructing a laser-closed-loop edge to obtain a vertex-constraint graph.
The laser-closed loop edge calculation step in step S2.3 is as follows:
a1, constructing a source point cloud set Q ═ { Q ═ Q1,q2,…,qNAnd a target point cloud set P ═ P1,p2,…,pN};
A2, constructing a rotation matrix R and a translation matrix T of the target point cloud set P, and constructing a target function E (R, T) through the rotation matrix R and the translation matrix T;
a3, setting a threshold, and judging whether E (R, T) is smaller than the threshold, if yes, judging the laser-closed loop is closed, and finishing the construction of the laser-closed loop edge, otherwise, entering the step A4;
a4, substituting the rotation matrix R and the translation matrix T into a source point cloud set Q to obtain a point set M;
a5, substituting the point set M into a target point cloud set P to obtain a new rotation matrix R 'and a new translation matrix T';
a6 assigns R to R ', T to T', substitutes the updated R and T into the objective function E (R, T), and returns to step A3.
The objective function E (R, T) is:
Figure BDA0002215880290000061
where N represents the total number of points in the target point cloud set, i 1,2iRepresenting the ith point, p, in the source point cloud setiRepresenting the ith point in the target point cloud set.
The specific method for optimizing the vertex-constraint graph through the graph optimization algorithm in the step S3 is as follows: and adjusting the pose vertex in the vertex-constraint graph to minimize an error function F (x) of the pose information and obtain the pose vertex which satisfies the constraint to the maximum extent.
The error function F (x) is:
Figure BDA0002215880290000071
wherein x isiRepresenting pose vertices i, xjRepresenting a set of pose vertices j, C representing a constraint relationship between the vertices of the graph, ΩijDenotes xiAnd xjAn information matrix of observed values between, eij(xi,xj,zij) Denotes xiAnd xjSatisfies the constraint zijDegree of (a), zijRepresenting the actual observed value between the pose vertex i and the pose vertex j acquired by the sensor, and the actual observed value zijThe method comprises the steps of pose transformation between adjacent pose vertexes i and j, the distance between the pose vertexes i and anchor points j and the pose transformation between non-adjacent pose vertexes i and j.
Step S4 includes the following substeps:
s4.1, dividing the environment into a plurality of grid units according to the environment information;
s4.2, calculating the probability l of each grid unit being occupiedt,ijProbability of being occupied lt,ijThe grid unit of 0.8 or more is represented as an obstacle, and the grid with the obstacle is usedGray represents, and the grid without obstacles is represented in white to obtain a grid map.
Probability l that each grid cell is occupiedt,ij
Wherein lt-1,ijRepresenting the probability that the time grid was occupied at the previous moment, p (m)ij|zt,xt) Representing a grid mijOccupied posterior probability, mijRepresenting a grid cell with abscissa i and ordinate j, ztIs an observed value at the time t, wherein the observed value represents the distance and the angle between the mobile robot and the obstacle measured by the laser radar, xtThe pose of the mobile robot at time t.
In this embodiment, the experimental scenario is selected as a corridor with two opposite sides as walls.
As shown in fig. 2, a is a map constructed by an original odometer, b is a map constructed by the odometer and the UWB together, and c is a map constructed by the UWB, the odometer and the laser radar according to the present invention; through comparative analysis of experimental results, the construction of the wall body by the map constructed by the method is more accurate, and the obstacles are effectively identified.
According to the invention, the distance information between the mobile robot and the anchor point, the pose information of the mobile robot and the environment information are respectively acquired by the UWB, the odometer and the laser radar to construct a map, and various sensors acquire data, so that the constructed map is more accurate.
According to the method, the vertex-constraint graph is constructed, and the vertex-constraint graph is optimized by using the graph optimization algorithm, so that the pose information of the mobile robot is more accurate, and a foundation is laid for constructing an accurate indoor map.
According to the invention, the odometer is used for obtaining the accurate pose change of the robot in a short time, the positioning information of the UWB is fused to obtain the accurate pose change of the robot in a long time, and meanwhile, the accurate map construction is carried out by combining the measurement data of the laser radar, so that the problem of low accuracy of the laser radar in constructing the indoor complex environment map is solved.

Claims (9)

1. A mobile robot indoor map construction method fused with multiple sensors is characterized by comprising the following steps:
s1, respectively acquiring distance information between the mobile robot and an anchor point, pose information of the mobile robot and environment information through UWB, a speedometer and a laser radar;
s2, constructing a vertex-constraint graph according to the distance information, the pose information and the environment information;
s3, optimizing the vertex-constraint graph through a graph optimization algorithm to obtain optimized mobile robot track data;
and S4, constructing a grid map through the optimized mobile robot track data and the environment information.
2. The multi-sensor-integrated mobile robot indoor mapping method of claim 1, wherein the step S1 comprises the steps of:
s1.1, carrying a UWB (ultra wide band) label and a laser radar on a mobile robot, wherein the robot is provided with a speedometer and an encoder;
s1.2, acquiring distance information between the mobile robot and an anchor point through a UWB (ultra wide band) label;
s1.3, acquiring pose data of the robot through a speedometer, wherein the speedometer acquires data through an encoder;
and S1.4, scanning by a laser radar to obtain environment information.
3. The multi-sensor-integrated mobile robot indoor mapping method of claim 1, wherein the step S2 comprises the steps of:
s2.1, forming a vertex according to the pose information of the mobile robot, and constructing an edge based on the odometer to obtain a first initial image;
s2.2, adding UWB constraint on the first initial graph, and constructing an edge based on UWB to obtain a second initial graph;
and S2.3, performing closed-loop detection through the environment data, adding a laser-based edge on the second initial graph, and constructing a laser-closed-loop edge to obtain a vertex-constraint graph.
4. The method for constructing the indoor map of the mobile robot fusing the multiple sensors as claimed in claim 3, wherein the step S2.3 of constructing the laser-closed loop edge comprises the following steps:
a1, constructing a source point cloud set Q ═ { Q ═ Q1,q2,…,qNAnd a target point cloud set P ═ P1,p2,…,pN};
A2, constructing a rotation matrix R and a translation matrix T of the target point cloud set P, and constructing a target function E (R, T) through the rotation matrix R and the translation matrix T;
a3, setting a threshold, and judging whether E (R, T) is smaller than the threshold, if yes, judging the laser-closed loop is closed, and finishing the construction of the laser-closed loop edge, otherwise, entering the step A4;
a4, substituting the rotation matrix R and the translation matrix T into a source point cloud set Q to obtain a point set M;
a5, substituting the point set M into a target point cloud set P to obtain a new rotation matrix R 'and a new translation matrix T';
a6 assigns R to R ', T to T', substitutes the updated R and T into the objective function E (R, T), and returns to step A3.
5. The multi-sensor-fused mobile robot indoor mapping method according to claim 4, wherein the objective function E (R, T) is:
Figure FDA0002215880280000021
where N represents the total number of points in the target point cloud set, i 1,2iRepresenting the ith point, p, in the source point cloud setiRepresenting the ith point in the target point cloud set.
6. The method for constructing the indoor map of the mobile robot with the fusion of the multiple sensors as claimed in claim 1, wherein the specific method for optimizing the vertex-constraint map by the map optimization algorithm in the step S3 is as follows: and adjusting the pose vertex in the vertex-constraint graph to minimize an error function F (x) of the pose information and obtain the pose vertex which satisfies the constraint to the maximum extent.
7. The method for constructing the indoor map of the mobile robot with the fusion of the multiple sensors as claimed in claim 6, wherein the error function F (x) is:
Figure FDA0002215880280000031
wherein x isiRepresenting pose vertices i, xjRepresenting a set of pose vertices j, C representing a constraint relationship between the vertices of the graph, ΩijDenotes xiAnd xjAn information matrix of observed values between, eij(xi,xj,zij) Denotes xiAnd xjSatisfies the constraint zijDegree matrix of (z)ijRepresenting the actual observed value between the pose vertex i and the pose vertex j acquired by the sensor, and the actual observed value zijThe method comprises the steps of pose transformation between adjacent pose vertexes i and j, the distance between the pose vertexes i and anchor points j and the pose transformation between non-adjacent pose vertexes i and j.
8. The method for constructing an indoor map of a mobile robot with multiple sensors integrated therein according to claim 1, wherein the step S4 comprises the following substeps:
s4.1, dividing the environment into a plurality of grid units according to the environment information;
s4.2, calculating the probability l of each grid unit being occupiedt,ijProbability of being occupied lt,ijA grid cell of 0.8 or more is represented as an obstacle, a grid with an obstacle is represented in gray, and a grid without an obstacle is represented in white, to obtain a grid map.
9. The multi-sensor-fused mobile robot indoor mapping method of claim 8, wherein the probability/, that each grid cell is occupiedt,ijComprises the following steps:
Figure FDA0002215880280000032
wherein lt-1,ijRepresenting the probability that the time grid was occupied at the previous moment, p (m)ij|zt,xt) Representing a grid mijOccupied posterior probability, mijRepresenting a grid cell with abscissa i and ordinate j, ztIs an observed value at time t, xtThe pose of the mobile robot at time t.
CN201910915091.4A 2019-09-26 2019-09-26 Mobile robot indoor map construction method fusing multiple sensors Expired - Fee Related CN110645974B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910915091.4A CN110645974B (en) 2019-09-26 2019-09-26 Mobile robot indoor map construction method fusing multiple sensors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910915091.4A CN110645974B (en) 2019-09-26 2019-09-26 Mobile robot indoor map construction method fusing multiple sensors

Publications (2)

Publication Number Publication Date
CN110645974A true CN110645974A (en) 2020-01-03
CN110645974B CN110645974B (en) 2020-11-27

Family

ID=68992647

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910915091.4A Expired - Fee Related CN110645974B (en) 2019-09-26 2019-09-26 Mobile robot indoor map construction method fusing multiple sensors

Country Status (1)

Country Link
CN (1) CN110645974B (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111427047A (en) * 2020-03-30 2020-07-17 哈尔滨工程大学 Autonomous mobile robot S L AM method in large scene
CN111551184A (en) * 2020-03-27 2020-08-18 上海大学 Map optimization method and system for SLAM of mobile robot
CN111694006A (en) * 2020-05-29 2020-09-22 杭州电子科技大学 Navigation sensing system for indoor unmanned system
CN111862216A (en) * 2020-07-29 2020-10-30 上海高仙自动化科技发展有限公司 Computer equipment positioning method and device, computer equipment and storage medium
CN111862162A (en) * 2020-07-31 2020-10-30 湖北亿咖通科技有限公司 Loop detection method and system, readable storage medium and electronic device
CN111983636A (en) * 2020-08-12 2020-11-24 深圳华芯信息技术股份有限公司 Pose fusion method, pose fusion system, terminal, medium and mobile robot
CN112179353A (en) * 2020-09-30 2021-01-05 深圳市银星智能科技股份有限公司 Positioning method and device of self-moving robot, robot and readable storage medium
CN112230243A (en) * 2020-10-28 2021-01-15 西南科技大学 Indoor map construction method for mobile robot
CN112325872A (en) * 2020-10-27 2021-02-05 上海懒书智能科技有限公司 Positioning method of mobile equipment based on multi-sensor coupling
CN112362045A (en) * 2020-11-19 2021-02-12 佛山科学技术学院 Device for building graph based on laser SLAM and memory optimization method
CN112578798A (en) * 2020-12-18 2021-03-30 珠海格力智能装备有限公司 Robot map acquisition method and device, processor and electronic device
CN112833876A (en) * 2020-12-30 2021-05-25 西南科技大学 Multi-robot cooperative positioning method integrating odometer and UWB
CN112965063A (en) * 2021-02-11 2021-06-15 深圳市安泽智能机器人有限公司 Robot mapping and positioning method
CN113111081A (en) * 2021-04-16 2021-07-13 四川阿泰因机器人智能装备有限公司 Mobile robot mapping method under weak characteristic environment
CN113124880A (en) * 2021-05-18 2021-07-16 杭州迦智科技有限公司 Mapping and positioning method and device based on data fusion of two sensors
CN113252042A (en) * 2021-05-18 2021-08-13 杭州迦智科技有限公司 Laser and UWB (ultra wide band) based positioning method and device in tunnel
CN113311452A (en) * 2021-05-26 2021-08-27 上海新纪元机器人有限公司 Positioning method and system based on multiple sensors
CN113640738A (en) * 2021-06-24 2021-11-12 西南科技大学 Rotary target positioning method combining IMU and single-group UWB
CN113670290A (en) * 2021-06-30 2021-11-19 西南科技大学 Mobile robot indoor map construction method based on multi-robot cooperation
CN113741473A (en) * 2021-09-13 2021-12-03 深圳本云国际发展有限公司 Photocatalyst mobile robot and map construction method
CN113848912A (en) * 2021-09-28 2021-12-28 北京理工大学重庆创新中心 Indoor map establishing method and device based on autonomous exploration
CN113916232A (en) * 2021-10-18 2022-01-11 济南大学 Map construction method and system for improving map optimization

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070293985A1 (en) * 2006-06-20 2007-12-20 Samsung Electronics Co., Ltd. Method, apparatus, and medium for building grid map in mobile robot and method, apparatus, and medium for cell decomposition that uses grid map
CN106643720A (en) * 2016-09-28 2017-05-10 深圳市普渡科技有限公司 Method for map construction based on UWB indoor locating technology and laser radar
CN107179080A (en) * 2017-06-07 2017-09-19 纳恩博(北京)科技有限公司 The localization method and device of electronic equipment, electronic equipment, electronic positioning system
CN109059942A (en) * 2018-08-22 2018-12-21 中国矿业大学 A kind of high-precision underground navigation map building system and construction method
CN109848996A (en) * 2019-03-19 2019-06-07 西安交通大学 Extensive three-dimensional environment map creating method based on figure optimum theory
CN109974712A (en) * 2019-04-22 2019-07-05 广东亿嘉和科技有限公司 It is a kind of that drawing method is built based on the Intelligent Mobile Robot for scheming optimization

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070293985A1 (en) * 2006-06-20 2007-12-20 Samsung Electronics Co., Ltd. Method, apparatus, and medium for building grid map in mobile robot and method, apparatus, and medium for cell decomposition that uses grid map
CN106643720A (en) * 2016-09-28 2017-05-10 深圳市普渡科技有限公司 Method for map construction based on UWB indoor locating technology and laser radar
CN107179080A (en) * 2017-06-07 2017-09-19 纳恩博(北京)科技有限公司 The localization method and device of electronic equipment, electronic equipment, electronic positioning system
CN109059942A (en) * 2018-08-22 2018-12-21 中国矿业大学 A kind of high-precision underground navigation map building system and construction method
CN109848996A (en) * 2019-03-19 2019-06-07 西安交通大学 Extensive three-dimensional environment map creating method based on figure optimum theory
CN109974712A (en) * 2019-04-22 2019-07-05 广东亿嘉和科技有限公司 It is a kind of that drawing method is built based on the Intelligent Mobile Robot for scheming optimization

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JENS-STEFFEN GUTMANN ET AL.: "Incremental Mapping of Large Cyclic Environments", 《PROC THE CONFERENCE ON INTELLIGENT ROBOTS & APPLICATIONS》 *

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111551184B (en) * 2020-03-27 2021-11-26 上海大学 Map optimization method and system for SLAM of mobile robot
CN111551184A (en) * 2020-03-27 2020-08-18 上海大学 Map optimization method and system for SLAM of mobile robot
CN111427047A (en) * 2020-03-30 2020-07-17 哈尔滨工程大学 Autonomous mobile robot S L AM method in large scene
CN111427047B (en) * 2020-03-30 2023-05-05 哈尔滨工程大学 SLAM method for autonomous mobile robot in large scene
CN111694006A (en) * 2020-05-29 2020-09-22 杭州电子科技大学 Navigation sensing system for indoor unmanned system
CN111862216B (en) * 2020-07-29 2023-05-26 上海高仙自动化科技发展有限公司 Computer equipment positioning method, device, computer equipment and storage medium
CN111862216A (en) * 2020-07-29 2020-10-30 上海高仙自动化科技发展有限公司 Computer equipment positioning method and device, computer equipment and storage medium
CN111862162A (en) * 2020-07-31 2020-10-30 湖北亿咖通科技有限公司 Loop detection method and system, readable storage medium and electronic device
CN111983636A (en) * 2020-08-12 2020-11-24 深圳华芯信息技术股份有限公司 Pose fusion method, pose fusion system, terminal, medium and mobile robot
CN112179353A (en) * 2020-09-30 2021-01-05 深圳市银星智能科技股份有限公司 Positioning method and device of self-moving robot, robot and readable storage medium
CN112325872A (en) * 2020-10-27 2021-02-05 上海懒书智能科技有限公司 Positioning method of mobile equipment based on multi-sensor coupling
CN112325872B (en) * 2020-10-27 2022-09-30 上海懒书智能科技有限公司 Positioning method of mobile equipment based on multi-sensor coupling
CN112230243A (en) * 2020-10-28 2021-01-15 西南科技大学 Indoor map construction method for mobile robot
CN112362045B (en) * 2020-11-19 2022-03-29 佛山科学技术学院 Device for building graph based on laser SLAM and memory optimization method
CN112362045A (en) * 2020-11-19 2021-02-12 佛山科学技术学院 Device for building graph based on laser SLAM and memory optimization method
CN112578798B (en) * 2020-12-18 2024-02-27 珠海格力智能装备有限公司 Robot map acquisition method and device, processor and electronic device
CN112578798A (en) * 2020-12-18 2021-03-30 珠海格力智能装备有限公司 Robot map acquisition method and device, processor and electronic device
CN112833876B (en) * 2020-12-30 2022-02-11 西南科技大学 Multi-robot cooperative positioning method integrating odometer and UWB
CN112833876A (en) * 2020-12-30 2021-05-25 西南科技大学 Multi-robot cooperative positioning method integrating odometer and UWB
CN112965063A (en) * 2021-02-11 2021-06-15 深圳市安泽智能机器人有限公司 Robot mapping and positioning method
CN112965063B (en) * 2021-02-11 2022-04-01 深圳市安泽智能机器人有限公司 Robot mapping and positioning method
CN113111081A (en) * 2021-04-16 2021-07-13 四川阿泰因机器人智能装备有限公司 Mobile robot mapping method under weak characteristic environment
CN113124880A (en) * 2021-05-18 2021-07-16 杭州迦智科技有限公司 Mapping and positioning method and device based on data fusion of two sensors
CN113252042B (en) * 2021-05-18 2023-06-13 杭州迦智科技有限公司 Positioning method and device based on laser and UWB in tunnel
CN113252042A (en) * 2021-05-18 2021-08-13 杭州迦智科技有限公司 Laser and UWB (ultra wide band) based positioning method and device in tunnel
CN113311452A (en) * 2021-05-26 2021-08-27 上海新纪元机器人有限公司 Positioning method and system based on multiple sensors
CN113640738A (en) * 2021-06-24 2021-11-12 西南科技大学 Rotary target positioning method combining IMU and single-group UWB
CN113670290A (en) * 2021-06-30 2021-11-19 西南科技大学 Mobile robot indoor map construction method based on multi-robot cooperation
CN113670290B (en) * 2021-06-30 2023-05-12 西南科技大学 Mobile robot indoor map construction method based on multi-robot cooperation
CN113741473A (en) * 2021-09-13 2021-12-03 深圳本云国际发展有限公司 Photocatalyst mobile robot and map construction method
CN113848912A (en) * 2021-09-28 2021-12-28 北京理工大学重庆创新中心 Indoor map establishing method and device based on autonomous exploration
CN113916232A (en) * 2021-10-18 2022-01-11 济南大学 Map construction method and system for improving map optimization
CN113916232B (en) * 2021-10-18 2023-10-13 济南大学 Map construction method and system for improving map optimization

Also Published As

Publication number Publication date
CN110645974B (en) 2020-11-27

Similar Documents

Publication Publication Date Title
CN110645974B (en) Mobile robot indoor map construction method fusing multiple sensors
CN107239076B (en) AGV laser SLAM method based on virtual scanning and distance measurement matching
CN111693047B (en) Visual navigation method for micro unmanned aerial vehicle in high-dynamic scene
CN112882053B (en) Method for actively calibrating external parameters of laser radar and encoder
CN111522022B (en) Dynamic target detection method of robot based on laser radar
CN113108773A (en) Grid map construction method integrating laser and visual sensor
CN112904358B (en) Laser positioning method based on geometric information
CN113776519B (en) AGV vehicle mapping and autonomous navigation obstacle avoidance method under lightless dynamic open environment
CN112444246B (en) Laser fusion positioning method in high-precision digital twin scene
CN114998276B (en) Robot dynamic obstacle real-time detection method based on three-dimensional point cloud
Ceriani et al. Pose interpolation slam for large maps using moving 3d sensors
CN115272596A (en) Multi-sensor fusion SLAM method oriented to monotonous texture-free large scene
CN116758153A (en) Multi-factor graph-based back-end optimization method for accurate pose acquisition of robot
CN116466712A (en) Autonomous cruising method and system based on CML-AVG laser radar trolley
Niewola et al. PSD–probabilistic algorithm for mobile robot 6D localization without natural and artificial landmarks based on 2.5 D map and a new type of laser scanner in GPS-denied scenarios
Ivancsits et al. Visual navigation system for small unmanned aerial vehicles
CN118310531A (en) Cross-scene positioning method and system for robot with coarse-to-fine point cloud registration
CN113741550B (en) Mobile robot following method and system
CN113376638A (en) Unmanned logistics trolley environment sensing method and system
Yang et al. AGV robot for laser-SLAM based method testing in automated container terminal
CN115019167B (en) Fusion positioning method, system, equipment and storage medium based on mobile terminal
CN115950414A (en) Adaptive multi-fusion SLAM method for different sensor data
Lee et al. Development of advanced grid map building model based on sonar geometric reliability for indoor mobile robot localization
Zhi et al. Research on Cartographer Algorithm based on Low Cost Lidar
Pan et al. LiDAR-IMU Tightly-Coupled SLAM Method Based on IEKF and Loop Closure Detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20201127

CF01 Termination of patent right due to non-payment of annual fee