CN115388893B - Mobile robot mapping method based on genetic filtering SLAM algorithm - Google Patents

Mobile robot mapping method based on genetic filtering SLAM algorithm Download PDF

Info

Publication number
CN115388893B
CN115388893B CN202211027811.1A CN202211027811A CN115388893B CN 115388893 B CN115388893 B CN 115388893B CN 202211027811 A CN202211027811 A CN 202211027811A CN 115388893 B CN115388893 B CN 115388893B
Authority
CN
China
Prior art keywords
particle
particles
mobile robot
time
grid
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211027811.1A
Other languages
Chinese (zh)
Other versions
CN115388893A (en
Inventor
张静
张雪鹏
申枭
李云松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhu Research Institute of Xidian University
Original Assignee
Wuhu Research Institute of Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhu Research Institute of Xidian University filed Critical Wuhu Research Institute of Xidian University
Priority to CN202211027811.1A priority Critical patent/CN115388893B/en
Publication of CN115388893A publication Critical patent/CN115388893A/en
Application granted granted Critical
Publication of CN115388893B publication Critical patent/CN115388893B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a mobile robot mapping method based on a genetic filtering SLAM algorithm, which comprises the following implementation steps: constructing a mobile robot mapping system; performing grid division on the region to be built; initializing parameters; the sensor collects data; updating the grid state information of each particle at each moment of the drawing building module; and obtaining a mapping result of the mobile robot. The invention uses the genetic cross mutation method to cross particles corresponding to the pose after updating each particle and then carries out mutation, thus realizing updating of each particle information, achieving the purposes of reducing particle degradation and retaining the diversity of particles, ensuring high-precision map construction of larger areas and more complex areas, and using a point-to-line scanning matching method to replace a point-to-point scanning matching method in the prior art when carrying out scanning matching on radar point cloud data, effectively reducing the iteration times of scanning matching and reducing the cost of computing resources.

Description

Mobile robot mapping method based on genetic filtering SLAM algorithm
Technical Field
The invention belongs to the field of robots, relates to a mobile robot mapping method, and in particular relates to a mobile robot mapping method based on a genetic filtering SLAM algorithm, which can be used in the fields of logistics intelligent distribution, electric power intelligent inspection, port intelligent transportation and the like.
Background
Along with the rapid development of artificial intelligence technology, mobile robots play an increasing role in the civil field due to the excellent performance and reasonable price, and the mobile robots can perform accurate and reliable autonomous navigation and path planning by establishing high-precision two-dimensional maps in the fields of logistics intelligent distribution, electric power intelligent inspection, port intelligent transportation and the like, so that the problems of positioning and map building (Simultaneous Localization AND MAPPING, SLAM) during the popular research of the robot field are led out. Based on the consideration of the cost of the mobile robot, the airborne computing capacity of the mobile robot is poor, a particle filtering-based method is generally used for solving the SLAM problem, and in a 'Gmapping map building method of the mobile robot based on sparse pose adjustment' (application number: CN202010515565.9 application date: 2020.06.09 application publication number: CN 111427370A) of the Beijing university of architecture, a Gmapping map building method of the mobile robot based on sparse pose adjustment is disclosed, wherein the implementation steps of the method are as follows: s1, initializing the pose and distribution of particles; s2, scanning and matching; s3, calculating x j target distribution of the sampling positions; s4, calculating Gaussian approximation; s5, updating the weight of the ith particle; s6, updating the particle map; s3, S4 and S3 'pose diagram construction and S4' closed-loop approximation are carried out simultaneously; and finally generating a two-dimensional grid map. The invention improves the image construction precision of the original particle filter algorithm to a certain extent. However, the method has the defects that the method needs to perform repeated sampling in the process of eliminating the particle degradation, and as the sampling times are increased, the finally reserved particles are heavy particles and the offspring copied from the heavy particles, and the problems of particle degradation and particle diversity reduction are generated in the process, so that the drawing precision is influenced. The invention uses a point-to-point scanning matching method, the calculation cost is large, if the initial matching effect is not good, the iteration times of the matching algorithm can be increased, and the problems of low matching speed and low precision can occur when the calculation resources are insufficient.
Disclosure of Invention
The invention aims to overcome the defects in the prior art, and provides a mobile robot mapping method based on a genetic filtering SLAM algorithm, which is used for solving the technical problems of particle degradation and particle diversity reduction caused by repeated resampling of particles in the prior art, thereby influencing mapping accuracy and reducing mapping calculation resource cost.
In order to achieve the above purpose, the technical scheme adopted by the invention comprises the following steps:
(1) Building a mobile robot mapping system:
The mobile robot image construction system comprises a mobile robot which is arranged in an image construction area and is provided with a sensor, an image construction module and a motion module, and a remote upper computer which is arranged in an office area and comprises a remote control module and a display interface, wherein the sensor comprises a laser radar, an Inertial Measurement Unit (IMU) and an odometer;
(2) Grid division is carried out on the region to be built:
Dividing the region to be mapped into uniform c×r grids Δ={Δ12,...,Δc,...,ΔC},Δc={Δc,1c,2,...,Δc,r,...,Δc,R},, where C > 2, R > 2, Δ c represents a subset of row C including R grids, and Δ c,r represents the R-th grid in Δ c;
(3) Initializing parameters:
Initializing a map creation time t, ending the map creation time t end, and obtaining grid state information corresponding to a time delta of t as Odd t (delta); the particle set input by initializing SLAM algorithm contains K particles K is more than or equal to 15, and each particle L k at time t carries pose information P t k and grid state information/>, of the mobile robotLet t=1;
(4) The sensor collects data:
The mobile robot starts to move under the control of a control instruction sent by a remote upper computer, and a laser radar, an Inertial Measurement Unit (IMU) and an mileage meter in the sensor acquire radar point cloud data of I points in the surrounding environment at the moment t respectively Acceleration information a t of the mobile robot, moving distance information d t of the mobile robot and included angle information theta t with the x-axis direction, and uploading z t and track information u t={at,dtt composed of a t、dt、θt to a map building module, wherein I is more than or equal to 300,/>Radar data representing an i-th point at time t;
(5) The mapping module updates the grid state information of each particle at the time t:
(5a) The mapping module passes through pose information of each particle L k at t-1 moment And moving robot trajectory information u t-1, estimating a pose P t k of each particle L k at time t;
(5b) The mapping module scans and matches the radar point cloud data z t-1 at the time t-1 and the cloud data z t at the time t Lei Dadian to obtain a relative pose P re between the two point cloud data, updates the P t k through the P re, and obtains a particle set with the pose Pt'={Pt'1,Pt'2,...,Pt'k,...,Pt'K} after K particles are updated
(5C) Mapping module uses genetic cross-variation method for each particle in L' (t)Performing crossover and mutation to obtain particle set/>And judging the weight set/> corresponding to the L cv (t)Whether the calculated particle evaluation number N eff and a preset resampling particle number threshold N th meet N eff<Nth or not, if yes, resampling L cv (t) to obtain K result particles, otherwise, taking K particles in L cv (t) as result particles, and then taking N eff particles with the largest weight as result particlesAs effective particles;
(5d) The mapping module creates each effective particle according to the time t-1 Grid state information/>And pose information/>Each effective particle/>, at time tPose information P t"neff of (1) updating t time/>Grid state information/>
(6) Obtaining a mapping result of the mobile robot:
(6a) The upper computer generates each particle according to the time t Grid state information/>Grid state information/>, for each grid Δ c,r in (a)And a passable threshold σ and an obstacle threshold τ set in advance determine grid state information Odd tc,r of each grid Δ c,r): determining a grid delta c,r corresponding to Odd tc,r)≤σ、Oddtc,r)>τ、σ<Oddtc,r) less than or equal to tau as a passing state, an obstacle state and an unknown state respectively, and coating the three states into different colors to obtain a grid state diagram at the moment t;
(6b) Judging whether t & gtt end is true or not, if yes, taking the grid state diagram at the moment of t end as a two-dimensional grid map of the mobile robot, otherwise, executing the step (4) after t=t+1. Two-dimensional grid map
Compared with the prior art, the invention has the following advantages:
1. In the process of acquiring the two-dimensional grid map of the mobile robot based on the genetic filtering SLAM algorithm, the map building module firstly uses the genetic cross mutation method to carry out cross mutation on each particle and update the information of each particle, so that the purposes of reducing particle degradation and retaining the diversity of the particles are achieved.
2. When the radar point cloud data is scanned and matched, the point-to-line scanning and matching method is used for replacing the point-to-point scanning and matching method in the prior art, so that the iteration times of scanning and matching are effectively reduced, the calculation resource cost is reduced under the condition that the drawing precision is ensured, and the cost of the mobile robot is reduced.
Drawings
FIG. 1 is a flow chart of an implementation of the present invention;
FIG. 2 is a schematic diagram of a mobile robot mapping system according to the present invention;
FIG. 3 is a schematic diagram of the structure of the area to be mapped according to the present invention;
FIG. 4 is a graph comparing the results of the construction of the present invention and the prior art in small scenes;
Fig. 5 is a graph comparing the results of the mapping in the small scenes of the present invention and the prior art.
Detailed Description
The invention is described in further detail below with reference to the drawings and the specific examples.
Referring to fig. 1, the present invention includes the steps of:
Step 1) constructing a mobile robot mapping system: the mobile robot image construction system comprises a mobile robot which is arranged in an image construction area and is provided with a sensor, an image construction module and a motion module, and a remote upper computer which is arranged in an office area and comprises a remote control module and a display interface, wherein the sensor comprises a laser radar, an Inertial Measurement Unit (IMU) and an odometer;
The method comprises the steps of constructing a mobile robot mapping system shown in fig. 2, comprising a mobile robot which is arranged in a mapping area to be mapped and is provided with a sensor, a mapping module and a motion module, and a remote upper computer which is arranged in an office area and comprises a remote control module and a display interface, wherein the sensor comprises a laser radar, an Inertial Measurement Unit (IMU) and an odometer;
The length and width specification of the region to be mapped is 9m×5m.
Step 2) carrying out grid division on the region to be built:
dividing the area to be mapped into uniform c×r grids Δ={Δ12,...,Δc,...,ΔC},Δc={Δc,1c,2,...,Δc,r,...,Δc,R},, where C > 2, R > 2, Δ c represents a subset of the R grids in row C, Δ c,r represents the R-th grid in Δ c, in this embodiment c=r=1200;
the grid division can be adjusted in real time to meet the requirements of different image construction resolutions, and excessive invalid operation is avoided.
Step 3) initializing parameters:
Initializing a map creation time t, ending the map creation time t end, and obtaining grid state information corresponding to a time delta of t as Odd t (delta); the particle set input by initializing SLAM algorithm contains K particles K is more than or equal to 15, and each particle L k at time t carries pose information P t k and grid state information/>, of the mobile robotLet t=1, in this example k=30;
For different environments, the number of particles can be adjusted so that better mapping accuracy is achieved under the condition that the computing resources are satisfied.
Step 4) the sensor collects data:
The mobile robot starts to move under the control of a control instruction sent by a remote upper computer, and a laser radar, an Inertial Measurement Unit (IMU) and an mileage meter in the sensor acquire radar point cloud data of I points in the surrounding environment at the moment t respectively Acceleration information a t of the mobile robot, moving distance information d t of the mobile robot and included angle information theta t with the x-axis direction, and uploading z t and track information u t={at,dtt composed of a t、dt、θt to a map building module, wherein I is more than or equal to 300,/>Radar data indicating the I-th point at time t, i=359 in the present embodiment;
Step 5), the mapping module updates the grid state information of each particle at the moment t:
(5a) The mapping module passes through pose information of each particle L k at t-1 moment And mobile robot trajectory information u t-1, estimating pose P t k of each particle L k at time t, wherein the pose P t k estimation formula is:
Wherein, the symbol obeys the' state, and the mobile robot track information at the time of t=0 is u 0 = {0, 0};
p t k is used as an initial value of the scanning matching in the step (5 b), so that the probability of matching failure can be effectively reduced, the calculated amount of the scanning matching can be effectively reduced,
(5B) The mapping module scans and matches the radar point cloud data z t-1 at the time t-1 and the cloud data z t at the time t Lei Dadian to obtain a relative pose P re between the two point cloud data, updates the P t k through the P re, and obtains a particle set with the pose Pt'={Pt'1,Pt'2,...,Pt'k,...,Pt'K} after K particles are updatedThe implementation steps are as follows:
(5b1) The mapping module obtains points to be matched in z t-1 Data in z t and two points closest to it/>AndThe data of two points are expressed as vector/>
(5B2) The mapping module converts the pose P t k contained in the particles into a rotation matrix and a translation matrix as initial values of scanning matching, rotates z t-1 firstly according to the rotation matrix and the translation matrix, then translates to match with z t, and leads the matching error to be obtainedLess than a set matching threshold delta, wherein R and T represent rotation and translation operations on z t-1, respectively, i.e., relative pose P re,/>, between two point cloud dataThe transposed result of vector e i is shown, δ=0.1 in this example;
(5b3) The mapping module updates P t k through P re to obtain a particle set with the pose Pt'={Pt'1,Pt'2,...,Pt'k,...,Pt'K} after K particles are updated The updated formula for P t k is:
Pt'k=RPt k+T。
Wherein, the initial pose of the mobile robot at time t=0 is P 0=(x0,y00),x0、y0, which respectively represents the initial transverse and longitudinal coordinates on the region to be mapped when the mobile robot is stationary, θ 0 represents the angle between the mobile robot and the x-axis direction when the mobile robot is stationary, and the cloud data of the radar point cloud data of the mobile robot before the mobile robot does not start to move is selected at time t=0 and Lei Dadian
In the point-to-point scanning matching method in the prior art, euclidean distance between points is used as error measurement of an objective function, and the accuracy is reduced under the condition that the number of outliers in the point cloud data is large or the noise is large; in addition, the calculation cost for searching the corresponding points is large, the iteration number is increased when the initial matching effect is not good, the calculation performance requirement on the system is high, and the problems of low matching speed and low precision can occur if the calculation resources are insufficient. Therefore, when the problem of scanning matching is solved, the invention takes the distance from any point in z t-1 to the connecting line between two nearest points in z t as an error function, thereby improving the probability of successful matching and reducing the calculation cost.
(5C) Mapping module uses genetic cross-variation method for each particle in L' (t)Performing crossover and mutation to obtain particle set/>And judging the weight set/> corresponding to the L cv (t)Whether the calculated particle evaluation number N eff and a preset resampling particle number threshold N th meet N eff<Nth or not, if yes, resampling L cv (t) to obtain K result particles, otherwise, taking K particles in L cv (t) as result particles, and then taking N eff particles with the largest weight as result particlesAs effective particles, the implementation steps are:
(5c1) The mapping module is based on the t moment particles Grid state information/>And pose information P t'k, cloud data z t of Lei Dadian at time t, update particle/>And pair/>Updated results/>Normalizing to obtain normalized weight set/>Wherein the particles/>, are updatedWeights of (2) and weighting the set of weights/>Ownership of/>The formulas for normalization are respectively:
wherein the grid state information of each particle L k at time t=0 is Odd 0 (Δ);
(5c2) Mapping module pair The K weights in (1) are arranged in descending order and will/>The L particles smaller than the preset particle weight threshold W th in the corresponding particles form a small weight particle set L l, L is more than or equal to 7, the rest K-L particles form a large weight particle set L h, and then one particle x l、xh is randomly extracted from L l、Lh to perform cross operation to obtain a crossed particle set/>The formula for the cross operation on x l、xh is:
Wherein, New particles generated after the crossover operation are shown, α is the crossover parameter, α=0.35 in this example. Alpha can influence how much of the information a small weight particle passes to a large weight particle, the larger alpha the more information the small weight particle retains. In particular, information indicating that the small weight particles and the large weight particles are not exchanged when α=0; and when α=1, information indicating that the small weight particles and the large weight particles are interchanged.
(5C3) The mapping module pairs the particles after the cross operationPerforming mutation to obtain a particle set/>, after genetic crossover mutationAnd judging the weight set/> corresponding to the L cv (t)If the calculated particle evaluation number N eff and the preset resampling particle number threshold N th meet N eff<Nth, resampling all particles in L cv (t) to obtain K result particles, otherwise, taking all particles in L cv (t) as result particles, arranging the weight of the result particles in a descending order, and taking the first N eff as effective particles/>Wherein pair/>The variation is carried out, and the calculation formulas of N eff are respectively as follows:
Wherein the method comprises the steps of Represents new particles generated after mutation operation, p l is AND/>The weight of the random variable is independently and uniformly distributed, p l∈[0,1],pm is variation probability, and p m epsilon [0,1];
p m can influence the variation probability of the small-weight particles, and better particle pose and map state information can be obtained by varying the information in the small-weight particles, so that the final map building effect is influenced.
(5D) The mapping module creates each effective particle according to the time t-1Grid state information/>And pose information/>Each effective particle/>, at time tPose information P t"neff of (1) updating t time/>Grid state information/>The update formula is:
Step 6) obtaining a mapping result of the mobile robot:
(6a) The upper computer generates each particle according to the time t Grid state information/>Grid state information/>, for each grid Δ c,r And a passable threshold σ and an obstacle threshold τ set in advance determine grid state information Odd tc,r of each grid Δ c,r): determining a grid delta c,r corresponding to Odd tc,r)≤σ、Oddtc,r)>τ、σ<Oddtc,r) less than or equal to tau as a passing state, an obstacle state and an unknown state respectively, and coating the three states into different colors to obtain a grid state diagram at a moment t, wherein sigma=0.2, and tau=0.65;
(6b) Judging whether t & gtt end is true or not, if yes, taking the grid state diagram at the moment of t end as a two-dimensional grid map of the mobile robot, otherwise, executing the step (4) after t=t+1.
The technical effects of the present invention will be described below with reference to simulation experiments.
1. Simulation conditions and content:
The remote upper computer hardware platform of the simulation experiment is as follows: dell extraterrestrial notebook computer, the processor is NVIDIA NX. The software platform is as follows: ubuntu18.04 operating system, ROS media, and c++11. The host of the mobile robot adopts raspberry group 4B, and the host supports Bluetooth and WIFI functions.
The Gmapping mapping method of the present invention and the existing mobile robot are respectively subjected to 10 times of mapping comparison simulation under a small scene and a large scene, and the results are shown in table 1, fig. 4 and fig. 5.
2. Simulation result analysis:
The simulation experiment adopts the relative error of the map length to measure the estimated error of the map building, and the relative error calculation formula is as follows:
Where || represents the take absolute value operation.
TABLE 1
Measuring length Relative error Measuring width Relative error
Prior Art 9.24 2.667% 5.11 2.200%
The invention is that 8.95 0.556% 5.04 0.800%
Referring to table 1, it can be seen that the average value of the map length estimation results in 10 map building experiments is far better than that of the map length estimation results in the existing method, so that the map building accuracy is higher, and erroneous estimation can be reduced.
Referring to fig. 4, fig. 4 (a) is a map created by the prior art, fig. 4 (b) is a map created by the prior art, in which the map has a tilting phenomenon on the overall effect, and the edge contour of the lower right corner is not clear enough and has a more obvious granular feel, but the invention has no obvious tilting or overlapping phenomenon on the overall map creation effect, and more accurately displays the real scene of the experimental field.
Referring to fig. 5, fig. 5 (a) is a map created by the prior art, fig. 5 (b) is a map created by the present invention, in fig. 5 where the map is larger and the environment is more complex, because the computing resources consumed by the prior art are larger and the resampling times are too many, the overall effect of the map created by the prior art is inclined, the edge of the map is blurred, and the present invention presents the real scene of the experimental field more accurately.
The above description is only one specific example of the invention and does not constitute any limitation of the invention, and it will be apparent to those skilled in the art that various modifications and changes in form and details may be made without departing from the principles, construction of the invention, but these modifications and changes based on the idea of the invention are still within the scope of the claims of the invention.

Claims (5)

1. The mobile robot mapping method based on the genetic filtering SLAM algorithm is characterized by comprising the following steps of:
(1) Building a mobile robot mapping system:
The mobile robot image construction system comprises a mobile robot which is arranged in an image construction area and is provided with a sensor, an image construction module and a motion module, and a remote upper computer which is arranged in an office area and comprises a remote control module and a display interface, wherein the sensor comprises a laser radar, an Inertial Measurement Unit (IMU) and an odometer;
(2) Grid division is carried out on the region to be built:
Dividing the region to be mapped into uniform c×r grids Δ={Δ12,...,Δc,...,ΔC},Δc={Δc,1c,2,...,Δc,r,...,Δc,R},, where C > 2, R > 2, Δ c represents a subset of row C including R grids, and Δ c,r represents the R-th grid in Δ c;
(3) Initializing parameters:
Initializing a map creation time t, ending the map creation time t end, and obtaining grid state information corresponding to a time delta of t as Odd t (delta); the particle set input by initializing SLAM algorithm contains K particles Each particle L k at time t carries pose information P t k and grid state information/>, of the mobile robotLet t=1;
(4) The sensor collects data:
The mobile robot starts to move under the control of a control instruction sent by a remote upper computer, and a laser radar, an Inertial Measurement Unit (IMU) and an mileage meter in the sensor acquire radar point cloud data of I points in the surrounding environment at the moment t respectively Acceleration information a t of the mobile robot, moving distance information d t of the mobile robot and included angle information theta t with the x-axis direction, and uploading z t and track information u t={at,dtt composed of a t、dt、θt to a map building module, wherein I is more than or equal to 300,/>Radar data representing an i-th point at time t;
(5) The mapping module updates the grid state information of each particle at the time t:
(5a) The mapping module passes through pose information of each particle L k at t-1 moment And moving robot trajectory information u t-1, estimating a pose P t k of each particle L k at time t;
(5b) The mapping module scans and matches the radar point cloud data z t-1 at the time t-1 and the cloud data z t at the time t Lei Dadian to obtain the relative pose P re between the two point cloud data, updates the P t k through the P re, and obtains the pose of K particles after updating Particle set/>
(5C) Mapping module uses genetic cross-variation method for each particle in L' (t)Performing crossover and mutation to obtain particle set/>And judging the weight set/> corresponding to the L cv (t)Whether the calculated particle evaluation number N eff and a preset resampling particle number threshold N th meet N eff<Nth or not, if yes, resampling L cv (t) to obtain K result particles, otherwise, taking K particles in L cv (t) as result particles, and then taking N eff particles with the largest weight as result particlesAs effective particles;
(5d) The mapping module creates each effective particle according to the time t-1 Grid state information/>And pose informationEach effective particle/>, at time tPose information of/>Update time t/>In grid state information of
(6) Obtaining a mapping result of the mobile robot:
(6a) The upper computer generates each particle according to the time t Grid state information/>Grid state information/>, for each grid Δ c,r in (a)And a passable threshold σ and an obstacle threshold τ set in advance determine grid state information Odd tc,r of each grid Δ c,r): determining a grid delta c,r corresponding to Odd tc,r)≤σ、Oddtc,r)>τ、σ<Oddtc,r) less than or equal to tau as a passing state, an obstacle state and an unknown state respectively, and coating the three states into different colors to obtain a grid state diagram at the moment t;
(6b) Judging whether t & gtt end is true or not, if yes, taking the grid state diagram at the moment of t end as a two-dimensional grid map of the mobile robot, otherwise, executing the step (4) after t=t+1.
2. The mobile robot mapping method based on the genetic filtering SLAM algorithm of claim 1, wherein the estimating pose P t k of each particle L k at time t in step (5 a) is as follows:
wherein: is "subject to" the symbol.
3. The mobile robot mapping method based on the genetic filtering SLAM algorithm according to claim 1, wherein the mapping module in the step (5 b) scans and matches the radar point cloud data z t-1 at time t-1 with the radar point cloud data z t at time t Lei Dadian to obtain a relative pose P re between the two point cloud data, updates P t k through P re, and the implementation steps are as follows:
(5b1) The mapping module obtains points to be matched in z t-1 Data in z t and two points closest to it/>And/>The data of two points are expressed as vector/>
(5B2) The mapping module converts the pose P t k contained in the particles into a rotation matrix and a translation matrix, rotates z t-1 firstly according to the rotation matrix and the translation matrix, then translates to match with z t, and leads the matching error to be realizedLess than our set matching threshold delta, where R and T represent rotation and translation operations on z t-1, respectively, i.e., the relative pose P re,/>, between two point cloud dataRepresenting the transposed result of vector e i;
(5b3) The mapping module updates P t k through P re to obtain the updated pose of K particles as Particle set/>Wherein the update formula of P t k is:
4. the mobile robot mapping method based on the genetic filtering SLAM algorithm of claim 1, wherein the K result particles in step (5 c) are obtained by the steps of:
(5c1) The mapping module is based on the t moment particles Grid state information/>And pose information/>Cloud data z t at time t Lei Dadian, update particle/>And pair/>Updated results/>Normalizing to obtain normalized weight set/>Wherein the particles/>, are updatedWeights of (2) and weighting the set of weights/>Ownership of/>The formulas for normalization are respectively:
(5c2) Mapping module pair The K weights in (1) are arranged in descending order and will/>The particles smaller than the preset particle weight threshold W th in the corresponding particles form a small weight particle set L l, the rest particles form a large weight particle set L h, and then one particle x l、xh randomly extracted from L l、Lh is subjected to cross operation to obtain a crossed particle set/>The formula for the cross operation on x l、xh is:
Wherein, Representing new particles generated after the cross operation, wherein alpha is a cross parameter, and alpha is [0,1];
(5c3) The mapping module pairs the particles after the cross operation Performing mutation to obtain particle set after genetic cross mutationAnd judging the weight set/> corresponding to the L cv (t)If the calculated particle evaluation number N eff and the preset resampling particle number threshold N th meet N eff<Nth, resampling all particles in L cv (t) to obtain K result particles, otherwise, taking all particles in L cv (t) as result particles, arranging the weight of the result particles in a descending order, and taking the first N eff as effective particles/>Wherein pair/>The variation is carried out, and the calculation formulas of N eff are respectively as follows:
Wherein the method comprises the steps of Represents new particles generated after mutation operation, p l is AND/>The weight of (3) is independent and uniformly distributed random variable, p l∈[0,1],pm is variation probability, and p m E [0,1].
5. The mobile robot mapping method based on the genetic filtering SLAM algorithm of claim 1, wherein said updating the time t in step (5 d)Grid state information/>The update formula is:
CN202211027811.1A 2022-08-25 2022-08-25 Mobile robot mapping method based on genetic filtering SLAM algorithm Active CN115388893B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211027811.1A CN115388893B (en) 2022-08-25 2022-08-25 Mobile robot mapping method based on genetic filtering SLAM algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211027811.1A CN115388893B (en) 2022-08-25 2022-08-25 Mobile robot mapping method based on genetic filtering SLAM algorithm

Publications (2)

Publication Number Publication Date
CN115388893A CN115388893A (en) 2022-11-25
CN115388893B true CN115388893B (en) 2024-05-14

Family

ID=84123069

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211027811.1A Active CN115388893B (en) 2022-08-25 2022-08-25 Mobile robot mapping method based on genetic filtering SLAM algorithm

Country Status (1)

Country Link
CN (1) CN115388893B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107246873A (en) * 2017-07-03 2017-10-13 哈尔滨工程大学 A kind of method of the mobile robot simultaneous localization and mapping based on improved particle filter
CN112882056A (en) * 2021-01-15 2021-06-01 西安理工大学 Mobile robot synchronous positioning and map construction method based on laser radar
CN113050658A (en) * 2021-04-12 2021-06-29 西安科技大学 SLAM algorithm based on lion group algorithm optimization
WO2021237667A1 (en) * 2020-05-29 2021-12-02 浙江大学 Dense height map construction method suitable for legged robot planning
CN114608585A (en) * 2022-04-01 2022-06-10 常州大学 Method and device for synchronous positioning and mapping of mobile robot

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107246873A (en) * 2017-07-03 2017-10-13 哈尔滨工程大学 A kind of method of the mobile robot simultaneous localization and mapping based on improved particle filter
WO2021237667A1 (en) * 2020-05-29 2021-12-02 浙江大学 Dense height map construction method suitable for legged robot planning
CN112882056A (en) * 2021-01-15 2021-06-01 西安理工大学 Mobile robot synchronous positioning and map construction method based on laser radar
CN113050658A (en) * 2021-04-12 2021-06-29 西安科技大学 SLAM algorithm based on lion group algorithm optimization
CN114608585A (en) * 2022-04-01 2022-06-10 常州大学 Method and device for synchronous positioning and mapping of mobile robot

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
结合退火优化和遗传重采样的RBPF算法;孙弋;张笑笑;;西安科技大学学报;20200330(第02期);全文 *

Also Published As

Publication number Publication date
CN115388893A (en) 2022-11-25

Similar Documents

Publication Publication Date Title
KR102292277B1 (en) LIDAR localization inferring solutions using 3D CNN networks in autonomous vehicles
CN111971574B (en) Deep learning based feature extraction for LIDAR localization of autonomous vehicles
CN109798896B (en) Indoor robot positioning and mapping method and device
CN111427370B (en) Sparse pose adjustment-based Gmapping mapping method for mobile robot
WO2020154973A1 (en) Lidar localization using rnn and lstm for temporal smoothness in autonomous driving vehicles
CN114387319B (en) Point cloud registration method, device, equipment and storage medium
CN112328715B (en) Visual positioning method, training method of related model, related device and equipment
Song et al. Online coverage and inspection planning for 3D modeling
CN112284376A (en) Mobile robot indoor positioning mapping method based on multi-sensor fusion
CN109885046B (en) Mobile robot positioning and accelerating method based on particle filtering
CN111190211A (en) GPS failure position prediction positioning method
Feng et al. Point cloud registration algorithm based on the grey wolf optimizer
Wu et al. A Non-rigid hierarchical discrete grid structure and its application to UAVs conflict detection and path planning
Puligandla et al. A multiresolution approach for large real-world camera placement optimization problems
CN115388893B (en) Mobile robot mapping method based on genetic filtering SLAM algorithm
CN113436223A (en) Point cloud data segmentation method and device, computer equipment and storage medium
CN111761583A (en) Intelligent robot motion positioning method and system
CN115239899B (en) Pose map generation method, high-precision map generation method and device
CN114925627B (en) Helicopter flow field numerical simulation system and method based on graphic processor
CN115655311A (en) Ackerman robot odometer calibration method based on scanning matching
CN116047495A (en) State transformation fusion filtering tracking method for three-coordinate radar
CN112947481B (en) Autonomous positioning control method for home service robot
Wang et al. Research on SLAM road sign observation based on particle filter
CN110849392A (en) Robot mileage counting data correction method and robot
Sun et al. Indoor Li-DAR 3D mapping algorithm with semantic-based registration and optimization

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant