CN114383611A - Multi-machine cooperative laser SLAM method, device and system for mobile robot - Google Patents

Multi-machine cooperative laser SLAM method, device and system for mobile robot Download PDF

Info

Publication number
CN114383611A
CN114383611A CN202111680361.1A CN202111680361A CN114383611A CN 114383611 A CN114383611 A CN 114383611A CN 202111680361 A CN202111680361 A CN 202111680361A CN 114383611 A CN114383611 A CN 114383611A
Authority
CN
China
Prior art keywords
point cloud
map
module
mobile robots
server
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111680361.1A
Other languages
Chinese (zh)
Inventor
曹永军
李文威
梁佳楠
黄伟溪
姜卫岐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China Robotics Innovation Research Institute
Institute of Intelligent Manufacturing of Guangdong Academy of Sciences
Original Assignee
South China Robotics Innovation Research Institute
Institute of Intelligent Manufacturing of Guangdong Academy of Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China Robotics Innovation Research Institute, Institute of Intelligent Manufacturing of Guangdong Academy of Sciences filed Critical South China Robotics Innovation Research Institute
Priority to CN202111680361.1A priority Critical patent/CN114383611A/en
Publication of CN114383611A publication Critical patent/CN114383611A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes

Abstract

The invention discloses a multi-machine cooperative laser SLAM method, a device and a system of a mobile robot, wherein the multi-machine cooperative laser SLAM method comprises the following steps: the central dispatching server starts and controls a plurality of mobile robots to operate; the central scheduling server issues a data acquisition instruction to the plurality of mobile robots; the method comprises the following steps that a plurality of mobile robots collect data and obtain rough poses based on EFK fusion calculation processing; the method comprises the steps that radar of a plurality of mobile robots scans the environment, denoising and down-sampling are conducted, and N local point cloud maps are obtained; the cloud server processes the local point cloud map and the rough pose based on a PL-ICP algorithm and a Cartogramer algorithm to obtain an optimal global point cloud map and an accurate pose; and converting the optimal global point cloud map into a global grid map and positioning. The invention improves the working efficiency and the drawing construction precision by establishing a multi-robot map information sharing mechanism of the mobile robot.

Description

Multi-machine cooperative laser SLAM method, device and system for mobile robot
Technical Field
The invention relates to the technical field of SLAM, in particular to a method, a device and a system for multi-machine cooperative laser SLAM of a mobile robot.
Background
The SLAM technology refers to a technology that a mobile robot runs in an unfamiliar environment, can simultaneously complete construction of an environment map and determine the relative position of the mobile robot in the map. After the mobile robots are applied in large scale in industry and service industry, the situation of cooperative work of multiple mobile robots becomes more and more common, and the situation of realizing group cooperative motion such as obstacle avoidance, formation, cruising and the like is more and more, so that a more efficient and more intelligent SLAM method and system are needed to be matched with the situation. At present, a mobile robot SLAM is generally in a single machine mode, data acquisition at the front end is mainly completed by sensors such as an airborne laser radar, a speedometer and an IMU, and data processing at the rear end is completed by an airborne industrial personal computer. The single-machine SLAM mode has a good application effect in a small-range scene, but when the single-machine SLAM mode is applied to a plurality of machines and a large-range scene, due to the narrow visual field range of a single robot, the outstanding problems of long calculation time consumption, poor mapping quality, inaccurate positioning and the like can occur. When the scale range of an application scene is expanded, the single mobile robot has the outstanding problems of long calculation time consumption, poor mapping quality, inaccurate positioning and the like, and a single SLAM mode is mutually isolated when multiple robots are used, so that an information sharing and task cooperation mechanism is lacked, and multiple mobile robots respectively map the same working scene, thereby causing low calculation efficiency and waste of calculation power.
Disclosure of Invention
The invention aims to overcome the defects of the prior art, and provides a method, a device and a system for multi-machine cooperative laser SLAM of a mobile robot.
In order to solve the above technical problem, an embodiment of the present invention provides a multi-machine cooperative laser SLAM method for a mobile robot, where the multi-machine cooperative laser SLAM method includes:
s1: calibrating the odometers and IMUs of the plurality of mobile robots, removing the motion distortion of the laser radar, and synchronizing the clocks of the mobile robot systems;
s2: the central dispatching server starts and controls a plurality of mobile robots to start and operate;
s3: refreshing the mileage of the mobile robot by the central scheduling server;
s4: the central dispatching server issues data acquisition instructions to the mobile robots after delaying the time delta t;
s5: the odometer and the IMU of the plurality of mobile robots acquire data according to the data acquisition instruction, and add timestamps and marks to the acquired data transmission;
s6: the method comprises the following steps that a plurality of mobile robots perform fusion processing on acquired data based on an extended Kalman filter to obtain N rough poses; at the same time, the user can select the desired position,
s7: the method comprises the following steps that radars of a plurality of mobile robots scan the environment according to data acquisition instructions to generate respective laser contour point clouds;
s8: the method comprises the following steps that a plurality of mobile robots carry out denoising treatment and down sampling on laser contour point clouds to obtain N local point cloud maps and attach timestamps and marks;
s9: transmitting the N local point cloud maps and the N rough poses to a central dispatching server through a 5G wireless network;
s10: the central scheduling server uploads the N local point cloud maps and the N rough poses to the cloud server;
s11: the cloud server performs fusion calculation on the N local point cloud maps to be spliced into a global point cloud map;
s12: the cloud server performs optimization processing on the global point cloud map and the N rough poses to obtain an optimal global point cloud map and N accurate poses;
s13: and converting the optimal global point cloud map into a global grid map, and marking the positions of a plurality of mobile robots on the global grid map based on the accurate pose matrix.
Preferably, the refreshing the mileage of the mobile robot by the central scheduling server includes:
and the central dispatching server refreshes the mileage of the mobile robot according to the accurate pose, and the milemeter returns to zero at the initial time.
Preferably, the data includes mileage, acceleration, and angular velocity.
Preferably, the denoising process includes removing isolated points and noise points; the downsampling is used for compressing the data volume of the laser point cloud.
Preferably, the processing of the collected data by the plurality of mobile robots based on the fusion of the extended kalman filter includes:
each mobile robot carries out operation conversion on the mobile robot driving mileage acquired by the mileometer to obtain the integral displacement of the robot;
each mobile robot integrates the acceleration and the angular velocity acquired by the IMU respectively, the linear velocity and the corner of the robot are calculated, and the linear velocity, the displacement, the angular velocity and the corner are fused by using an extended Kalman filter to obtain a rough pose matrix of the mobile robot.
Preferably, the cloud server performs fusion calculation on the N local point cloud maps according to a PL-ICP algorithm, including:
and in the cloud server, performing data matching by adopting a timestamp alignment mode, performing fusion calculation on a plurality of local point cloud maps by adopting a PL-ICP (PL-inductively coupled plasma) algorithm, and splicing into a global point cloud map.
Preferably, the cloud server performs optimization processing on the global point cloud map and the N rough poses, including:
and the cloud server optimizes the global point cloud map according to a Cartogrier algorithm to obtain an optimal global point cloud map, and meanwhile, the N rough pose matrixes generate accurate poses corresponding to the mobile robot according to the Cartogrier algorithm.
Preferably, the marking the positions of a plurality of mobile robots on the global grid map based on the accurate pose matrix comprises:
each grid in the grid map has a value range of [0,1], wherein 0 represents that the grid is not occupied, 1 represents that the grid is occupied, 1 represents that whether uncertainty is occupied or not, and values between 0 and 1 represent the probability that the grid is occupied.
Correspondingly, the invention also provides a multi-machine collaborative laser SLAM device of the mobile robot, which is used for realizing the multi-machine collaborative laser SLAM method and comprises a preparation module, a starting module, a data refreshing module, a command module, an acquisition module, a fusion calculation module, a down-sampling module, a scanning module, a denoising module, a first transmission module, a second transmission module, a map splicing module, a map optimization module and a positioning mapping module.
The preparation module is used for calibrating the odometers and IMUs of the plurality of mobile robots, removing the motion distortion of the laser radar and synchronizing the clocks of the mobile robot systems;
the starting module is used for starting the central dispatching server and controlling a plurality of mobile robots to start and operate;
the data refreshing module refreshes the mileage of the mobile robot according to the central scheduling server;
the command module issues data acquisition instructions to a plurality of mobile robots after the time delay delta t according to the central scheduling server;
the acquisition module acquires data according to the odometer and the IMU of the plurality of mobile robots and data acquisition instructions, and attaches timestamps and marks to the acquired data transmission;
the fusion calculation module performs fusion processing on the acquired data based on an extended Kalman filter according to a plurality of mobile robots to obtain N rough poses;
the scanning module scans the environment according to radar data acquisition instructions of a plurality of mobile robots to generate respective laser contour point clouds;
the de-noising module de-noises the laser contour point cloud according to a plurality of mobile robots to obtain N local point cloud maps and adds timestamps and marks;
the down-sampling module respectively performs down-sampling on the N local point cloud maps to compress the data volume of the laser point cloud;
the first transmission module is used for transmitting the N local point cloud maps and the N rough poses to a central scheduling server through a 5G wireless network;
the second transmission module uploads the N local point cloud maps and the N rough poses of the central scheduling server to the cloud server;
the map splicing module performs fusion calculation on the N local point cloud maps through the cloud server to splice the N local point cloud maps into a global point cloud map;
the map optimization module is used for optimizing the global point cloud map and the N rough poses based on the cloud server to obtain an optimal global point cloud map and N precise poses;
and the positioning and mapping module converts the optimal global point cloud map into a global grid map, and marks the positions of a plurality of mobile robots on the global grid map based on the accurate pose matrix.
Correspondingly, the invention also provides a multi-machine collaborative laser SLAM system of the mobile robot, the multi-machine collaborative laser SLAM system comprises a plurality of mobile robots, a 5G private network, a 5G switch, a central scheduling server and a plurality of cloud servers, the mobile robots are in wireless connection with the 5G switch based on the 5G private network, the 5G switch is connected with the central scheduling server based on a network cable, and the central scheduling server is connected with the cloud servers based on the network cable;
the multi-machine collaborative laser SLAM system is used for realizing the multi-machine collaborative laser SLAM method.
The embodiment of the invention provides a multi-machine cooperative laser SLAM method, a device and a system for a mobile robotThe system scans the surrounding environment synchronously by using the laser radars carried by a plurality of mobile robots, performs point cloud data fusion processing by using a 5G network as a relay and using computing resources of a cloud end, can quickly generate a lightweight global map of a large-scale scene and complete accurate positioning of the plurality of robots, and improves the working efficiency; map M with multiple mobile robots on same gridgGo up and carry out the mark, many mobile robot synchronization mark information build the picture and fix a position to the whole jointly, avoid the waste of calculation power, promote work efficiency.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic flow chart of a multi-machine cooperative laser SLAM method in an embodiment of the present invention.
Fig. 2 is a schematic data flow diagram of a multi-machine cooperative laser SLAM method in an embodiment of the present invention.
Fig. 3 is a schematic structural diagram of a multi-machine cooperative laser SLAM device in an embodiment of the present invention.
Fig. 4 is a schematic diagram of a hardware topology of a multi-machine cooperative laser SLAM system in the embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example one
Fig. 1 shows a schematic flow diagram of a multi-machine cooperative laser SLAM method in an embodiment of the present invention, where the multi-machine cooperative laser SLAM method of the mobile robot is called Simultaneous Localization And Mapping, And the SLAM is synchronous positioning And map building, And includes:
s1: calibrating the odometers and IMUs of the plurality of mobile robots, removing the motion distortion of the laser radar, and synchronizing the clocks of the mobile robot systems;
preparation before SLAM operation includes: calibrating odometers and IMUs of a plurality of mobile station robots, removing motion distortion of a laser radar, and synchronizing clocks of all mobile robot systems; and (4) enabling the odometer data of the plurality of mobile robots to be completely zero. The condition of the mobile robot is checked before starting, data of the robot is initialized, calculation is facilitated, and accuracy of data acquisition is guaranteed.
The plurality of mobile robots are the same wheel type mobile robot, and the wheel type mobile robot is provided with a 2D laser radar, a mileometer, an IMU, an EFK and a 5G wireless communication module; the IMU is called an Inertial Measurement Unit (IMU) and is an Inertial sensor, and the Inertial sensor is used for acquiring the acceleration and the angular velocity of the mobile robot; the odometer can be a visual odometer, a wheel type odometer or a laser odometer, and is used for collecting the driving mileage of the mobile robot; EFK is called Extended Kalman Filter entirely, EFK is an Extended Kalman Filter, and the Extended Kalman Filter is used for calculating a pose matrix of the mobile robot.
S2: the central dispatching server starts and controls a plurality of mobile robots to start and operate;
and starting the central dispatching server, and then synchronously starting the plurality of mobile robots by utilizing the central dispatching server to start mapping detection, so that the time of the plurality of mobile robots is synchronized, the mobile robots have identity, and data errors caused by time are reduced.
S3: refreshing the mileage of the mobile robot by the central scheduling server;
the central dispatching server is in 5G special network wireless connection with the 5G wireless communication modules of the plurality of mobile robots based on the 5G exchanger, and the central dispatching server of the SLAM refreshes the odometer data of the plurality of robots according to the accurate pose; at the beginning, the mileage data are all returned to zero, the central dispatching server refreshes the mileage data of a plurality of robots, the mobile robots move, and the mileage records displacement data.
The 5G special network is a local area communication network with characteristics of low time delay, high reliability and large bandwidth, and the 5G special network can completely cover the movement range of the mobile robot, can quickly transmit signals and data, accelerates transmission rate and improves efficiency.
S4: the central dispatching server issues data acquisition instructions to the mobile robots after delaying the time delta t;
and the data acquisition instruction generated by the central dispatching server wirelessly transmits the data acquisition instruction to the plurality of mobile robots through the 5G exchanger after the time delay of delta t, the plurality of mobile robots acquire data according to the sampling frequency of 1/delta t, and the data comprise the driving mileage, the acceleration and the angular velocity.
S5: the odometer and the IMU of the plurality of mobile robots acquire data according to the data acquisition instruction, and add timestamps and marks to the acquired data transmission;
each mobile robot receives a data acquisition instruction transmitted by the central dispatching server based on the 5G wireless communication module, acquires the driving mileage, the acceleration and the angular velocity of the mobile robot according to the sampling frequency of 1/delta t, acquires the driving mileage of the mobile robot by the milemeter, and acquires the acceleration and the angular velocity of the mobile robot by the IMU. Each mobile robot distinguishes the collected data by the attached time stamp and label, and transmits the collected data to the EFK. The data of each mobile robot is distinguished by attaching the time stamp and the label, so that the mobile robot corresponding to the data is conveniently identified, the identification rate is increased, and the mobile robot identification system has good convenience.
S6: the method comprises the following steps that a plurality of mobile robots perform fusion processing on acquired data based on an extended Kalman filter to obtain N rough poses;
mobile robot driving with mileage meter collected by each mobile robotThe mileage is converted into the integral displacement of the robot through calculation; respectively integrating the acceleration and the angular velocity acquired by the IMU, calculating the linear velocity and the corner of each mobile robot, and fusing the linear velocity, the displacement, the angular velocity and the corner by using the EFK to obtain a rough pose matrix P 'of the mobile robot'i(x,y,θ)T(i ═ 1 … N), that is, several station moving robots obtain N coarse pose matrices.
S7: the method comprises the following steps that radars of a plurality of mobile robots scan the environment according to data acquisition instructions to generate respective laser contour point clouds;
while S5 is being performed, each mobile robot synchronously scans the surrounding environment based on the 2D lidar, generating respective laser profile point clouds.
S8: the method comprises the following steps that a plurality of mobile robots carry out denoising treatment and down sampling on laser contour point clouds to obtain N local point cloud maps and attach timestamps and marks;
denoising and downsampling the laser contour point cloud by the 2D laser radar of each mobile robot to generate a local map MiDenoising treatment is to remove isolated points and noise points, the down sampling is used for compressing the data volume of the laser point cloud, and each mobile robot is used for respective local map MiWith time stamps and labels, i.e. N partial maps M obtained by several mobile robotsi
Here for the local map MiThe additional timestamp and the label are added for distinguishing, so that the mobile robot corresponding to the data can be conveniently identified, the identification speed is accelerated, and the method has good convenience.
S9: transmitting the N local point cloud maps and the N rough poses to a central dispatching server through a 5G wireless network;
each mobile robot is wirelessly connected with a 5G exchanger through a 5G wireless communication module, the 5G exchanger is connected with a central dispatching server based on a network cable, and each mobile robot is used for locally generating a point cloud map MiAnd coarse pose P'iUploading to a central scheduling server, i.e. N local maps Mi(i-1 … N) and N coarse pose matrices P'i(i-1 … N) to a central dispatch serviceA device.
S10: the central scheduling server uploads the N local point cloud maps and the N rough poses to the cloud server;
the central scheduling server is connected with the plurality of cloud processors through network cables. The central dispatching server sends N local maps Mi(i-1 … N) and N coarse pose matrices P'i(i-1 … N) to several cloud servers, and at the same time, the central dispatch server sends N local maps Mi(i-1 … N) and N coarse pose matrices P'i(i-1 … N) to a database storing historical operating data of the mobile robot and a historical global map.
S11: the cloud server performs fusion calculation on the N local point cloud maps to be spliced into a global point cloud map;
in the cloud server, data matching is carried out in a timestamp alignment mode, and a PL-ICP algorithm is adopted to carry out data matching on a plurality of local point cloud maps MiPerforming fusion calculation to combine a global point cloud map Mc
PL-ICP is known as Point-to-Line Iterative close Point, PL-ICP algorithm: firstly, initializing, wherein the initialization work is the assignment of first frame radar data and the assignment of first frame time; cos and sin of each angle value corresponding to the radar data are calculated, and the calculation amount is saved; then converting the radar data into a required format; then, PL-ICP is called for fusion calculation.
S12: the cloud server performs optimization processing on the global point cloud map and the N rough poses to obtain an optimal global point cloud map and N accurate poses;
the cloud server carries out global point cloud map M according to Cartogrier algorithmcOptimizing to obtain an optimal global point cloud map and N rough pose matrixes P'iOptimizing and generating N accurate pose matrixes P according to Cartogrrapher algorithmi(i-1 … N), coarse pose matrix P 'in the optimization process'iThe corresponding accurate pose matrix P can be obtained by matching with the corresponding driving mileage through a Cartogrer algorithmi
S13: converting the optimal global point cloud map into a global grid map, and marking the positions of a plurality of mobile robots on the global grid map based on the accurate pose matrix;
map M of global point cloudcConverting to a lightweight global grid probability map MgAnd on the grid map MgBased on the accurate position matrix PiMarking the position of the corresponding mobile robot, and an accurate pose matrix PiThe corresponding mobile robots are distinguished according to the attached time stamp and the label. Each grid value range in the map is [0,1]]Where 0 denotes that the grid is unoccupied, 1 denotes that the grid is occupied, 1 denotes that it is uncertain whether it is occupied, and values between 0 and 1 denote the probability that the grid is occupied. And then returning to S3 to repeat the processes from S3 to S13 continuously, carrying out a new round of global mapping and positioning, and continuously acquiring the moving position of the mobile robot until the grid map MgAnd if all the grids are occupied, the global mapping of the area is completed. After the global map building and the positioning of the plurality of mobile robots are completed, the plurality of mobile robots are all stopped, and the SLAM stops running. Map M with a plurality of mobile robots on same gridgAnd the mobile robots are marked, and a plurality of mobile robots synchronously mark information, build a map and position the whole situation together, and improve the working efficiency.
The embodiment of the invention provides a multi-machine cooperative laser SLAM method for mobile robots, which is characterized in that laser radars carried by a plurality of mobile robots are used for synchronously scanning surrounding environments, a 5G network is used as a relay, and cloud point data are fused by using cloud computing resources, so that a lightweight global map of a large-range scene can be quickly generated, meanwhile, the plurality of mobile robots are continuously and accurately positioned on the map to complete global map construction, and the working efficiency is improved; map M with multiple mobile robots on same gridgGo up and carry out the mark, many mobile robot synchronization mark information build the picture and fix a position to the whole jointly, avoid the waste of calculation power, promote work efficiency.
Example two
Fig. 2 is a schematic data flow diagram of a multi-machine cooperative laser SLAM method in the embodiment of the present invention. The mobile robot scans the surrounding environment based on a laser radar to obtain laser point cloud parameters, and performs denoising and down-sampling on the laser point cloud to obtain a local point cloud map; meanwhile, the mobile robot obtains IMU data through the IMU, obtains odometer data through the odometer, and calculates and processes the odometer data and the IMU data through the EFK to obtain a rough pose; then, performing fusion calculation on the local point cloud map by using a PL-ICP (PL-inductively coupled plasma) algorithm, and splicing to form a global point cloud map; then, optimizing the global point cloud map by using a Cartogrer algorithm to obtain an optimal global point cloud map, wherein the rough pose matrix can be matched with the corresponding driving mileage to obtain a corresponding accurate pose matrix through the Cartogrer algorithm; and finally, converting the global point cloud map into a light global grid probability map, marking the corresponding position of the mobile robot on the grid map based on the accurate pose matrix, and carrying out global mapping and positioning on the global map of the mobile robot. The multi-machine collaborative laser SLAM method continuously constructs a supplementary global map by continuously moving a plurality of mobile robots and recording data of the mobile robots until the global map is completely supplemented and the mobile robots finish positioning on the global map, and marks the completion of the global map construction.
EXAMPLE III
Fig. 3 shows a schematic structural diagram of a multi-machine cooperative laser SLAM device in an embodiment of the present invention, where the multi-machine cooperative laser SLAM device includes a preparation module 301, a starting module 302, a data refreshing module 303, a command module 304, an acquisition module 305, a fusion calculation module 306, a scanning module 307, a denoising module 308, a first transmission module 309, a second transmission module 310, a down-sampling module, a map stitching module 311, a map optimization module 312, and a positioning map building module 313.
The preparation module 301 is used for calibrating odometers and IMUs of a plurality of mobile robots, removing motion distortion of a laser radar, and synchronizing clocks of all mobile robot systems;
the starting module 302 is used for starting the central scheduling server and controlling a plurality of mobile robots to start and operate;
the data refreshing module 303 refreshes the mileage of the mobile robot according to the central scheduling server;
the command module 304 issues a data acquisition instruction to a plurality of mobile robots after the time delay delta t according to the central scheduling server;
the acquisition module 305 acquires data according to the odometer and the IMU of the plurality of mobile robots and data acquisition instructions, and transmits and attaches timestamps and marks to the acquired data;
the fusion calculation module 306 performs fusion processing on the acquired data based on an extended kalman filter according to a plurality of mobile robots to obtain N rough poses;
the scanning module 307 scans the environment according to the data acquisition instruction by the radars of the plurality of mobile robots, and generates respective laser contour point clouds;
the denoising module 308 performs denoising processing on the laser contour point cloud according to a plurality of mobile robots to obtain N local point cloud maps and adds timestamps and marks;
the down-sampling module respectively performs down-sampling on the N local point cloud maps to compress the data volume of the laser point cloud;
the first transmission module 309 is configured to transmit the N local point cloud maps and the N rough poses to a central scheduling server through a 5G wireless network;
the second transmission module 310 uploads the N local point cloud maps and the N rough poses of the central scheduling server to the cloud server;
the map stitching module 311 performs fusion calculation on the N local point cloud maps through the cloud server to stitch the N local point cloud maps into a global point cloud map;
the map optimization module 312 performs optimization processing on the global point cloud map and the N rough poses based on the cloud server to obtain an optimal global point cloud map and N precise poses;
the positioning map building module 313 converts the optimal global point cloud map into a global grid map, and marks the positions of a plurality of mobile robots on the global grid map based on the accurate pose matrix.
Example four
Figure 4 shows a schematic diagram of the hardware topology of the multi-machine cooperative laser SLAM system in the embodiment of the invention, the multi-machine cooperative laser SLAM system of the mobile robot is used for realizing the multi-machine cooperative laser SLAM method of the mobile robot in the first embodiment, the multi-machine collaborative laser SLAM system of the mobile robot comprises a plurality of mobile robots, a 5G private network, a 5G switch, a central dispatching server, a database and a plurality of cloud servers, the mobile robots are wirelessly connected with the 5G switch based on a 5G private network, the 5G switch is connected with a central dispatching server based on a network cable, the central dispatching server is connected with a plurality of cloud servers based on the network cable, the central scheduling server is connected with a database based on a network cable, and the database is used for storing historical operation data and a historical global map of the mobile robot. Specifically, the system acquires data based on a plurality of robots, realizes data transmission based on a 5G private network, realizes scheduling control based on a 5G switch, a central scheduling server and a database, performs cloud computing based on a plurality of cloud servers, and completes map construction based on data acquisition. The multi-robot cooperative laser SLAM system synchronously scans the surrounding environment by using laser radars carried by a plurality of mobile robots, the 5G network is used as a relay and carries out point cloud data fusion processing by using computing resources at the cloud end, the 5G network can completely cover the motion range of the mobile robots, a light-weight global map of a large-range scene can be rapidly generated, and the accurate positioning of the multi-robots is completed.
The plurality of mobile robots simultaneously operate in the same area range; the plurality of mobile robots are the same wheel type mobile robot, and the wheel type mobile robot is provided with a 2D laser radar, a odometer, an IMU, an extended Kalman filter and a 5G wireless communication module; the IMU is called an Inertial Measurement Unit (IMU) in full, the Inertial sensor is used for collecting the acceleration and the angular speed of the mobile robot, the odometer can be a visual odometer, a wheel type odometer or a laser odometer, the odometer is used for collecting the driving mileage of the mobile robot, the extended Kalman filter is used for calculating the position matrix of the mobile robot, and the 2D laser radar is used for scanning the surrounding environment to generate a local point cloud map. The mobile robots are wirelessly connected with the 5G switch based on the 5G wireless communication module, the mobile robots are wirelessly connected through a 5G special network, the 5G special network is a local area communication network with low time delay, high reliability and large bandwidth characteristics, and the 5G special network can completely cover the movement range of the mobile robots, so that the transmission rate is increased, and the efficiency is improved.
It should be noted that each mobile robot collects the driving mileage of the mobile robot through an odometer, each mobile robot collects the acceleration and angular velocity of the mobile robot through an IMU, and each mobile robot distinguishes the collected data by an attached timestamp and label; calculating the mileage, the acceleration and the angular velocity to obtain a rough pose matrix P 'of each mobile robot'i. Converting the travel mileage of the mobile robot collected by the odometer into the integral displacement of the robot through calculation; respectively integrating the acceleration and the angular velocity acquired by the IMU, calculating the linear velocity and the corner of the robot, and fusing the linear velocity, the displacement, the angular velocity and the corner by using an extended Kalman filter (namely an extended Kalman filter) to generate a rough pose matrix P 'of the robot'i(x,y,θ)T(i ═ 1 … N); each mobile robot scans the surrounding environment to generate respective laser contour point cloud, and the laser contour point cloud is subjected to denoising treatment and down sampling to generate a local map MiAnd for each mobile robot, local map MiTime stamps and labels are attached for distinction.
The cloud server is constructed based on a Hadoop architecture, a PL-ICP algorithm and a Cartographer algorithm are stored in the cloud server, the cloud server completes fusion of a plurality of local maps by using cloud computing resources to generate a global map of an environment scene, and the cloud server can complete fusion of the plurality of local maps by using the cloud computing resources to generate the global map of the environment scene. In the cloud server 140, data matching is performed in a timestamp alignment mode, and a PL-ICP algorithm is adopted to perform data matching on a plurality of local point cloud maps MiPerforming fusion calculation to combine a global point cloud map Mc. The cloud server carries out global point cloud map M according to Cartogrier algorithmcOptimizing to obtain an optimal global point cloud map and a plurality of rough pose matrixes P'iGenerating an accurate pose matrix P of a plurality of mobile robots according to Cartogrier algorithm optimizationiFinally, a lightweight global grid probability map is established, and a grid map M is establishedgBased on the accurate position matrix PiMarking the position of the corresponding mobile robot, and an accurate pose matrix PiAnd distinguishing the corresponding mobile robots according to the additional timestamps and labels to realize synchronous mapping and positioning. The system is suitable for synchronous mapping and positioning of unknown large-range scenes.
PL-ICP is called Point-to-Line Iterative close Point, and the processing method comprises the following steps: firstly, initializing, wherein the initialization work is the assignment of first frame radar data and the assignment of first frame time; cos and sin of each angle value corresponding to the radar data are calculated, and the calculation amount is saved; then converting the radar data into a required format; then, PL-ICP is called for fusion calculation.
The embodiment of the invention provides a multi-robot cooperative laser SLAM system of mobile robots, which utilizes laser radars carried by a plurality of mobile robots to synchronously scan the surrounding environment, uses a 5G network as a relay and utilizes computing resources of a cloud end to perform fusion processing on point cloud data, can quickly generate a lightweight global map of a large-range scene and complete accurate positioning of multiple robots, and improves the working efficiency; map M with multiple mobile robots on same gridgGo up and carry out the mark, many mobile robot synchronization mark information build the picture and fix a position to the whole jointly, avoid the waste of calculation power, promote work efficiency.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by instructions associated with hardware via a program, which may be stored in a computer-readable storage medium, and the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.
In addition, the above embodiments of the present invention are described in detail, and the principle and the implementation manner of the present invention should be described herein by using specific examples, and the above description of the embodiments is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. A multi-machine cooperative laser SLAM method of a mobile robot is characterized by comprising the following steps:
s1: calibrating the odometers and IMUs of the plurality of mobile robots, removing the motion distortion of the laser radar, and synchronizing the clocks of the mobile robot systems;
s2: the central dispatching server starts and controls a plurality of mobile robots to start and operate;
s3: refreshing the mileage of the mobile robot by the central scheduling server;
s4: the central dispatching server issues data acquisition instructions to the mobile robots after delaying the time delta t;
s5: the odometer and the IMU of the plurality of mobile robots acquire data according to the data acquisition instruction, and add timestamps and marks to the acquired data transmission;
s6: the method comprises the following steps that a plurality of mobile robots perform fusion processing on acquired data based on an extended Kalman filter to obtain N rough poses; at the same time, the user can select the desired position,
s7: the method comprises the following steps that radars of a plurality of mobile robots scan the environment according to data acquisition instructions to generate respective laser contour point clouds;
s8: the method comprises the following steps that a plurality of mobile robots carry out denoising treatment and down sampling on laser contour point clouds to obtain N local point cloud maps and attach timestamps and marks;
s9: transmitting the N local point cloud maps and the N rough poses to a central dispatching server through a 5G wireless network;
s10: the central scheduling server uploads the N local point cloud maps and the N rough poses to the cloud server;
s11: the cloud server performs fusion calculation on the N local point cloud maps to be spliced into a global point cloud map;
s12: the cloud server performs optimization processing on the global point cloud map and the N rough poses to obtain an optimal global point cloud map and N accurate poses;
s13: and converting the optimal global point cloud map into a global grid map, and marking the positions of a plurality of mobile robots on the global grid map based on the accurate pose matrix.
2. The multi-machine cooperative laser SLAM method as in claim 1, wherein the central dispatch server refreshes mobile robot mileage comprising:
and the central dispatching server refreshes the mileage of the mobile robot according to the accurate pose, and the milemeter returns to zero at the initial time.
3. The multi-machine cooperative laser SLAM method as in claim 1, wherein the data comprises mileage, acceleration, and angular velocity.
4. The multi-machine cooperative laser SLAM method as described in claim 1, wherein said de-noising process comprises removing isolated points and noise points; the downsampling is used for compressing the data volume of the laser point cloud.
5. The multi-machine cooperative laser SLAM method as claimed in claim 1, wherein said several mobile robots base the collected data on the fusion process of extended Kalman filter, comprising:
each mobile robot carries out operation conversion on the mobile robot driving mileage acquired by the mileometer to obtain the integral displacement of the robot;
each mobile robot integrates the acceleration and the angular velocity acquired by the IMU respectively, the linear velocity and the corner of the robot are calculated, and the linear velocity, the displacement, the angular velocity and the corner are fused by using an extended Kalman filter to obtain a rough pose matrix of the mobile robot.
6. The multi-machine cooperative laser SLAM method as claimed in claim 1, wherein said cloud server performs fusion calculation on N local point cloud maps according to PL-ICP algorithm, comprising:
and in the cloud server, performing data matching by adopting a timestamp alignment mode, performing fusion calculation on a plurality of local point cloud maps by adopting a PL-ICP (PL-inductively coupled plasma) algorithm, and splicing into a global point cloud map.
7. The multi-machine cooperative laser SLAM method as recited in claim 1, wherein the cloud server performs optimization processing on a global point cloud map and N rough poses, comprising:
and the cloud server optimizes the global point cloud map according to a Cartogrier algorithm to obtain an optimal global point cloud map, and meanwhile, the N rough pose matrixes generate accurate poses corresponding to the mobile robot according to the Cartogrier algorithm.
8. The multi-machine cooperative laser SLAM method as described in claim 1, wherein said marking the locations of a number of mobile robots on a global grid map based on a precise pose matrix comprises:
each grid in the grid map has a value range of [0,1], wherein 0 represents that the grid is not occupied, 1 represents that the grid is occupied, 1 represents that whether uncertainty is occupied or not, and values between 0 and 1 represent the probability that the grid is occupied.
9. A multi-machine cooperative laser SLAM device of a mobile robot is characterized in that the multi-machine cooperative laser SLAM device is used for realizing the multi-machine cooperative laser SLAM method of any one of claims 1 to 8, and comprises a preparation module, a starting module, a data refreshing module, a command module, an acquisition module, a fusion calculation module, a down-sampling module, a scanning module, a denoising module, a first transmission module, a second transmission module, a map splicing module, a map optimization module and a positioning mapping module.
The preparation module is used for calibrating the odometers and IMUs of the plurality of mobile robots, removing the motion distortion of the laser radar and synchronizing the clocks of the mobile robot systems;
the starting module is used for starting the central dispatching server and controlling a plurality of mobile robots to start and operate;
the data refreshing module refreshes the mileage of the mobile robot according to the central scheduling server;
the command module issues data acquisition instructions to a plurality of mobile robots after the time delay delta t according to the central scheduling server;
the acquisition module acquires data according to the odometer and the IMU of the plurality of mobile robots and data acquisition instructions, and attaches timestamps and marks to the acquired data transmission;
the fusion calculation module performs fusion processing on the acquired data based on an extended Kalman filter according to a plurality of mobile robots to obtain N rough poses;
the scanning module scans the environment according to radar data acquisition instructions of a plurality of mobile robots to generate respective laser contour point clouds;
the de-noising module de-noises the laser contour point cloud according to a plurality of mobile robots to obtain N local point cloud maps and adds timestamps and marks;
the down-sampling module respectively performs down-sampling on the N local point cloud maps to compress the data volume of the laser point cloud;
the first transmission module is used for transmitting the N local point cloud maps and the N rough poses to a central scheduling server through a 5G wireless network;
the second transmission module uploads the N local point cloud maps and the N rough poses of the central scheduling server to the cloud server;
the map splicing module performs fusion calculation on the N local point cloud maps through the cloud server to splice the N local point cloud maps into a global point cloud map;
the map optimization module is used for optimizing the global point cloud map and the N rough poses based on the cloud server to obtain an optimal global point cloud map and N precise poses;
and the positioning and mapping module converts the optimal global point cloud map into a global grid map, and marks the positions of a plurality of mobile robots on the global grid map based on the accurate pose matrix.
10. A multi-machine collaborative laser SLAM system of mobile robots is characterized by comprising a plurality of mobile robots, a 5G private network, a 5G switch, a central scheduling server and a plurality of cloud servers, wherein the mobile robots are in wireless connection with the 5G switch based on the 5G private network, the 5G switch is connected with the central scheduling server based on a network cable, and the central scheduling server is connected with the cloud servers based on the network cable;
the multi-machine collaborative laser SLAM system is used for realizing the multi-machine collaborative laser SLAM method of any one of claims 1 to 8.
CN202111680361.1A 2021-12-30 2021-12-30 Multi-machine cooperative laser SLAM method, device and system for mobile robot Pending CN114383611A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111680361.1A CN114383611A (en) 2021-12-30 2021-12-30 Multi-machine cooperative laser SLAM method, device and system for mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111680361.1A CN114383611A (en) 2021-12-30 2021-12-30 Multi-machine cooperative laser SLAM method, device and system for mobile robot

Publications (1)

Publication Number Publication Date
CN114383611A true CN114383611A (en) 2022-04-22

Family

ID=81200410

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111680361.1A Pending CN114383611A (en) 2021-12-30 2021-12-30 Multi-machine cooperative laser SLAM method, device and system for mobile robot

Country Status (1)

Country Link
CN (1) CN114383611A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114993285A (en) * 2022-04-27 2022-09-02 大连理工大学 Two-dimensional laser radar mapping method based on four-wheel omnidirectional all-wheel-drive mobile robot
CN115797425A (en) * 2023-01-19 2023-03-14 中国科学技术大学 Laser global positioning method based on point cloud aerial view and rough-to-fine strategy

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013109325A (en) * 2011-11-22 2013-06-06 Korea Electronics Telecommun Map creation method using group of intelligent robots and device for the same
CN109084785A (en) * 2018-07-25 2018-12-25 吉林大学 More vehicle co-locateds and map constructing method, device, equipment and storage medium
CN109283927A (en) * 2018-09-05 2019-01-29 华南智能机器人创新研究院 A kind of multiple mobile robot's cooperative control method and system based on network-control
CN109579843A (en) * 2018-11-29 2019-04-05 浙江工业大学 Multirobot co-located and fusion under a kind of vacant lot multi-angle of view build drawing method
CN110605713A (en) * 2018-06-15 2019-12-24 科沃斯机器人股份有限公司 Robot positioning method, robot, and storage medium
CN111273892A (en) * 2020-02-13 2020-06-12 济南浪潮高新科技投资发展有限公司 Method for realizing intelligent robot based on cloud technology and edge calculation
US20200233436A1 (en) * 2017-02-16 2020-07-23 Indiana University Research And Technology Corporation Cloud based robotic control systems and methods
CN113379910A (en) * 2021-06-09 2021-09-10 山东大学 Mobile robot mine scene reconstruction method and system based on SLAM
CN113701742A (en) * 2021-08-24 2021-11-26 吕太之 Mobile robot SLAM method based on cloud and edge fusion calculation

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013109325A (en) * 2011-11-22 2013-06-06 Korea Electronics Telecommun Map creation method using group of intelligent robots and device for the same
US20200233436A1 (en) * 2017-02-16 2020-07-23 Indiana University Research And Technology Corporation Cloud based robotic control systems and methods
CN110605713A (en) * 2018-06-15 2019-12-24 科沃斯机器人股份有限公司 Robot positioning method, robot, and storage medium
CN109084785A (en) * 2018-07-25 2018-12-25 吉林大学 More vehicle co-locateds and map constructing method, device, equipment and storage medium
CN109283927A (en) * 2018-09-05 2019-01-29 华南智能机器人创新研究院 A kind of multiple mobile robot's cooperative control method and system based on network-control
CN109579843A (en) * 2018-11-29 2019-04-05 浙江工业大学 Multirobot co-located and fusion under a kind of vacant lot multi-angle of view build drawing method
CN111273892A (en) * 2020-02-13 2020-06-12 济南浪潮高新科技投资发展有限公司 Method for realizing intelligent robot based on cloud technology and edge calculation
CN113379910A (en) * 2021-06-09 2021-09-10 山东大学 Mobile robot mine scene reconstruction method and system based on SLAM
CN113701742A (en) * 2021-08-24 2021-11-26 吕太之 Mobile robot SLAM method based on cloud and edge fusion calculation

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114993285A (en) * 2022-04-27 2022-09-02 大连理工大学 Two-dimensional laser radar mapping method based on four-wheel omnidirectional all-wheel-drive mobile robot
CN114993285B (en) * 2022-04-27 2024-04-05 大连理工大学 Two-dimensional laser radar mapping method based on four-wheel omni-directional full-drive mobile robot
CN115797425A (en) * 2023-01-19 2023-03-14 中国科学技术大学 Laser global positioning method based on point cloud aerial view and rough-to-fine strategy

Similar Documents

Publication Publication Date Title
CN111897332B (en) Semantic intelligent substation robot humanoid inspection operation method and system
CN109100730B (en) Multi-vehicle cooperative rapid map building method
EP3629057B1 (en) Method and apparatus for calibrating relative pose and medium
CN109341706B (en) Method for manufacturing multi-feature fusion map for unmanned vehicle
CN114383611A (en) Multi-machine cooperative laser SLAM method, device and system for mobile robot
CN113269878B (en) Multi-sensor-based mapping method and system
CN111260751B (en) Mapping method based on multi-sensor mobile robot
CN110751123B (en) Monocular vision inertial odometer system and method
CN114413887B (en) Sensor external parameter calibration method, device and medium
CN112414415B (en) High-precision point cloud map construction method
CN114459471A (en) Positioning information determination method and device, electronic equipment and storage medium
CN114485619A (en) Multi-robot positioning and navigation method and device based on air-ground cooperation
CN107941167B (en) Space scanning system based on unmanned aerial vehicle carrier and structured light scanning technology and working method thereof
CN115355901A (en) Multi-machine combined graph building method fusing dynamic target perception
CN111208526B (en) Multi-unmanned aerial vehicle cooperative positioning method based on laser radar and positioning vector matching
CN112447058B (en) Parking method, parking device, computer equipment and storage medium
CN111402702A (en) Map construction method, device and system
CN113311452B (en) Positioning method and system based on multiple sensors
CN111025364A (en) Machine vision positioning system and method based on satellite assistance
CN114966793B (en) Three-dimensional measurement system, method and GNSS system
CN116907469A (en) Synchronous positioning and mapping method and system for multi-mode data combined optimization
CN115372947A (en) Calibration method and calibration system for inertial navigation and laser radar of automatic driving vehicle
CN114137953A (en) Power inspection robot system based on three-dimensional laser radar and image building method
CN111563934B (en) Monocular vision odometer scale determination method and device
CN113093759A (en) Robot formation construction method and system based on multi-sensor information fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination