CN107567036B - SLAM system and method based on wireless self-organizing local area network of robot search and rescue environment - Google Patents

SLAM system and method based on wireless self-organizing local area network of robot search and rescue environment Download PDF

Info

Publication number
CN107567036B
CN107567036B CN201710917049.7A CN201710917049A CN107567036B CN 107567036 B CN107567036 B CN 107567036B CN 201710917049 A CN201710917049 A CN 201710917049A CN 107567036 B CN107567036 B CN 107567036B
Authority
CN
China
Prior art keywords
robot
representing
map
wireless
local area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710917049.7A
Other languages
Chinese (zh)
Other versions
CN107567036A (en
Inventor
张承进
王洪玲
宋勇
庞豹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University
Original Assignee
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University filed Critical Shandong University
Priority to CN201710917049.7A priority Critical patent/CN107567036B/en
Publication of CN107567036A publication Critical patent/CN107567036A/en
Application granted granted Critical
Publication of CN107567036B publication Critical patent/CN107567036B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The invention discloses a SLAM system and a method based on a wireless self-organizing local area network of a robot search and rescue environment, wherein the system comprises a wireless self-organizing local area network formed by at least two satellite robots and a leading robot; each satellite acts as a mobile network node in the wireless ad hoc local area network and is configured to: transmitting the self-positioning information and the constructed corresponding local map information to the leading robot; the dominant robot acts as a mobile master network node in a wireless ad hoc local area network and is configured to: and receiving information transmitted by all the mobile network nodes, merging the position information of the dominant robot, forming a real-time robot cooperative motion control strategy and a global map of the search and rescue environment, and transmitting the real-time information to each mobile network node to control the mobile network nodes to accurately execute corresponding SLAM tasks.

Description

SLAM system and method based on wireless self-organizing local area network of robot search and rescue environment
Technical Field
The invention belongs to the field of robot search and rescue, and particularly relates to an SLAM system and method based on a wireless self-organizing local area network of a robot search and rescue environment.
Background
SLAM (simultaneous localization and mapping, locate and map) that maps the search and rescue environment with a mobile robot and simultaneously determines the location of itself in the search and rescue environment with the map that is constructed.
Because the robot locates and builds the map under special search and rescue environment (such as coal mine or underground search and rescue site), the GPS (global position system) signal is unstable in the search and rescue site, and even the GPS signal is difficult to obtain. Therefore, under the environment lacking reliable network connection, the mobile robots cannot upload the constructed map information and the acquired search and rescue environment information to the remote control center, each mobile robot is finally caused to be an isolated point, only the map of the local area in the search and rescue environment can be respectively drawn, the global map of the search and rescue environment cannot be obtained, and finally the efficiency and accuracy of the search and rescue of the robots can be reduced.
Disclosure of Invention
In order to solve the defects of the prior art, the first object of the invention is to provide a SLAM system based on a wireless self-organizing local area network of a robot search and rescue environment, which uses a robot as a mobile network node to form the wireless self-organizing local area network, finally obtains a global map of the search and rescue environment, and improves the search and rescue efficiency and accuracy.
The invention provides two technical schemes of SLAM systems based on a wireless self-organizing local area network of a robot search and rescue environment. Wherein:
the technical scheme of the SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment is as follows:
the SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment comprises a wireless self-organizing local area network formed by at least two satellite robots and one dominant robot;
each satellite acts as a mobile network node in the wireless ad hoc local area network and is configured to: transmitting the self-positioning information and the constructed corresponding local map information to the leading robot;
the dominant robot acts as a mobile master network node in a wireless ad hoc local area network and is configured to: and receiving information transmitted by all the mobile network nodes, merging the position information of the dominant robot, forming a real-time robot cooperative motion control strategy and a global map of the search and rescue environment, and transmitting the real-time information to each mobile network node to control the mobile network nodes to accurately execute corresponding SLAM tasks.
The second technical scheme of SLAM system based on robot search and rescue environment wireless self-organizing local area network is as follows:
The SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment comprises a wireless self-organizing local area network formed by at least two satellite robots and one dominant robot; the master robot and each slave robot are preset with corresponding communication distance ranges;
each satellite acts as a mobile network node in the wireless ad hoc local area network and is configured to: transmitting the self-positioning information and the constructed corresponding local map information to other satellite robots within a preset communication distance range, and finally uploading the information to a master robot;
the dominant robot acts as a mobile master network node in a wireless ad hoc local area network and is configured to: and fusing the received information transmitted by all the mobile network nodes with the position information of the master robot to form a real-time robot cooperative motion control strategy and a global map of the search and rescue environment, transmitting the real-time information to the slave robots within a preset communication distance range, and finally transmitting the information to the corresponding slave robots through communication links among the slave robots and controlling the corresponding slave robots to accurately execute corresponding SLAM tasks.
In any one of the above SLAM systems based on the wireless self-organizing local area network of the robot search and rescue environment, the dominant robot is also connected with a remote control center.
The remote control center has a local area network surfing function and is connected with the wireless self-organizing local area network.
In any one of the SLAM systems based on the wireless self-organizing local area network of the robot search and rescue environment, each of the following robots and the leading robot is provided with a wireless network communication module, and the leading robot and the following robot are mutually communicated through the wireless network communication module.
In any one of the above SLAM systems based on the wireless self-organizing local area network of the robot search and rescue environment, the wireless network communication module comprises a wireless network card, and the wireless network card is respectively connected with the wireless router and the wireless serial port communication module.
In any one of the SLAM systems based on the wireless self-organizing local area network of the robot search and rescue environment, each of the satellite robots and the master robot is provided with a positioning module.
In any one of the SLAM systems based on the wireless self-organizing local area network of the robot search and rescue environment, the positioning module comprises a gyroscope, a sonar sensor and a laser range finder, the gyroscope is used for positioning the movement direction of the corresponding robot, the sonar sensor is used for collecting the position of a target in the search and rescue environment, and the laser range finder is used for calculating the distance information from the range finder to the target.
In any one of the SLAM systems based on the wireless self-organizing local area network of the robot search and rescue environment, each of the satellite robots and the master robot is provided with an image vision module, and the image vision module is used for collecting target image information in the search and rescue environment.
The second object of the invention is to provide a working method of an SLAM system based on a wireless self-organizing local area network of a robot search and rescue environment.
The working method of the SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment comprises the following steps:
each satellite robot serves as a mobile network node in the wireless self-organizing local area network, and transmits self-positioning information and constructed corresponding local map information to the master robot;
the master robot is used as a mobile master control network node in the wireless self-organizing local area network, receives information transmitted by all mobile network nodes, fuses the position information of the master robot, forms a real-time robot cooperative motion control strategy and a global map of a search and rescue environment, and sends the real-time robot cooperative motion control strategy and the global map of the search and rescue environment to each mobile network node to control the mobile network node to accurately execute corresponding SLAM tasks.
The second working method of the SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment comprises the following steps:
Each satellite robot serves as a mobile network node in the wireless self-organizing local area network, and transmits own positioning information and corresponding constructed local map information to other satellite robots within a preset communication distance range, and finally, the other satellite robots are uploaded to the master robot;
the master robot serves as a mobile master control network node in the wireless self-organizing local area network, the received information transmitted by all mobile network nodes is fused with the position information of the master robot to form a real-time global map of the robot cooperative motion control strategy and the search and rescue environment, the real-time global map is transmitted to the slave robots within a preset communication distance range, and the global map is finally transmitted to the corresponding slave robots through communication links among the slave robots and controls the corresponding slave robots to accurately execute corresponding SLAM tasks.
Compared with the prior art, the invention has the beneficial effects that:
(1) According to the SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment, each satellite robot is used as a mobile network node in the wireless self-organizing local area network, the master robot is used as a mobile master network node in the wireless self-organizing local area network to form the wireless self-organizing local area network, the search and rescue environment site is detected autonomously, a global update map is output, and map information of a bad site is provided for search and rescue team members.
(2) The invention can also construct the map quickly, accurately and robustly in the coverage of the search and rescue exploration area and the cooperation of the multiple mobile robots according to the preset communication distance range, and has the positive effects that the mobile robots are used as the multiple sensor nodes of the wireless network and flexibly deploy the mobile wireless local area network, so that the master robot and the slave robot can cooperatively execute SLAM autonomously.
(3) The dominant robot in the SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment is also communicated with the remote control center, and the remote control center is utilized to monitor and manage the running states of the dominant robot and the satellite robot in the search and rescue environment.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute an undue limitation to the application.
FIG. 1 is a schematic diagram of an embodiment of a SLAM system based on a wireless self-organizing local area network in a robot search and rescue environment according to the present invention.
Fig. 2 is a schematic diagram of a second embodiment of a SLAM system based on a wireless ad hoc local area network in a robot search and rescue environment according to the present invention.
FIG. 3 is a schematic diagram of a third embodiment of a SLAM system based on a wireless ad hoc LAN in a robot search and rescue environment according to the present invention.
FIG. 4 is a schematic diagram of a SLAM system of the wireless self-organizing local area network based on the robot search and rescue environment according to the present invention.
Fig. 5 is a schematic diagram of a dominant robot of the present invention.
Fig. 6 is a schematic view of a follower robot of the present invention.
FIG. 7 is a flow chart of a global map fusion process.
FIG. 8 is a SLAM algorithm process diagram for a search and rescue environment.
Fig. 9 is a SLAM information map fused to form a global map.
Detailed Description
It should be noted that the following detailed description is illustrative and is intended to provide further explanation of the present application. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments in accordance with the present application. As used herein, the singular is also intended to include the plural unless the context clearly indicates otherwise, and furthermore, it is to be understood that the terms "comprises" and/or "comprising" when used in this specification are taken to specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof.
Example 1
FIG. 1 is a schematic diagram of an embodiment of a SLAM system based on a wireless self-organizing local area network in a robot search and rescue environment according to the present invention.
As shown in fig. 1, the SLAM system based on the wireless ad hoc local area network of the robot search and rescue environment of the present embodiment includes a wireless ad hoc local area network composed of at least two satellite robots and one dominant robot;
each satellite acts as a mobile network node in the wireless ad hoc local area network and is configured to: transmitting the self-positioning information and the constructed corresponding local map information to the leading robot;
the dominant robot acts as a mobile master network node in a wireless ad hoc local area network and is configured to: and receiving information transmitted by all the mobile network nodes, merging the position information of the dominant robot, forming a real-time robot cooperative motion control strategy and a global map of the search and rescue environment, and transmitting the real-time information to each mobile network node to control the mobile network nodes to accurately execute corresponding SLAM tasks.
In this embodiment, the master robot is mounted with an industrial personal computer, such as a Pioneer LX industrial personal computer IPC.
As shown in fig. 5, a mobile robot control platform, a motion controller, a program running platform (Visual studio, matlab) and a software compiling environment (Cmake) are installed in the industrial personal computer, and are used for processing information acquired from the wireless ad hoc local area network, constructing local maps around the satellite robots in the explored search and rescue environment, fusing to form an updated global map, positioning targets of the multiple mobile robots and the surrounding environment by using the updated map, further generating a cooperative motion control strategy of the multiple robots, outputting instructions, and controlling the whole mobile robot wireless ad hoc network system to reliably and accurately execute SLAM tasks.
The wireless network communication modules are arranged on the leading robot and the trailing robot, and the leading robot and the trailing robot are communicated with each other through the wireless network communication modules.
The wireless network communication module comprises a wireless network card, and the wireless network card is respectively connected with the wireless router and the wireless serial port communication module.
The wireless network card can adopt a Tp-link wireless network card; the wireless router can adopt a Tp-link wireless router and is fixed on the dominant robot by the base.
And the leading robot and the following robot are provided with positioning modules.
The positioning module comprises a gyroscope, a sonar sensor and a laser range finder, wherein the gyroscope is used for positioning the movement direction of the corresponding robot, the sonar sensor is used for collecting the position of a target in a search and rescue environment, and the laser range finder is used for calculating the distance information from the range finder to the target.
Image vision modules are arranged on the leading robot and the trailing robot, and the image vision modules are used for collecting target image information in the search and rescue environment.
Specifically, the industrial personal computer IPC intelligently generates a wireless self-organizing sensor mobile node deployment strategy through a mobile robot control platform, a motion controller, a program operation platform (Visual studio, matlab) and a software compiling environment (Cmake), a sensor mobile node algorithm is arranged in a search and rescue area a, and the position coordinates and angles of output nodes can be represented by a formula (1):
L A ={A(x i ,y ii ) N } formula (1)
Wherein L is A Representing the dynamic location of the assigned mobile node;
(x i ,y ii ) Representing coordinates and angles of the ith node arrangement;
n represents the number of mobile nodes of the wireless sensor network.
The absolute error of the robot exploration trajectory (absolute trajectory error, ATE) can be calculated from equation (2):
Figure BDA0001425994470000051
wherein N represents the number of measured values;
x i 、y i cartesian coordinates representing the absolute position of the i-th robot;
Figure BDA0001425994470000061
representing the estimated pose of the i-th step of the robot.
In a wireless self-organizing local area network, a performance evaluation index K of SLAM Q Is proportional to the coverage area of the search and rescue environment area, and inversely proportional to the sum of the exploration distances of each robot, and is expressed by a formula (3):
Figure BDA0001425994470000062
wherein A represents the area of the search and rescue area;
l i representing the exploration distance of each robot.
The image vision module collects image information of surrounding environment targets, and extracts image features of the targets by using a least square elliptic curve fitting (least square ellipse curve-fitting) algorithm, and the image features are represented by a formula (4):
F(p,x,y)=x 2 +3xy+4y 2 +5x+6y+7=0 equation (4)
Wherein x represents a coordinate value along the x-axis direction;
y represents a coordinate value along the y-axis direction;
p= {1,3,4,5,6,7}; p represents a constant value affecting a least squares elliptic curve fitting algorithm and represents an image feature extraction radius.
The motion controller of the satellite robot receives a control instruction of the main robot through a Tp-link wireless router, a wireless RS232/485 module and a Tp-link wireless network card, and performs motion exploration in a designated search and rescue area. In the exploration process, the forward sonar and the backward sonar send and receive ultrasonic signals, and the distance between the surrounding target position and the robot is calculated; the gyroscope measures the rotation angle, calculates the pose of the robot and the angle of the target relative to the robot, and thus completes the positioning process, as shown in fig. 6.
The information obtained by exploring the mobile node of the follow-up robot is transmitted to the wireless self-organizing local area network through the wireless RS232/485 module and the Tp-link wireless router, so that the follow-up robot can communicate and exchange information with other mobile robots in the network.
The invention relates to a process diagram of a map algorithm for simultaneously positioning and constructing a mobile robot search and rescue environment of a wireless self-organizing local area network SLAM system, wherein formulas (5) - (11) represent defined state variable sets:
S={S 1 , S 2 , …, S n formula (5)
R={r 1 , r 2 , …, r l Formula (6)
Wherein S represents a task set of global map construction;
S i representing the region construction sub-map task distributed by each mobile robot;
r represents a multi-robot set;
r j representing each mobile robot.
Wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure BDA0001425994470000071
S i ∩S j =φ,i≠j,(i,j=1,2,…,l).
set S is divided into l independent subsets:
L={s 1 , s 2 , …, s l formula (7)
s i Representation and each robot r i Related local sub-maps.
The control strategy cost set of the invention is:
C S ={c s (1), c s (2), …, c s (n) } equation (8)
c s (i) Is a cost series of set S, where i=1, 2, …, n
Figure BDA0001425994470000072
Wherein, the liquid crystal display device comprises a liquid crystal display device,c m (s) represents robot r m The minimum cost of the control strategy;
c *m (S) represents robot r m Is provided.
Figure BDA0001425994470000073
Wherein s is m E L represents the optimal set of series for SLAM task execution.
Equation (11) for robot observation state vector X
Figure BDA0001425994470000074
Wherein x represents a coordinate value along the x-axis direction;
y represents a coordinate value along the y-axis direction;
θ represents an azimuth angle from the x-axis;
Figure BDA0001425994470000075
and->
Figure BDA0001425994470000076
Representing their speed;
x is a robot state observation vector defined in a Cartesian (Cartesian) coordinate system.
The SLAM algorithm process utilizes a laser rangefinder and employs an integrated probability distribution particle filter (distributed particle filter, DP filter) algorithm:
the kth sub-map position and azimuth are expressed as:
M k ={x k ,y kk },M k e M, k=1, 2, … l equation (12)
x k And y k Cartesian coordinate values, θ, representing a kth sub-map k Representing the azimuth of the sub-map.
The set of all sub-maps is represented as:
M={M 1 , M 2 , …, M l equation (13)
Wherein M is k Representing the coordinates of the kth sub-map, the displacement of two consecutive adjacent sub-maps is represented as:
ξ k =M k -M k-1 ={Δx k,k-1 ,Δy k,k-1 ,Δθ k,k-1 } T formula (14)
Wherein, xi k Represents M k And M k-1 A distance therebetween;
Δx k,k-1 ,Δy k,k-1 representing displacement along the x-axis and y-axis, Δθ k,k-1 Representing the departure angle of the two sub-maps.
From the point cloud paired minimum square error distance, the vector of the global scan can be expressed as:
S t =(x t , y t , θ t ) T formula (15)
Wherein x is t A coordinate value representing a direction along the x-axis;
y t a coordinate value representing a direction along the y-axis;
θ t representing azimuth angles off the x-axis;
S t a vector representing the cartesian coordinates of the global scan.
Displacement deltas of continuous laser scanning t Represented as
ΔS t =(Δx t , Δy t , Δθ t ) T Formula (16)
Wherein the method comprises the steps of
Figure BDA0001425994470000081
Δθ t =θ 1020 And x is 10 ,x 20 Two adjacent scan values in the x coordinate axis direction, y 10 ,y 20 Two adjacent scan values in y coordinate axis direction, θ 1020 Respectively representing the angles of two continuous scanning values deviating from the x axis;
Δx t representing two successive scan values in the x-axisDistance in the direction;
Δy t representing the distance of two consecutive scan values in the y-axis direction;
Δθ t representing the angle of departure between two consecutive scan values.
The local sensor observations Ω constitute a sub-map, where ψ is the algorithmic connection expressed as:
ψ={i, j, Δ, C Δ ' i, j ' epsilon ' omega formula (17)
Wherein i, j represent an index connecting two observations, i.e., the i-th observation set is associated with the j-th observation set; delta position estimate State Difference, C Δ Representing the estimated covariance matrix.
The integrated probability distribution particles (DP filters) estimate the exploration trajectory of the robot by selecting robot position state particles, each with a specific set of Extended Kalman Filters (EKFs), as shown in fig. 7 and 8, the orientations of landmark points are independently estimated according to the following steps:
step one: the robot exploration trajectory set r is expressed as:
r=[r 1 , r 2 , …, r k ]r epsilon R formula (18)
Wherein r is i =(x i ,y i θ i ) Representing each time t i The state vector of the robot, and thus the motion trajectory of the robot, is composed of a series of motion states, and the position set s of the object can be expressed as:
s=[l 1 , l 2 , …, l N ]s epsilon L formula (19)
Each robot motion trail forms a sub-map, and l j =(x j , y j ) Representing the location of the target, the target location is associated with the sub-map.
Step two: measurement function of observation prediction. The filter position state vector is expressed as:
s i =(l x l y ) T formula (20)
Wherein s is i Representing the position of the ith target, which corresponds toCovariance matrix is expressed as
Figure BDA00014259944700000913
The observed value z=d represents the covariance matrix C over the distance d z The estimated values are:
Figure BDA0001425994470000091
step three: the weighting factor is updated for each state particle based on the observation distance. Updating error vector e i And corresponding covariance matrix
Figure BDA0001425994470000092
Expressed by formulas (22) and (23):
e i =z-P i (s) formula (22)
Figure BDA0001425994470000093
Wherein ∈P s Jacobian matrix (jacobian matrix):
Figure BDA0001425994470000094
where x=d x -l x ,y=d y -l y And (2) and
Figure BDA0001425994470000095
J is a jacobian symbol; the jacobian function represents the optimal linear estimated distance of the robot from the target.
Setting up
Figure BDA0001425994470000096
Then equation (24) can be expressed as:
Figure BDA0001425994470000097
the corresponding covariance matrix for each state particle is updated as follows:
s i+1 =s i +Ke i formula (26)
Figure BDA0001425994470000098
Wherein the method comprises the steps of
Figure BDA0001425994470000099
Then equation (23) can be obtained:
Figure BDA00014259944700000910
in a particular robotic motion detection system, covariance matrix
Figure BDA00014259944700000911
It was determined that, in the derivation according to formula (21), one observation target is in the z=d range, its corresponding covariance matrix C z Is a constant, i.e. has +.>
Figure BDA00014259944700000912
Is a constant matrix.
The weight factor for each state particle update is calculated according to equation (29):
Figure BDA0001425994470000101
in the full-coverage closed-loop detection process of building a map, the weight factors are used for randomly exploring the track of the robot, and z=d is set in the wireless network detection range of the multi-mobile robot according to the proposed integrated probability distribution particle filter algorithm, and n d Is one set of z=d, then the set of map data M stored by the dominant robot i Can be expressed as:
Figure BDA0001425994470000102
wherein j is e n d Formula (30)
Wherein m is j Representing a j-th position observation vector;
Figure BDA0001425994470000103
representing the sum of the position observation vectors.
The resulting fusion forms a SLAM information map of the global map, as shown in FIG. 9.
The working method of the SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment of the embodiment of the invention comprises the following steps:
step 1: each satellite robot serves as a mobile network node in the wireless self-organizing local area network, and transmits self-positioning information and constructed corresponding local map information to the master robot;
step 2: the master robot is used as a mobile master control network node in the wireless self-organizing local area network, receives information transmitted by all mobile network nodes, fuses the position information of the master robot, forms a real-time robot cooperative motion control strategy and a global map of a search and rescue environment, and sends the real-time robot cooperative motion control strategy and the global map of the search and rescue environment to each mobile network node to control the mobile network node to accurately execute corresponding SLAM tasks.
In the SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment, each satellite robot is used as a mobile network node in the wireless self-organizing local area network, the master robot is used as a mobile master network node in the wireless self-organizing local area network to form the wireless self-organizing local area network, the search and rescue environment site is independently detected, a global update map is output, and map information of a bad site is provided for search and rescue team members.
Example two
Fig. 2 is a schematic diagram of a second embodiment of a SLAM system based on a wireless ad hoc local area network in a robot search and rescue environment according to the present invention.
As shown in fig. 2, on the basis of the SLAM system structure of the wireless ad hoc local area network based on the robot search and rescue environment according to the first embodiment, the master robot is further connected to a remote control center.
The remote control center has a local area network surfing function and is connected with the wireless self-organizing local area network.
The working principle of the SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment in the embodiment is the same as that of the embodiment I.
In the SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment, each satellite robot is used as a mobile network node in the wireless self-organizing local area network, the master robot is used as a mobile master network node in the wireless self-organizing local area network to form the wireless self-organizing local area network, the search and rescue environment site is independently detected, a global update map is output, and map information of a bad site is provided for search and rescue team members.
The dominant robot in the SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment of the embodiment is also communicated with a remote control center, and the remote control center is utilized to monitor and manage the running states of the dominant robot and the satellite robot in the search and rescue environment.
Example III
FIG. 3 is a schematic diagram of a third embodiment of a SLAM system based on a wireless ad hoc LAN in a robot search and rescue environment according to the present invention.
As shown in fig. 3, the SLAM system based on the wireless ad hoc local area network of the robot search and rescue environment of the present embodiment includes a wireless ad hoc local area network composed of at least two satellite robots and one dominant robot; the master robot and each slave robot are preset with corresponding communication distance ranges;
each satellite acts as a mobile network node in the wireless ad hoc local area network and is configured to: transmitting the self-positioning information and the constructed corresponding local map information to other satellite robots within a preset communication distance range, and finally uploading the information to a master robot;
the dominant robot acts as a mobile master network node in a wireless ad hoc local area network and is configured to: and fusing the received information transmitted by all the mobile network nodes with the position information of the master robot to form a real-time robot cooperative motion control strategy and a global map of the search and rescue environment, transmitting the real-time information to the slave robots within a preset communication distance range, and finally transmitting the information to the corresponding slave robots through communication links among the slave robots and controlling the corresponding slave robots to accurately execute corresponding SLAM tasks.
The working method of the SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment of the embodiment comprises the following steps:
step 1: each satellite robot serves as a mobile network node in the wireless self-organizing local area network, and transmits own positioning information and corresponding constructed local map information to other satellite robots within a preset communication distance range, and finally, the other satellite robots are uploaded to the master robot;
step 2: the master robot serves as a mobile master control network node in the wireless self-organizing local area network, the received information transmitted by all mobile network nodes is fused with the position information of the master robot to form a real-time global map of the robot cooperative motion control strategy and the search and rescue environment, the real-time global map is transmitted to the slave robots within a preset communication distance range, and the global map is finally transmitted to the corresponding slave robots through communication links among the slave robots and controls the corresponding slave robots to accurately execute corresponding SLAM tasks.
In the SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment, each satellite robot serves as a mobile network node in the wireless self-organizing local area network, and transmits self-positioning information and corresponding constructed local map information to other satellite robots within a preset communication distance range of the satellite robots, and finally the satellite robots are uploaded to a leading robot; the master robot serves as a mobile master control network node in the wireless self-organizing local area network, the received information transmitted by all mobile network nodes is fused with the position information of the master robot to form a real-time global map of the robot cooperative motion control strategy and the search and rescue environment, the real-time global map is transmitted to the slave robots within a preset communication distance range, and the global map is finally transmitted to the corresponding slave robots through communication links among the slave robots and controls the corresponding slave robots to accurately execute corresponding SLAM tasks.
The method and the system can also build the map quickly, accurately and robustly in the coverage of the search and rescue exploration area and in the cooperation of the multiple mobile robots according to the preset communication distance range, and have the positive effects that the mobile robots serve as the multiple sensor nodes of the wireless network and flexibly deploy the mobile wireless local area network, so that the master robot and the slave robot can cooperatively execute SLAM autonomously.
Example IV
FIG. 4 is a schematic diagram of a SLAM system of the wireless self-organizing local area network based on the robot search and rescue environment according to the present invention.
As shown in fig. 4, on the basis of the SLAM system structure of the wireless ad hoc local area network based on the robot search and rescue environment in the first embodiment, the master robot is further connected to a remote control center.
The remote control center has a local area network surfing function and is connected with the wireless self-organizing local area network.
The working principle of the SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment is the same as that of the embodiment.
In the SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment, each satellite robot serves as a mobile network node in the wireless self-organizing local area network, and transmits self-positioning information and corresponding constructed local map information to other satellite robots within a preset communication distance range of the satellite robots, and finally the satellite robots are uploaded to a leading robot; the master robot serves as a mobile master control network node in the wireless self-organizing local area network, the received information transmitted by all mobile network nodes is fused with the position information of the master robot to form a real-time global map of the robot cooperative motion control strategy and the search and rescue environment, the real-time global map is transmitted to the slave robots within a preset communication distance range, and the global map is finally transmitted to the corresponding slave robots through communication links among the slave robots and controls the corresponding slave robots to accurately execute corresponding SLAM tasks.
The method and the system can also build the map quickly, accurately and robustly in the coverage of the search and rescue exploration area and in the cooperation of the multiple mobile robots according to the preset communication distance range, and have the positive effects that the mobile robots serve as the multiple sensor nodes of the wireless network and flexibly deploy the mobile wireless local area network, so that the master robot and the slave robot can cooperatively execute SLAM autonomously.
The dominant robot in the SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment of the embodiment is also communicated with a remote control center, and the remote control center is utilized to monitor and manage the running states of the dominant robot and the satellite robot in the search and rescue environment.
While the foregoing description of the embodiments of the present invention has been presented in conjunction with the drawings, it should be understood that it is not intended to limit the scope of the invention, but rather, it is intended to cover all modifications or variations within the scope of the invention as defined by the claims of the present invention.

Claims (16)

1. The SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment is characterized by comprising a wireless self-organizing local area network formed by at least two following robots and one leading robot;
Each satellite acts as a mobile network node in the wireless ad hoc local area network and is configured to: transmitting the self-positioning information and the constructed corresponding local map information to the leading robot;
the dominant robot acts as a mobile master network node in a wireless ad hoc local area network and is configured to: receiving information transmitted by all mobile network nodes, merging position information of a dominant robot to form a real-time robot cooperative motion control strategy and a global map of a search and rescue environment, and sending the real-time information to each mobile network node to control the mobile network node to accurately execute a corresponding SLAM task;
the master robot is provided with an industrial personal computer, and a mobile robot control platform, a motion controller, a program running platform and a software compiling environment are arranged in the industrial personal computer;
the wireless self-organizing local area network SLAM system mobile robot search and rescue environment simultaneously locates and builds a map algorithm process diagram, and formulas (5) - (11) represent defined state variable sets:
S={S 1 ,S 2 ,…,S n formula (5)
R={r 1 ,r 2 ,…,r l Formula (6)
Wherein S represents a task set of global map construction;
S i representing the region construction sub-map task distributed by each mobile robot;
R represents a multi-robot set;
r j representing each mobile robot;
wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure FDA0004257778210000011
S i ∩S j =φ,i≠j,(i,j=1,2,…,l);
set S is divided into l independent subsets:
L={s 1 ,s 2 ,…,s l formula (7)
s i Representation and each robot r i An associated local sub-map;
the control strategy cost set is as follows:
C S ={c s (1),c s (2),…,c s (n) } equation (8)
c s (i) Is a cost series of set S, where i=1, 2, …, n
Figure FDA0004257778210000012
Wherein c m (s)Representation robot r m The minimum cost of the control strategy;
c *m (S) represents robot r m Is a control strategy of the system;
Figure FDA0004257778210000021
wherein s is m E L represents an optimal series set of SLAM task execution processes;
equation (11) for robot observation state vector X
Figure FDA0004257778210000022
Wherein x represents a coordinate value along the x-axis direction;
y represents a coordinate value along the y-axis direction;
θ represents an azimuth angle from the x-axis;
Figure FDA0004257778210000023
and->
Figure FDA0004257778210000024
Representing their speed;
x is a robot state observation vector defined in a Cartesian coordinate system;
the SLAM algorithm process uses a laser rangefinder and employs an integrated probability distribution particle filter distributed particle filter, DP filter algorithm:
the kth sub-map position and azimuth are expressed as:
M k ={x k ,y kk },M k e M, k=1, 2, … l equation (12)
x k And y k Cartesian coordinate values, θ, representing a kth sub-map k Representing azimuth angles of the sub-map;
the set of all sub-maps is represented as:
M={M 1 ,M 2 ,…,M l equation (13)
Wherein M is k Representing the coordinates of the kth sub-map, the displacement of two consecutive adjacent sub-maps is represented as:
ξ k =M k -M k-1 ={Δx k,k-1 ,Δy k,k-1 ,Δθ k,k-1 } T formula (14)
Wherein, xi k Represents M k And M k-1 A distance therebetween;
Δx k,k-1 ,Δy k,k-1 representing displacement along the x-axis and y-axis, Δθ k,k-1 Representing the departure angle of the two sub-maps;
from the point cloud paired minimum square error distance, the vector of the global scan can be expressed as:
S t =(x t ,y tt ) T formula (15)
Wherein x is t A coordinate value representing a direction along the x-axis;
y t a coordinate value representing a direction along the y-axis;
θ t representing azimuth angles off the x-axis;
S t a vector representing cartesian coordinates of the global scan;
displacement deltas of continuous laser scanning t Represented as
ΔS t =(Δx t ,Δy t ,Δθ t ) T Formula (16)
Wherein the method comprises the steps of
Figure FDA0004257778210000031
Δθ t =θ 1020 And x is 10 ,x 20 Two adjacent scan values in the x coordinate axis direction, y 10 ,y 20 Two adjacent scan values in y coordinate axis direction, θ 1020 Respectively representing the angles of two continuous scanning values deviating from the x axis;
Δx t representing twoThe distance of the continuous scanning value in the x-axis direction;
Δy t representing the distance of two consecutive scan values in the y-axis direction;
Δθ t an angle representing the deviation between two consecutive scan values;
the local sensor observations Ω constitute a sub-map, where ψ is the algorithmic connection expressed as:
ψ={i,j,Δ,C Δ ' i, j ' epsilon ' omega formula (17)
Wherein i, j represent an index connecting two observations, i.e., the i-th observation set is associated with the j-th observation set; delta position estimate State Difference, C Δ Representing an estimated covariance matrix;
the position of landmark points is independently estimated according to the following steps:
step one: the robot exploration trajectory set r is expressed as:
r=[r 1 ,r 2 ,…,r k ]r epsilon R formula (18)
Wherein r is i =(x i ,y i θ i ) Representing each time t i The state vector of the robot, and thus the motion trajectory of the robot, is composed of a series of motion states, and the position set s of the object can be expressed as:
s=[l 1 ,l 2 ,…,l N ]s epsilon L formula (19)
Each robot motion trail forms a sub-map, and l j =(x j ,y j ) Representing the position of a target, the target position being associated with a sub-map;
step two: a measurement function of observation prediction; the filter position state vector is expressed as:
s i =(l x l y ) T formula (20)
Wherein s is i Representing the position of the ith target, the corresponding covariance matrix of which is represented as
Figure FDA0004257778210000041
The observed value z=d representsCovariance matrix C over distance d z The estimated values are:
Figure FDA0004257778210000042
step three: updating a weight factor for each state particle based on the observation distance; updating error vector e i And corresponding covariance matrix
Figure FDA0004257778210000043
Expressed by formulas (22) and (23):
e i =z-P i (s) formula (22)
Figure FDA0004257778210000044
Wherein the method comprises the steps of
Figure FDA0004257778210000045
Jacobian matrix:
Figure FDA0004257778210000046
where x=d x -l x ,y=d y -l y And (2) and
Figure FDA0004257778210000047
Figure FDA00042577782100000415
is a jacobian symbol; the jacobian function represents an optimal linear estimated distance of the robot from the target;
setting up
Figure FDA0004257778210000048
Then equation (24) can be expressed as:
Figure FDA0004257778210000049
the corresponding covariance matrix for each state particle is updated as follows:
s i+1 =s i +Ke i formula (26)
Figure FDA00042577782100000410
Wherein the method comprises the steps of
Figure FDA00042577782100000411
Then equation (23) can be obtained:
Figure FDA00042577782100000412
in a particular robotic motion detection system, covariance matrix
Figure FDA00042577782100000413
It was determined that, in the derivation according to formula (21), one observation target is in the z=d range, its corresponding covariance matrix C z Is a constant, i.e. has +.>
Figure FDA00042577782100000414
Is a constant matrix;
the weight factor for each state particle update is calculated according to equation (29):
Figure FDA0004257778210000051
setting z=d in the wireless network detection range of the multi-mobile robot according to the proposed integrated probability distribution particle filter algorithm, n d Is one set of z=d, then the set of map data M stored by the dominant robot i Can be expressed as:
Figure FDA0004257778210000052
wherein m is j Representing a j-th position observation vector;
Figure FDA0004257778210000053
and (5) representing the addition of the position observation vectors, and finally obtaining the SLAM information graph fused to form the global map.
2. The SLAM system of claim 1, wherein the master robot is further coupled to a remote control center.
3. The SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment as set forth in claim 1, wherein each of the slave robots and the master robot is provided with a wireless network communication module, and the master robot and the slave robots communicate with each other through the wireless network communication module.
4. The SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment as set forth in claim 3, wherein the wireless network communication module comprises a wireless network card, and the wireless network card is respectively connected with the wireless router and the wireless serial communication module.
5. The SLAM system of claim 1, wherein each of the satellite robots and the master robot is equipped with a positioning module.
6. The SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment as set forth in claim 5, wherein the positioning module comprises a gyroscope, a sonar sensor and a laser range finder, the gyroscope is used for positioning the movement direction of the corresponding robot, the sonar sensor is used for collecting the position of a target in the search and rescue environment, and the laser range finder is used for calculating distance information from the range finder to the target.
7. The SLAM system of claim 1, wherein each of the satellite robots and the master robot is provided with an image vision module for collecting target image information in the search and rescue environment.
8. A method of operating a SLAM system based on a wireless ad hoc local area network in a robotic search and rescue environment as claimed in claim 1, comprising:
each satellite robot serves as a mobile network node in the wireless self-organizing local area network, and transmits self-positioning information and constructed corresponding local map information to the master robot;
the master robot is used as a mobile master control network node in the wireless self-organizing local area network, receives information transmitted by all mobile network nodes, fuses the position information of the master robot, forms a real-time robot cooperative motion control strategy and a global map of a search and rescue environment, and sends the real-time robot cooperative motion control strategy and the global map of the search and rescue environment to each mobile network node to control the mobile network node to accurately execute corresponding SLAM tasks.
9. The SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment is characterized by comprising a wireless self-organizing local area network formed by at least two following robots and one leading robot; the master robot and each slave robot are preset with corresponding communication distance ranges;
Each satellite acts as a mobile network node in the wireless ad hoc local area network and is configured to: transmitting the self-positioning information and the constructed corresponding local map information to other satellite robots within a preset communication distance range, and finally uploading the information to a master robot;
the dominant robot acts as a mobile master network node in a wireless ad hoc local area network and is configured to: the method comprises the steps of fusing the received information transmitted by all mobile network nodes with the position information of a master robot to form a real-time robot cooperative motion control strategy and a global map of a search and rescue environment, transmitting the real-time information to the slave robots within a preset communication distance range, and finally transmitting the information to the corresponding slave robots through communication links among the slave robots to control the corresponding slave robots to accurately execute corresponding SLAM tasks;
the master robot is provided with an industrial personal computer, and a mobile robot control platform, a motion controller, a program running platform and a software compiling environment are arranged in the industrial personal computer;
the wireless self-organizing local area network SLAM system mobile robot search and rescue environment simultaneously locates and builds a map algorithm process diagram, and formulas (5) - (11) represent defined state variable sets:
S={S 1 ,S 2 ,…,S n Formula (5)
R={r 1 ,r 2 ,…,r l Formula (6)
Wherein S represents a task set of global map construction;
S i representing the region construction sub-map task distributed by each mobile robot;
r represents a multi-robot set;
r j representing each mobile robot;
wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure FDA0004257778210000061
S i ∩S j =φ,i≠j,(i,j=1,2,…,l);
set S is divided into l independent subsets:
L={s 1 ,s 2 ,…,s l } formula (7)
s i Representation and each robot r i An associated local sub-map;
the control strategy cost set is as follows:
C S ={c s (1),c s (2),…,c s (n) } equation (8)
c s (i) Is a cost series of set S, where i=1, 2, …, n
Figure FDA0004257778210000071
Wherein c m (s) represents robot r m The minimum cost of the control strategy;
c *m (S) represents robot r m Is a control strategy of the system;
Figure FDA0004257778210000072
wherein s is m E L represents an optimal series set of SLAM task execution processes;
equation (11) for robot observation state vector X
Figure FDA0004257778210000073
Wherein x represents a coordinate value along the x-axis direction;
y represents a coordinate value along the y-axis direction;
θ represents an azimuth angle from the x-axis;
Figure FDA0004257778210000074
and->
Figure FDA0004257778210000075
Representing their speed;
x is a robot state observation vector defined in a Cartesian coordinate system;
the SLAM algorithm process uses a laser rangefinder and employs an integrated probability distribution particle filter distributed particle filter, DP filter algorithm:
The kth sub-map position and azimuth are expressed as:
M k ={x k ,y kk },M k e M, k=1, 2, … l equation (12)
x k And y k Cartesian coordinate values, θ, representing a kth sub-map k Representing azimuth angles of the sub-map;
the set of all sub-maps is represented as:
M={M 1 ,M 2 ,…,M l equation (13)
Wherein M is k Representing the coordinates of the kth sub-map, the displacement of two consecutive adjacent sub-maps is represented as:
ξ k =M k -M k-1 ={Δx k,k-1 ,Δy k,k-1 ,Δθ k,k-1 } T formula (14)
Wherein, xi k Represents M k And M k-1 A distance therebetween;
Δx k,k-1 ,Δy k,k-1 representing displacement along the x-axis and y-axis, Δθ k,k-1 Representing the departure angle of the two sub-maps;
from the point cloud paired minimum square error distance, the vector of the global scan can be expressed as:
S t =(x t ,y tt ) T formula (15)
Wherein x is t A coordinate value representing a direction along the x-axis;
y t a coordinate value representing a direction along the y-axis;
θ t representing azimuth angles off the x-axis;
S t a vector representing cartesian coordinates of the global scan;
continuous laser scanningIs of the displacement DeltaS of (2) t Represented as
ΔS t =(Δx t ,Δy t ,Δθ t ) T Formula (16)
Wherein the method comprises the steps of
Figure FDA0004257778210000081
Δθ t =θ 1020 And x is 10 ,x 20 Two adjacent scan values in the x coordinate axis direction, y 10 ,y 20 Two adjacent scan values in y coordinate axis direction, θ 1020 Respectively representing the angles of two continuous scanning values deviating from the x axis;
Δx t representing the distance of two consecutive scan values in the x-axis direction;
Δy t representing the distance of two consecutive scan values in the y-axis direction;
Δθ t An angle representing the deviation between two consecutive scan values;
the local sensor observations Ω constitute a sub-map, where ψ is the algorithmic connection expressed as:
ψ={i,j,Δ,C Δ ' i, j ' epsilon ' omega formula (17)
Wherein i, j represent an index connecting two observations, i.e., the i-th observation set is associated with the j-th observation set; delta position estimate State Difference, C Δ Representing an estimated covariance matrix;
the position of landmark points is independently estimated according to the following steps:
step one: the robot exploration trajectory set r is expressed as:
r=[r 1 ,r 2 ,…,r k ]r epsilon R formula (18)
Wherein r is i =(x i ,y i θ i ) Representing each time t i The state vector of the robot, and thus the motion trajectory of the robot, is composed of a series of motion states, and the position set s of the object can be expressed as:
s=[l 1 ,l 2 ,…,l N ],s∈l formula (19)
Each robot motion trail forms a sub-map, and l j =(x j ,y j ) Representing the position of a target, the target position being associated with a sub-map;
step two: a measurement function of observation prediction; the filter position state vector is expressed as:
s i =(l x l y ) T formula (20)
Wherein s is i Representing the position of the ith target, the corresponding covariance matrix of which is represented as
Figure FDA0004257778210000091
The observed value z=d represents the covariance matrix C over the distance d z The estimated values are:
Figure FDA0004257778210000092
step three: updating a weight factor for each state particle based on the observation distance; updating error vector e i And corresponding covariance matrix
Figure FDA0004257778210000093
Expressed by formulas (22) and (23):
e i =z-P i (s) formula (22)
Figure FDA0004257778210000094
Wherein the method comprises the steps of
Figure FDA0004257778210000095
Jacobian matrix:
Figure FDA0004257778210000096
where x=d x -l x ,y=d y -l y And (2) and
Figure FDA0004257778210000097
Figure FDA0004257778210000098
is a jacobian symbol; the jacobian function represents an optimal linear estimated distance of the robot from the target;
setting up
Figure FDA0004257778210000099
Then equation (24) can be expressed as:
Figure FDA00042577782100000910
the corresponding covariance matrix for each state particle is updated as follows:
s i+1 =s i +Ke i formula (26)
Figure FDA00042577782100000911
Wherein the method comprises the steps of
Figure FDA00042577782100000912
Then equation (23) can be obtained:
Figure FDA0004257778210000101
in a particular robotic motion detection system, covariance matrix
Figure FDA0004257778210000102
Is determined, in the derivation according to the formula (21), an observationThe measurement object is in the range of z=d, and the corresponding covariance matrix C thereof z Is a constant, i.e. has +.>
Figure FDA0004257778210000103
Is a constant matrix;
the weight factor for each state particle update is calculated according to equation (29):
Figure FDA0004257778210000104
setting z=d in the wireless network detection range of the multi-mobile robot according to the proposed integrated probability distribution particle filter algorithm, n d Is one set of z=d, then the set of map data M stored by the dominant robot i Can be expressed as:
Figure FDA0004257778210000105
wherein m is j Representing a j-th position observation vector;
Figure FDA0004257778210000106
And (5) representing the addition of the position observation vectors, and finally obtaining the SLAM information graph fused to form the global map.
10. The SLAM system of claim 9, wherein the master robot is further coupled to a remote control center.
11. The SLAM system based on the wireless self-organizing local area network of the robot search and rescue environment as set forth in claim 9, wherein each of the slave robots and the master robot is provided with a wireless network communication module, and the master robot and the slave robots communicate with each other through the wireless network communication module.
12. The SLAM system of claim 11, wherein the wireless network communication module comprises a wireless network card, and the wireless network card is connected to the wireless router and the wireless serial communication module respectively.
13. The SLAM system of claim 9, wherein each of the satellite robots and the master robot is equipped with a positioning module.
14. The SLAM system of claim 13, wherein the positioning module comprises a gyroscope for positioning the motion direction of the corresponding robot, a sonar sensor for acquiring the position of the target in the search and rescue environment, and a laser rangefinder for calculating the range information of the rangefinder to the target.
15. The SLAM system of claim 9, wherein each of the satellite robots and the master robot is provided with an image vision module for collecting target image information in the search and rescue environment.
16. A method of operating a SLAM system based on a wireless ad hoc local area network in a robotic search and rescue environment as claimed in claim 9, comprising:
each satellite robot serves as a mobile network node in the wireless self-organizing local area network, and transmits own positioning information and corresponding constructed local map information to other satellite robots within a preset communication distance range, and finally, the other satellite robots are uploaded to the master robot;
the master robot serves as a mobile master control network node in the wireless self-organizing local area network, the received information transmitted by all mobile network nodes is fused with the position information of the master robot to form a real-time global map of the robot cooperative motion control strategy and the search and rescue environment, the real-time global map is transmitted to the slave robots within a preset communication distance range, and the global map is finally transmitted to the corresponding slave robots through communication links among the slave robots and controls the corresponding slave robots to accurately execute corresponding SLAM tasks.
CN201710917049.7A 2017-09-30 2017-09-30 SLAM system and method based on wireless self-organizing local area network of robot search and rescue environment Active CN107567036B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710917049.7A CN107567036B (en) 2017-09-30 2017-09-30 SLAM system and method based on wireless self-organizing local area network of robot search and rescue environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710917049.7A CN107567036B (en) 2017-09-30 2017-09-30 SLAM system and method based on wireless self-organizing local area network of robot search and rescue environment

Publications (2)

Publication Number Publication Date
CN107567036A CN107567036A (en) 2018-01-09
CN107567036B true CN107567036B (en) 2023-07-04

Family

ID=60984852

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710917049.7A Active CN107567036B (en) 2017-09-30 2017-09-30 SLAM system and method based on wireless self-organizing local area network of robot search and rescue environment

Country Status (1)

Country Link
CN (1) CN107567036B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108507578B (en) * 2018-04-03 2021-04-30 珠海市一微半导体有限公司 Navigation method of robot
CN110087220A (en) * 2019-05-29 2019-08-02 上海驰盈机电自动化技术有限公司 A kind of Communication of Muti-robot System and tele-control system
CN110658833B (en) * 2019-09-18 2022-06-14 沈阳航空航天大学 Multi-AUV real-time rescue task allocation algorithm in underwater environment
CN111556593B (en) * 2020-04-29 2021-02-26 深圳市迩立信息科技有限公司 Ad hoc network terminal communication system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104184781A (en) * 2013-05-28 2014-12-03 东北大学 Unknown environment exploration-oriented mobile robot self-deploying sensing network
CN106679661A (en) * 2017-03-24 2017-05-17 山东大学 Simultaneous localization and mapping system and method assisted by search and rescue robot arms

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101739996B1 (en) * 2010-11-03 2017-05-25 삼성전자주식회사 Moving robot and simultaneous localization and map-buliding method thereof
US10062010B2 (en) * 2015-06-26 2018-08-28 Intel Corporation System for building a map and subsequent localization
US20170021497A1 (en) * 2015-07-24 2017-01-26 Brandon Tseng Collaborative human-robot swarm

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104184781A (en) * 2013-05-28 2014-12-03 东北大学 Unknown environment exploration-oriented mobile robot self-deploying sensing network
CN106679661A (en) * 2017-03-24 2017-05-17 山东大学 Simultaneous localization and mapping system and method assisted by search and rescue robot arms

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Björn Gernert et.al."An Interdisciplinary Approach to Autonomous Team-based Exploration in Disaster Scenarios".《2014 IEEE International Symposium on Safety,and Security,and Rescue Robotics》.2015,正文第3-4章. *
Robert Reid et.al."Cooperative Multi-Robot Navigation, Exploration,Mapping and Object Detection with ROS".《2013 IEEE Intelligent Vehicles Symposium (IV)》.2013,正文第1-4章. *

Also Published As

Publication number Publication date
CN107567036A (en) 2018-01-09

Similar Documents

Publication Publication Date Title
CN107567036B (en) SLAM system and method based on wireless self-organizing local area network of robot search and rescue environment
Nguyen et al. Integrated uwb-vision approach for autonomous docking of uavs in gps-denied environments
US9563528B2 (en) Mobile apparatus and localization method thereof
Langelaan State estimation for autonomous flight in cluttered environments
CN106406338A (en) Omnidirectional mobile robot autonomous navigation apparatus and method based on laser range finder
CN111123925A (en) Mobile robot navigation system and method
Liu et al. A survey of computer vision applied in aerial robotic vehicles
Xianjia et al. Cooperative UWB-based localization for outdoors positioning and navigation of UAVs aided by ground robots
CN111982114A (en) Rescue robot for estimating three-dimensional pose by adopting IMU data fusion
Güler et al. Infrastructure-free multi-robot localization with ultrawideband sensors
CN112925000B (en) Vehicle positioning method in tunnel environment based on visible light communication and inertial navigation
Liu et al. Vision aided unmanned aerial vehicle autonomy: An overview
Sohn et al. Localization system for mobile robot using wireless communication with IR landmark
Güler et al. Real time onboard ultrawideband localization scheme for an autonomous two-robot system
US20230213946A1 (en) Vehicle Navigation Positioning Method and Apparatus, and Base Station, System and Readable Storage Medium
Miraglia et al. Comparison of two sensor data fusion methods in a tightly coupled UWB/IMU 3-D localization system
Cantelli et al. UAV/UGV cooperation to improve navigation capabilities of a mobile robot in unstructured environments
KR20110035258A (en) Device for control of moving robot, moving robot system having the same and method for control of moving robot
Gao et al. Localization of mobile robot based on multi-sensor fusion
Xiang et al. Localization and mapping algorithm for the indoor mobile robot based on LIDAR
Bergantin et al. Estimation of the distance from a surface based on local optic flow divergence
Zali et al. Localization of an indoor mobile robot using decentralized data fusion
Agarwal et al. Monocular vision based navigation and localisation in indoor environments
CN114415655A (en) Inspection robot navigation control method based on improved SLAM
Shu et al. An imu/sonar-based extended kalman filter for mini-uav localization in indoor environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant