CN113701742B - Mobile robot SLAM method based on cloud and edge fusion calculation - Google Patents

Mobile robot SLAM method based on cloud and edge fusion calculation Download PDF

Info

Publication number
CN113701742B
CN113701742B CN202110974304.8A CN202110974304A CN113701742B CN 113701742 B CN113701742 B CN 113701742B CN 202110974304 A CN202110974304 A CN 202110974304A CN 113701742 B CN113701742 B CN 113701742B
Authority
CN
China
Prior art keywords
cloud
mobile robot
pose
slam
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110974304.8A
Other languages
Chinese (zh)
Other versions
CN113701742A (en
Inventor
吕太之
张娟
何陈勇
游学军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Shenye Intelligent Technology Co ltd
Original Assignee
Jiangsu Maritime Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Maritime Institute filed Critical Jiangsu Maritime Institute
Priority to CN202110974304.8A priority Critical patent/CN113701742B/en
Publication of CN113701742A publication Critical patent/CN113701742A/en
Application granted granted Critical
Publication of CN113701742B publication Critical patent/CN113701742B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3841Data obtained from two or more sources, e.g. probe vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3848Data obtained from both position sensors and additional sensors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3885Transmission of map data to client devices; Reception of map data by client devices
    • G01C21/3893Transmission of map data from distributed sources, e.g. from roadside stations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/15Correlation function computation including computation of convolution operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/16Matrix or vector computation, e.g. matrix-matrix or matrix-vector multiplication, matrix factorization
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • Mathematical Optimization (AREA)
  • Computational Mathematics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Databases & Information Systems (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Algebra (AREA)
  • Electromagnetism (AREA)
  • Computing Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Operations Research (AREA)
  • Probability & Statistics with Applications (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses a mobile robot SLAM method based on cloud and side fusion calculation, which comprises data acquisition and preprocessing; the side SLAM calculation module estimates the pose and the local map information of the robot based on a square root unscented Kalman filtering algorithm SR-UKF for the preprocessed data; the cloud performs loop detection, distributed parallel SLAM calculation based on particle filter SR-UPF and correction of priori knowledge on the preprocessed data in parallel, and finally returns the pose of the particle with the highest posterior probability to the side as the pose and a local map of the robot at the current moment; according to the posterior probability density, the result of fusing the edge and the cloud is achieved; storing the pose and the local map information of the fused robot into an edge local map storage module for controlling the mobile robot; resampling cloud particles and storing a global map; the invention can make up the short plate with insufficient airborne capacity and overlarge burden in the SLAM process of the mobile robot by utilizing the edge cloud fusion technology.

Description

Mobile robot SLAM method based on cloud and edge fusion calculation
Technical Field
The invention relates to the technical field of autonomous navigation of mobile robots, in particular to a mobile robot SLAM method based on cloud end and side end fusion calculation.
Background
The mobile robot is widely applied to the fields of military, aerospace, deep sea, medical treatment, disaster relief, service and the like, and has higher and higher requirements on the accuracy and reliability of autonomous navigation. SLAM (synchronous positioning and map creation) is a key to achieving autonomous navigation of mobile robots due to its important theoretical and application value. SLAM (synchronous localization and map creation) is a concept: the robot is expected to start from an unknown place with unknown environment, position and posture of the robot are positioned through repeatedly observed road signs (such as walls, corners and columns) in the movement process, and then a map is built in an incremental mode according to the position of the robot, so that the purposes of simultaneous positioning and map building are achieved.
Meanwhile, positioning and map creation (simultaneous localization AND MAPPING, SLAM) are key problems to be solved by the mobile robot to realize autonomous navigation, are typical computationally intensive tasks, and have high requirements on airborne equipment. The traditional SLAM method completely relies on computing resources of the edge to solve in real time, and has low execution speed and low precision. According to the invention, the SLAM computing real-time frame is constructed through edge cloud fusion, and the cloud can improve SLAM precision through distributed parallel solving of the global map; the side end builds a local map through rapid SLAM calculation, so that the real-time requirement is met; and the SLAM estimation precision and efficiency are improved through edge cloud fusion, the environment adaptability of the robot is enhanced, and as the edge end obtains a result through rapid calculation, the cloud end performs complex operation to ensure precision and global performance, and all the edge end does not need excessive calculation and storage capacity, so that the method has urgent demands and wide application prospects in the field of autonomous navigation.
Disclosure of Invention
1. The technical problems to be solved are as follows:
Aiming at the technical problems, the invention provides the mobile robot SLAM system and the method based on the edge cloud edge fusion calculation, which make up for the shortages of insufficient airborne capacity and overlarge burden in the SLAM process of the mobile robot, ensure the real-time performance of the SLAM and improve the precision and the efficiency of the SLAM of the mobile robot.
2. The technical scheme is as follows:
A mobile robot SLAM method based on cloud and edge fusion calculation is characterized in that: the method comprises the following steps:
step one: collecting data; a laser radar and an inertial navigation sensor are arranged on the mobile robot; and the collected laser radar data and inertial navigation data are preprocessed and then sent to the cloud end and the side end for SLAM calculation.
Step two: edge SLAM processing; and the side SLAM calculation module estimates the pose and the local map information of the robot based on a square root unscented Kalman filtering algorithm SR-UKF.
Step three: cloud SLAM processing; the cloud performs loop detection, distributed parallel SLAM calculation based on particle filter SR-UPF and correction of priori knowledge on the received preprocessed data in parallel; finally, the pose of the particle with the highest posterior probability is returned to the side as the pose of the robot and the local map.
Step four: edge cloud data fusion; calculating the posterior probability of the SLAM result of the side, calculating the posterior probability of the SLAM result of the cloud, and realizing the fusion of the side and the cloud according to the posterior probability density; and storing the fused pose of the robot and the local map information into an edge local map storage module for controlling the mobile robot.
Step five: resampling cloud particles and storing a global map; the cloud distributed computing module computes the effective particle number in particle filter SLAM computation and judges whether particle resampling is needed or not; resampling is carried out if resampling is needed, and the global map, the sensor information and all particle information are stored in a cloud global map distributed database; the method for judging whether the particles are resampled is as follows: if the effective sample size is smaller than a preset threshold value T, adopting betting round selection and genetic resampling of rapid MH variation to carry out particle resampling; if the effective sample size is greater than or equal to a preset threshold T, not resampling the particles; and if resampling is not needed, storing the global map, the sensor information and all particle information into a cloud global map distributed database.
Further, the inertial navigation data u t includes a motion speed V t and an acceleration α t at the center of the rear axle of the mobile robot, and the motion model of the mobile robot at the time t is g (x t-1,ut), where x t-1 represents the pose at the time t-1; the laser radar sensor acquires road sign information in an observation environment and uses the road sign information to realize estimation of a correction system state, wherein the road sign information comprises a distance l t(mj) and a direction angle beta t(mj) of a road sign; the mobile robot observation model at time t is as follows:
(1-1) wherein δ is the observed error; the range of the direction angle beta t is [0, pi ] rad, and the direction angle of the road sign right in front of the mobile robot is preset as Wherein the map consists of a set of environmental landmarks m, the jth landmark being represented by its position in the global coordinate system as/>
Further, the process of estimating the pose and the local map information of the robot based on the square root unscented Kalman filter algorithm SR-UKF in the second step specifically comprises the following steps:
Step 21: initializing weights of system state X 0 (0, 0), square root of system covariance matrix, ith Sigma point And/>Wherein the square root, weight/>, of the covariance matrix is initializedAnd/>Specifically, the method is carried out by using the formulas (2-1), (2-2) and (2-3):
S0=chol(Q) (2-1)
In the above formula, S 0 is the covariance of the initialization time; l is the dimension of the system state, and lambda is the scale adjustment factor; q is a noise covariance matrix of a motion model of the mobile robot; Weight of the ith Sigma point mean,/> Weights for the ith Sigma point variance;
step 22: the Sigma point of the robot pose at the last time t-1 is calculated as follows:
Xt-1=[xt-1 xt-1+ηSt-1 xt-1-ηSt-1] (2-4);
(2-4) equations X t-1 and S t-1 are the mean and covariance of the system states at time t-1, respectively; wherein the scale constant η is calculated as follows:
(2-5) wherein λ=α 2 (l+κ) -L is a scale regulator; wherein alpha is 1 e-4.ltoreq.alpha.ltoreq.1; kappa is a secondary scale adjustment factor, preset to 0; l is the dimension of the X t-1 vector;
Step 23: data association, namely associating the road sign observed at the time with the road sign existing in the local map;
Step 24: sigma point of robot pose at current time t is predicted according to mobile robot Sigma point X t-1
Order the
(2-6) Wherein the function g is a mobile robot motion model; u t is inertial navigation data of the robot;
weighted summation to obtain predicted value of robot position state Namely:
Predicting square root matrices by QR decomposition
Square root matrix to ensure covarianceIs semi-positive using Cholesky first order update/>
Updating Sigma points according to the predicted robot pose and covariance matrix:
step 25: recalculating the Sigma point at the current moment according to the observation model of the mobile robot at the current moment t and the laser radar data zt: the specific process is as follows:
Order the
In the formula (2-11), the h function is a mobile robot observation model; Sigma points of the predicted pose and covariance matrix of the robot at the t moment in the formula (2-10);
weighted summation to obtain the observation predicted value of the robot
The observed covariance matrix is calculated by QR decomposition:
(2-13) wherein R represents a covariance matrix of the observation model noise;
First order update with Cholesky:
In order to update the predicted value for the robot pose and the square root of the covariance matrix by observation, the kalman gain is calculated by the equations (2-15), (2-16):
finally updating the square root of the covariance matrix and the pose and map information xt by Kalman gain:
Further, loop detection, distributed parallel SLAM calculation based on particle filter SR-UPF and correction of priori knowledge in the step three are specifically as follows:
Step 31: loop detection; comparing map information of all periods of the mobile robot stored in the cloud, judging whether the map information is similar to map information of a certain moment of history, and if so, forming a closed loop at the detection position, namely updating a local map by using the history map;
Step 32: distributed parallel SLAM calculation based on particle filter SR-UPF; each particle adopts an SR-UPF method to estimate the SLAM posterior probability density function of the mobile robot; outputting the calculated pose of the robot, the particle weight and the mean value and covariance matrix of each road sign;
Step 33: adjusting priori knowledge; adjusting priori knowledge according to a mobile robot fitness function; the priori knowledge comprises a motion error covariance matrix Q and an observation error covariance matrix R; the method has the advantages that the prior knowledge avoiding errors is realized, the estimation precision of the SLAM algorithm is reduced, and the prior knowledge reflects the real situation through dynamic adjustment;
step 34: and returning the pose and the environmental characteristic calculated by the particle with the largest weight as a pose and environmental characteristic map to the mobile robot.
Further, the SR-UPF method adopted in the step 32 specifically comprises the following steps:
S321, as shown in the steps 21 to 25, predicting a new pose at the moment t according to the pose and the motion model of the mobile robot at the moment t-1 based on a square root unscented Kalman filtering algorithm SR-UKF method;
s322: and (3) data association: correlating the observed road sign with the road sign existing in the map;
s323: and (5) updating the pose of the mobile robot: based on a square root unscented Kalman filtering algorithm SR-UKF method, the predicted pose of the mobile robot in the step S321 is updated according to the mobile robot observation model and radar data, and then the pose of the mobile robot is updated through the mobile robot observation model and radar data;
s324: updating the road sign: updating the landmark information by expanding Kalman filtering;
S325: particle weight calculation: the ith particle weight formula is:
further, the fourth step specifically includes:
Step 41: judging whether the cloud returns a result in real time; if not, taking the edge SLAM calculation result as local map information; otherwise, go to step 42;
Step 42: calculating the posterior probability density of the SLAM calculation result of the edge, calculating the posterior probability density of the SLAM calculation result of the cloud end, and fusing the results of the edge and the cloud end according to the probability density;
The calculating side posterior probability density function is as follows:
(4-1) wherein X 1t is the estimation of the pose of the robot by the edge; Θ lt is an estimate of the map for the edge, including an estimate of N m lj for m j roadmap; The estimation of the mobile robot at the time t-1; r t (mj) is the covariance matrix of the jth landmark estimate; q t is a covariance matrix of the pose estimation of the mobile robot; z t(mj) is the actual observation of the jth road sign; is a predictive observation based on landmark estimation; /(I) Estimating the pose of the mobile robot based on a motion model; the cloud posterior probability density function is:
(4-2) wherein x ct is the estimation of the pose of the robot by the cloud; Θ ct is an estimate of the map by the cloud, including an estimate of N m cj of m j landmarks, Is a predictive observation based on landmark estimation;
The pose and map estimation of the mobile robot by the edge cloud are fused as follows:
further, the calculating the effective particle number in the fifth step, judging whether the particle resampling process is needed, specifically includes the following steps:
step 51: calculating the effective particle number of the cloud computing node, and judging whether particle resampling is needed or not;
The effective particle number N eff is calculated as follows.
In the formula (5-1), M represents the number of particles,A weight representing each particle;
If the effective particle count is less than the predetermined threshold, step 52 is performed to resample the particle count;
Step 52: carrying out particle resampling; genetic resampling by betting round selection and rapid MH variation.
3. The beneficial effects are that:
(1) According to the invention, the robot side transmits the acquired inertial navigation sensor and the laser radar to the side and the cloud end respectively, and the side calculates and constructs a local map through rapid SLAM, so that the real-time requirement can be met; the global SLAM map is built through distributed parallel processing at the cloud, so that SLAM precision can be improved; then, the method has the advantages of high efficiency, high precision and easiness in software and hardware realization through edge cloud fusion, and can realize the real-time SLAM process of the mobile robot.
(2) In the invention, the side end adopts a Kalman filtering algorithm SR-UKF based on square root unscented Kalman to update the pose of the robot and the local map (namely the observed road sign) at the same time to realize the construction of the local map; the cloud end firstly updates the pose of the mobile robot by adopting a Kalman filtering algorithm SR-UKF based on square root, but not updates the system state comprising the pose of the robot and local landmarks, and then updates the map (updating individual landmarks) by adopting an EKF (extended Kalman filtering) after the pose of the robot is updated. The global map and track information can be saved based on the strong computing and storage capacity of the cloud, the accuracy of the map is guaranteed, the side end only keeps local map information, quick response is guaranteed, and instantaneity is guaranteed.
In conclusion, the method provided by the invention can make up for the short plates with insufficient airborne capacity and overlarge burden in the SLAM process of the mobile robot by utilizing the edge cloud fusion technology, thereby not only ensuring the instantaneity of the SLAM, but also improving the precision and efficiency of the SLAM of the mobile robot.
Drawings
FIG. 1 is a block diagram of a mobile robot SLAM system based on edge cloud computing;
FIG. 2 is an overall flowchart of a mobile robot SLAM method based on edge cloud computing;
FIG. 3 is a diagram of a mobile robot model in accordance with the present invention;
FIG. 4 is an observation model of a mobile robot in the present invention;
FIG. 5 is a flow chart of the SLAM computation at the edge in the present invention;
FIG. 6 is a schematic representation of cloud end particle set format in the present invention;
FIG. 7 is a schematic diagram of a cloud-end distributed parallel SLAM computing process according to the present invention;
Fig. 8 is a schematic diagram of a cloud particle resampling process according to the present invention.
Detailed Description
The present invention will be described in detail with reference to the accompanying drawings.
As shown in the attached figure 1, the whole structure diagram of the mobile robot SLAM system based on edge cloud edge fusion calculation comprises a message client module at the edge of a mobile robot end, a data acquisition module, an edge SLAM calculation module, a local storage module and a cloud edge fusion module; the cloud distributed SLAM computing system comprises a cloud distributed SLAM computing module, a cloud distributed database, a coordinated monitoring cluster module and a message cluster module.
In fig. 1, a cloud distributed SLAM computing module is implemented by a large-scale distributed streaming computing system link platform of a distributed parallel computing cluster, and the module provides a distributed parallel SLAM computing capability; the cloud distributed database HDFS provides a distributed file system for SLAM, and achieves storage of global maps, sensing data, path information, particle information and the like; the coordination monitoring cluster module realizes application coordination service through the Zookeeper and provides consistency service for SLAM distributed computation. In the Flink platform, the master node is implemented by JobManager and the resource scheduling is implemented by Resourcemanager. The working nodes are responsible for executing computing tasks, running TASKMANAGER monitoring processes on each working node, and receiving tasks, and starting and stopping working processes which belong to self management.
The mobile robot deployment ROS (Robot Operating System) at the edge encapsulates the hardware of the robot and provides the same expression for the upper layer applications. Based on a 5G communication network, the mobile robot communicates with the cloud through a Kafka client, data acquired by the sensor are transmitted to the cloud, and the cloud returns a calculation result to the mobile robot through a Kafka component after calculation through a streaming calculation cluster.
As shown in fig. 2, a mobile robot SLAM method based on cloud end and edge end fusion calculation includes:
Step one: collecting data; an inertial navigation sensor and a laser radar are arranged on the mobile robot; and the collected laser radar data and inertial navigation data are preprocessed and then sent to the cloud end and the side end for SLAM calculation.
Step two: edge SLAM processing; and the side SLAM calculation module estimates the pose and the local map information of the robot based on a square root unscented Kalman filtering algorithm SR-UKF. As shown in fig. 5. The data association involved in this step adopts a multi-hypothesis tracking algorithm (multi-hypothesis tracking,MHT).(Zhou W,E S,Cao Z,et al.Review of SLAM data association study[C].Proc.of the 6th International Conference on Sensor Network and Computer Engineering.2016:14-19.).
Step three: cloud SLAM processing; the cloud end parallelly executes loop detection, distributed parallel SLAM calculation based on particle filtering SR-UPF and correction of priori knowledge on the received preprocessed data, and finally returns the pose of the particle with the highest posterior probability to the side as the pose and a local map of the current moment of the robot;
The distributed parallel SLAM computation based on particle filter SR-UPF approximates the posterior distribution of SLAM estimation by particle filtering. Each particle represents a sample of the path. Each particle is independently associated with N landmark estimates. At time t, the particle set format is shown in fig. 6, and the calculation of each particle is managed by the TASKMANAGER of the Flink, and the calculation is executed in the TaskSlot container, and finally the execution result of each particle is combined by one CombineTask. The flow of calculation of the particles is shown in fig. 7.
Step four: edge cloud data fusion; calculating the posterior probability of the SLAM result of the side, calculating the posterior probability of the SLAM result of the cloud, and realizing the fusion of the side and the cloud according to the posterior probability density; and storing the fused pose of the robot and the local map information into an edge local map storage module for controlling the mobile robot. And saving the side local map information into a local file through the ROS file system.
Step five: resampling cloud particles and storing a global map; the cloud distributed computing module computes the effective particle number in particle filter SLAM computation and judges whether particle resampling is needed or not; resampling is carried out if resampling is needed, and the global map, the sensor information and all particle information are stored in a cloud global map distributed database; the method for judging whether the particles are resampled is as follows: if the effective sample size is less than the threshold T, resampling the particles using genetic resampling by betting round selection and rapid MH variation; if the effective sample size is greater than or equal to a threshold T, not performing particle resampling; and if resampling is not needed, storing the global map, the sensor information and all particle information into a cloud global map distributed database. The flow of particle resampling is shown in fig. 8. The data storage step is performed by storing the global map, the sensor information and all particle information in a distributed database HBase.
As shown in fig. 3 and fig. 4, the inertial navigation data in the present invention includes a motion velocity v t and an acceleration α t at the center of a rear axle of the mobile robot, and then a motion model of the mobile robot at a time t is g (x t-1,ut), where x t-1 represents a pose at a time t-1; the laser radar sensor acquires road sign information in an observation environment and uses the road sign information to realize estimation of a correction system state, wherein the road sign information comprises a distance l t(mj) and a direction angle beta t(mj) of a road sign; the mobile robot observation model at time t is as follows:
(1-1) wherein δ is the observed error; the range of the direction angle beta t is [0, pi ] rad, and the direction angle of the road sign right in front of the mobile robot is preset as Wherein the map consists of a set of environmental landmarks m, the jth landmark being represented by its position in the global coordinate system as/>Fig. 4 shows an observation model of a mobile robot. The inertial navigation data and the radar data are usually formatted in the form of the following table, and the processed data are simultaneously sent to the cloud end and the side end for SLAM calculation.
As shown in fig. 5, the process of estimating pose and local map information of the robot based on the square root unscented kalman filter algorithm SR-UKF specifically includes the following steps:
Step 21: initializing weights of system state X 0 (0, 0), square root of system covariance matrix, ith Sigma point And/>Wherein the square root, weight/>, of the covariance matrix is initializedAnd/>Specifically, the method is carried out by using the formulas (2-1), (2-2) and (2-3):
S0=chol(Q) (2-1)
in the above formula, S0 is the covariance at the initialization time; l is the dimension of the system state, and lambda is the scale adjustment factor; q is a covariance matrix of motion model noise; Weight of the ith Sigma point mean,/> Weights for the ith Sigma point variance;
step 22: the Sigma point of the robot pose at the last time t-1 is calculated as follows:
Xt-1=[xt-1 xt-1+ηSt-1 xt-1-ηSt-1] (2-4);
(2-4) equations x t-1 and S t-1 are the mean and covariance of the system states at time t-1; wherein the scale constant η is calculated as follows:
(2-5) wherein λ=α 2 (l+κ) -L is a scale regulator; wherein alpha is 1e-4 and is not less than 1; kappa is a secondary scale factor, typically set to 0; l is the dimension of the X t-1 vector;
Step 23: data association, namely associating the road sign observed at the time with the road sign existing in the local map;
Step 24: sigma point of robot pose at current time t is predicted according to mobile robot Sigma point X t-1
Order the
(2-6) Wherein the function g is a mobile robot motion model; u t is inertial navigation data of the robot;
weighted summation to obtain predicted value of robot position state
Predicting square root matrices by QR decomposition
Square root matrix to ensure covarianceIs semi-positive using Cholesky first order update/>
Updating Sigma points according to the predicted robot pose and covariance matrix:
Step 25: recalculating the Sigma point at the current moment according to the observation model of the mobile robot at the current moment t and the laser radar data z t: the specific process is as follows:
Order the
In the formula (2-11), the h function is a mobile robot observation model; Sigma points of the predicted pose and covariance matrix of the robot at the t moment in the formula (2-10);
weighted summation to obtain the observation predicted value of the robot
The observed covariance matrix is calculated by QR decomposition:
(2-13) wherein R represents a covariance matrix of the observation model noise;
First order update with Cholesky:
In order to update the predicted value for the robot pose and the square root of the covariance matrix by observation, the kalman gain is calculated by the equations (2-15), (2-16):
finally updating the square root of the covariance matrix and the pose and map information xt by Kalman gain:
As shown in fig. 7, in the third step, loop detection, distributed parallel SLAM calculation based on particle filter SR-UPF, and correction of a priori knowledge are specifically:
step 31 loop detection; comparing map information of all periods of the mobile robot stored in the cloud, judging whether the map information is similar to map information of a certain moment of history, and if so, forming a closed loop at the detection position, namely updating a local map by using the history map;
Step 32: distributed parallel SLAM calculation based on particle filter SR-UPF; each particle adopts an SR-UPF method to estimate the SLAM posterior probability density function of the mobile robot; outputting the calculated pose of the robot, the particle weight and the mean value and covariance matrix of each road sign; as shown in fig. 6; the specific process includes S321 to S325:
S321, as shown in the steps 21 to 25, predicting a new pose at the moment t according to the pose and the motion model of the mobile robot at the moment t-1 based on a square root unscented Kalman filtering algorithm SR-UKF method;
s322: and (3) data association: correlating the observed road sign with the road sign existing in the map;
s323: and (5) updating the pose of the mobile robot: based on a square root unscented Kalman filtering algorithm SR-UKF method, the predicted pose of the mobile robot in the step S321 is updated according to the mobile robot observation model and radar data, and then the pose of the mobile robot is updated through the mobile robot observation model and radar data;
s324: updating the road sign: updating the landmark information by expanding Kalman filtering;
S325: particle weight calculation: the ith particle weight formula is:
Step 33: adjusting priori knowledge; adjusting priori knowledge according to a mobile robot fitness function; the priori knowledge comprises a motion error covariance matrix Q and an observation error covariance matrix R; the method has the advantages that the prior knowledge avoiding errors is realized, the estimation precision of the SLAM algorithm is reduced, and the prior knowledge can reflect the real situation through dynamic adjustment;
step 34: and returning the pose and the environmental characteristic calculated by the particle with the largest weight as a pose and environmental characteristic map to the mobile robot.
At cloud, time t, the particle set is as shown in fig. 6, and the calculation of each particle is managed by the TASKMANAGER of the Flink, and runs in the TaskSlot container, and finally the running results of each particle are combined by one CombineTask.
Further, the fourth step specifically includes:
step 41: judging whether the cloud returns a result in real time; if not, taking the side SLAM result as local map information; otherwise, go to step 42;
Step 42: calculating the posterior probability of the SLAM result of the edge, calculating the posterior probability of the SLAM result of the cloud end, and fusing the results of the edge and the cloud end according to the probability;
The calculating side posterior probability density function is as follows:
(4-1) wherein x 1t is the estimation of the pose of the robot by the edge, Θ lt is the estimation of the map by the edge, including the estimation of N m j landmarks m lj, For the estimation of the mobile robot at the time t-1, R t (mj) is the covariance matrix of the jth road sign estimation; q t is a covariance matrix of the pose estimation of the mobile robot; z t(mj) is the actual observation of the jth road sign,Is a predictive observation based on landmark estimation; /(I)Estimating the pose of the mobile robot based on a motion model;
the cloud posterior probability density function is:
(4-2) wherein x ct is the estimation of the pose of the robot by the cloud; Θ ct is an estimate of the map by the cloud, including an estimate of N m cj of m j landmarks, Is a predictive observation based on landmark estimation;
The pose and map estimation of the mobile robot by the edge cloud are fused as follows:
further, the calculating the effective particle number in the fifth step, judging whether the particle resampling process is needed, specifically includes the following steps:
step 51: calculating the effective particle number of the cloud computing node, and judging whether particle resampling is needed or not;
The effective particle number N eff is calculated as follows.
In the formula (5-1), M represents the number of particles,A weight representing each particle;
If the effective particle count is less than the predetermined threshold, performing step 52 to resample the particle count;
step 52: carrying out particle resampling; genetic resampling by betting round selection and rapid MH variation. Fig. 8 is a schematic diagram of a cloud particle resampling process.
Genetic resampling by betting round selection and rapid MH variation specifically includes: m1 (m1=n×p s) parent particles are first selected from N initial particles by a two-pot algorithm, and then M2 particles are generated from M1 by using fast MH variation, and M3 particles are generated from M1 by using a crossover operation. Finally, M1, M2 and M3 particles are combined into new particles. The weight of the particles is initialized to 1/M.
While the application has been described with reference to the preferred embodiments, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the application, and it is intended that the scope of the application shall be defined by the appended claims.

Claims (7)

1. A mobile robot SLAM method based on cloud and edge fusion calculation is characterized in that: the method comprises the following steps:
step one: collecting data; a laser radar and an inertial navigation sensor are arranged on the mobile robot; preprocessing the collected laser radar data and inertial navigation data, and simultaneously sending the preprocessed laser radar data and the inertial navigation data to a cloud end and a side end for SLAM calculation;
Step two: edge SLAM processing; the side SLAM calculation module estimates the pose and the local map information of the robot based on a square root unscented Kalman filtering algorithm SR-UKF;
Step three: cloud SLAM processing; the cloud performs loop detection, distributed parallel SLAM calculation based on particle filter SR-UPF and correction of priori knowledge on the received preprocessed data in parallel; finally, returning the pose of the particle with the highest posterior probability to the side as the pose of the robot and a local map;
Step four: edge cloud data fusion; calculating the posterior probability of the SLAM result of the side, calculating the posterior probability of the SLAM result of the cloud, and realizing the fusion of the side and the cloud according to the posterior probability density; storing the pose and the local map information of the fused robot into an edge local map storage module for controlling the mobile robot;
Step five: resampling cloud particles and storing a global map; the cloud distributed computing module computes the effective particle number in particle filter SLAM computation and judges whether particle resampling is needed or not; resampling is carried out if resampling is needed, and the global map, the sensor information and all particle information are stored in a cloud global map distributed database; the method for judging whether the particles are resampled is as follows: if the effective sample size is smaller than a preset threshold value T, adopting betting round selection and genetic resampling of rapid MH variation to carry out particle resampling; if the effective sample size is greater than or equal to a preset threshold T, not resampling the particles; and if resampling is not needed, storing the global map, the sensor information and all particle information into a cloud global map distributed database.
2. The mobile robot SLAM method based on cloud and edge fusion calculation of claim 1, wherein the method is characterized in that: the inertial navigation data u t comprises a motion speed v t and an acceleration alpha t at the center of a rear axle of the mobile robot, and a motion model of the mobile robot at the moment t is g (x t-1,ut), wherein x t-1 represents the pose at the moment t-1; the laser radar sensor acquires road sign information in an observation environment and uses the road sign information to realize estimation of a correction system state, wherein the road sign information comprises a distance l t(mj) and a direction angle beta t(mj) of a road sign; the mobile robot observation model at time t is as follows:
(1-1) wherein δ is the observed error; the range of the direction angle beta t is [0, pi ] rad, and the direction angle of the road sign right in front of the mobile robot is preset as Rad; wherein the map consists of a set of environmental landmarks m, the jth landmark being represented by its position in the global coordinate system as/>
3. The mobile robot SLAM method based on cloud and edge fusion calculation of claim 2, wherein the method is characterized in that: in the second step, the process of estimating the pose and the local map information of the robot based on the square root unscented Kalman filtering algorithm SR-UKF specifically comprises the following steps:
Step 21: initializing weights of system state X 0 (0, 0), square root of system covariance matrix, ith Sigma point And/>Wherein the square root, weight/>, of the covariance matrix is initializedAnd/>Specifically, the method is carried out by using the formulas (2-1), (2-2) and (2-3):
S0=chol(Q) (2-1)
In the above formula, S 0 is the covariance of the initialization time; l is the dimension of the system state, lambda is the scale adjustment factor; q is a noise covariance matrix of a motion model of the mobile robot; Weight of the ith Sigma point mean,/> Weights for the ith Sigma point variance;
step 22: the Sigma point of the robot pose at the last time t-1 is calculated as follows:
Xt-1=[xt-1 xt-1+ηSt-1 xt-1-ηSt-1] (2-4);
(2-4) equations X t-1 and S t-1 are the mean and covariance of the system states at time t-1, respectively; wherein the scale constant η is calculated as follows:
(2-5) wherein λ=α 2 (l+κ) -L is a scale regulator; wherein alpha is 1e-4 and is not less than 1; kappa is a secondary scale adjustment factor, preset to 0; l is the dimension of the X t-1 vector;
Step 23: data association, namely associating the road sign observed at the time with the road sign existing in the local map;
Step 24: sigma point of robot pose at current time t is predicted according to mobile robot Sigma point X t-1 Order the
(2-6) Wherein the function g is a mobile robot motion model; u t is inertial navigation data of the robot;
weighted summation to obtain predicted value of robot position state Namely:
Predicting square root matrices by QR decomposition
Square root matrix to ensure covarianceIs semi-positive using Cholesky first order update/>
Updating Sigma points according to the predicted robot pose and covariance matrix:
Step 25: recalculating the Sigma point at the current moment according to the observation model of the mobile robot at the current moment t and the laser radar data z t: the specific process is as follows:
Order the
In the formula (2-11), the h function is a mobile robot observation model; Sigma points of the predicted pose and covariance matrix of the robot at the t moment in the formula (2-10);
weighted summation to obtain the observation predicted value of the robot
The observed covariance matrix is calculated by QR decomposition:
(2-13) wherein R represents a covariance matrix of the observation model noise;
First order update with Cholesky:
In order to update the predicted value for the robot pose and the square root of the covariance matrix by observation, the kalman gain is calculated by the equations (2-15), (2-16):
finally updating the square root of the covariance matrix and the pose and map information xt by Kalman gain:
4. The mobile robot SLAM method based on cloud and edge fusion calculation of claim 3, wherein the method comprises the following steps: in the third step, loop detection, distributed parallel SLAM calculation based on particle filter SR-UPF and correction of priori knowledge are specifically as follows:
Step 31: loop detection; comparing map information of all periods of the mobile robot stored in the cloud, judging whether the map information is similar to map information of a certain moment of history, and if so, forming a closed loop at the detection position, namely updating a local map by using the history map;
Step 32: distributed parallel SLAM calculation based on particle filter SR-UPF; each particle adopts an SR-UPF method to estimate the SLAM posterior probability density function of the mobile robot; outputting the calculated pose of the robot, the particle weight and the mean value and covariance matrix of each road sign;
Step 33: adjusting priori knowledge; adjusting priori knowledge according to a mobile robot fitness function; the priori knowledge comprises a motion error covariance matrix Q and an observation error covariance matrix R; the method has the advantages that the prior knowledge avoiding errors is realized, the estimation precision of the SLAM algorithm is reduced, and the prior knowledge reflects the real situation through dynamic adjustment;
step 34: and returning the pose and the environmental characteristic calculated by the particle with the largest weight as a pose and environmental characteristic map to the mobile robot.
5. The mobile robot SLAM method based on cloud and edge fusion calculation of claim 4, wherein the method is characterized by: the SR-UPF method adopted in the step 32 specifically comprises the following steps:
S321, as shown in the steps 21 to 25, predicting a new pose at the moment t according to the pose and the motion model of the mobile robot at the moment t-1 based on a square root unscented Kalman filtering algorithm SR-UKF method;
s322: and (3) data association: correlating the observed road sign with the road sign existing in the map;
s323: and (5) updating the pose of the mobile robot: based on a square root unscented Kalman filtering algorithm SR-UKF method, the predicted pose of the mobile robot in the step S321 is updated according to the mobile robot observation model and radar data, and then the pose of the mobile robot is updated through the mobile robot observation model and radar data;
s324: updating the road sign: updating the landmark information by expanding Kalman filtering;
S325: particle weight calculation: the ith particle weight formula is:
6. The mobile robot SLAM method based on cloud and edge fusion calculation of claim 1, wherein the method is characterized in that: the fourth step specifically comprises:
Step 41: judging whether the cloud returns a result in real time; if not, taking the edge SLAM calculation result as local map information; otherwise, go to step 42;
Step 42: calculating the posterior probability density of the SLAM calculation result of the edge, calculating the posterior probability density of the SLAM calculation result of the cloud end, and fusing the results of the edge and the cloud end according to the probability density;
The calculating side posterior probability density function is as follows:
(4-1) wherein x lt is the estimation of the pose of the robot by the edge; Θ lt is an estimate of the map for the edge, including an estimate of N m lj for m j roadmap; The estimation of the mobile robot at the time t-1; r t(mj) is the covariance matrix of the jth landmark estimate; q t is a covariance matrix of the pose estimation of the mobile robot; z t(mj) is the actual observation of the jth road sign; /(I) Is a predictive observation based on landmark estimation; /(I)Estimating the pose of the mobile robot based on a motion model;
the cloud posterior probability density function is:
(4-2) wherein x ct is the estimation of the pose of the robot by the cloud; Θ ct is an estimate of the map by the cloud, including an estimate of N m cj of m j landmarks, Is a predictive observation based on landmark estimation;
The pose and map estimation of the mobile robot by the edge cloud are fused as follows:
y lt is the edge SLAM posterior probability density, and y ct is the cloud SLAM posterior probability density.
7. The mobile robot SLAM method based on cloud and edge fusion calculation of claim 1, wherein the method is characterized in that: the step five of calculating the effective particle number, judging whether the particle resampling process is needed, specifically comprises the following steps:
step 51: calculating the effective particle number of the cloud computing node, and judging whether particle resampling is needed or not;
the effective particle number N eff is calculated as follows:
in the formula (5-1), M represents the number of particles, A weight representing each particle;
If the effective particle count is less than the predetermined threshold, step 52 is performed to resample the particle count;
Step 52: carrying out particle resampling; genetic resampling by betting round selection and rapid MH variation.
CN202110974304.8A 2021-08-24 2021-08-24 Mobile robot SLAM method based on cloud and edge fusion calculation Active CN113701742B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110974304.8A CN113701742B (en) 2021-08-24 2021-08-24 Mobile robot SLAM method based on cloud and edge fusion calculation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110974304.8A CN113701742B (en) 2021-08-24 2021-08-24 Mobile robot SLAM method based on cloud and edge fusion calculation

Publications (2)

Publication Number Publication Date
CN113701742A CN113701742A (en) 2021-11-26
CN113701742B true CN113701742B (en) 2024-04-26

Family

ID=78654330

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110974304.8A Active CN113701742B (en) 2021-08-24 2021-08-24 Mobile robot SLAM method based on cloud and edge fusion calculation

Country Status (1)

Country Link
CN (1) CN113701742B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114383611A (en) * 2021-12-30 2022-04-22 华南智能机器人创新研究院 Multi-machine cooperative laser SLAM method, device and system for mobile robot
CN115022348B (en) * 2022-05-27 2023-04-28 江南大学 Intelligent factory cloud-level architecture data storage method for high-end battery
CN115371664B (en) * 2022-09-06 2024-08-13 重庆邮电大学 Dynamic maintenance method and system for mobile robot map based on geometric scale density and information entropy

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104330083A (en) * 2014-10-27 2015-02-04 南京理工大学 Multi-robot cooperative positioning algorithm based on square root unscented kalman filter
CN109459033A (en) * 2018-12-21 2019-03-12 哈尔滨工程大学 A kind of robot of the Multiple fading factor positions without mark Fast synchronization and builds drawing method
CN109916410A (en) * 2019-03-25 2019-06-21 南京理工大学 A kind of indoor orientation method based on improvement square root Unscented kalman filtering
CN110763245A (en) * 2019-10-25 2020-02-07 江苏海事职业技术学院 Map creating method and system based on stream computing
CN112347840A (en) * 2020-08-25 2021-02-09 天津大学 Vision sensor laser radar integrated unmanned aerial vehicle positioning and image building device and method
CN112581590A (en) * 2020-12-28 2021-03-30 广东工业大学 Unmanned aerial vehicle cloud edge terminal cooperative control method for 5G security rescue networking

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104330083A (en) * 2014-10-27 2015-02-04 南京理工大学 Multi-robot cooperative positioning algorithm based on square root unscented kalman filter
CN109459033A (en) * 2018-12-21 2019-03-12 哈尔滨工程大学 A kind of robot of the Multiple fading factor positions without mark Fast synchronization and builds drawing method
CN109916410A (en) * 2019-03-25 2019-06-21 南京理工大学 A kind of indoor orientation method based on improvement square root Unscented kalman filtering
CN110763245A (en) * 2019-10-25 2020-02-07 江苏海事职业技术学院 Map creating method and system based on stream computing
CN112347840A (en) * 2020-08-25 2021-02-09 天津大学 Vision sensor laser radar integrated unmanned aerial vehicle positioning and image building device and method
CN112581590A (en) * 2020-12-28 2021-03-30 广东工业大学 Unmanned aerial vehicle cloud edge terminal cooperative control method for 5G security rescue networking

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
an Improved FastSLAM Algorithm Based on Revised Genetic Resampling and SR-UPF;Tai-Zhi Lv;《International Journal of Automation and Computing 》;第325-334页 *

Also Published As

Publication number Publication date
CN113701742A (en) 2021-11-26

Similar Documents

Publication Publication Date Title
CN113701742B (en) Mobile robot SLAM method based on cloud and edge fusion calculation
CN112859859B (en) Dynamic grid map updating method based on three-dimensional obstacle object pixel object mapping
CN112639502B (en) Robot pose estimation
Tisdale et al. Autonomous UAV path planning and estimation
CN108896047B (en) Distributed sensor network collaborative fusion and sensor position correction method
CN111240319A (en) Outdoor multi-robot cooperative operation system and method thereof
CN107315171B (en) Radar networking target state and system error joint estimation algorithm
WO2021077769A1 (en) Streaming computing-based map creation method and system therefor
CN114398455B (en) Heterogeneous multi-robot collaborative SLAM map fusion method
CN116929338B (en) Map construction method, device and storage medium
CN103901891A (en) Dynamic particle tree SLAM algorithm based on hierarchical structure
CN109901108A (en) A kind of formation unmanned plane co-located method based on posteriority linearisation belief propagation
CN110515382A (en) A kind of smart machine and its localization method
CN114877883A (en) Vehicle positioning method and system considering communication delay under cooperative vehicle and road environment
CN117685953A (en) UWB and vision fusion positioning method and system for multi-unmanned aerial vehicle co-positioning
CN117007040A (en) Relative navigation method and device based on multi-source information fusion
WO2021109166A1 (en) Three-dimensional laser positioning method and system
CN117739978B (en) Multi-AUV parallel collaborative navigation positioning method and system based on factor graph
Song et al. Cooperative Positioning Algorithm Based on Manifold Gradient Filtering in UAV-WSN
CN114339595B (en) Ultra-wide band dynamic inversion positioning method based on multi-model prediction
CN116793366A (en) Multi-AUV (autonomous Underwater vehicle) co-location method and system based on factor graph under dynamic topology
Lv et al. A SLAM Algorithm Based on Edge‐Cloud Collaborative Computing
CN116381724A (en) Multi-machine 3D laser SLAM method and system for dynamically combining connected components
JP2023173688A (en) Moving object, advance map generation method, position estimation system, position estimation method, advance map generation method, and program
CN114993341A (en) Carrier rocket trajectory estimation method and device based on space-based measurement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20240329

Address after: No. 309, Jiangning District, Jiangning District, Nanjing, Jiangsu

Applicant after: JIANGSU MARITIME INSTITUTE

Country or region after: China

Address before: Room 602, No. 5, wen'anli, Nanjing, Jiangsu 210001

Applicant before: Lv Taizhi

Country or region before: China

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20241022

Address after: Building C, 8th Floor, Deji Building, No. 188 Changjiang Road, Xuanwu District, Nanjing City, Jiangsu Province, China 210018

Patentee after: Nanjing Shenye Intelligent Technology Co.,Ltd.

Country or region after: China

Address before: No. 309, Jiangning District, Jiangning District, Nanjing, Jiangsu

Patentee before: JIANGSU MARITIME INSTITUTE

Country or region before: China