CN115388899A - Mobile robot vision inertia fusion SLAM method based on variational Bayes - Google Patents
Mobile robot vision inertia fusion SLAM method based on variational Bayes Download PDFInfo
- Publication number
- CN115388899A CN115388899A CN202211125734.3A CN202211125734A CN115388899A CN 115388899 A CN115388899 A CN 115388899A CN 202211125734 A CN202211125734 A CN 202211125734A CN 115388899 A CN115388899 A CN 115388899A
- Authority
- CN
- China
- Prior art keywords
- mobile robot
- iteration
- representing
- distribution
- state vector
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1656—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/18—Stabilised platforms, e.g. by gyroscope
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention provides a mobile robot vision inertia fusion SLAM method based on variational Bayes, which comprises the following steps: s1, obtaining a predicted state vector and a covariance matrix of a mobile robot at a moment k; s2, establishing a model of observation noise at the k moment, which accords with the mixed distribution of student t based on gamma distribution, and obtaining likelihood function distribution in a hierarchical Gaussian form based on visual information, an observation noise model and a prediction state vector of the mobile robot at the k moment; and S3, constructing a combined posterior probability density function based on the likelihood function distribution, and performing combined solution on the posterior combined probability density function by using a variational Bayes method to obtain a state vector of the mobile robot at the time k. A new distribution is provided for modeling observation noise so as to reduce the influence of non-stationary thick tail noise on the precision of a visual and inertial data fusion result, and the variational Bayes is utilized to realize the joint estimation of the state of the mobile robot and improve the precision of the finally output state vector.
Description
Technical Field
The invention relates to the technical field of intelligent robots, in particular to a mobile robot vision inertia fusion SLAM method based on variational Bayes.
Background
Synchronous positioning And map building (SLAM) refers to a process of extracting And combining useful information in an unknown environment while a mobile robot runs in the unknown environment, determining self pose And gradually building a surrounding environment map. SLAM has been successfully applied to underwater unmanned vehicles, aerial unmanned vehicles and the like, and has received more and more attention.
The Bayesian estimation-based filtering is widely applied to SLAM rear-end optimization, and the current-time state vector is obtained according to the last-time state vector of the mobile robot and the current-time estimated value. The first appeared SLAM algorithm based on Extended Kalman Filter (EKF), which needs to utilize a first-order Taylor expansion to linearize a motion model and an observation model in the SLAM algorithm before estimating the pose and the map feature position of a robot by using EKF, but corresponding errors are introduced into the linearization processes of the two models, and in addition, the calculation process of a Jacobian matrix also brings great burden to the algorithm. Then, unscented Kalman Filtering (UKF) occurs, and patent CN109141436A uses this method to substitute a sampling point set with a certain weight into a motion model and an observation model of the SLAM algorithm, thereby avoiding linearization in the calculation of state information and observation information of the mobile robot. However, when the system dimension is gradually increased, a non-local sampling phenomenon occurs in the UKF, which directly causes the calculation result of the SLAM to be unstable and even to be divergent. Next, a SLAM algorithm based on Cubature Kalman Filter (CKF) appears, and a group of sampling point sets with the same weight are substituted into a nonlinear model for calculation, but the method omits a part of approximation errors, so that the filtering precision is reduced. In the patent CN113971752A, non-gaussian noise with outliers is simulated by using student t distribution to improve estimation accuracy, and a posterior probability density function is approximated to a student t probability density function with fixed degree of freedom parameters. However, in practice, thick tail noise also has non-stationary characteristics, and the estimation accuracy of kalman filtering is greatly reduced without considering the influence of the non-stationary characteristics.
Disclosure of Invention
The invention aims to solve the problem that the accuracy of data after final fusion is reduced because of the influence of noise non-stationarity in Kalman filtering in the prior art, and provides a mobile robot vision inertia fusion SLAM method based on variational Bayes.
In order to achieve the above object, according to a first aspect of the present invention, there is provided a mobile robot visual-inertial fusion SLAM method based on variational bayes, comprising: s1, obtaining a prediction state vector and a one-step prediction error covariance matrix of the mobile robot at the moment k based on the state vector and the covariance matrix of the mobile robot at the moment k-1, IMU data and a system noise covariance matrix of the mobile robot at the moment k; s2, establishing a model of observation noise at the moment k, which accords with the mixed distribution of student t based on gamma distribution, and obtaining likelihood function distribution in a hierarchical Gaussian form based on visual information, an observation noise model and a prediction state vector at the moment k of the mobile robot; and S3, constructing a combined posterior probability density function based on the likelihood function distribution, and performing combined solution on the posterior combined probability density function by using a variational Bayes method to obtain a state vector of the mobile robot at the time k.
To achieve the above objects, according to the same inventive concept, the present invention provides a computer-readable storage medium storing at least one instruction, at least one program, a code set, or a set of instructions, which is loaded and executed by a processor to implement the variational bayes based mobile robot visual inertia fusion SLAM method according to the first aspect of the present invention.
In order to achieve the above-mentioned object, according to the same inventive concept, the invention provides a mobile robot, which includes a mobile robot body, and a vision sensor, an IMU sensor and a processor that are arranged on the mobile robot body, wherein the processor is respectively connected with the vision sensor and the IMU sensor, and the processor executes the steps of the mobile robot vision inertia fusion SLAM method based on variational bayes according to the first aspect of the invention.
The technical scheme is as follows: a new distribution (namely, student t mixed distribution based on gamma distribution) is provided to model observation noise, the distribution takes the gamma distribution as a mixed density function, scale mixing is carried out on the student t distribution to cope with non-stationary running of noise, the influence of the non-stationary thick tail noise on the precision of a visual and inertial data fusion result is reduced, the estimation precision of Kalman filtering is further improved, information iteration updating based on variational Bayes is introduced in the fusion process of IMU data and visual sensor measurement data, and the variational Bayes is utilized to realize joint estimation of the state of the mobile robot, so that the precision of a finally obtained state vector is further improved.
Drawings
Fig. 1 is a schematic flowchart of a mobile robot visual inertia fusion SLAM method based on variational bayes in embodiment 1 of the present invention;
fig. 2 is a schematic diagram of a data fusion process in an application scenario in embodiment 1 of the present invention.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
In the description of the present invention, it is to be understood that the terms "longitudinal", "lateral", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like, indicate orientations or positional relationships based on those shown in the drawings, and are used merely for convenience of description and for simplicity of description, and do not indicate or imply that the referenced devices or elements must have a particular orientation, be constructed in a particular orientation, and be operated, and thus, are not to be construed as limiting the present invention.
In the description of the present invention, unless otherwise specified and limited, it is to be noted that the terms "mounted," "connected," and "connected" are to be interpreted broadly, and may be, for example, a mechanical connection or an electrical connection, a communication between two elements, a direct connection, or an indirect connection via an intermediate medium, and specific meanings of the terms may be understood by those skilled in the art according to specific situations.
Example 1
The embodiment discloses a mobile robot visual inertia fusion SLAM method based on variational Bayes, which comprises the following steps as shown in figure 1:
step S1, based on the state vector x of the mobile robot at the moment k-1 k-1||k-1 Sum covariance matrix P k-1|k-1 And IMU data and system noise covariance matrix Q at time k of the mobile robot k Obtaining the predicted state vector x of the mobile robot at the moment k k|k-1 Sum step prediction error covariance matrix P k|k-1 。
The IMU data is provided by IMU sensors including accelerometers to provide accelerations of the mobile robot during movement and gyroscopes to provide angular velocity values of the mobile robot during movement. After obtaining IMU data, performing pre-integration processing on the IMU data, wherein the pre-integration processing comprises removing noise from an acceleration value measured by an accelerometer, performing primary integration to obtain speed, and performing secondary integration to obtain relative displacement; and the pre-integration processing also comprises the step of carrying out primary integration after removing noise on the angular velocity value measured by the gyroscope so as to obtain the angular change of the mobile robot in the motion process.
In step S1, a predicted state vector x at the time k of the mobile robot is obtained using IMU data at the time k of the mobile robot and the constructed volume points k|k-1 The method specifically comprises the following steps: x k|k-1 =F k-1 X k-1||k-1 Wherein F is k-1 Representing the state vector transformation relationship of the mobile robot at time k-1 and k obtained based on IMU data, preferably F k-1 Obtaining by pre-integration of IMU data; x k-1||k-1 A state vector representing the mobile robot at the time k-1; x k|k-1 =[P bk ,V bk ,Q bk ],P bk And V bk Three-dimensional pose and speed information, Q, of the mobile robot, respectively bk Representing a quaternion from the IMU coordinate system to the world coordinate system.
In step S1, the predicted state vector X at time k of the mobile robot is used k|k-1 Sum system noise covariance matrix Q k Calculating a one-step prediction error covariance matrix at the moment k of the mobile robot, specifically:
wherein, P k-1|k-1 Covariance matrix, Q, representing the moment k-1 of the mobile robot k Representing the covariance matrix of the system noise at time k of the mobile robot. The prior probability density function is then: p (X) k |Y k-1 )=N(X k ;X k|k-1 P k|k-1 )。
And S2, establishing a model of observation noise at the moment k, which accords with the mixed distribution of the student t based on gamma distribution, and obtaining likelihood function distribution in a hierarchical Gaussian form based on the visual information, the observation noise model and the prediction state vector of the mobile robot at the moment k.
Establishing a model which accords with the gamma distribution-based student t mixed distribution of observation noise at the k moment:
Wherein v is k Represents the observation noise at the moment k of the mobile robot,denotes v k An initial value of (1); beta is a k Denotes the shape parameter, beta k Is a distribution function ofWherein the content of the first and second substances,representing a first constant number, y representing a second constant number, I m Representing an m-order identity matrix; r k Representing a scale matrix; u. u k Denotes a first degree of freedom, u k Has a distribution function of p (u) k )=G(u k ;a k ,b k ),a k 、b k Respectively belong to u k A first real number parameter and a second real number parameter of the gamma distribution; delta. For the preparation of a coating k Representing a second degree of freedom; lambda k Denotes a first auxiliary parameter, λ k Has a distribution function of p (λ) k )=G(λ k ;δ k /2,δ k /2);ξ k Representing a second auxiliary parameter, ξ k Has a distribution function of p (xi) k )=G(ξ k ;u k /2,u k 2); n represents a Gaussian distribution; g denotes a gamma distribution, and GaSTM denotes a student t-mix distribution based on the gamma distribution.
Proposes a new distribution-based student t mixed distributionThe distribution having location parametersShape parameter beta k Dimension matrix R k First and second different parameters u of degree of freedom k And delta k The method takes gamma distribution as a mixed density function, and scales and mixes the student t step by step to deal with the non-stationary characteristic of noise.
In step S2, the step of obtaining the likelihood function distribution in a hierarchical gaussian form based on the visual information of the mobile robot at time k, the observation noise model, and the prediction state vector includes:
step S21, based on the vision information of the mobile robot at the time k, obtaining the vision state information Y of the mobile robot at the time k k And Y is k Satisfy Y k =H k X k|k-1 +v k (ii) a Wherein H k An observation matrix representing visual state information Y of the mobile robot at the time k k And the predicted state vector X k|k-1 The mapping relationship of (c); specifically, ORB (organized FAST and Rotated BRIEF) features of the visual information are extracted according to the visual information of the robot at the time k, and the visual state information Y is obtained according to the ORB features k The process of obtaining visual state information is prior art and is not described herein again. Visual state information Y k The method comprises the steps of including three-dimensional pose and speed information of the mobile robot;
step S22, establishing likelihood function distribution as follows:
p(Y k |X k|k-1 )=GaSTM(Y k ;H k X k|k-1 ,β k ,R k ,u k ,δ k );
the likelihood function satisfies GaSTM distribution, and the likelihood function distribution is converted into a hierarchical Gaussian form:
and S3, constructing a combined posterior probability density function based on the likelihood function distribution, and performing combined solution on the posterior combined probability density function by using a variational Bayes method to obtain a state vector of the mobile robot at the time k.
According to a prior probability density function p (X) k |Y k-1 )=N(X k ;X k|k-1 ,P k|k-1 ) Obtaining a combined posterior probability density function p (X) from the distribution of likelihood functions in a hierarchical Gaussian form k ,λ k ,ξ k ,u k ,β k ,Y k ):
Namely: p (X) k ,λ k ,ξ k ,u k ,β k ,Y k )≈q(X k )q(λ k )q(ξ k )q(u k )q(β k ); (1)
Wherein q (X) k ) State vector X representing time k of mobile robot k An approximate probability density function of; q (lambda) k ) Representing a first auxiliary parameter lambda k An approximate probability density function of; q (xi) k ) Representing a second auxiliary parameter ξ k An approximate probability density function of; q (u) k ) Representing a first degree of freedom u k An approximate probability density function of; q (. Beta.) of k ) Representing a shape parameter beta k An approximate probability density function of. Therefore, only q (X) needs to be solved k )、q(λ k )、q(ξ k )、q(u k )、q(β k ) The finally fused state vector X of the robot at the moment k can be obtained k 。
X when the posterior joint probability density function is jointly solved by using the variational Bayes method k 、λ k 、ξ k 、u k 、β k Satisfies the following conditions:
wherein, aggregateTheta represents a set theta k Any one of the elements of (a); theta k The expression (-theta) is represented by the set theta k A subset of all elements except θ; c θ Represents a constant independent of the element θ; e [ ·]Indicating the expected value.
In step S3, the step of jointly solving the posterior joint probability density function by using the variational bayes method to obtain the fusion state vector of the mobile robot at the time k includes:
step S31, parameter initialization:
step S32, the combined posterior probability density function is iteratively updated by utilizing the variational Bayes method, and X is respectively solved after each iteration k 、λ k 、ξ k 、u k 、β k If the expected value of the time of the mobile robot reaches the iteration stop condition, outputting a state vector X of the mobile robot at the moment k obtained after the last iteration k Sum covariance matrix P k|k 。
Further preferably, in step S32, for the (i + 1) th iteration:
step S321, substituting equation 1 into equation 2, let θ = X k It is possible to obtain:
from the above formula, q can be deduced (i+1) (X k ) Obeying a gaussian distribution:q (i+1) (X k ) Represents X k Approximate probability density function in the (i + 1) th iteration. Wherein the content of the first and second substances, represents the state vector, X, of the mobile robot at time k in the (i + 1) th iteration k|k-1 A predicted state vector, P, representing the mobile robot at time k k|k-1 Representing the covariance matrix of the one-step prediction error at time k of the mobile robot,representKalman gain of the i +1 th iteration, E (i) [λ k ]Denotes λ k Expected value at i-th iteration, E (i) [β k ]Is represented by beta k Expected value at i-th iteration, E (i) [ξ k ]Is representative of xi k At the expected value of the i-th iteration,representing the covariance matrix of the mobile robot at time k in the (i + 1) th iteration.
Step S322, substituting equation 1 into equation 2, let θ = λ k It is possible to obtain:
the above formula can analyze that q cannot be paired (i+1) (λ k ) Direct analysis is carried out, approximate point processing is carried out, and the optimal solution can be obtained as follows:
wherein, the first and the second end of the pipe are connected with each other,E (i) [ξ k ]is representative of xi k In the expected value of the ith iteration, tr { } represents the trace of the matrix; n is a radical of an alkyl radical 1 、n 2 、n 3 Respectively a third auxiliary parameter, a fourth auxiliary parameter and a fifth auxiliary parameter; e (i) [(Y k -H k X k )(Y k -H k X k ) T ]Is represented by (Y) k -H k X k )(Y k -H k X k ) T Expected value at the ith iteration;to representExpected value at the ith iteration; m represents an identity matrix I m The order of (a).
Step S323, substituting formula 1 into formula 2, and making theta = xi k It is possible to obtain:
from the above formula, q can be deduced (i+1) (ξ k ) Obey the gamma distribution, hence ξ k The approximate probability density function in the (i + 1) th iteration is:
q (i+1) (ξ k )=G(ξ k ;A (i+1) ,B (i+1) );
wherein the content of the first and second substances,A (i+1) represents the value of the sixth auxiliary parameter A in the (i + 1) th iteration, B (i+1) Represents the value of the seventh auxiliary parameter B in the (i + 1) th iteration, C (i+1) Denotes the value of the eighth auxiliary parameter C in the (i + 1) th iteration, E (i) [u k ]Represents u k Expected value at i-th iteration, E (i) [(Y k -H k X k -λ k β k )(Y k -H k X k -λ k β k ) T ]Is represented by (Y) k -H k X k -λ k β k )(Y k -H k X k -λ k β k ) T Expected value at the i-th iteration.
Step S324, substituting equation 1 into equation 2, and letting θ = u k It is possible to obtain:
from the above formula, q can be deduced (i+1) (u k ) Obeying the gamma distribution to obtain u k Approximate probability Density in the i +1 th iterationThe function is:
wherein, the first and the second end of the pipe are connected with each other,a k (0) denotes a k Is set to the initial value of (a),denotes a k Value in the (i + 1) th iteration, b k (0) Denotes b k Is set to the initial value of (a),denotes b k Value in the (i + 1) th iteration.
Step S325, substituting formula 1 into formula 2, let θ = β k It is possible to obtain:
from the above formula, q can be deduced (i+1) (β k ) Obeying Gaussian distribution to obtain beta k The approximate probability density function in the (i + 1) th iteration is:
wherein the content of the first and second substances, is represented by beta k The value in the (i + 1) th iteration,denotes the eighthAuxiliary parameter K β The value in the (i + 1) th iteration,denotes a ninth auxiliary parameter P β Value in the (i + 1) th iteration.
Step S326, obtaining expected values of each parameter in the (i + 1) th iteration, where the expected values may be used as input values of the next iteration or output values after the iteration stop condition is met, and each expected value is:
E (i+1) [ξ k ]=A (i+1) /B (i+1) ;
E (i+1) [u k ]=a (i+1) /b (i+1) ;
step S327, determining whether the condition is satisfiedIf so, willFusion state vector X as a mobile robot at time k k Output is toCovariance matrix P as a mobile robot at time k k|k And outputting epsilon to represent a preset iteration stop threshold, and if the output epsilon does not satisfy the preset iteration stop threshold, making i = i +1, and returning to execute the steps S321 to S327.
The embodiment provides a student t-hybrid Kalman filtering SLAM algorithm based on gamma distribution, aiming at the problem that the nonstationary characteristic of noise is not considered in Kalman filtering provided in the background art, and the influence on the reduction of the final fused data precision is reduced. Firstly, a new distribution-the student t mixed distribution based on the gamma distribution is providedThe distribution having a location parameterShape parameter beta k The scale matrix R k First and second different parameters u of degree of freedom k And delta k The method takes gamma distribution as a mixed density function, and scales and mixes the student t step by step to deal with the non-stationary characteristic of noise.
In this embodiment, the main process of the SLAM method can be divided into: first, using the state vector X at time k-1 k-1|k-1 And IMU data, for state vector X at time k k And one-step prediction error covariance matrix P k|k-1 And (6) performing prediction. Second, observation noise v is measured using a GaSTM pair k Modeling is carried out, and likelihood function distribution of a level Gaussian form is deduced by using two auxiliary parameters. Thirdly, a variational Bayes method is used for carrying out combined solution on the posterior combined probability density function, and the expected values of the state vector, the auxiliary parameter, the first degree of freedom and the shape parameter are obtained through sequential iteration, so that the accuracy of the measurement data fusion of the IMU and the vision sensor is ensured, and the improvement is achievedThe positioning accuracy of SLAM is high. The method introduces information iteration updating based on variational Bayes in the process of IMU and vision sensor measurement data fusion, utilizes GaSTM to model observation noise, is respectively used for reducing the influence of non-stationary thick tail noise on the precision of a fusion result, and utilizes variational Bayes to realize the joint estimation of the state of the mobile robot, thereby improving the precision of finally obtained state information.
In an application scenario of this embodiment, fig. 2 is a schematic diagram of a data fusion process of the application scenario, where firstly, prediction of information at a next time is completed according to IMU data and information at a current time, that is, information at a k time is predicted by using information at a k-1 time, then observed quantity is obtained by using visual sensor data, observation noise is modeled by using gasstm, and finally, solution of a final result is completed by using variational bayes.
Example 2
The present embodiment is based on the same inventive concept as embodiment 1, and discloses a computer-readable storage medium, in which at least one instruction, at least one program, code set, or instruction set is stored, and the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by a processor to implement the variational bayes-based mobile robot visual inertial fusion SLAM method provided in embodiment 1.
Example 3
Based on the inventive concept of embodiment 1, this embodiment discloses a mobile robot, which includes a mobile robot body, and a vision sensor, an IMU sensor and a processor that are arranged on the mobile robot body, where the processor is connected with the vision device and the IMU sensor, respectively, and the processor executes the steps of the variational bayes-based mobile robot vision inertial fusion SLAM method provided in embodiment 1. The vision sensor is preferably, but not limited to, a camera.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present invention have been shown and described, it will be understood by those of ordinary skill in the art that: various changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the invention, the scope of which is defined by the claims and their equivalents.
Claims (10)
1. A mobile robot vision inertia fusion SLAM method based on variational Bayes is characterized by comprising the following steps:
s1, obtaining a predicted state vector and a one-step prediction error covariance matrix of the mobile robot at the moment k based on a state vector and a covariance matrix of the mobile robot at the moment k-1, IMU data and a system noise covariance matrix of the mobile robot at the moment k;
s2, establishing a model of observation noise at the moment k, which accords with the mixed distribution of student t based on gamma distribution, and obtaining likelihood function distribution in a hierarchical Gaussian form based on visual information, an observation noise model and a prediction state vector at the moment k of the mobile robot;
and S3, constructing a combined posterior probability density function based on the likelihood function distribution, and performing combined solution on the posterior combined probability density function by using a variational Bayes method to obtain a state vector of the mobile robot at the time k.
2. The variational bayes-based visual-inertial fusion SLAM method for a mobile robot according to claim 1, characterized in that in step S1, the predicted state vector of the mobile robot at time k is:
X k|k-1 =F k-1 X k-1||k-1 wherein F is k-1 Indicating the state vector conversion relationship of the mobile robot at the time k-1 and the time k obtained based on IMU data, X k-1||k-1 A state vector representing the moment k-1 of the mobile robot;
the covariance matrix of the one-step prediction error at the moment k of the mobile robot is as follows:
3. The variational bayes-based visual-inertial-fusion SLAM method of a mobile robot according to claim 1 or 2, characterized in that in said step S2, the observed noise model at time k of the mobile robot is:
wherein v is k Represents the observation noise at the time of the mobile robot k,denotes v k An initial value of (1); beta is a beta k Denotes the shape parameter, beta k Is a distribution function ofWherein the content of the first and second substances,representing a first constant number, y representing a second constant number, I m Representing an m-order identity matrix; r is k Representing a scale matrix; u. of k Denotes a first degree of freedom, u k Has a distribution function of p (u) k )=G(u k ;a k ,b k ),a k 、b k Respectively belong to u k A first real parameter and a second real parameter of the gamma distribution of (1); delta. For the preparation of a coating k Representing a second degree of freedom;λ k Denotes a first auxiliary parameter, λ k Has a distribution function of p (λ) k )=G(λ k ;δ k /2,δ k /2);ξ k Representing a second auxiliary parameter, ξ k Has a distribution function of p (xi) k )=G(ξ k ;u k /2,u k 2) and (c); n represents a Gaussian distribution; g denotes a gamma distribution, and GaSTM denotes a student t-mix distribution based on the gamma distribution.
4. The variational bayes-based mobile robot visual-inertial fusion SLAM method of claim 3, wherein in said step S2, the step of obtaining likelihood function distribution in a hierarchical gaussian form based on visual information of the mobile robot at time k, observation noise model and predicted state vector comprises:
acquiring visual state information Y of mobile robot at moment k based on visual information of mobile robot at moment k k And Y is k Satisfy Y k =H k X k|k-1 +v k (ii) a Wherein H k An observation matrix representing visual state information Y of the mobile robot at the time k k And the predicted state vector X k|k-1 The mapping relationship of (2);
establishing likelihood function distribution as follows:
p(Y k |X k|k-1 )=GaSTM(Y k ;H k X k|k-1 ,β k ,R k ,u k ,δ k );
converting the likelihood function distribution into a hierarchical gaussian form as:
5. the variational bayes-based mobile robot visual-inertial fusion SLAM method according to claim 4, characterized in that in said step S3, the joint posterior probability density function is:
p(X k ,λ k ,ξ k ,u k ,β k ,Y k )≈q(X k )q(λ k )q(ξ k )q(u k )g(β k );
wherein q (X) k ) State vector X representing time k of mobile robot k An approximate probability density function of; q (lambda) k ) Representing a first auxiliary parameter lambda k An approximate probability density function of; q (xi) k ) Representing a second auxiliary parameter xi k An approximate probability density function of; q (u) k ) Representing a first degree of freedom u k An approximate probability density function of; q (. Beta.) of k ) Representing a shape parameter beta k Approximate probability density function of.
6. The method of claim 5, wherein in step S3, the variational Bayes method is used to jointly solve the posterior joint probability density function by X k 、λ k 、ξ k 、u k 、β k Satisfies the following optimal solution:
7. The visual-inertial fusion SLAM method of the mobile robot based on variational Bayes as claimed in claim 5 or 6 wherein, in the step S3, the step of jointly solving the posterior joint probability density function by using the variational Bayes method to obtain the fusion state vector of the mobile robot at the k-time comprises:
step S31, parameter initialization:
step S32, the combined posterior probability density function is iteratively updated by utilizing the variational Bayes method, and X is respectively solved after each iteration k 、λ k 、ξ k 、u k 、β k If the expected value reaches the iteration stop condition, outputting the state vector X of the mobile robot at the moment k obtained after the last iteration k Sum covariance matrix P k|k 。
8. The variational bayes-based mobile robot visual-inertial fusion SLAM method of claim 7, wherein in said step S32, for the (i + 1) th iteration: step S321, solving X based on variational Bayes method k The approximate probability density function in the (i + 1) th iteration is:
wherein, the first and the second end of the pipe are connected with each other, represents the state vector, X, of the mobile robot at time k in the i +1 th iteration k|k-1 A predicted state vector, P, representing the mobile robot at time k k|k-1 Representing the one-step prediction error covariance matrix at time k of the mobile robot,representing the i +1 th iterationKalman gain, E (i) [λ k ]Denotes λ k Expected value at i-th iteration, E (i) [β k ]Is expressed by beta k Expected value at i-th iteration, E (i) [ξ k ]Is representative of xi k At the expected value of the i-th iteration,representing a covariance matrix of the mobile robot at the k moment in the (i + 1) th iteration;
step S322, based on variational Bayes method, calculating lambda k The optimal solution in the (i + 1) th iteration is:
wherein, the first and the second end of the pipe are connected with each other,E (i) [ξ k ]is representative of xi k At the expected value of the ith iteration, tr { } represents the trace of the solved matrix; n is 1 、n 2 、n 3 Respectively a third auxiliary parameter, a fourth auxiliary parameter and a fifth auxiliary parameter; e (i) [(Y k -H k X k )(Y k -H k X k ) T ]Is represented by (Y) k -H k X k )(Y k -H k X k ) T Expected value at the ith iteration;to representExpected value at the ith iteration; m represents an identity matrix I m The order of (a);
step S323, based on variational Bayes method, solving xi k The approximate probability density function in the (i + 1) th iteration is:
q (i+1) (ξ k )=G(ξ k ;A (i+1) ,B (i+1) );
wherein the content of the first and second substances,A (i+1) represents the value of the sixth auxiliary parameter A in the (i + 1) th iteration, B (i+1) Represents the value of the seventh auxiliary parameter B in the (i + 1) th iteration, C (i+1) Representing the value of the eighth auxiliary parameter C in the (i + 1) th iteration, E (i) [u k ]Represents u k Expected value at i-th iteration, E (i) [(Y k -H k X k -λ k β k )(Y k -H k X k -λ k β k ) T ]Is shown (Y) k -H k X k -λ k β k )(Y k -H k X k -λ k β k ) T Expected value at the ith iteration;
step S324, based on variational Bayes method, u is obtained k The approximate probability density function in the (i + 1) th iteration is:
wherein the content of the first and second substances,a k (0) denotes a k Is set to the initial value of (a),denotes a k Value in the (i + 1) th iteration, b k (0) Denotes b k Of the initial value of (a) is,denotes b k The value in the (i + 1) th iteration;
step S325, based on variational BayesMethod for obtaining beta k The approximate probability density function in the (i + 1) th iteration is:
wherein the content of the first and second substances, is expressed by beta k The value in the (i + 1) th iteration,denotes the eighth auxiliary parameter K β The value in the (i + 1) th iteration,denotes a ninth auxiliary parameter P β The value in the (i + 1) th iteration;
step S326, obtaining an expected value:
E (i+1) [ξ k ]=A (i+1) /B (i+1) ;
E (i+1) [u k ]=a (i+1) /b (i+1) ;
step S327, determining whether the requirements are satisfiedIf so, willFusion state vector X as a mobile robot at time k k And output is toCovariance matrix P as a mobile robot at time k k|k And outputting epsilon representing an iteration stop threshold, and if the iteration stop threshold is not met, making i = i +1, and returning to execute the steps S321 to S327.
9. A computer readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement a variational bayes based mobile robotic visual inertial fusion SLAM method of any of claims 1-8.
10. A mobile robot, comprising a mobile robot body, and a vision sensor, an IMU sensor and a processor which are arranged on the mobile robot body, wherein the processor is respectively connected with the vision device and the IMU sensor, and the processor executes the steps of the mobile robot vision inertia fusion SLAM method based on variational Bayes according to one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211125734.3A CN115388899A (en) | 2022-09-15 | 2022-09-15 | Mobile robot vision inertia fusion SLAM method based on variational Bayes |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211125734.3A CN115388899A (en) | 2022-09-15 | 2022-09-15 | Mobile robot vision inertia fusion SLAM method based on variational Bayes |
Publications (1)
Publication Number | Publication Date |
---|---|
CN115388899A true CN115388899A (en) | 2022-11-25 |
Family
ID=84127279
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211125734.3A Pending CN115388899A (en) | 2022-09-15 | 2022-09-15 | Mobile robot vision inertia fusion SLAM method based on variational Bayes |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115388899A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115930971A (en) * | 2023-02-01 | 2023-04-07 | 七腾机器人有限公司 | Data fusion processing method for robot positioning and mapping |
-
2022
- 2022-09-15 CN CN202211125734.3A patent/CN115388899A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115930971A (en) * | 2023-02-01 | 2023-04-07 | 七腾机器人有限公司 | Data fusion processing method for robot positioning and mapping |
CN115930971B (en) * | 2023-02-01 | 2023-09-19 | 七腾机器人有限公司 | Data fusion processing method for robot positioning and map building |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112347840B (en) | Vision sensor laser radar integrated unmanned aerial vehicle positioning and image building device and method | |
CN111136660B (en) | Robot pose positioning method and system | |
CN110018691B (en) | Flight state estimation system and method for small multi-rotor unmanned aerial vehicle | |
CN111949929B (en) | Design method of multi-sensor fusion quadruped robot motion odometer | |
CN114459470A (en) | Inspection robot positioning method based on multi-sensor fusion | |
CN113819917A (en) | Automatic driving path planning method, device, equipment and storage medium | |
CN115388899A (en) | Mobile robot vision inertia fusion SLAM method based on variational Bayes | |
CN116067370A (en) | IMU gesture resolving method, IMU gesture resolving equipment and storage medium | |
CN115143954A (en) | Unmanned vehicle navigation method based on multi-source information fusion | |
CN113340324B (en) | Visual inertia self-calibration method based on depth certainty strategy gradient | |
CN113483755A (en) | Multi-sensor combined positioning method and system based on non-global consistent map | |
CN114046800B (en) | High-precision mileage estimation method based on double-layer filtering frame | |
CN108710295B (en) | Robot following method based on progressive volume information filtering | |
CN113778082B (en) | Unmanned vehicle track tracking control method and system based on self-triggering mechanism | |
CN113984045B (en) | Method and system for estimating motion state of movable butt-joint target of underwater robot | |
Emter et al. | Stochastic cloning for robust fusion of multiple relative and absolute measurements | |
CN111812668B (en) | Winding inspection device, positioning method thereof and storage medium | |
CN114777762A (en) | Inertial navigation method based on Bayesian NAS | |
CN115930971B (en) | Data fusion processing method for robot positioning and map building | |
CN114018250B (en) | Inertial navigation method, electronic device, storage medium and computer program product | |
CN113758491B (en) | Relative positioning method and system based on multi-sensor fusion unmanned vehicle and vehicle | |
WO2024037295A1 (en) | Positioning | |
CN114056338B (en) | Multi-sensor fusion vehicle state parameter prediction method | |
WO2024114593A1 (en) | Multi-line laser positioning method and positioning apparatus, computer device, and storage medium | |
Zhou et al. | A robot state estimator based on multi-sensor information fusion |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |