CN109443354B - Visual-inertial tight coupling combined navigation method based on firefly group optimized PF - Google Patents

Visual-inertial tight coupling combined navigation method based on firefly group optimized PF Download PDF

Info

Publication number
CN109443354B
CN109443354B CN201811590654.9A CN201811590654A CN109443354B CN 109443354 B CN109443354 B CN 109443354B CN 201811590654 A CN201811590654 A CN 201811590654A CN 109443354 B CN109443354 B CN 109443354B
Authority
CN
China
Prior art keywords
navigation system
inertial
representing
visual
iteration
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811590654.9A
Other languages
Chinese (zh)
Other versions
CN109443354A (en
Inventor
李秀源
高文学
张加书
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
North University of China
Original Assignee
North University of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by North University of China filed Critical North University of China
Priority to CN201811590654.9A priority Critical patent/CN109443354B/en
Publication of CN109443354A publication Critical patent/CN109443354A/en
Application granted granted Critical
Publication of CN109443354B publication Critical patent/CN109443354B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Abstract

The invention relates to a visual-inertial tight coupling combined navigation method, in particular to a visual-inertial tight coupling combined navigation method based on firefly group optimized PF. The invention improves the real-time performance and navigation precision of the visual-inertial tight coupling integrated navigation method. A visual-inertial tight coupling combined navigation method based on firefly group optimized PF is realized by adopting the following steps: step S1: a strapdown inertial navigation system and a binocular vision odometer are arranged on the carrier; step S2: establishing a linear state equation; establishing a nonlinear measurement equation; step S3: performing nonlinear filtering on the visual-inertial tightly-coupled integrated navigation system by using the firefly group optimization-based PF to realize data fusion of the visual-inertial tightly-coupled integrated navigation system; step S4: and correcting the calculation result of the strapdown inertial navigation system according to the nonlinear filtering result of the step S3. The invention is suitable for vision-inertia tight coupling combined navigation.

Description

Visual-inertial tight coupling combined navigation method based on firefly group optimized PF
Technical Field
The invention relates to a visual-inertial tight coupling combined navigation method, in particular to a visual-inertial tight coupling combined navigation method based on firefly group optimized PF.
Background
In recent years, individual navigation systems based on various principles have been developed, and their performance has been improved. However, any individual navigation sub-device or sub-system cannot fully meet the increasing navigation requirements, so that the application of the combined navigation technology capable of realizing complementary advantages is expanding and receiving more and more attention.
The strapdown inertial navigation system has the advantages of low cost, small volume, full autonomy, good concealment, high sampling frequency and the like, but the error of the strapdown inertial navigation system is dispersed along with the time. The visual odometer based on the visual sensor is a newly emerging navigation device at present, and has the advantages of low price, low energy consumption, rich information quantity and the like, so the visual odometer rapidly gets wide attention and application. Because the advantages of the strapdown inertial navigation system and the visual odometer based on the visual sensor are complemented, the long-term divergence of the strapdown inertial navigation system is effectively overcome, and the precision of the whole navigation system is improved, the integrated navigation technology based on the strapdown inertial navigation system/the visual odometer is an important development direction in the integrated navigation field. The combined navigation of the strapdown inertial navigation system/the visual odometer is divided into loose coupling and tight coupling, the loose coupling is a mode that the strapdown inertial navigation system and the visual odometer are coupled on the basis of completing motion calculation respectively, the calculation amount of a combined navigation filter is small, but the original image information of the visual odometer is not utilized; the tight coupling utilizes the original coordinate information matched with the image of the visual odometer to be coupled with the strapdown inertial navigation system, and can obtain a combined navigation result with higher precision than the loose coupling. The invention belongs to visual-inertial tight coupling combined navigation.
Because the observation equation of the vision-inertia tight coupling combination navigation is strong nonlinearity, the efficient realization of the high-precision nonlinear combination filtering becomes one of the key technologies in the vision-inertia tight coupling combination navigation. Particle Filter (PF) can optimize the important density function through iteration volume Kalman filter, thus obtain the tight coupling combination filter result of higher accuracy, however, because iteration volume Kalman filter carries out the calculation of volume points and weighted summation in the time updating and measurement updating stages, the calculated amount is larger, and the real-time performance is poorer. Meanwhile, particle filtering has the problems of particle depletion caused by resampling and the need of a large number of particles for state estimation. Aiming at the defects, the invention provides a set of complete technical scheme for realizing the visual-inertial tightly-coupled combined navigation according to the characteristic that the state equation of the visual-inertial tightly-coupled combined navigation is linear and the observation equation is strong nonlinear. In the time updating stage of iterative volume Kalman filtering, the mean value and the variance of the state quantity are updated according to a linear KF method instead of a volume Kalman filtering method, so that the volume point calculation and weighted summation process is avoided, and the calculated quantity is reduced; in the measurement updating stage of the iterative volume Kalman filtering, measurement iterative updating is realized according to a Levenberg-Marquard method, so that an optimized important density function is obtained, and finally, a firefly swarm optimization PF capable of reducing the number of required particles on the premise of equivalent precision is adopted to realize tight coupling combined navigation, so that the problems of particle depletion and large calculated amount of the PF are solved to a certain extent. The invention can better solve the problem of nonlinear filtering of the vision-inertia tightly coupled integrated navigation system, improves the real-time performance of the vision-inertia tightly coupled integrated navigation system on one hand, and can improve the navigation precision of the vision-inertia tightly coupled integrated navigation system to a certain extent at the same time.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a firefly group-based PF (particle Filter) optimized visual-inertial tightly-coupled integrated navigation method, and improves the real-time performance and navigation accuracy of visual-inertial tightly-coupled integrated navigation.
The invention is realized by adopting the following technical scheme:
a visual-inertial tight coupling combined navigation method based on firefly group optimized PF is realized by adopting the following steps:
step S1: a strapdown inertial navigation system and a binocular vision odometer are mounted on the carrier, and the strapdown inertial navigation system and the binocular vision odometer jointly form a vision-inertia tightly-coupled integrated navigation system; the strapdown inertial navigation system calculates nominal motion information of the carrier according to the data acquired by the strapdown inertial navigation system; performing feature matching on an image sequence acquired by the binocular vision odometer by adopting an SURF algorithm, and solving pixel coordinate information of matching points of two continuous frames of images;
step S2: establishing a linear state equation of the vision-inertia tightly-coupled integrated navigation system according to the error characteristic of the strapdown inertial navigation system; establishing a nonlinear measurement equation of the visual-inertial tightly-coupled integrated navigation system by taking the pixel coordinate information of the matching points of two continuous frames of images as a measurement value of the visual-inertial tightly-coupled integrated navigation system;
step S3: performing nonlinear filtering on the visual-inertial tightly-coupled integrated navigation system by using the firefly group optimization-based PF to realize data fusion of the visual-inertial tightly-coupled integrated navigation system;
step S4: and correcting the calculation result of the strapdown inertial navigation system according to the nonlinear filtering result of the step S3, thereby obtaining the navigation result of the vision-inertia tightly-coupled integrated navigation system.
The invention has the following effective effects: firstly, because a volume Kalman filtering method is not adopted in the time updating stage of iterative volume Kalman filtering, the updating is completed by utilizing a linear KF method according to the characteristic that a state equation of a vision-inertia tightly-coupled integrated navigation system is linear, so that the calculated amount is reduced, and meanwhile, in the particle filtering, a firefly algorithm is adopted to carry out iterative optimization on particles, so that the required number of the particles is reduced on the premise of ensuring that the filtering precision is equivalent, the calculated amount is further reduced, and the real-time performance of the vision-inertia tightly-coupled integrated navigation system is improved. And secondly, the linear KF state updating process adopted in the time updating stage of the iterative cubature Kalman filtering is optimal estimation, and the standard cubature Kalman filtering state updating process is suboptimal, so that the method can obtain an important density function with higher precision and more reasonable distribution, thereby effectively improving the navigation precision of the vision-inertia tightly-coupled combined navigation system.
The effect of the invention is verified by the following experiment:
the performance of the present invention was verified using open source data from the Karlsruhe Institute of technology and Toyota technical Institute, KITTI, on-board experiments in the KITTI Institute, where a autonomous vehicle named Annieway was used as a platform and various sensors were installed to collect data in the surrounding environment, gyro drift was 36 °/h and accelerometer drift was 1.0204 × 10-3g, the data acquisition frequency of the strapdown inertial navigation system is 100Hz, the resolution of the binocular vision odometer is 1226 × 370 pixels, the baseline distance is 54cm, the height is 1.65m, the image acquisition frequency is 10 Hz. FIG. 2 is the true motion track of the carrier in the experiment, the experiment adopts the traditional method and the invention to process the data in the experiment, respectively realize the navigation positioning of the vision-inertia tight coupling combined navigation system, the experiment results are shown in FIGS. 3-4, the traditional method and the particle number provided by the invention in the experiment are 2500 and 600 respectively, from FIGS. 3-4, compared with the traditional method, the motion track obtained by the invention is closer to the true motion track of the carrier, the maximum value of the positioning error of the invention is about 15.52m, while the maximum value of the positioning error of the traditional method is about 18.29m, the peak of the position error in FIGS. 3-4 is caused by the transient lack of GPS signals providing true track information, and is not the true position error informationThe stationary phase position error information is the true tightly coupled combined navigation position error information. In order to eliminate the influence of different computers on the filtering time, the relative average single-step filtering time of the traditional method is used as a reference for the experiment to be normalized. As can be seen from fig. 5, the relatively average single-step filtering time of the present invention is reduced to 31.22% of the conventional method, compared to the conventional method. In conclusion, the invention effectively improves the navigation real-time performance.
The invention improves the real-time performance and navigation precision of the vision-inertia tight coupling integrated navigation method, and is suitable for the vision-inertia tight coupling integrated navigation.
Drawings
FIG. 1 is a schematic flow chart of steps S3-S4 according to the present invention.
FIG. 2 shows the true trajectory of the vehicle in the experiment.
FIG. 3 is a schematic diagram of X-direction position error comparison of the present invention and conventional methods in an experiment.
FIG. 4 is a schematic diagram showing the comparison of the Y-direction position errors of the present invention and the conventional method in the experiment.
FIG. 5 is a graph showing a comparison of the relative average single-step filtering times of the present invention and the conventional method during the experiment.
Detailed Description
A visual-inertial tight coupling combined navigation method based on firefly group optimized PF is realized by adopting the following steps:
step S1: a strapdown inertial navigation system and a binocular vision odometer are mounted on the carrier, and the strapdown inertial navigation system and the binocular vision odometer jointly form a vision-inertia tightly-coupled integrated navigation system; the strapdown inertial navigation system calculates nominal motion information of the carrier according to the data acquired by the strapdown inertial navigation system; performing feature matching on an image sequence acquired by the binocular vision odometer by adopting an SURF algorithm, and solving pixel coordinate information of matching points of two continuous frames of images;
step S2: establishing a linear state equation of the vision-inertia tightly-coupled integrated navigation system according to the error characteristic of the strapdown inertial navigation system; establishing a nonlinear measurement equation of the visual-inertial tightly-coupled integrated navigation system by taking the pixel coordinate information of the matching points of two continuous frames of images as a measurement value of the visual-inertial tightly-coupled integrated navigation system;
step S3: performing nonlinear filtering on the visual-inertial tightly-coupled integrated navigation system by using the firefly group optimization-based PF to realize data fusion of the visual-inertial tightly-coupled integrated navigation system;
step S4: and correcting the calculation result of the strapdown inertial navigation system according to the nonlinear filtering result of the step S3, thereby obtaining the navigation result of the vision-inertia tightly-coupled integrated navigation system.
The step S3 includes the following steps:
step S3.1: from the prior distribution p (x)0) Extracting M initialized state particles
Figure BDA0001920173830000051
And setting an initial value w of the weight of the jth particlej,0(ii) a The formula is set as follows:
wj,0=1/M;
step S3.2: performing iterative volume kalman filtering on each particle j to obtain an optimized important density function:
step S3.2.1: for the jth particle, calculating the time-updated state quantity mean value of the combined navigation filter
Figure BDA0001920173830000052
Sum time updated combined navigation filter state quantity variance Pj,k/k-1(ii) a The calculation formula is as follows:
Figure BDA0001920173830000053
Figure BDA0001920173830000054
in the formula: phik/k-1A discrete matrix representing a state transition matrix of the visual-inertial tightly-coupled integrated navigation system; qkRepresenting visual-inertial tight coupling combinationsA covariance matrix of process noise of the navigation system;
step S3.2.2: setting an LM algorithm-based integrated navigation filter state quantity mean value iteration initial value of an iteration strategy for the jth particle
Figure BDA0001920173830000061
And combining initial values of state quantity variance iteration of navigation filter
Figure BDA0001920173830000062
The formula is set as follows:
Figure BDA0001920173830000063
Figure BDA0001920173830000064
in the formula: i represents an identity matrix; μ represents an adjustable parameter;
step S3.2.3: for the jth particle, calculating the filter gain of the iteration strategy based on the LM algorithm during the ith iteration
Figure BDA0001920173830000065
Mean value of state quantities of combined navigation filter in ith iteration
Figure BDA0001920173830000066
The calculation formula is as follows:
Figure BDA0001920173830000067
Figure BDA0001920173830000068
Figure BDA0001920173830000069
in the formula: f (-) represents the amount of nonlinearity of the combined navigation system from the visual-inertial tight couplingMeasuring a nonlinear mapping relation generated by an equation; rkA covariance matrix representing a measurement noise of the visual-inertial tightly coupled integrated navigation system; z is a radical ofkA measurement value representing a visual-inertial tightly-coupled integrated navigation system;
step S3.2.4: for the jth particle, when the iteration times of the iteration strategy based on the LM algorithm reach N, the iteration of the iteration strategy based on the LM algorithm is terminated, and therefore the state quantity mean value iteration final value of the combined navigation filter of the iteration strategy based on the LM algorithm is set
Figure BDA00019201738300000610
And combining the navigation filter state quantity variance iteration final values
Figure BDA00019201738300000611
Thereby obtaining an optimized normal distribution
Figure BDA00019201738300000612
The formula is set as follows:
Figure BDA00019201738300000613
Figure BDA0001920173830000071
Figure BDA0001920173830000072
in the formula:
Figure BDA0001920173830000073
representing the mean value of the state quantities of the combined navigation filter during the Nth iteration of the iteration strategy based on the LM algorithm;
step S3.3: for each particle j, an optimized normal distribution is utilized
Figure BDA0001920173830000074
And (3) performing importance sampling:
Figure BDA0001920173830000075
in the formula: a. thej,kRepresents the position of the jth particle;
step S3.4: calculating the weight of each particle j
Figure BDA0001920173830000076
The calculation formula is as follows:
Figure BDA0001920173830000077
in the formula: p (z)k|Aj,k) Representing a likelihood function; p (A)j,k|Aj,k-1) Representing a prior probability density; q (A)j,k|Aj,k-1,zk) Representing an importance density function;
s3.5, regarding each particle as the firefly in the firefly swarm optimization algorithm, regarding the weight of each particle as the firefly brightness in the firefly swarm optimization algorithm, obtaining the optimal value of the firefly brightness by sequencing the firefly brightness to obtain the overall optimal value particle, and then calculating the attraction β between each particle j and the overall optimal value particle during the k-th iteration of the firefly swarm optimization algorithmj,k(kappa), Cartesian distance r between the jth particle and the global optimum particle at the kth iterationj,k(κ); the calculation formula is as follows:
Figure BDA0001920173830000078
Figure BDA0001920173830000079
Figure BDA0001920173830000081
in the formula β0Represents the maximum attraction degree, and the value thereof takes a constant value of 1; gamma represents the absorption of light by air,its value is gamma ∈ [ 0.110](ii) a Tau (kappa) represents the attenuation factor of the firefly population optimization algorithm at the kappa-th iteration; ψ denotes a set constant; a. thej,k(κ) represents the location of the jth particle in the kth iteration of the firefly population optimization algorithm; a. themax,k(k) represents the position of the global optimum particle at the kth iteration of the firefly population optimization algorithm; a. thej,k,(p)(κ) represents the p-th component of the spatial coordinates of the j-th particle at the k-th iteration of the firefly population optimization algorithm; a. themax,k,(p)(κ) represents the pth component of the spatial coordinates of the global optimum particle for the kth iteration of the firefly population optimization algorithm; d represents the state dimension of the visual-inertial tightly-coupled integrated navigation system;
step S3.6: and updating the position of each particle j, wherein the updating formula is as follows:
Aj,k(κ+1)=e-u(κ)/u(κ-1)Aj,k(κ)+βj,k(κ)(Aj,k(κ)-Amax,k(κ))+α(rand-0.5);
Figure BDA0001920173830000082
wherein α represents a random step size and α∈ [0,1 ]](ii) a rand represents a random number of 0 to 1; w (A)j,k(k)) represents the weight of the jth particle in the kth iteration of the firefly population optimization algorithm; w (A)max,k(k)) represents the weight of the global optimum particles in the kth iteration of the firefly population optimization algorithm;
step S3.7: judging whether the iteration termination condition of the firefly group optimization algorithm is reached: the weight w (A) of the global optimum particle when the firefly population optimization algorithm iterates for the k < th > timemax,k(kappa)) reaches a set threshold or the iteration number of the firefly population optimization algorithm reaches a set maximum iteration number, the iteration of the firefly population optimization algorithm is terminated, and the position of the jth particle at the moment is taken as the optimized position of the jth particle
Figure BDA0001920173830000083
Otherwise, returning to the step S3.5 to continue the optimization iteration of the firefly group optimization algorithm;
step S3.8: calculating the compensated weight and the normalized weight of each particle: according to the optimized position of the jth particle
Figure BDA0001920173830000084
Calculating the compensation weight of the jth particle
Figure BDA0001920173830000085
And the normalized weight w of the jth particlej,k(ii) a The calculation formula is as follows:
Figure BDA0001920173830000091
Figure BDA0001920173830000092
in the formula: q (-) represents an importance function;
Figure BDA0001920173830000093
representing a likelihood function;
step S3.9: calculating the state quantity mean value of the combined navigation filter according to the optimized position of each particle and the normalized weight of each particle
Figure BDA0001920173830000094
The calculation formula is as follows:
Figure BDA0001920173830000095
the step S4 includes the following steps:
step S4.1: from calculated mean values of combined navigation filter state quantities
Figure BDA0001920173830000096
Error estimation of strapdown inertial navigation system obtained by intermediate extraction
Figure BDA0001920173830000097
Step S4.2: error estimation from strapdown inertial navigation system
Figure BDA0001920173830000098
Resolving result of strapdown inertial navigation system
Figure BDA0001920173830000099
Correcting to obtain navigation result T of the visual-inertial tightly-coupled integrated navigation systemk(ii) a The correction formula is as follows:
Figure BDA00019201738300000910
Figure BDA00019201738300000911
in the formula:
Figure BDA00019201738300000912
representing an angle error of the strapdown inertial navigation system; vE、VN、VURepresenting a speed error of the strapdown inertial navigation system; rE、RN、RUAnd representing the position error of the strapdown inertial navigation system.
In step S2, the linear equation of state of the visual-inertial tightly-coupled integrated navigation system is established as follows:
first, the state quantity X of the combined navigation filter is defined as:
Figure BDA00019201738300000913
in the formula:
Figure BDA00019201738300000914
representing an angle error of the strapdown inertial navigation system; vE、VN、VURepresenting a speed error of the strapdown inertial navigation system; rE、RN、RURepresenting a position error of the strapdown inertial navigation system;bxbybzindicating edgeGyroscope random drift errors in three directions of a carrier coordinate system; zetax、ζy、ζzRepresenting a random drift error of the accelerometer; p is a radical ofx、py、pzRepresenting a nominal position error during the acquisition of the previous image; thetax、θy、θzRepresenting a nominal angle error during collection of a previous image; E. n, U denote three directions of a geographical coordinate system;
then, according to the error characteristic of the strapdown inertial navigation system, obtaining a linear state equation of the visual-inertial tightly-coupled integrated navigation system:
Figure BDA0001920173830000101
Figure BDA0001920173830000102
Figure BDA0001920173830000103
Figure BDA0001920173830000104
W(t)=[wx,wy,wz,wζx,wζy,wζz]T
in the formula: f (t) represents the state transition matrix of the combined navigation filter; g (t) represents a noise transformation matrix of the combined navigation filter; w (t) represents the noise vector of the combined navigation filter; fN(t) represents the classical inertial system error 9 × 9 matrix FS(t) represents a transformation matrix between the navigation parameters and the inertial element errors;
Figure BDA0001920173830000105
representing a posture matrix; w is ax、wy、wzA noise term representing a gyroscope; w is aζx、wζy、wζzRepresenting the noise term of the accelerometer.
In step S2, the process of establishing the non-linear measurement equation of the visual-inertial tightly-coupled integrated navigation system is as follows:
firstly, according to the state quantity X of the combined navigation filter and the nominal state X of the strapdown inertial navigation systemnCalculating the rotation R of the right camera at the later time relative to the previous timeRRotation R of the left camera at the later moment relative to the previous momentLThe displacement t of the right camera at the later moment relative to the former momentRThe displacement t of the left camera at the later moment relative to the previous momentLThe calculation formula is expressed as follows:
Figure BDA0001920173830000111
Figure BDA0001920173830000112
Figure BDA0001920173830000113
Figure BDA0001920173830000114
in the formula:
Figure BDA0001920173830000115
representing the actual state information of the strapdown inertial navigation obtained by the state quantity of the combined navigation filter to the nominal state of the strapdown inertial navigation system; r and t respectively represent the functional relation between the rotation amount and the displacement amount obtained by the real state information of the strapdown inertial navigation system;
then, the coordinate system of the right camera at the previous moment is aligned with the world coordinate system, i.e. the projection matrix P of the right camera at the previous moment is aligned with the world coordinate systemR,kComprises the following steps:
PR,k=KR[I|0];
in the formula: kRA calibration matrix representing a right camera;
obtaining a previous time pan left by calibrationRelative pose of camera [ R ]C|tC]Thereby forming a projection matrix P of the left camera at the previous momentL,kExpressed as:
PL,k=KL[RC|tC];
in the formula: kLA calibration matrix representing the left camera; rCRepresenting the rotation of the left camera relative to the right camera; t is tCRepresenting the displacement of the left camera relative to the right camera;
projection matrix P of right camera at later timeR,k+1Expressed as:
PR,k+1=KR[RR|tR];
projection matrix P of left camera at later timeL,k+1Expressed as:
PL,k+1=KL[RL|tL];
then, calculating the trifocal tensor T of the two continuous frames of images acquired by the right cameraRAnd the trifocal tensor T of the two continuous frames of images acquired by the left cameraL(ii) a The calculation formula is as follows:
TR=T(KR,KL,RC,tC,RR,tR);
TL=T(KR,KL,RC,tC,RL,tL);
in the formula: t represents a calculation formula of the trifocal tensor;
then, according to the nonlinear transfer function H of the trifocal tensor, obtaining a nonlinear measurement equation of the visual-inertial tightly-coupled integrated navigation system:
xR,k+1=H(TR,xR,k,xL,k);
xL,k+1=H(TL,xR,k,xL,k);
in the formula: x is the number ofR,k+1Representing pixel coordinates of matching points of two continuous frames of images collected by the right camera at the later moment; x is the number ofL,k+1Representing two consecutive frames acquired by the left camera at a later timePixel coordinates of image matching points; x is the number ofR,kRepresenting pixel coordinates of matching points of two continuous frames of images collected by the right camera at the previous moment; x is the number ofL,kRepresenting pixel coordinates of matching points of two continuous frames of images collected by a left camera at the previous moment;
then let xk+1=[xR,k+1,xL,k+1]TThen, the nonlinear measurement equation of the visual-inertial tightly-coupled integrated navigation system is expressed as:
Figure BDA0001920173830000121
generating a nonlinear mapping relation from the state quantity X of the integrated navigation filter to the coordinate information of the pixel points according to the nonlinear measurement equation of the visual-inertial tightly-coupled integrated navigation system:
xk+1=f(X)。

Claims (3)

1. a visual-inertial tight coupling combined navigation method based on firefly group optimized PF is characterized in that: the method is realized by adopting the following steps:
step S1: a strapdown inertial navigation system and a binocular vision odometer are mounted on the carrier, and the strapdown inertial navigation system and the binocular vision odometer jointly form a vision-inertia tightly-coupled integrated navigation system; the strapdown inertial navigation system calculates nominal motion information of the carrier according to the data acquired by the strapdown inertial navigation system; performing feature matching on an image sequence acquired by the binocular vision odometer by adopting an SURF algorithm, and solving pixel coordinate information of matching points of two continuous frames of images;
step S2: establishing a linear state equation of the vision-inertia tightly-coupled integrated navigation system according to the error characteristic of the strapdown inertial navigation system; establishing a nonlinear measurement equation of the visual-inertial tightly-coupled integrated navigation system by taking the pixel coordinate information of the matching points of two continuous frames of images as a measurement value of the visual-inertial tightly-coupled integrated navigation system;
step S3: performing nonlinear filtering on the visual-inertial tightly-coupled integrated navigation system by using the firefly group optimization-based PF to realize data fusion of the visual-inertial tightly-coupled integrated navigation system;
step S4: correcting the calculation result of the strapdown inertial navigation system according to the nonlinear filtering result of the step S3, thereby obtaining a navigation result of the vision-inertia tightly-coupled integrated navigation system;
the step S3 includes the following steps:
step S3.1: from the prior distribution p (x)0) Extracting M initialized state particles
Figure FDA0002546423390000011
And setting an initial value w of the weight of the jth particlej,0(ii) a The formula is set as follows:
wj,0=1/M;
step S3.2: performing iterative volume kalman filtering on each particle j to obtain an optimized important density function:
step S3.2.1: for the jth particle, calculating the time-updated state quantity mean value of the combined navigation filter
Figure FDA0002546423390000012
Sum time updated combined navigation filter state quantity variance Pj,k/k-1(ii) a The calculation formula is as follows:
Figure FDA0002546423390000013
Figure FDA0002546423390000021
in the formula: phik/k-1A discrete matrix representing a state transition matrix of the visual-inertial tightly-coupled integrated navigation system; qkA covariance matrix representing process noise of the visual-inertial tightly-coupled integrated navigation system;
step S3.2.2: setting an LM algorithm-based integrated navigation filter state quantity mean value iteration initial value of an iteration strategy for the jth particle
Figure FDA0002546423390000022
And combining initial values of state quantity variance iteration of navigation filter
Figure FDA0002546423390000023
The formula is set as follows:
Figure FDA0002546423390000024
Figure FDA0002546423390000025
in the formula: i represents an identity matrix; μ represents an adjustable parameter;
step S3.2.3: for the jth particle, calculating the filter gain of the iteration strategy based on the LM algorithm during the ith iteration
Figure FDA0002546423390000026
Mean value of state quantities of combined navigation filter in ith iteration
Figure FDA0002546423390000027
The calculation formula is as follows:
Figure FDA0002546423390000028
Figure FDA0002546423390000029
Figure FDA00025464233900000210
in the formula: f (-) represents a nonlinear mapping relation generated by a nonlinear measurement equation of the visual-inertial tightly-coupled integrated navigation system; rkA covariance matrix representing a measurement noise of the visual-inertial tightly coupled integrated navigation system; z is a radical ofkRepresenting visual-inertialMeasuring values of the tightly coupled integrated navigation system;
step S3.2.4: for the jth particle, when the iteration times of the iteration strategy based on the LM algorithm reach N, the iteration of the iteration strategy based on the LM algorithm is terminated, and therefore the state quantity mean value iteration final value of the combined navigation filter of the iteration strategy based on the LM algorithm is set
Figure FDA00025464233900000211
And combining the navigation filter state quantity variance iteration final values
Figure FDA00025464233900000212
Thereby obtaining an optimized normal distribution
Figure FDA00025464233900000213
The formula is set as follows:
Figure FDA00025464233900000214
Figure FDA0002546423390000031
Figure FDA0002546423390000032
in the formula:
Figure FDA0002546423390000033
representing the mean value of the state quantities of the combined navigation filter during the Nth iteration of the iteration strategy based on the LM algorithm;
step S3.3: for each particle j, an optimized normal distribution is utilized
Figure FDA0002546423390000034
And (3) performing importance sampling:
Figure FDA0002546423390000035
in the formula: a. thej,kRepresents the position of the jth particle;
step S3.4: calculating the weight of each particle j
Figure FDA0002546423390000036
The calculation formula is as follows:
Figure FDA0002546423390000037
in the formula: p (z)k|Aj,k) Representing a likelihood function; p (A)j,k|Aj,k-1) Representing a prior probability density; q (A)j,k|Aj,k-1,zk) Representing an importance density function;
s3.5, regarding each particle as the firefly in the firefly swarm optimization algorithm, regarding the weight of each particle as the firefly brightness in the firefly swarm optimization algorithm, obtaining the optimal value of the firefly brightness by sequencing the firefly brightness to obtain the overall optimal value particle, and then calculating the attraction β between each particle j and the overall optimal value particle during the k-th iteration of the firefly swarm optimization algorithmj,k(kappa), Cartesian distance r between the jth particle and the global optimum particle at the kth iterationj,k(κ); the calculation formula is as follows:
Figure FDA0002546423390000038
Figure FDA0002546423390000039
Figure FDA0002546423390000041
in the formula β0Represents the maximum attraction degree and takes a constant value of 1, gamma represents the absorption rate of air to light and takes a value of gamma ∈[0.110](ii) a Tau (kappa) represents the attenuation factor of the firefly population optimization algorithm at the kappa-th iteration; ψ denotes a set constant; a. thej,k(κ) represents the location of the jth particle in the kth iteration of the firefly population optimization algorithm; a. themax,k(k) represents the position of the global optimum particle at the kth iteration of the firefly population optimization algorithm; a. thej,k,(p)(κ) represents the p-th component of the spatial coordinates of the j-th particle at the k-th iteration of the firefly population optimization algorithm; a. themax,k,(p)(κ) represents the pth component of the spatial coordinates of the global optimum particle for the kth iteration of the firefly population optimization algorithm; d represents the state dimension of the visual-inertial tightly-coupled integrated navigation system;
step S3.6: and updating the position of each particle j, wherein the updating formula is as follows:
Aj,k(κ+1)=e-u(κ)/u(κ-1)Aj,k(κ)+βj,k(κ)(Aj,k(κ)-Amax,k(κ))+α(rand-0.5);
Figure FDA0002546423390000042
wherein α represents a random step size and α∈ [0,1 ]](ii) a rand represents a random number of 0 to 1; w (A)j,k(k)) represents the weight of the jth particle in the kth iteration of the firefly population optimization algorithm; w (A)max,k(k)) represents the weight of the global optimum particles in the kth iteration of the firefly population optimization algorithm;
step S3.7: judging whether the iteration termination condition of the firefly group optimization algorithm is reached: the weight w (A) of the global optimum particle when the firefly population optimization algorithm iterates for the k < th > timemax,k(kappa)) reaches a set threshold or the iteration number of the firefly population optimization algorithm reaches a set maximum iteration number, the iteration of the firefly population optimization algorithm is terminated, and the position of the jth particle at the moment is taken as the optimized position of the jth particle
Figure FDA0002546423390000043
Otherwise, returning to the step S3.5 to continue the optimization iteration of the firefly group optimization algorithm;
step S3.8: calculating the compensated weight and the normalized weight of each particle: according to the optimized position of the jth particle
Figure FDA0002546423390000044
Calculating the compensation weight of the jth particle
Figure FDA0002546423390000045
And the normalized weight w of the jth particlej,k(ii) a The calculation formula is as follows:
Figure FDA0002546423390000051
Figure FDA0002546423390000052
in the formula: q (-) represents an importance function;
Figure FDA0002546423390000053
representing a likelihood function;
step S3.9: calculating the state quantity mean value of the combined navigation filter according to the optimized position of each particle and the normalized weight of each particle
Figure FDA0002546423390000054
The calculation formula is as follows:
Figure FDA0002546423390000055
the step S4 includes the following steps:
step S4.1: from calculated mean values of combined navigation filter state quantities
Figure FDA0002546423390000056
Error estimation of strapdown inertial navigation system obtained by intermediate extraction
Figure FDA0002546423390000057
Step S4.2: error estimation from strapdown inertial navigation system
Figure FDA0002546423390000058
Resolving result of strapdown inertial navigation system
Figure FDA0002546423390000059
Correcting to obtain navigation result T of the visual-inertial tightly-coupled integrated navigation systemk(ii) a The correction formula is as follows:
Figure FDA00025464233900000510
Figure FDA00025464233900000511
in the formula:
Figure FDA00025464233900000512
representing an angle error of the strapdown inertial navigation system; vE、VN、VURepresenting a speed error of the strapdown inertial navigation system; rE、RN、RUAnd representing the position error of the strapdown inertial navigation system.
2. The firefly population optimized PF-based visual-inertial tightly-coupled integrated navigation method of claim 1, wherein: in step S2, the linear equation of state of the visual-inertial tightly-coupled integrated navigation system is established as follows:
first, the state quantity X of the combined navigation filter is defined as:
Figure FDA00025464233900000513
in the formula:
Figure FDA00025464233900000514
representing an angle error of the strapdown inertial navigation system; vE、VN、VURepresenting a speed error of the strapdown inertial navigation system; rE、RN、RURepresenting a position error of the strapdown inertial navigation system;bxbybzrepresenting the random drift errors of the gyroscope along three directions of a carrier coordinate system; zetax、ζy、ζzRepresenting a random drift error of the accelerometer; p is a radical ofx、py、pzRepresenting a nominal position error during the acquisition of the previous image; thetax、θy、θzRepresenting a nominal angle error during collection of a previous image; E. n, U denote three directions of a geographical coordinate system;
then, according to the error characteristic of the strapdown inertial navigation system, obtaining a linear state equation of the visual-inertial tightly-coupled integrated navigation system:
Figure FDA0002546423390000061
Figure FDA0002546423390000062
Figure FDA0002546423390000063
Figure FDA0002546423390000064
W(t)=[wx,wy,wz,wζx,wζy,wζz]T
in the formula: f (t) represents the state transition matrix of the combined navigation filter; g (t) represents a noise transformation matrix of the combined navigation filter; w (t) represents the noise vector of the combined navigation filter; fN(t) represents the classical inertial system error 9 × 9 matrix FS(t)A transformation matrix representing the navigation parameters and inertial element errors;
Figure FDA0002546423390000065
representing a posture matrix; w is ax、wy、wzA noise term representing a gyroscope; w is aζx、wζy、wζzRepresenting the noise term of the accelerometer.
3. The firefly population optimized PF-based visual-inertial tightly-coupled integrated navigation method of claim 1, wherein: in step S2, the process of establishing the non-linear measurement equation of the visual-inertial tightly-coupled integrated navigation system is as follows:
firstly, according to the state quantity X of the combined navigation filter and the nominal state X of the strapdown inertial navigation systemnCalculating the rotation R of the right camera at the later time relative to the previous timeRRotation R of the left camera at the later moment relative to the previous momentLThe displacement t of the right camera at the later moment relative to the former momentRThe displacement t of the left camera at the later moment relative to the previous momentLThe calculation formula is expressed as follows:
Figure FDA0002546423390000071
Figure FDA0002546423390000072
Figure FDA0002546423390000073
Figure FDA0002546423390000074
in the formula:
Figure FDA0002546423390000075
represent a group ofAcquiring real state information of the strapdown inertial navigation system according to the state quantity of the navigation filter to the nominal state of the strapdown inertial navigation system; r and t respectively represent the functional relation between the rotation amount and the displacement amount obtained by the real state information of the strapdown inertial navigation system;
then, the coordinate system of the right camera at the previous moment is aligned with the world coordinate system, i.e. the projection matrix P of the right camera at the previous moment is aligned with the world coordinate systemR,kComprises the following steps:
PR,k=KR[I|0];
in the formula: kRA calibration matrix representing a right camera;
obtaining the relative pose [ R ] of the left camera at the previous moment through calibrationC|tC]Thereby forming a projection matrix P of the left camera at the previous momentL,kExpressed as:
PL,k=KL[RC|tC];
in the formula: kLA calibration matrix representing the left camera; rCRepresenting the rotation of the left camera relative to the right camera; t is tCRepresenting the displacement of the left camera relative to the right camera;
projection matrix P of right camera at later timeR,k+1Expressed as:
PR,k+1=KR[RR|tR];
projection matrix P of left camera at later timeL,k+1Expressed as:
PL,k+1=KL[RL|tL];
then, calculating the trifocal tensor T of the two continuous frames of images acquired by the right cameraRAnd the trifocal tensor T of the two continuous frames of images acquired by the left cameraL(ii) a The calculation formula is as follows:
TR=T(KR,KL,RC,tC,RR,tR);
TL=T(KR,KL,RC,tC,RL,tL);
in the formula: t represents a calculation formula of the trifocal tensor;
then, according to the nonlinear transfer function H of the trifocal tensor, obtaining a nonlinear measurement equation of the visual-inertial tightly-coupled integrated navigation system:
xR,k+1=H(TR,xR,k,xL,k);
xL,k+1=H(TL,xR,k,xL,k);
in the formula: x is the number ofR,k+1Representing pixel coordinates of matching points of two continuous frames of images collected by the right camera at the later moment; x is the number ofL,k+1Representing pixel coordinates of matching points of two continuous frames of images collected by a left camera at the later moment; x is the number ofR,kRepresenting pixel coordinates of matching points of two continuous frames of images collected by the right camera at the previous moment; x is the number ofL,kRepresenting pixel coordinates of matching points of two continuous frames of images collected by a left camera at the previous moment;
then let xk+1=[xR,k+1,xL,k+1]TThen, the nonlinear measurement equation of the visual-inertial tightly-coupled integrated navigation system is expressed as:
Figure FDA0002546423390000081
generating a nonlinear mapping relation from the state quantity X of the integrated navigation filter to the coordinate information of the pixel points according to the nonlinear measurement equation of the visual-inertial tightly-coupled integrated navigation system:
xk+1=f(X)。
CN201811590654.9A 2018-12-25 2018-12-25 Visual-inertial tight coupling combined navigation method based on firefly group optimized PF Active CN109443354B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811590654.9A CN109443354B (en) 2018-12-25 2018-12-25 Visual-inertial tight coupling combined navigation method based on firefly group optimized PF

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811590654.9A CN109443354B (en) 2018-12-25 2018-12-25 Visual-inertial tight coupling combined navigation method based on firefly group optimized PF

Publications (2)

Publication Number Publication Date
CN109443354A CN109443354A (en) 2019-03-08
CN109443354B true CN109443354B (en) 2020-08-14

Family

ID=65535597

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811590654.9A Active CN109443354B (en) 2018-12-25 2018-12-25 Visual-inertial tight coupling combined navigation method based on firefly group optimized PF

Country Status (1)

Country Link
CN (1) CN109443354B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110260885B (en) * 2019-04-15 2020-06-30 南京航空航天大学 Satellite/inertia/vision combined navigation system integrity evaluation method
CN110263905B (en) * 2019-05-31 2021-03-02 上海电力学院 Robot positioning and mapping method and device based on firefly optimized particle filtering
CN110187308B (en) * 2019-06-20 2023-06-16 华南师范大学 Indoor positioning method, device, equipment and medium based on signal fingerprint library
CN114543793B (en) * 2020-11-24 2024-02-09 上海汽车集团股份有限公司 Multi-sensor fusion positioning method of vehicle navigation system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8447517B2 (en) * 2008-11-06 2013-05-21 Texas Instruments Incorporated Tightly-coupled GNSS/IMU integration filter having speed scale-factor and heading bias calibration
JP2013024845A (en) * 2011-07-26 2013-02-04 Furuno Electric Co Ltd Tightly coupled gps and dead reckoning navigation for vehicle
CN103033189B (en) * 2012-12-26 2015-05-20 北京航空航天大学 Inertia/vision integrated navigation method for deep-space detection patrolling device
US20150219767A1 (en) * 2014-02-03 2015-08-06 Board Of Regents, The University Of Texas System System and method for using global navigation satellite system (gnss) navigation and visual navigation to recover absolute position and attitude without any prior association of visual features with known coordinates
CN107796391A (en) * 2017-10-27 2018-03-13 哈尔滨工程大学 A kind of strapdown inertial navigation system/visual odometry Combinated navigation method
CN108731670B (en) * 2018-05-18 2021-06-22 南京航空航天大学 Inertial/visual odometer integrated navigation positioning method based on measurement model optimization

Also Published As

Publication number Publication date
CN109443354A (en) 2019-03-08

Similar Documents

Publication Publication Date Title
CN109443354B (en) Visual-inertial tight coupling combined navigation method based on firefly group optimized PF
CN108731670B (en) Inertial/visual odometer integrated navigation positioning method based on measurement model optimization
CN107564061B (en) Binocular vision mileage calculation method based on image gradient joint optimization
CN111156984B (en) Monocular vision inertia SLAM method oriented to dynamic scene
CN109102525B (en) Mobile robot following control method based on self-adaptive posture estimation
CN107796391A (en) A kind of strapdown inertial navigation system/visual odometry Combinated navigation method
CN110595466B (en) Lightweight inertial-assisted visual odometer implementation method based on deep learning
CN114001733B (en) Map-based consistent efficient visual inertial positioning algorithm
CN113108771B (en) Movement pose estimation method based on closed-loop direct sparse visual odometer
CN105719314A (en) Homography estimation and extended Kalman filter based localization method for unmanned aerial vehicle (UAV)
CN109443355B (en) Visual-inertial tight coupling combined navigation method based on self-adaptive Gaussian PF
CN111536970B (en) Infrared inertial integrated navigation method for low-visibility large-scale scene
CN109443353B (en) Visual-inertial tight coupling combined navigation method based on fuzzy self-adaptive ICKF
CN112767546B (en) Binocular image-based visual map generation method for mobile robot
Cattaneo et al. Cmrnet++: Map and camera agnostic monocular visual localization in lidar maps
CN111025229B (en) Underwater robot pure orientation target estimation method
CN116429116A (en) Robot positioning method and equipment
CN114690229A (en) GPS-fused mobile robot visual inertial navigation method
CN113345032B (en) Initialization map building method and system based on wide-angle camera large distortion map
CN113552585A (en) Mobile robot positioning method based on satellite map and laser radar information
CN112945233A (en) Global drift-free autonomous robot simultaneous positioning and map building method
CN112444245A (en) Insect-imitated vision integrated navigation method based on polarized light, optical flow vector and binocular vision sensor
CN113554705B (en) Laser radar robust positioning method under changing scene
CN113483769B (en) Vehicle self-positioning method, system, equipment and medium based on particle filter
CN114964276A (en) Dynamic vision SLAM method fusing inertial navigation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant