CN110006423B - Self-adaptive inertial navigation and visual combined navigation method - Google Patents

Self-adaptive inertial navigation and visual combined navigation method Download PDF

Info

Publication number
CN110006423B
CN110006423B CN201910270894.9A CN201910270894A CN110006423B CN 110006423 B CN110006423 B CN 110006423B CN 201910270894 A CN201910270894 A CN 201910270894A CN 110006423 B CN110006423 B CN 110006423B
Authority
CN
China
Prior art keywords
matrix
covariance matrix
navigation
error
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910270894.9A
Other languages
Chinese (zh)
Other versions
CN110006423A (en
Inventor
王美玲
翟朝阳
杨毅
沈凯
岳裕丰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Original Assignee
Beijing Institute of Technology BIT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT filed Critical Beijing Institute of Technology BIT
Priority to CN201910270894.9A priority Critical patent/CN110006423B/en
Publication of CN110006423A publication Critical patent/CN110006423A/en
Application granted granted Critical
Publication of CN110006423B publication Critical patent/CN110006423B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments

Abstract

The invention discloses a self-adaptive inertial navigation and vision combined navigation method, which designs a self-adaptive unscented Kalman filtering to combine inertial navigation and vision on the basis of the existing inertial navigation and vision combined navigation method; the method considers the noise caused by camera vibration due to road surface bumping when a vehicle runs, designs a self-adaptive factor reflecting the camera vibration noise level, and adaptively adjusts Kalman filter gain and an error covariance matrix in an integrated navigation algorithm according to different noise levels to ensure the reliability of the integrated navigation system positioning result, so that the method is a robust positioning method; because the method does not need modeling compensation or track closed-loop correction of errors caused by vibration noise in advance, the method has strong adaptability to the environment; on the basis of the inertial navigation 15-dimensional error quantity, the invention designs the state quantity of the filter added with the camera pose, realizes the tight combination of inertial navigation and vision, and is a positioning method with higher precision.

Description

Self-adaptive inertial navigation and visual combined navigation method
Technical Field
The invention belongs to the technical field of automatic control, and particularly relates to a self-adaptive inertial navigation and visual combined navigation method.
Background
Compared with the traditional vehicle, the unmanned vehicle as an intelligent vehicle consisting of technologies such as environment perception, navigation positioning, motion control and path planning has rapidly developed in recent years and is widely concerned by society. In the technology related to the unmanned automobile, navigation positioning is the key and the foundation for the normal operation of the whole system.
At present, a satellite navigation and inertial navigation combined navigation system widely used on an unmanned automobile can obtain accurate vehicle positioning in an open environment, but in a satellite signal rejection environment such as an urban canyon, due to the influence of signal shielding and multipath effect, the satellite navigation system cannot effectively correct accumulated errors of inertial navigation, the positioning precision of the combined navigation system is rapidly reduced, and even the combined navigation system cannot perform positioning. Therefore, a scheme of combining inertial navigation and vision can be adopted, and the navigation and positioning under the satellite signal rejection environment are realized by utilizing the complementarity of the characteristics of the two sensors. However, currently, the mainstream method does not consider the measurement noise possibly caused by road bumpiness, which easily causes the divergence of the positioning error of the navigation system.
Disclosure of Invention
In view of this, the present invention provides a method for adaptive inertial navigation and visual integrated navigation, which can ensure the reliability of the positioning result of the integrated navigation system and has better robustness.
A self-adaptive inertial navigation and vision combined navigation method comprises the following steps:
step 0, initializing a system nominal state matrix
Figure BDA0002018340450000011
System error state matrix
Figure BDA0002018340450000012
And an initial error covariance matrix P0|0
Step 1, resolving by a strapdown inertial navigation system to obtain a system nominal state matrix
Figure BDA0002018340450000021
Meanwhile, a system transfer matrix F is calculated according to the kinematic dynamics relationdSum system process noise variance matrix Qd
Step 2, utilizing the system matrix FdSum system process noise variance matrix QdCarrying out one-step prediction on the error state of the system;
step 3, performing feature extraction and feature matching on the visual images acquired by the camera, and removing mismatching points by using a single-point RANSAC method to obtain feature matching pairs among the continuous three frames of images;
step 4, 2n +1 sigma points are obtained through calculation; n represents a system state dimension;
and 5, correcting the nominal state of the system by using 2n +1 sigma points
Figure BDA0002018340450000022
And (3) predicting an observed value by combining a feature matching result:
step 6, updating the filter observed quantity prediction covariance matrix of the current ith navigation solution
Figure BDA0002018340450000026
And covariance matrix of observed quantity and state quantity
Figure BDA0002018340450000027
Step 7, acquiring fact covariance matrix P of observed quantity of ith navigation solutionf,iAnd the prediction covariance matrix Pc,iWherein the fact covariance matrix Pf,iPass filter innovation ΔiSo as to obtain the compound with the characteristics of,
Figure BDA0002018340450000023
prediction covariance matrix Pc,iUpdating the prediction covariance matrix of the filter observations in step 6:
then, a diagonal matrix of the two matrixes is obtained through a diagonal operator:
Df,i=diag(Pf,i),Dc,i=diag(Pc,i)
will Df,iAnd Dc,iDividing diagonal elements corresponding to the two matrixes, and then taking the average value to obtain the ith navigation resolving self-adaptive factor alphai
Figure BDA0002018340450000024
Wherein, alpha represents the dispersion degree of sigma points, beta is used for describing the distribution information of the state quantity;
step 8, obtaining the mean value m of the adaptive factorαAnd judging:
when alpha isi≤mαComputing filter Kalman filter gain KkAnd updating the error covariance matrix Pk|k
Figure BDA0002018340450000025
Wherein, Pk|k-1Representing an error covariance matrix in the error state obtained in the step 2;
when m isαi≤mααAnd adjusting the error covariance matrix, namely:
Figure BDA0002018340450000031
when m isααi≤mαα 2And adjusting the error covariance matrix and Kalman filtering gain, namely:
Figure BDA0002018340450000032
when alpha isi>mαα 2And then, quickly adjusting the error covariance matrix, namely:
Figure BDA0002018340450000033
and finally, estimating the system error state:
Figure BDA0002018340450000034
step 9, using the system error state obtained in step 8
Figure BDA0002018340450000035
For nominal state of system
Figure BDA0002018340450000036
Correcting to obtain a positioning result; and repeating the steps 1 to 9 to obtain continuous vehicle navigation positioning information.
Further, in the step 0, adding the camera poses at the latest two moments in the system error state matrix; in the step 9, after the positioning result is obtained, updating the historical camera poses in the system error state matrix by using the camera poses at the latest two moments, and then updating the error covariance matrix; and then returning to the step 1 to enter the next positioning calculation.
The invention has the following beneficial effects:
on the basis of the existing inertial navigation and vision combined navigation method, the invention designs a self-adaptive unscented Kalman filtering to combine inertial navigation and vision; the method considers the noise caused by camera vibration due to road surface bumping when a vehicle runs, designs a self-adaptive factor reflecting the camera vibration noise level, and adaptively adjusts Kalman filter gain and an error covariance matrix in an integrated navigation algorithm according to different noise levels to ensure the reliability of the integrated navigation system positioning result, so that the method is a robust positioning method; the method of the invention does not need modeling compensation or track closed-loop correction of errors caused by vibration noise in advance, so the method has strong adaptability to the environment.
On the basis of the inertial navigation 15-dimensional error quantity, the invention designs the state quantity of the filter added with the camera pose, realizes the tight combination of inertial navigation and vision, and is a positioning method with higher precision.
Drawings
FIG. 1 is a flow chart of a method for adaptive inertial navigation and visual integrated navigation according to the present invention.
Detailed Description
The invention is described in detail below by way of example with reference to the accompanying drawings.
Aiming at the problem that the precision of a positioning result is reduced or even the positioning cannot be realized easily due to camera vibration noise in the existing inertial navigation and visual combined navigation algorithm, a self-adaptive algorithm can be designed to carry out self-adaptive adjustment on an error covariance matrix and Kalman filtering gain in a filter. The invention provides a self-adaptive inertial navigation and vision combined navigation method, which comprises the following specific processes as shown in figure 1:
step 1, initializationNormalizing system nominal state matrix
Figure BDA0002018340450000041
System error state matrix
Figure BDA0002018340450000042
And an initial error covariance matrix P0|0. Meanwhile, the weight values of 2n +1 sigma points are calculated, and the formula is as follows:
Figure BDA0002018340450000043
Figure BDA0002018340450000044
Figure BDA0002018340450000045
wherein λ ═ α2(n + k) -n, α determines the degree of dispersion of the sigma points, usually taking a small positive value, k is chosen to be 3-n, β is used to describe the distribution information of the state quantities, the optimum value is 2, and n is the system state dimension.
Step 2, resolving by the strapdown inertial navigation system to obtain a system nominal state matrix
Figure BDA0002018340450000046
Meanwhile, a system transfer matrix F is calculated according to the kinematic dynamics relationdSum system process noise variance matrix Qd
Step 3, updating the filter time, namely using the system matrix F obtained in the step 2dSum system process noise variance matrix QdAnd carrying out one-step prediction on the error state of the system, wherein the formula is as follows:
Figure BDA0002018340450000051
Pk|k-1=FdPk-1|k-1Fd T+Qd
step 4, performing feature extraction and feature matching on the visual image acquired by the camera, and removing mismatching points by using a single-point RANSAC method to obtain feature matching pairs (p) among three continuous frame images1,p2,p3)i。
And step 5, combining the weighted value obtained in the step 1 and the one-step prediction of the error state obtained in the step 3, and obtaining 2n +1 sigma points by using unscented transformation calculation, wherein the formula is as follows:
Figure BDA0002018340450000052
step 6, correcting the nominal state of the system by using the 2n +1 sigma points obtained in the step 5
Figure BDA0002018340450000053
And (4) further combining the feature matching result obtained in the step (4) to predict an observation value:
Figure BDA0002018340450000054
Figure BDA0002018340450000055
wherein h (-) is a transition matrix from the system state to the observed quantity, is a nonlinear function representing epipolar geometric constraint and trifocal tensor constraint between the continuous three-frame images, and is expressed by a formula:
Figure BDA0002018340450000056
step 7, updating the filter observation prediction covariance matrix according to the results of the step 5 and the step 6
Figure BDA0002018340450000057
And covariance matrix of observed quantity and state quantity
Figure BDA0002018340450000058
The formula is as follows:
Figure BDA0002018340450000059
Figure BDA0002018340450000061
step 8, calculating Kalman filtering gain K of the filter by using the covariance matrix of the filter obtained in the step 7kAnd updating the error covariance matrix Pk|kTo obtain the optimal estimation of the filter
Figure BDA0002018340450000062
The formula is as follows:
Figure BDA0002018340450000063
Figure BDA0002018340450000064
Figure BDA0002018340450000065
step 9, using the system error state obtained in step 8
Figure BDA0002018340450000066
For nominal state of system
Figure BDA0002018340450000067
And correcting to obtain a positioning result. And repeating the steps 2 to 9 to obtain continuous vehicle navigation positioning information.
The above process is the conventional step of the existing inertial navigation and vision combined navigation method based on the unscented Kalman filter, the invention designs the adaptive factor reflecting the vibration noise level of the camera on the basis of the existing navigation positioning method, and adjusts the Kalman filtering gain and the error covariance matrix of the filter by different methods to obtain the navigation positioning result of adaptive robustness, and concretely, the step 8 is changed into the following two steps:
step 8(a), obtaining the fact covariance matrix P of the observed quantityf,iAnd the prediction covariance matrix Pc,iWherein the fact covariance matrix Pf,iPass filter innovation ΔiObtaining, predicting a covariance matrix Pc,iIn step 7, the following results:
Figure BDA0002018340450000068
Figure BDA0002018340450000069
then, a diagonal matrix of the two matrixes is obtained through a diagonal operator:
Df,i=diag(Pf,i),Dc,i=diag(Pc,i)
further, D isf,iAnd Dc,iDividing diagonal elements corresponding to the two matrixes, and then taking the average value to obtain the adaptive factor alphai
Figure BDA00020183404500000610
In the ideal case, the fact of the observed quantity is the same as the predicted result, so the mean m of the adaptive factor is knownαIs 1, variance σα 2Is 2-5.
Step 8(b), according to different vibration levels of the camera, using the adaptive factor obtained in the step 8(a) to perform Kalman filtering on the filter gain KkAnd performing adaptive adjustment on the error covariance matrix. Because Kalman filtering gain and error covariance matrix both have influence on filter estimation, and the influence of the Kalman filtering gain and error covariance matrix is obvious, the adaptive adjustment strategy specifically comprises the following steps: in the case of slight vibration, only the error covariance matrix is adjusted; in the case of moderate vibration, reducing Kalman filter gain prevents error divergence and error correctionAdjusting the difference covariance matrix; under the condition of severe vibration, due to low observation result reliability, the result is greatly influenced if the Kalman filtering gain is adjusted, so that only the error covariance matrix is adjusted, and a quick adjustment coefficient beta is introducediFast adjustment of the error covariance matrix, betaiThe value range of (A) is 1.5-3 according to debugging experience. By the adjustment, the optimal estimation of the filter can be obtained
Figure BDA0002018340450000071
Is formulated as follows: normal case of alphai≤mαNo adjustment is made.
Figure BDA0002018340450000072
Slight vibration condition mαi≤mααOnly the error covariance matrix is adjusted.
Figure BDA0002018340450000073
Medium vibration case mααi≤mαα 2And adjusting both the error covariance matrix and the Kalman filtering gain.
Figure BDA0002018340450000074
Severe vibration condition alphai>mαα 2Adjusting only the error covariance matrix and introducing a fast adjustment coefficient betaiAchieving rapid adjustment.
Figure BDA0002018340450000075
Estimating the system error state:
Figure BDA0002018340450000076
particularly, when a filter is initialized, the camera poses at the two latest moments are added on the basis of the inertial navigation 15-dimensional error quantity to form a 27-dimensional filter state quantity, so that the close combination of inertial navigation and vision is realized. And simultaneously, updating the system error state and the error covariance matrix when the single solution is finished in the step 9 so as to ensure that the filter only keeps the camera poses at the latest two moments in the cyclic iteration process.
The invention uses KITTI data set '2011 _10_03_ drive _ 0027' to carry out simulation experiment. The vehicle trajectory in the selected data set was 540 meters, the total elapsed time was 78 seconds, and the average speed was about 25 kilometers per hour. In addition, the vehicle has a typical scene of one violent vibration, one long-time moderate vibration and a plurality of slight vibrations. The navigation positioning results of the existing method and the self-adaptive inertial navigation and vision combined navigation method provided by the invention are as follows:
algorithm Root mean square error End point error
The invention 2.352 Rice 1.513 m
Existing methods 4.002 m 6.448 Rice
According to simulation experiment results, compared with the conventional method, the self-adaptive inertial navigation and visual combined navigation method provided by the invention improves the navigation positioning precision to 2.352 meters. Meanwhile, the comparison of the end point errors shows that the method provided by the invention does not have the error divergence.
In summary, the above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (2)

1. A self-adaptive inertial navigation and vision combined navigation method comprises the following steps:
step 0, initializing a system nominal state matrix
Figure FDA0002645105410000011
System error state matrix
Figure FDA0002645105410000012
And an initial error covariance matrix P0|0
Step 1, resolving by a strapdown inertial navigation system to obtain a system nominal state matrix
Figure FDA0002645105410000013
Meanwhile, a system transfer matrix F is calculated according to the kinematic dynamics relationdSum system process noise variance matrix Qd
Step 2, utilizing a system transfer matrix FdSum system process noise variance matrix QdCarrying out one-step prediction on the error state of the system;
step 3, performing feature extraction and feature matching on the visual images acquired by the camera, and removing mismatching points by using a single-point RANSAC method to obtain feature matching pairs among the continuous three frames of images;
step 4, 2n +1 sigma points are obtained through calculation; n represents a system state dimension;
and 5, correcting the nominal state of the system by using 2n +1 sigma pointsMatrix array
Figure FDA0002645105410000014
And (3) predicting an observed value by combining a feature matching result:
step 6, updating the filter observed quantity prediction covariance matrix of the current ith navigation solution
Figure FDA0002645105410000015
And covariance matrix of observed quantity and state quantity
Figure FDA0002645105410000016
It is characterized by also comprising the following steps:
step 7, acquiring the fact covariance matrix P of the filter observed quantity of the ith navigation solutionf,iAnd a prediction covariance matrix
Figure FDA0002645105410000017
Wherein the fact covariance matrix Pf,iPass filter innovation ΔiSo as to obtain the compound with the characteristics of,
Figure FDA0002645105410000018
predictive covariance matrix
Figure FDA0002645105410000019
A prediction covariance matrix obtained by updating the filter observations in step 6:
then, the fact covariance matrix P is obtained through a diagonal operatorf,iAnd a prediction covariance matrix
Figure FDA00026451054100000110
Diagonal matrix of (a):
Df,i=diag(Pf,i),
Figure FDA00026451054100000111
will Df,iAnd Dc,iDividing diagonal elements corresponding to the two matrixes, and then taking the average value to obtain the ith navigation resolving self-adaptive factor alphai
Figure FDA0002645105410000021
Wherein, alpha represents the dispersion degree of sigma points, beta is used for describing the distribution information of the state quantity;
step 8, obtaining the mean value m of the adaptive factorαAnd judging:
when alpha isi≤mαComputing filter Kalman filter gain KkAnd updating the error covariance matrix Pk|k
Figure FDA0002645105410000022
Wherein, Pk|k-1Representing an error covariance matrix in the error state obtained in the step 2;
when m isαi≤mαα 2And adjusting the error covariance matrix, namely:
ηi=αi-mα,
Figure FDA0002645105410000023
wherein σα 2Representing the variance of the adaptation factor;
when m isααi≤mαα 2And adjusting the error covariance matrix and Kalman filtering gain, namely:
ηi=αi-mα,
Figure FDA0002645105410000024
when alpha isi>mαα 2Time to error covarianceThe matrix is adjusted rapidly, namely:
ηi=αi-mα,
Figure FDA0002645105410000025
and finally, estimating the system error state:
Figure FDA0002645105410000026
step 9, using the system error state obtained in step 8
Figure FDA0002645105410000027
To system nominal state matrix
Figure FDA0002645105410000028
Correcting to obtain a positioning result; and repeating the steps 1 to 9 to obtain continuous vehicle navigation positioning information.
2. The integrated adaptive inertial navigation and visual navigation method according to claim 1, wherein in step 0, the camera poses at the last two moments are added to the system error state matrix; in the step 9, after the positioning result is obtained, updating the historical camera poses in the system error state matrix by using the camera poses at the latest two moments, and then updating the error covariance matrix; and then returning to the step 1 to enter the next positioning calculation.
CN201910270894.9A 2019-04-04 2019-04-04 Self-adaptive inertial navigation and visual combined navigation method Active CN110006423B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910270894.9A CN110006423B (en) 2019-04-04 2019-04-04 Self-adaptive inertial navigation and visual combined navigation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910270894.9A CN110006423B (en) 2019-04-04 2019-04-04 Self-adaptive inertial navigation and visual combined navigation method

Publications (2)

Publication Number Publication Date
CN110006423A CN110006423A (en) 2019-07-12
CN110006423B true CN110006423B (en) 2020-11-06

Family

ID=67170029

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910270894.9A Active CN110006423B (en) 2019-04-04 2019-04-04 Self-adaptive inertial navigation and visual combined navigation method

Country Status (1)

Country Link
CN (1) CN110006423B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113532455B (en) * 2020-04-17 2023-03-31 北京三快在线科技有限公司 Visual fusion positioning method and device, electronic equipment and readable storage medium
CN113916222B (en) * 2021-09-15 2023-10-13 北京自动化控制设备研究所 Combined navigation method based on Kalman filtering estimation variance constraint
CN114485574B (en) * 2021-12-21 2023-03-21 武汉大学 Three-linear array image POS auxiliary ground positioning method based on Kalman filtering model
CN114777745A (en) * 2022-04-08 2022-07-22 南京信息工程大学 Inclined evidence obtaining modeling method based on unscented Kalman filtering

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102538781A (en) * 2011-12-14 2012-07-04 浙江大学 Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN103162687A (en) * 2013-03-07 2013-06-19 中国人民解放军国防科学技术大学 Image/inertial navigation combination navigation method based on information credibility
CN105953796A (en) * 2016-05-23 2016-09-21 北京暴风魔镜科技有限公司 Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080195316A1 (en) * 2007-02-12 2008-08-14 Honeywell International Inc. System and method for motion estimation using vision sensors
CN101598556B (en) * 2009-07-15 2011-05-04 北京航空航天大学 Unmanned aerial vehicle vision/inertia integrated navigation method in unknown environment
US10012504B2 (en) * 2014-06-19 2018-07-03 Regents Of The University Of Minnesota Efficient vision-aided inertial navigation using a rolling-shutter camera with inaccurate timestamps
CN106708066B (en) * 2015-12-20 2019-07-26 中国电子科技集团公司第二十研究所 View-based access control model/inertial navigation unmanned plane independent landing method
CN106885569A (en) * 2017-02-24 2017-06-23 南京理工大学 A kind of missile-borne deep combination ARCKF filtering methods under strong maneuvering condition
CN107014371A (en) * 2017-04-14 2017-08-04 东南大学 UAV integrated navigation method and apparatus based on the adaptive interval Kalman of extension
CN109443353B (en) * 2018-12-25 2020-11-06 中北大学 Visual-inertial tight coupling combined navigation method based on fuzzy self-adaptive ICKF

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102538781A (en) * 2011-12-14 2012-07-04 浙江大学 Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN103162687A (en) * 2013-03-07 2013-06-19 中国人民解放军国防科学技术大学 Image/inertial navigation combination navigation method based on information credibility
CN105953796A (en) * 2016-05-23 2016-09-21 北京暴风魔镜科技有限公司 Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone

Also Published As

Publication number Publication date
CN110006423A (en) 2019-07-12

Similar Documents

Publication Publication Date Title
CN110006423B (en) Self-adaptive inertial navigation and visual combined navigation method
CN109459019B (en) Vehicle navigation calculation method based on cascade adaptive robust federal filtering
CN109459705B (en) Power battery SOC estimation method of anti-outlier robust unscented Kalman filtering
Zhang et al. A hybrid intelligent algorithm DGP-MLP for GNSS/INS integration during GNSS outages
CN112758097A (en) State prediction and estimation method for unmanned vehicle
CN107608208B (en) Task constraint-oriented spacecraft attitude control system on-orbit reconstruction method
CN116309731A (en) Multi-target dynamic tracking method based on self-adaptive Kalman filtering
CN112083457A (en) IMM satellite positioning and navigation method based on neural network optimization
CN115451952B (en) Multi-system integrated navigation method and device for fault detection and robust adaptive filtering
CN113465628A (en) Inertial measurement unit data compensation method and system
CN110763245A (en) Map creating method and system based on stream computing
CN114626307B (en) Distributed consistent target state estimation method based on variational Bayes
CN108459314B (en) Three-dimensional solid-state area array laser radar non-uniform correction method
CN113362377B (en) VO weighted optimization method based on monocular camera
CN104050349B (en) External air temperature measuring apparatus and method
CN114063131A (en) GNSS/INS/wheel speed combined positioning real-time smoothing method
CN114970341B (en) Method for establishing low-orbit satellite orbit prediction precision improvement model based on machine learning
CN112034445A (en) Vehicle motion trail tracking method and system based on millimeter wave radar
Dah-Jing et al. Neural network aided adaptive Kalman filter for GPS/INS navigation system design
CN117170354A (en) Wheel type robot positioning method and system
US9900491B2 (en) Method and system for automatically focusing a camera
CN115731269A (en) Method and device for predicting road lane line, vehicle and storage medium
CN114567288B (en) Distribution collaborative nonlinear system state estimation method based on variable decibels
CN112800889A (en) Target tracking method based on distributed matrix weighting and Gaussian filtering fusion
CN115546319B (en) Lane keeping method, lane keeping apparatus, computer device, and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant