CN108731670B - Inertial/visual odometer integrated navigation positioning method based on measurement model optimization - Google Patents
Inertial/visual odometer integrated navigation positioning method based on measurement model optimization Download PDFInfo
- Publication number
- CN108731670B CN108731670B CN201810477871.0A CN201810477871A CN108731670B CN 108731670 B CN108731670 B CN 108731670B CN 201810477871 A CN201810477871 A CN 201810477871A CN 108731670 B CN108731670 B CN 108731670B
- Authority
- CN
- China
- Prior art keywords
- time
- inertial
- matrix
- navigation
- error
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
Abstract
The invention discloses an inertial/visual odometer integrated navigation positioning method based on measurement model optimization, which comprises the steps of firstly establishing a state equation of an integrated navigation system, and expanding an inertial sensor error into a system state variable, wherein the system state variable comprises a random constant drift of a gyroscope, a first-order Markov process drift of the gyroscope and a first-order Markov process drift of an accelerometer; then, the visual odometer is used as an angular velocity sensor, a linear velocity sensor and a position sensor to obtain measurement data so as to construct a measurement equation; and finally, carrying out real-time feedback correction on the navigation error in the carrier movement process to obtain an error-corrected navigation result of the inertial navigation system. The method can effectively utilize the angular velocity, linear velocity and position information of the visual odometer in the motion process of the carrier, realize effective fusion with inertial navigation, improve the precision and reliability of the integrated navigation system, and is suitable for engineering application.
Description
Technical Field
The invention relates to an inertial/visual odometer integrated navigation positioning method based on measurement model optimization, and belongs to the technical field of inertial/visual odometer integrated navigation.
Background
In recent years, with the development and development of intelligent moving bodies such as unmanned planes, unmanned vehicles, robots and the like, higher requirements are put forward on the performance of navigation systems.
In the prior art, a navigation system widely applied to an intelligent moving body is mostly realized by adopting an inertial unit (IMU-accelerometer and gyroscope)/satellite combined navigation system. However, in consideration of the environment where the satellite signals are interfered or failed in cities, jungles, indoors, etc., it is difficult for the IMU/satellite integrated navigation system to provide accurate navigation information. The visual odometer technology is a six-degree-of-freedom motion estimation method based on related image sequence processing, and has the advantages of good autonomy, good concealment, simple structure, low cost and the like because the visual odometer technology only depends on a passively measured camera, so that the visual odometer technology is bound to become an important information unit of an intelligent moving body navigation system in the future.
The error of the inertial navigation system is mainly caused by the measurement error of an inertial sensor (IMU-accelerometer and gyroscope), the error of the inertial navigation system is accumulated along with time, but the measurement precision of the inertial navigation system is not influenced by sudden maneuvering, and the visual odometer solves the pose in a recursion mode, so that the visual odometer inevitably has accumulated error, and the accumulated error is similar to the odometer based on a photoelectric encoder and is spatially related but not temporally related; and when the mobile body is too mobile, the matching failure can be caused by too large change of image scenes, and the positioning precision is influenced. The visual odometer is mainly classified into a feature point method of tracking and matching by using feature points, represented by open source algorithms such as LIBVISO and PTAM, and a direct method of matching by directly using pixel gradation information, represented by open source algorithms such as SVO and LSD-SLAM, depending on the use of image information. Various representative algorithm motion estimation methods are different, and the relationship between the maneuvering of a moving body and the output pose noise is difficult to accurately model. Considering the complementarity of the inertial navigation and the visual odometer, the combined navigation based on the inertial navigation and the visual odometer is widely regarded, but most of the existing combined navigation of the IMU and the visual odometer directly uses the combination of the posture and the position, and neglects the characteristics related to the accumulation of the positioning errors of the visual odometer and the maneuver. How to realize the effective fusion of the navigation information of the inertial navigation and the visual odometer has important research significance.
Disclosure of Invention
In order to solve the defects of the prior art, the invention aims to provide an inertial/visual odometer integrated navigation positioning method based on measurement model optimization, which effectively utilizes the angular velocity, linear velocity and position information of a visual odometer in the motion process of a carrier, realizes the optimal fusion with inertial navigation and improves the precision of an integrated navigation system.
In order to achieve the above object, the present invention adopts the following technical solutions:
an inertial/visual odometer integrated navigation positioning method based on measurement model optimization is characterized in that,
step 1) establishing a state equation of the integrated navigation system, and expanding errors of the inertial sensor into system state variables, wherein the system state variables comprise random constant drift of a gyroscope, first-order Markov process drift of the gyroscope and first-order Markov process drift of an accelerometer;
step 2) obtaining a conversion relation from the relative pose to the pose and the position under the navigation system by combining the relative pose output by the visual odometer;
step 3) selecting measurement variables of the integrated navigation system, and constructing an inertia/visual odometer integrated Kalman filtering measurement equation;
and 4) discretizing the system state equation and the measurement equation, and performing feedback correction on the system state quantity by adopting Kalman filtering.
The integrated navigation and positioning method of the inertial/visual odometer based on measurement model optimization is characterized in that the state equation of the system in the step 1) is as follows:
wherein, X is a state vector,is a mathematical platform error angle in a strapdown inertial navigation system,the error angles of the three-axis mathematical platform are x, y and z respectively; delta vnIs the error in the speed of the carrier,speed errors of the carrier on x, y and z axes under the geographical system; δ pnThe position error of the carrier is delta L, delta lambda and delta h are respectively longitude, latitude and height errors; epsilonbIs the random constant drift of the gyroscope,random constant drifts of the gyroscope x, y and z axes respectively;for the first order markov process drift of the gyroscope,first order markov process drifts for gyroscope x, y, z axes respectively;for the first order markov process drift of the accelerometer,first order markov process drift for accelerometer x, y, z axes;
the superscript n represents a geographical system, the superscript b represents a loading system, the subscript I represents an inertial navigation system calculation value, and the subscript V represents a visual mileage calculation value;
wherein the content of the first and second substances,the angular rate error of the geographic system relative to the inertial system,is the angular velocity of the geographic system relative to the inertial system,representing a coordinate transformation matrix from a carrier system to a geographical system, fnIn order to output specific force of the accelerometer under geographic conditions,is the angular rate error of the earth's system relative to the inertial system,is the angular rate error of the geographic system relative to the earth system,is the angular velocity of the earth's system relative to the inertial system,angular velocity of the geographic system relative to the Earth's system, δ gnIs the gravity acceleration error, vnIs the speed of the carrier under the geographic system,speed of the carrier on the x-axis under geographic system, RmRadius of a unit of fourth quarternRadius of meridian, h is the local altitude, L is the local altitude, TgAnd TaRespectively, the correlation time, w, of the first order Markov processrAnd waRespectively, associated driving white noise.
The above-mentioned inertial/visual odometer integrated navigation positioning method based on measurement model optimization is characterized in that, the transformation relationship from the relative pose in step 2) to the pose and position under the navigation system is:wherein the content of the first and second substances,is a coordinate transformation matrix from the carrier to the geographic system at the time of k frames of images,is a coordinate transformation matrix of the camera system to the carrier system,andthe rotational matrix and translation vectors for the k +1 frame image time relative to the k frame image time are measured for the visual odometer,for the position of the carrier under the geographic system at the moment of the k frames of the image, the superscript T denotes the transposition of the matrix, the subscripts k and k +1 denote the corresponding framesPhysical quantities at all times.
The above-mentioned inertial/visual odometer integrated navigation positioning method based on measurement model optimization is characterized in that the measurement equation in step 3) is:
wherein the content of the first and second substances,error in angular velocity of the carrier with respect to the geographic system, δ vnAs error in velocity of the carrier, δ pnThe subscript I represents the inertial navigation system solution value, the subscript V represents the visual mileage calculation value, H is the measurement coefficient matrix of the system, and V is the measurement noise matrix of the system;
wherein the content of the first and second substances,representing a coordinate transformation matrix from a geographical system to a carrier system, MvAnd MpRespectively, coefficients related to speed and position, vxAnd vyThe components of the carrier velocity in the x-axis and y-axis respectively,is composed ofThe value of the modulus of the (c) component,and I is an identity matrix.
The integrated navigation and positioning method of the inertial/visual odometer based on measurement model optimization is characterized in that the specific process of the step 4) is as follows:
401) discretizing a system state equation and a measurement equation:
Xk=Φk,k-1Xk-1+Γk,k-1Wk-1,Zk=HkXk+Vkwherein X iskIs tkTime of day system state quantity, Xk-1Is tk-1Time of day system state quantity, phik,k-1Is tk-1Time to tkState transition matrix of time system, gammak,k-1Is tk-1Time to tkNoise-driven matrix of time-of-day system, Wk-1Is tk-1Noise matrix of time of day system, ZkIs tkAttitude measurement matrix of time system, HkIs tkAttitude measurement coefficient matrix at time, VkIs tkA noise matrix of attitude observations at a time;
402) adding a control item U to a formula obtained by discretizationk-1And correcting the inertial navigation system error by adopting a closed loop correction system state equation:wherein, Bk,k-1Is tk-1Time to tkA control item coefficient matrix of the time system;
403) obtaining a linearized Kalman filter equation of the system:
wherein the content of the first and second substances,is tk-1Time to tkPredicting the state quantity one step at a time;is tk-1A time filtering state estimator;is tkA time filtering state estimator; kkIs tkA time filtering gain matrix; zkIs tkAn innovation sequence of moments; pk,k-1Is tk-1Time to tkPredicting a covariance matrix one step at a time; pk-1Is tk-1Estimating a covariance matrix by a time filtering state; phik,k-1 TIs phik,k-1The transposed matrix of (2); gamma-shapedk-1Is tk-1A time system noise coefficient matrix; qk-1Is tk-1A system observation noise estimation covariance matrix at the moment; gamma-shapedk-1 TIs gammak-1The transposed matrix of (2);is HkThe transposed matrix of (2); rkIs tkEstimating a covariance matrix of measurement noise at a moment; pkIs tkEstimating a covariance matrix by a time filtering state; i is an identity matrix;
404) and estimating a system navigation error value including attitude, position and speed errors according to the linearized Kalman filter equation obtained in the step 403), and subtracting the system navigation error value from a navigation parameter calculated by an inertial navigation system to obtain an inertial/visual odometer combined navigation correction value optimized based on a measurement model.
The invention achieves the following beneficial effects: the method takes the visual odometer as an angular velocity sensor, a linear velocity sensor and a position sensor to acquire measurement data so as to construct a measurement equation, accords with the noise characteristic output by the visual odometer, ensures the effective utilization of the output information of the visual odometer, and has wide applicability; the method realizes the optimal estimation of the state quantity of the inertial/visual odometer combined navigation system by applying the Kalman filtering method, effectively improves the precision of the combined navigation system by utilizing the complementarity of the inertial navigation and the visual odometer, and is suitable for engineering application.
Drawings
FIG. 1 is an architectural diagram of the present invention;
FIG. 2 is a schematic diagram of the actual motion trajectory of a KITTI data set vehicle used in the present invention;
fig. 3(a) - (c) are graphs comparing navigation results of the present invention with navigation results using attitude position combinations, fig. 3(a) is a position error comparison curve, fig. 3(b) is an attitude error comparison curve, and fig. 3(c) is a velocity error comparison curve.
Detailed Description
The invention is further described below with reference to the accompanying drawings. The following examples are only for illustrating the technical solutions of the present invention more clearly, and the protection scope of the present invention is not limited thereby.
As shown in fig. 1, the principle of the inertial/visual odometer integrated navigation method optimized based on the metrology model according to the present invention is: and the vision odometer module performs distortion correction on an original image output by the binocular camera, performs feature extraction and stereo matching on the sequence image, and performs tracking on feature points, so as to estimate the relative pose. The Kalman filter module is used for solving strapdown inertial navigation by establishing a Kalman filtering state equation and a measurement equation and utilizing the original output of a gyroscope and an accelerometer, and time updating and measurement updating are respectively carried out after a filtering period is reached, so that the optimal estimation of the state quantity of the integrated navigation system is realized, and the performance of the integrated navigation system is improved.
The specific embodiment of the invention is as follows:
step 1) establishing a state equation of an inertia/visual odometer combined navigation algorithm:
firstly, defining a coordinate system: the geographical coordinate system selects east-north-sky as a navigation system (n), and the carrier coordinate system (b) selects right-front-up. The state quantity to be estimated is as follows:
wherein, X is a state vector,is a mathematical platform error angle in a strapdown inertial navigation system,the error angles of the three-axis mathematical platform are x, y and z respectively; delta vnIs the error in the speed of the carrier,speed errors of the carrier on x, y and z axes under the geographical system; δ pnThe position error of the carrier is delta L, delta lambda and delta h are respectively longitude, latitude and height errors; epsilonbIs the random constant drift of the gyroscope,random constant drifts of the gyroscope x, y and z axes respectively;for the first order markov process drift of the gyroscope,first order markov process drifts for gyroscope x, y, z axes respectively;for the first order markov process drift of the accelerometer,is the first order markov process drift of the accelerometer x, y, z axes. The superscript n represents the geographical system, and the superscript b represents the carrier system;
the measurements of the IMU are taken as input variables to a state equation expressed as:
in the formula (I), the compound is shown in the specification,the angular rate error of the geographic system relative to the inertial system,is the angular velocity of the geographic system relative to the inertial system,representing a coordinate transformation matrix from a carrier system to a geographical system, fnIn order to output specific force of the accelerometer under geographic conditions,is the angular rate error of the earth's system relative to the inertial system,is the angular rate error of the geographic system relative to the earth system,is the angular velocity of the earth's system relative to the inertial system,angular velocity of the geographic system relative to the Earth's system, δ gnIs the gravity acceleration error, vnIs the speed of the carrier under the geographic system,speed of the carrier on the x-axis under geographic system, RmRadius of a unit of fourth quarternRadius of meridian, h is the local altitude, L is the local altitude, TgAnd TaRespectively, the correlation time, w, of the first order Markov processrAnd waRespectively, associated driving white noise.
Step 2) establishing a measurement equation of an inertial/visual odometer combined navigation algorithm based on measurement model optimization
The visual odometer estimates the movement of camera between adjacent images by matching and tracking the sequence image information, the output is the relative position of two successive sampling momentsAnd translation vectorTo indicate that k and k +1 indicate the number of frames of the image.
The following equation represents the transformation relationship from the relative pose to the pose and position under the navigation system.
In the formula (I), the compound is shown in the specification,representing a coordinate transformation matrix from a camera system to a carrier system, generally consideredIt was calibrated before the experiment and remained unchanged in one experiment. The indices k and k +1 indicate the magnitude of the respective physical quantities of the carrier at the moment of the corresponding frame number image.
Because the noise characteristics of the visual odometer based on different resolving principles and methods are different, the visual odometer is regarded as a black box without loss of generality, and the noise of the relative pose obtained by direct observation cannot be accumulated and is similar to white noise, so the noise is reflected on the attitude angle increment and the translation vector, namely:
wherein, the delta gamma, the delta theta, and the delta phi represent the variation of the roll angle, the pitch angle and the course angle from the moment k and the moment k +1, representing roll, pitch, course angle measurements with errors, wγ,wθ,wφ,wtAnd white noise respectively representing roll angle, pitch angle, course angle and translation vector.
Based on the analyzed noise characteristics of the visual odometer, selecting a difference value of angular velocity errors, a linear velocity error difference value and a position error difference value which are calculated by an inertial navigation system and the visual odometer as observed quantities of a combined system, wherein the observed quantities are shown as the following formula:
wherein the content of the first and second substances,representing the angular velocity error, δ v, of the carrier relative to the geographical systemnAs error in velocity of the carrier, δ pnIs the position error of the carrier. Subscript I represents the inertial navigation system calculated value and subscript V represents the visual mileage calculated value. H is the measurement matrix of the system.
wherein the content of the first and second substances,is the output of the gyroscope,is the angular rate of rotation of the earth,the angular velocity caused by the motion of the carrier.
the value of (d) can be obtained by attitude change angular rate conversion according to the differential equation of euler's angle, as shown in the following formula:
Is a state quantity of the system and is,can be computed from the visual location estimate transform at time k to k + 1. According to equations (5) - (7), the measurement matrix H of the system is shown as follows:
where the symbol x is a simplified representation of the cross product of the vector,Mvand MpRespectively velocity and position related coefficients,is composed ofThe value of the modulus of the (c) component,and I is an identity matrix. Step 3) establishing an inertial/visual odometer integrated navigation Kalman filtering model based on measurement model optimization
31) Discretizing a system state equation and a measurement equation:
Xk=Φk,k-1Xk-1+Γk,k-1Wk-1 (9)
Zk=HkXk+Vk (10)
wherein, XkIs tkTime of day system state quantity, Xk-1Is tk-1Time of day system state quantity, phik,k-1Is tk-1Time to tkState transition matrix of time system, gammak,k-1Is tk-1Time to tkNoise-driven matrix of time-of-day system, Wk-1Is tk-1Noise matrix of time of day system, ZkIs tkAttitude measurement matrix of time system, HkIs tkAttitude measurement coefficient matrix at time, VkIs tkA noise matrix of attitude observations at a time.
32) Adding a control item U to a formula obtained by discretizationk-1And correcting the inertial navigation system error by adopting a closed loop correction system state equation:
wherein, Bk,k-1Is tk-1Time to tkAnd (4) a control item coefficient matrix of the time system.
33) Obtaining a linearized Kalman filter equation of the system:
wherein the content of the first and second substances,is tk-1Time to tkPredicting the state quantity one step at a time;is tk-1A time filtering state estimator;is tkA time filtering state estimator; kkIs tkA time filtering gain matrix; zkIs tkAn innovation sequence of moments; pk,k-1Is tk-1Time to tkPredicting a covariance matrix one step at a time; pk-1Is tk-1Estimating a covariance matrix by a time filtering state; phik,k-1 TIs phik,k-1The transposed matrix of (2); gamma-shapedk-1Is tk-1A time system noise coefficient matrix; qk-1Is tk-1A system observation noise estimation covariance matrix at the moment; gamma-shapedk-1 TIs gammak-1The transposed matrix of (2);is HkThe transposed matrix of (2); rkIs tkEstimating a covariance matrix of measurement noise at a moment; pkIs tkEstimating a covariance matrix by a time filtering state; and I is an identity matrix.
In order to verify the correctness and the effectiveness of the inertial/visual odometer combined navigation method based on measurement model optimization, the method is adopted to establish a model, and a KITTI data set is utilized to perform semi-physical simulation verification. The actual trajectory of the test vehicle is shown in figure 2.
Based on the selected data set, the inertial/visual odometer combined navigation method based on the optimization of the measurement model is verified and compared with the Kalman filtering of the direct combination of the attitude and the position, and a navigation error comparison curve is shown in figures 3(a) to 3 (c).
In fig. 3(a) -3 (c), the solid line represents the error curve of the present invention, and the dotted line represents the directly combined kalman filter error curve of the attitude position. As can be seen from the navigation error comparison curves in FIGS. 3(a) -3 (c), by adopting the inertial/visual odometer combined navigation method based on the optimization of the measurement model, the accuracy of the inertial navigation system is obviously improved compared with the Kalman filtering of the direct combination of the attitude and the position, and the method has beneficial engineering application value.
The above description is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, several modifications and variations can be made without departing from the technical principle of the present invention, and these modifications and variations should also be regarded as the protection scope of the present invention.
Claims (5)
1. An inertial/visual odometer integrated navigation positioning method based on measurement model optimization is characterized in that,
step 1) establishing a state equation of the integrated navigation system, and expanding errors of the inertial sensor into system state variables, wherein the system state variables comprise random constant drift of a gyroscope, first-order Markov process drift of the gyroscope and first-order Markov process drift of an accelerometer;
step 2) obtaining a conversion relation from the relative pose to the pose and the position under the navigation system by combining the relative pose output by the visual odometer;
step 3) selecting measurement variables of the integrated navigation system, and constructing an inertia/visual odometer integrated Kalman filtering measurement equation;
step 4) discretizing a system state equation and a measurement equation, and performing feedback correction on the system state quantity by adopting Kalman filtering;
the measurement equation of the step 3) is as follows:
wherein the content of the first and second substances,error in angular velocity of the carrier with respect to the geographic system, δ vnAs error in velocity of the carrier, δ pnAnd the subscript I represents the calculation value of the inertial navigation system, the subscript V represents the calculation value of the visual mileage, H is a measurement coefficient matrix of the system, and V is a measurement noise matrix of the system.
2. The integrated navigation and positioning method for inertial/visual odometer based on metrology model optimization as claimed in claim 1, wherein the state equation of the system in step 1) is:
wherein, X is a state vector,is a mathematical platform error angle in a strapdown inertial navigation system,the error angles of the three-axis mathematical platform are x, y and z respectively; delta vnIs the error in the speed of the carrier,speed errors of the carrier on x, y and z axes under the geographical system; δ pnThe position error of the carrier is delta L, delta lambda and delta h are respectively longitude, latitude and height errors; epsilonbIs the random constant drift of the gyroscope,random constant drifts of the gyroscope x, y and z axes respectively;for the first order markov process drift of the gyroscope,first order markov process drifts for gyroscope x, y, z axes respectively;for the first order markov process drift of the accelerometer,first order markov process drift for accelerometer x, y, z axes;
the superscript n represents a geographical system, the superscript b represents a loading system, the subscript I represents an inertial navigation system calculation value, and the subscript V represents a visual mileage calculation value;
wherein the content of the first and second substances,the angular rate error of the geographic system relative to the inertial system,is the angular velocity of the geographic system relative to the inertial system,representing a coordinate transformation matrix from a carrier system to a geographical system, fnIn order to output specific force of the accelerometer under geographic conditions,is the angular rate error of the earth's system relative to the inertial system,is the angular rate error of the geographic system relative to the earth system,is the angular velocity of the earth's system relative to the inertial system,angular velocity of the geographic system relative to the Earth's system, δ gnIs the gravity acceleration error, vnIs the speed of the carrier under the geographic system,speed of the carrier on the x-axis under geographic system, RmRadius of a unit of fourth quarternRadius of meridian, h is the local altitude, L is the local altitude, TgAnd TaRespectively, the correlation time, w, of the first order Markov processrAnd waRespectively, associated driving white noise.
3. The integrated navigation and positioning method for the inertial/visual odometer based on the metrology model optimization as claimed in claim 1, wherein the transformation relationship from the relative pose in step 2) to the pose and position under the navigation system is:wherein the content of the first and second substances,is a coordinate transformation matrix from the carrier to the geographic system at the time of k frames of images,is a coordinate transformation matrix of the camera system to the carrier system,andthe rotational matrix and translation vectors for the k +1 frame image time relative to the k frame image time are measured for the visual odometer,for the position of the carrier under the geographical system at the moment of the k frames of images, the superscript T represents the transpose of the matrix, and the subscripts k and k +1 represent the physical quantities corresponding to the frame moments.
4. The integrated inertial/visual odometer navigation positioning method based on metrology model optimization of claim 2,
wherein the content of the first and second substances,representing a coordinate transformation matrix from a geographical system to a carrier system, MvAnd MpRespectively, coefficients related to speed and position, vxAnd vyThe components of the carrier velocity in the x-axis and y-axis respectively,is composed ofThe value of the modulus of the (c) component,
5. The integrated navigation and positioning method for inertial/visual odometer based on metrology model optimization as claimed in claim 1, wherein the specific process of step 4) is:
401) discretizing a system state equation and a measurement equation:
Xk=Φk,k-1Xk-1+Γk,k-1Wk-1,Zk=HkXk+Vkwherein X iskIs tkTime of day system state quantity, Xk-1Is tk-1Time of day system state quantity, phik,k-1Is tk-1Time to tkState transition matrix of time system, gammak,k-1Is tk-1Time to tkNoise-driven matrix of time-of-day system, Wk-1Is tk-1Noise matrix of time of day system, ZkIs tkAttitude measurement matrix of time system, HkIs tkAttitude measurement coefficient matrix at time, VkIs tkA noise matrix of attitude observations at a time;
402) adding a control item U to a formula obtained by discretizationk-1And correcting the inertial navigation system error by adopting a closed loop correction system state equation:wherein, Bk,k-1Is tk-1Time to tkA control item coefficient matrix of the time system;
403) obtaining a linearized Kalman filter equation of the system:
wherein the content of the first and second substances,is tk-1Time to tkPredicting the state quantity one step at a time;is tk-1A time filtering state estimator;is tkA time filtering state estimator; kkIs tkA time filtering gain matrix; zkIs tkAn innovation sequence of moments; pk,k-1Is tk-1Time to tkPredicting a covariance matrix one step at a time; pk-1Is tk-1Estimating a covariance matrix by a time filtering state; phik,k-1 TIs phik,k-1The transposed matrix of (2); gamma-shapedk-1Is tk-1A time system noise coefficient matrix; qk-1Is tk-1A system observation noise estimation covariance matrix at the moment; gamma-shapedk-1 TIs gammak-1The transposed matrix of (2); hk TIs HkThe transposed matrix of (2); rkIs tkEstimating a covariance matrix of measurement noise at a moment; pkIs tkEstimating a covariance matrix by a time filtering state; i is an identity matrix;
404) and estimating a system navigation error value including attitude, position and speed errors according to the linearized Kalman filter equation obtained in the step 403), and subtracting the system navigation error value from a navigation parameter calculated by an inertial navigation system to obtain an inertial/visual odometer combined navigation correction value optimized based on a measurement model.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810477871.0A CN108731670B (en) | 2018-05-18 | 2018-05-18 | Inertial/visual odometer integrated navigation positioning method based on measurement model optimization |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810477871.0A CN108731670B (en) | 2018-05-18 | 2018-05-18 | Inertial/visual odometer integrated navigation positioning method based on measurement model optimization |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108731670A CN108731670A (en) | 2018-11-02 |
CN108731670B true CN108731670B (en) | 2021-06-22 |
Family
ID=63938504
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810477871.0A Active CN108731670B (en) | 2018-05-18 | 2018-05-18 | Inertial/visual odometer integrated navigation positioning method based on measurement model optimization |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108731670B (en) |
Families Citing this family (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109341724B (en) * | 2018-12-04 | 2023-05-05 | 中国航空工业集团公司西安航空计算技术研究所 | On-line calibration method for relative pose of airborne camera-inertial measurement unit |
CN109544696B (en) * | 2018-12-04 | 2022-12-20 | 中国航空工业集团公司西安航空计算技术研究所 | Accurate registration method for airborne enhanced synthetic visual virtual and real images based on visual inertial combination |
CN109655058A (en) * | 2018-12-24 | 2019-04-19 | 中国电子科技集团公司第二十研究所 | A kind of inertia/Visual intelligent Combinated navigation method |
CN109443355B (en) * | 2018-12-25 | 2020-10-27 | 中北大学 | Visual-inertial tight coupling combined navigation method based on self-adaptive Gaussian PF |
CN109443353B (en) * | 2018-12-25 | 2020-11-06 | 中北大学 | Visual-inertial tight coupling combined navigation method based on fuzzy self-adaptive ICKF |
CN109443354B (en) * | 2018-12-25 | 2020-08-14 | 中北大学 | Visual-inertial tight coupling combined navigation method based on firefly group optimized PF |
CN109631875A (en) * | 2019-01-11 | 2019-04-16 | 京东方科技集团股份有限公司 | The method and system that a kind of pair of sensor attitude fusion measurement method optimizes |
CN109813311B (en) * | 2019-03-18 | 2020-09-15 | 南京航空航天大学 | Unmanned aerial vehicle formation collaborative navigation method |
CN110221333B (en) * | 2019-04-11 | 2023-02-10 | 同济大学 | Measurement error compensation method of vehicle-mounted INS/OD integrated navigation system |
FR3097045B1 (en) * | 2019-06-06 | 2021-05-14 | Safran Electronics & Defense | Method and device for resetting an inertial unit of a means of transport from information delivered by a viewfinder of the means of transport |
CN110455275B (en) * | 2019-08-07 | 2023-07-07 | 天津理工大学 | Positioning navigation system and method for large spherical storage tank wall climbing robot |
CN110728716B (en) * | 2019-09-04 | 2023-11-17 | 深圳市道通智能航空技术股份有限公司 | Calibration method and device and aircraft |
CN113074751B (en) * | 2019-12-17 | 2023-02-07 | 杭州海康威视数字技术股份有限公司 | Visual positioning error detection method and device |
CN110986939B (en) * | 2020-01-02 | 2022-06-28 | 东南大学 | Visual inertia odometer method based on IMU (inertial measurement Unit) pre-integration |
WO2021160098A1 (en) * | 2020-02-13 | 2021-08-19 | Guangdong Oppo Mobile Telecommunications Corp., Ltd. | Error state kalman filter for visual slam by dynamically tuning measurement noise covariance |
CN111595337B (en) * | 2020-04-13 | 2023-09-26 | 浙江深寻科技有限公司 | Inertial positioning self-correction method based on visual modeling |
CN111708377B (en) * | 2020-06-21 | 2022-10-25 | 西北工业大学 | Flight control method based on inertial navigation/flight control system information fusion |
CN112066983B (en) * | 2020-09-08 | 2022-09-23 | 中国人民解放军国防科技大学 | Inertial/odometer combined navigation filtering method, electronic equipment and storage medium |
CN112050809B (en) * | 2020-10-08 | 2022-06-17 | 吉林大学 | Wheel type odometer and gyroscope information fusion unmanned vehicle directional positioning method |
CN112325878A (en) * | 2020-10-30 | 2021-02-05 | 南京航空航天大学 | Ground carrier combined navigation method based on UKF and air unmanned aerial vehicle node assistance |
CN112461237B (en) * | 2020-11-26 | 2023-03-14 | 浙江同善人工智能技术有限公司 | Multi-sensor fusion positioning method applied to dynamic change scene |
CN112683263B (en) * | 2020-12-12 | 2022-11-11 | 西北工业大学 | UWB/IMU/ODOM multi-sensor data fusion mobile robot positioning method based on improved model |
CN112729283A (en) * | 2020-12-21 | 2021-04-30 | 西北工业大学 | Navigation method based on depth camera/MEMS inertial navigation/odometer combination |
CN112577493B (en) * | 2021-03-01 | 2021-05-04 | 中国人民解放军国防科技大学 | Unmanned aerial vehicle autonomous positioning method and system based on remote sensing map assistance |
CN113155154B (en) * | 2021-04-07 | 2022-09-20 | 扬州大学 | Error correction method based on attitude and mileage of sensor and camera |
CN113362366B (en) * | 2021-05-21 | 2023-07-04 | 上海奥视达智能科技有限公司 | Sphere rotation speed determining method and device, terminal and storage medium |
CN113375667B (en) * | 2021-07-15 | 2022-02-22 | 北京百度网讯科技有限公司 | Navigation method, device, equipment and storage medium |
CN114184209B (en) * | 2021-10-29 | 2023-10-13 | 北京自动化控制设备研究所 | Inertial error suppression method for low-speed detection platform system |
CN114383609A (en) * | 2021-12-22 | 2022-04-22 | 中国煤炭科工集团太原研究院有限公司 | Combined navigation mine positioning method and device based on strapdown inertial navigation and odometer |
CN114323076A (en) * | 2021-12-31 | 2022-04-12 | 深圳市优必选科技股份有限公司 | Odometer calibration method, device, robot and readable storage medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102435188A (en) * | 2011-09-15 | 2012-05-02 | 南京航空航天大学 | Monocular vision/inertia autonomous navigation method for indoor environment |
CN102538781A (en) * | 2011-12-14 | 2012-07-04 | 浙江大学 | Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method |
CN103292804A (en) * | 2013-05-27 | 2013-09-11 | 浙江大学 | Monocular natural vision landmark assisted mobile robot positioning method |
CN105371840A (en) * | 2015-10-30 | 2016-03-02 | 北京自动化控制设备研究所 | Method for combined navigation of inertia/visual odometer/laser radar |
-
2018
- 2018-05-18 CN CN201810477871.0A patent/CN108731670B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102435188A (en) * | 2011-09-15 | 2012-05-02 | 南京航空航天大学 | Monocular vision/inertia autonomous navigation method for indoor environment |
CN102538781A (en) * | 2011-12-14 | 2012-07-04 | 浙江大学 | Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method |
CN103292804A (en) * | 2013-05-27 | 2013-09-11 | 浙江大学 | Monocular natural vision landmark assisted mobile robot positioning method |
CN105371840A (en) * | 2015-10-30 | 2016-03-02 | 北京自动化控制设备研究所 | Method for combined navigation of inertia/visual odometer/laser radar |
Non-Patent Citations (3)
Title |
---|
A Design of Odometer-aided Visual Inertial Integrated Navigation Algorithm Based on Multiple View Geometry Constraints;Deqiang Tian .et al;《2017 9th International Conference on Intelligent Human-Machine Systems and Cybernetics》;20170921;第161-166页 * |
Loosely Coupled Kalman Filtering for Fusion of Visual Odometry and Inertial Navigation;Salim .et al;《Proceedings of the 16th International Conference on Information Fusion》;20131021;第219-226页 * |
基于惯性传感器和视觉里程计的机器人定位;夏凌楠等;《仪器仪表学报》;20130131;第34卷(第1期);第166-172页 * |
Also Published As
Publication number | Publication date |
---|---|
CN108731670A (en) | 2018-11-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108731670B (en) | Inertial/visual odometer integrated navigation positioning method based on measurement model optimization | |
US20220018962A1 (en) | Positioning method and device based on multi-sensor fusion | |
CN110243358B (en) | Multi-source fusion unmanned vehicle indoor and outdoor positioning method and system | |
CN106871928B (en) | Strap-down inertial navigation initial alignment method based on lie group filtering | |
CN108387236B (en) | Polarized light SLAM method based on extended Kalman filtering | |
CN108362288B (en) | Polarized light SLAM method based on unscented Kalman filtering | |
CN104180818A (en) | Monocular vision mileage calculating device | |
CN113252033A (en) | Positioning method, positioning system and robot based on multi-sensor fusion | |
CN114001733A (en) | Map-based consistency efficient visual inertial positioning algorithm | |
CN112577493A (en) | Unmanned aerial vehicle autonomous positioning method and system based on remote sensing map assistance | |
CN111189474A (en) | Autonomous calibration method of MARG sensor based on MEMS | |
CN114693754B (en) | Unmanned aerial vehicle autonomous positioning method and system based on monocular vision inertial navigation fusion | |
CN115200578A (en) | Polynomial optimization-based inertial-based navigation information fusion method and system | |
CN113483755B (en) | Multi-sensor combination positioning method and system based on non-global consistent map | |
CN109443353B (en) | Visual-inertial tight coupling combined navigation method based on fuzzy self-adaptive ICKF | |
CN109443355B (en) | Visual-inertial tight coupling combined navigation method based on self-adaptive Gaussian PF | |
CN111257853A (en) | Automatic driving system laser radar online calibration method based on IMU pre-integration | |
CN113008229B (en) | Distributed autonomous integrated navigation method based on low-cost vehicle-mounted sensor | |
CN112325878A (en) | Ground carrier combined navigation method based on UKF and air unmanned aerial vehicle node assistance | |
CN110375773B (en) | Attitude initialization method for MEMS inertial navigation system | |
CN111220151B (en) | Inertia and milemeter combined navigation method considering temperature model under load system | |
CN111982126A (en) | Design method of full-source BeiDou/SINS elastic state observer model | |
CN114459474B (en) | Inertial/polarization/radar/optical-fluidic combined navigation method based on factor graph | |
Zhou et al. | Integrated INS/GPS system for an autonomous mobile vehicle | |
CN115540854A (en) | Active positioning method, equipment and medium based on UWB assistance |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |