CN115793001A - Vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing - Google Patents
Vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing Download PDFInfo
- Publication number
- CN115793001A CN115793001A CN202310069650.0A CN202310069650A CN115793001A CN 115793001 A CN115793001 A CN 115793001A CN 202310069650 A CN202310069650 A CN 202310069650A CN 115793001 A CN115793001 A CN 115793001A
- Authority
- CN
- China
- Prior art keywords
- inertial navigation
- frame
- coordinate system
- navigation
- follows
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 34
- 230000004927 fusion Effects 0.000 title claims abstract description 24
- 239000011159 matrix material Substances 0.000 claims abstract description 95
- 230000009466 transformation Effects 0.000 claims abstract description 30
- 230000000007 visual effect Effects 0.000 claims abstract description 27
- 238000005457 optimization Methods 0.000 claims abstract description 26
- 238000001914 filtration Methods 0.000 claims abstract description 7
- 230000001133 acceleration Effects 0.000 claims description 38
- 230000005484 gravity Effects 0.000 claims description 18
- 230000010354 integration Effects 0.000 claims description 15
- 238000004364 calculation method Methods 0.000 claims description 14
- 238000005259 measurement Methods 0.000 claims description 12
- 230000008569 process Effects 0.000 claims description 8
- 238000012937 correction Methods 0.000 claims description 3
- 230000007704 transition Effects 0.000 claims description 3
- 238000013519 translation Methods 0.000 claims description 3
- 238000013139 quantization Methods 0.000 claims 1
- 230000003068 static effect Effects 0.000 claims 1
- 238000007792 addition Methods 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Images
Classifications
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Landscapes
- Navigation (AREA)
- Position Fixing By Use Of Radio Waves (AREA)
Abstract
The invention discloses a vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing. Firstly, three sensors of inertial navigation, a binocular camera and a satellite antenna are rigidly and fixedly connected to the same carrier, an inertial navigation zero offset, camera parameters and a transformation matrix among the three sensors are calibrated, then inertial navigation and visual data are matched through a timestamp, the inertial navigation and visual data are fused based on a visual SLAM frame optimized by a graph to obtain a visual-inertial navigation primary positioning result with the same frequency as the visual data, the inertial navigation and satellite navigation data are matched through the timestamp, the inertial navigation and satellite navigation data are fused based on extended Kalman filtering to obtain an inertial navigation-satellite navigation primary positioning result with the same frequency as the inertial navigation data, and then the two primary positioning results are subjected to secondary fusion in a graph optimization mode to obtain a final pose result. The invention can improve the pose output frequency of satellite navigation by multiplexing the inertial navigation data, reduce the time matching errors of the inertial navigation, vision and satellite navigation data and improve the pose precision.
Description
Technical Field
The invention belongs to the technical field of multi-source fusion positioning, and particularly relates to a vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing.
Background
The traditional vision, inertial navigation and satellite navigation multi-source fusion positioning technology is that vision and inertial navigation data are fused to obtain a positioning result, the positioning result and satellite navigation data are fused, and a final positioning result is output. Under the condition of known inertial navigation and satellite navigation data, the positioning output frequency can be improved by fusing the inertial navigation and satellite navigation data, and then the subsequent fusion is participated. The existing vision, inertial navigation and defense multi-source fusion positioning method at least has the following technical problems:
firstly, the frequency of multi-source data is different, the frequency of satellite data is lower, and the time error is larger in the process of multi-source data matching;
secondly, the satellite signals are greatly influenced by the external environment, the positioning accuracy is different in different environments, and the influence on the final positioning accuracy is large when the satellite signals directly participate in multi-source fusion positioning;
thirdly, satellite derivative data in the multi-source data are only used as a residual error item in global optimization, and the satellite derivative data are not fully utilized for positioning calculation.
Therefore, the technical problems that the matching time error of multi-source different-frequency data is large, the influence of the environment on the satellite-guided positioning precision also influences the final positioning precision, and the multi-source data is not fully utilized exist in the method in the prior art.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing, which is used for solving the technical problems that the matching time error of multi-source different-frequency data is large, the environment influences satellite navigation positioning precision, further influences final positioning precision and underutilizes multi-source data in the prior art.
In order to achieve the above object, the technical scheme provided by the invention is a vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing, which comprises the following steps:
step 1, acquiring inertial navigation zero offset, an internal reference matrix of a binocular camera and a transformation matrix among three sensors of inertial navigation, the binocular camera and a satellite antenna;
step 2, obtaining the three-axis acceleration of inertial navigationAnd triaxial angular velocityData, binocular vision data、Matching the acceleration and angular velocity of inertial navigation with binocular vision data through a timestamp, fusing the inertial navigation and the vision data based on a visual SLAM frame optimized by a graph, and calculating to obtain a primary positioning result of the visual inertial navigation with the same frequency as the binocular vision data;
Step 2.1, three-axis acceleration according to inertial navigationAnd triaxial angular velocityData calculation ofkFrame and secondk+Integration result between 1 frame to obtaink+Position, velocity and attitude of 1 frame;
step 2.2, realizing mutual constraint of vision and inertial navigation according to the pre-integration information calculated in the step 2.1, and finishing initialization;
step 2.3, constructing an objective function, wherein the objective function comprises three parts of marginalization prior information, an inertial measurement residual error and a visual re-projection residual error, optimizing the state vector of each frame to enable the objective function to reach the minimum value, and solving the final pose result of each frame;
step 3, obtaining the three-axis acceleration of inertial navigationAnd triaxial angular velocityData, satellite dataMatching the angular velocity, the acceleration and the satellite derivative data of the inertial navigation through the timestamp, fusing the inertial navigation and satellite derivative data based on the extended Kalman filtering, and calculating to obtain an inertial navigation-satellite navigation first-stage positioning result with the same frequency as the inertial navigation data;
Step 4, utilizing the time stamp to calculate the two primary positioning resultsAndmatching is carried out, and an objective function is constructed by using two primary positioning resultsPerforming global optimization to obtain the final positioning result。
And in the step 1, three sensors, namely the inertial navigation sensor, the binocular camera and the satellite antenna, are rigidly and fixedly connected to the same carrier, the inertial navigation zero offset is calibrated statically, the internal reference matrix of the binocular camera, the inertial navigation sensor and the transformation matrix of the binocular camera are calibrated jointly by using a calibration plate, the transformation matrix from the satellite antenna to the inertial navigation sensor and the transformation matrix from the satellite antenna to the binocular camera are measured, and the inertial navigation zero offset sensor, the internal reference matrix of the binocular camera and the transformation matrix between the three sensors are obtained.
Furthermore, the three-axis acceleration according to inertial navigation in step 2.1And triaxial angular velocityData calculation ofkFrame and the firstk+Integration result between 1 frame to obtaink+Position, velocity, and attitude of 1 frame:
in the formula,is the first in the world coordinate systemk+The position of 1 frame of inertial navigation,is the first in the world coordinate systemkThe position of the frame inertial navigation is determined,is the first in the world coordinate systemkThe speed of the frame inertial navigation is determined,is the time interval between two key frames,is as followstThree-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,is a firsttA rotation matrix from the frame inertial navigation coordinate system to the world coordinate system,is as followstZero offset of three-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,is a gravity vector under a world coordinate system,is the first in the world coordinate systemk+The speed of the inertial navigation is 1 frame,is the first in the world coordinate systemk+1 frame of the posture of inertial navigation,is the first in the world coordinate systemkThe attitude of the frame inertial navigation is determined,in order to be a sign of the quaternion multiplication,as a function of angular velocity transformationFunction ofIs to makeCarry-in functionIn the above-mentioned step (2), calculating,is as followstThree-axis angular velocity of inertial navigation under a frame inertial navigation coordinate system,is as followstZero offset of three-axis angular velocity of inertial navigation under the frame inertial navigation coordinate system,is a firsttFrame inertial navigation coordinate system relative to the secondkThe attitude variation of the frame inertial navigation coordinate system is determined by the fact that the inertial navigation data frequency is higher than the visual data frequencytIs a firstkFrame visual data tok+And 1, inertial navigation data frame among the frames of visual data, wherein the world coordinate system is a right-hand coordinate system which is constructed by taking the position of the 0 th frame of inertial navigation as an original point and taking the gravity direction, the north direction and the east direction, and the inertial navigation coordinate system is a right-hand coordinate system which is constructed by taking the position of the current frame of inertial navigation as the original point and taking the gyroscope and meter adding directions of inertial navigation hardware.
Will be variableFrom the firstkFrame to firstkAnd (4) separating in an integral term of +1 frames to obtain:
in the formula,is as followskThe variation of the position of the (k + 1) th frame inertial navigation in the frame inertial navigation coordinate system,is a firstkThe variation of the speed of the (k + 1) th frame inertial navigation in the frame inertial navigation coordinate system,is as followskThe attitude variation of the (k + 1) th frame inertial navigation in the frame inertial navigation coordinate system,is a firstkThe first under the frame inertial navigation coordinate systemtThe attitude variation of the frame inertial navigation,is as followstFrame inertial navigation coordinate system tokA rotation matrix of the frame inertial navigation coordinate system,is as followstThree-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,is a firsttZero offset of three-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,as a function of angular velocity transformationFunction ofIs to makeCarry-in functionIn the above-mentioned step (2), calculating,is as followstThree-axis angular velocity of inertial navigation under a frame inertial navigation coordinate system,is as followstAnd zero deviation of three-axis angular velocity of inertial navigation under the frame inertial navigation coordinate system.
In step 2.2, the zero offset of the inertial navigation data is constrained by using the visual data, scale information is provided for the visual data by the inertial navigation data, an objective function is constructed by rotation of the visual data and the inertial navigation data, and a variable is adjustedAnd enabling the target function to reach the minimum value so as to correct the zero offset of the gyroscope, wherein the target function is as follows:
in the formula,is as followskA rotation matrix from the +1 frame inertial navigation coordinate system to the 0 th frame camera coordinate system,is as followskA rotation matrix from the frame inertial navigation coordinate system to the 0 th frame camera coordinate system,is a firstk+1 frame inertial navigation coordinate system tokThe rotation of the frame inertial navigation coordinate system,the camera coordinate system is a quaternion multiplication symbol, and is a right-hand coordinate system which is constructed by taking the position of the left eye of the current frame camera as an original point and taking the transverse direction, the longitudinal direction and the forward direction of an image.
Wherein,
in the formula,is at the firstkThe first under the frame inertial navigation coordinate systemkThe estimated value of +1 frame inertial navigation coordinate system rotation is obtained by integral calculation of the observed value of angular velocity,in the form of a sign of a quaternion multiplication,for a jacobian matrix with zero offset angular velocity for the rotation increment,in increments of zero offset of angular velocity.
Then, an objective function is constructed through translation and rotation of vision and inertial navigation, and optimization variables are adjustedAnd enabling the objective function to reach the minimum value so as to initialize the speed, the gravity and the scale, wherein the objective function is as follows:
in the formula,the variables of the optimization are represented by a table,to optimize variablesThe residual error of the inertial measurement of the structure,representing optimization variables by adjustingMake the inertia measurement residual errorThe minimum value is reached, and the minimum value,is as followskThe first under the frame inertial navigation coordinate systemkThe pre-integration position delta error for +1 frame inertial navigation,is as followskThe first under the frame inertial navigation coordinate systemkA pre-integrated velocity delta error for +1 frame of inertial navigation,is as followskThe first under the frame inertial navigation coordinate systemkThe pre-integration position increment for +1 frame of inertial navigation,is as followskThe first under the frame inertial navigation coordinate systemkThe pre-integration velocity increment for +1 frame of inertial navigation,from the camera coordinate system of frame 0 to thekA rotation matrix of the frame inertial navigation coordinate system,sin order to be a scale of,is as followskA rotation matrix from the frame inertial navigation coordinate system to the 0 th frame camera coordinate system,is a firstkThe three-axis velocity under +1 frame inertial navigation coordinate system,is as followskThree-axis speed under the frame inertial navigation coordinate system,is a firstkA rotation matrix of +1 frame inertial navigation coordinate system to 0 th frame camera coordinate system,is the second frame in the 0 th frame camera coordinate systemkThe relative position of the frame inertial navigation is determined,is the first frame in the 0 th frame camera coordinate systemk+The relative position of 1 frame of inertial navigation,is the gravity vector in the 0 th frame camera coordinate system,the time interval between two key frames.
Wherein:
in the formula,to compriseThe number of the optimization variables of the unknowns,refer to the 0 th framenFrame sharingVelocity unknowns for three axes of the frame, 3 refers toThe expressed gravity vector unknowns, 1 refers tosThe scale that is represented is an unknown quantity,from frame 0 to frame 0nThe three-axis velocity of the frame in the inertial navigation coordinate system,is the gravity vector in the 0 th frame camera coordinate system,sis a scale.
Moreover, the objective function in step 2.3 is:
in the formula,for the purpose of marginalizing the a-priori information,a Jacobian matrix calculated for the prior information,to optimize variablesConstructed inertial measurement residual,To optimize variablesConstructed visual reprojection residual。
Wherein:
optimizing variablesIn the step (1), the first step,is as followskThe state vector of the frame to be optimized,is a transformation matrix from a camera coordinate system to an inertial navigation coordinate system,is as followsmThe inverse depth of each feature point, i.e., the inverse of the depth.
In the formula,is the 0 th frame to the 0 th framekThe state vector of the frame to be optimized,is the first inertia of the world coordinate systemkThe position of the frame guide is determined,is the first in the world coordinate systemkThe speed of the frame inertial navigation is determined,is the first in the world coordinate systemkThe attitude of the frame inertial navigation is determined,is as followskZero offset of three-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,is as followskThe three-axis angular velocity of inertial navigation under the frame inertial navigation coordinate system is zero offset,the lever arm values from the camera coordinate system to the inertial navigation coordinate system,is a rotational quaternion from the camera coordinate system to the inertial navigation coordinate system.
Adjusting optimization variables using gauss-newton method(equation 12) the objective function (equation 11) is minimized, and the state vector obtained at this timeNamely the vision-inertial navigation first-level positioning result。
Furthermore, the three-axis acceleration according to inertial navigation in the step 3And triaxial angular velocityData and satellite dataAnd constructing a state transition equation and an observation equation of Kalman filtering:
in the formula,is as followskThe observed value of the frame is determined,is as followskThe process noise of the frame is a noise,is as followskThe observed noise of the frame is a function of,is as followskThe state of frame inertial navigation is respectively position, speed, attitude, gyroscope and acceleration zero-offset parameters, and is represented as:
estimation of previous frame using Taylor expansionAnd a state prediction value of the current frameAnd (3) unfolding:
in the formula,is a function ofIn thatJacobian moment ofThe number of the arrays is determined,is a function ofIn thatThe jacobian matrix of (a).
The prediction equation is:
in the formula,is as followskThe predicted value of the frame inertial navigation state,is as followskA covariance matrix of predicted values of the frame inertial navigation state;is a function ofIn thatThe jacobian matrix of (a) is,is as followsk1 covariance matrix of inertial navigation states,is the covariance matrix of the process noise.
The correction equation is:
in the formula,is a firstkThe kalman gain of the frame is determined by the frame,is a firstkA covariance matrix of predicted values of the frame inertial navigation states,is a function ofIn thatThe jacobian matrix of (a) is,in order to observe the covariance matrix of the noise,is as followskThe output value of the frame inertial navigation state,is as followskThe observed value of the frame is determined,is as followskThe predicted value of the frame inertial navigation state,is an identity matrix.
Calculated by the above extended Kalman Filter AlgorithmNamely the inertial navigation-satellite navigation primary positioning result。
Furthermore, in the step 4, the visual-inertial navigation primary positioning result is matched through the timestampInertial navigation-satellite navigation primary positioning resultAnd constructing an objective function:
in the formula,is a transformation matrix between two frames of inertial navigation-satellite navigation first-level positioning results under a longitude and latitude high coordinate system,is a transformation matrix from the inertial navigation-satellite navigation primary positioning result to the vision-inertial navigation primary positioning result,is a transformation matrix of the pose between two frames under the 0 th frame camera coordinate system,is a transformation matrix between two frames of vision-inertial navigation first-level positioning results under the 0 th frame camera coordinate system,、is the pose variable to be optimized.
Adjusting optimization variables by gauss-newton method、The target function of the matched two frames of data reaches the minimum value, and the final optimization variable is outputAndi.e. as the final pose result。
Compared with the prior art, the invention has the following advantages:
under the condition that new observation data are not added, the output frequency and the positioning precision of the satellite navigation system are improved through the fusion of the inertial navigation system and the satellite navigation system, the time matching error of multi-source data can be effectively reduced, and the positioning precision is improved.
Drawings
FIG. 1 is a flow chart of an embodiment of the present invention.
Detailed Description
The invention provides a vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing, and the technical scheme of the invention is further explained by combining the attached drawings and an embodiment.
As shown in fig. 1, the process of the embodiment of the present invention includes the following steps:
step 1, rigidly and fixedly connecting three sensors of inertial navigation, a binocular camera and a satellite antenna to the same carrier, calibrating inertial navigation zero offset through stillness, jointly calibrating an internal reference matrix of the binocular camera, an inertial navigation matrix and a transformation matrix of the binocular camera by using a calibration plate, measuring the transformation matrix from the satellite antenna to the inertial navigation, measuring the transformation matrix from the satellite antenna to the binocular camera, and obtaining the inertial navigation zero offset, the internal reference matrix of the binocular camera and the transformation matrix among the three sensors.
Step 2, obtaining the three-axis acceleration of inertial navigationAnd triaxial angular velocityData, binocular vision data、Matching the acceleration and the angular velocity of inertial navigation with binocular vision data through a timestamp, fusing the inertial navigation and the vision data based on a vision SLAM frame optimized by a graph, and calculating to obtain a vision-inertial navigation primary positioning result with the same frequency as the binocular vision data。
Step 2.1, three-axis acceleration according to inertial navigationAnd triaxial angular velocityData calculation ofkFrame and the firstk+Integration result between 1 frame to obtaink+Position, velocity and attitude of 1 frame.
In the formula,is the first in the world coordinate systemk+The position of 1 frame of inertial navigation (usually, the position of the 0 th frame of inertial navigation is taken as the origin, and the right-hand system constructed in the gravity direction, the north direction and the east direction is taken as a world coordinate system),is the first in the world coordinate systemkThe position of the frame inertial navigation is determined,is the first in the world coordinate systemkThe speed of the frame inertial navigation is determined,is the time interval between two key frames,is a firsttThree-axis acceleration of inertial navigation under a frame inertial navigation coordinate system (an inertial navigation coordinate system, namely a Body system, takes the position of the current frame inertial navigation as an origin, and constructs a right-hand coordinate system in the directions of a gyroscope and a summator of inertial navigation hardware, generally a front right lower coordinate system),is a firsttA rotation matrix from the frame inertial navigation coordinate system to the world coordinate system,is a firsttZero offset of three-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,is a gravity vector under a world coordinate system,is the first in the world coordinate systemk+The speed of the inertial navigation is 1 frame,is the first in the world coordinate systemk+1 frame of the attitude of inertial navigation,is the first in the world coordinate systemkThe attitude of the frame inertial navigation is determined,in the form of a sign of a quaternion multiplication,as a function of angular velocity transformationFunction ofIs to makeCarry-in functionIn the above-mentioned step (2), the calculation is carried out,is a firsttThree-axis angular velocity of inertial navigation under a frame inertial navigation coordinate system,is as followstZero offset of three-axis angular velocity of inertial navigation under the frame inertial navigation coordinate system,is as followstFrame inertial navigation coordinate system relative to the secondkThe attitude variation of the frame inertial navigation coordinate system is determined by the fact that the inertial navigation data frequency is higher than the visual data frequencytIs a firstkFrame visual data tok+Inertial navigation data frames between 1 frame of visual data.
Since the integration of the inertial navigation data needs to depend onkThe speed and attitude of the frame, when the back-end is non-linearly optimized, needs iteration and updatekThe speed and attitude of the frame, which results in the need to integrate again according to the value after each iteration, the amount of calculation is very large, so the variables are consideredFrom the firstkFrame to firstkThe integral term of +1 frame is separated, namely:
in the formula,is as followskThe variation of the position of the (k + 1) th frame inertial navigation in the frame inertial navigation coordinate system,is as followskThe variation of the speed of the (k + 1) th frame inertial navigation in the frame inertial navigation coordinate system,is as followskThe attitude variation of the (k + 1) th frame inertial navigation in the frame inertial navigation coordinate system,is as followskThe first under the frame inertial navigation coordinate systemtThe attitude variation of the frame inertial navigation,is as followstFrame inertial navigation coordinate system tokA rotation matrix of the frame inertial navigation coordinate system,is as followstThree-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,is as followstZero offset of three-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,as a function of angular velocity transformationFunction ofIs to makeCarry-in functionIn the above-mentioned step (2), the calculation is carried out,is as followstThree-axis angular velocity of inertial navigation under a frame inertial navigation coordinate system,is as followstAnd zero deviation of three-axis angular velocity of inertial navigation under the frame inertial navigation coordinate system.
And 2.2, realizing mutual constraint of vision and inertial navigation according to the pre-integration information calculated in the step 2.1, and finishing initialization.
And the zero offset of the inertial navigation data is constrained by using the visual data, and scale information is provided for the visual data through the inertial navigation data, so that the final pose result of each frame can be conveniently solved subsequently.
Firstly, an objective function is constructed through rotation of vision and inertial navigation, and variables are adjustedAnd the target function is enabled to reach the minimum value, so that the gyro zero offset is corrected.
The objective function is:
in the formula,is a firstkA rotation matrix from the +1 frame inertial navigation coordinate system to the 0 th frame camera coordinate system (the camera coordinate system is a right-hand coordinate system which is constructed by taking the position of the left eye of the current frame camera as an origin and taking the transverse direction, the longitudinal direction and the forward direction of an image),is as followskA rotation matrix from the frame inertial navigation coordinate system to the 0 th frame camera coordinate system,is as followsk+1 frame inertial navigation coordinate system tokThe rotation of the frame inertial navigation coordinate system,is a quaternion multiplication sign.
Wherein:
in the formula,is at the firstkThe first under the frame inertial navigation coordinate systemkThe estimated value of +1 frame inertial navigation coordinate system rotation is obtained by integral calculation of the observed value of angular velocity,in the form of a sign of a quaternion multiplication,for a jacobian matrix with zero offset angular velocity for the rotation increment,in increments of zero offset of angular velocity.
Then, an objective function is constructed through translation and rotation of vision and inertial navigation, and optimization variables are adjustedThe objective function is minimized to initialize velocity, gravity and scale.
The objective function is:
in the formula,the variables of the optimization are represented by a table,to optimize variablesThe residual error of the inertial measurement of the structure,representing optimization variables by tuningMake the inertia measurement residual errorThe minimum value is reached, and the minimum value,is as followskThe first under the frame inertial navigation coordinate systemkA pre-integrated position increment error for +1 frame of inertial navigation,is as followskThe first under the frame inertial navigation coordinate systemkA pre-integrated velocity delta error for +1 frame of inertial navigation,is as followskFrame inertial navigation coordinate systemkThe pre-integration position increment for +1 frame of inertial navigation,is as followskThe first under the frame inertial navigation coordinate systemkThe pre-integration velocity increment for +1 frame of inertial navigation,from the camera coordinate system of frame 0 to the camera coordinate system of frame 0kA rotation matrix of the frame inertial navigation coordinate system,sin order to be a scale of,is a firstkA rotation matrix from the frame inertial navigation coordinate system to the 0 th frame camera coordinate system,is as followskThe three-axis velocity under +1 frame inertial navigation coordinate system,is a firstkThree-axis speed under the frame inertial navigation coordinate system,is as followskA rotation matrix from the +1 frame inertial navigation coordinate system to the 0 th frame camera coordinate system,is the second frame in the 0 th frame camera coordinate systemkThe relative position of the frame inertial navigation system,is the first frame in the 0 th frame camera coordinate systemk+The relative position of 1 frame of inertial navigation,is the gravity vector in the 0 th frame camera coordinate system,the time interval between two key frames.
Wherein:
in the formula,to compriseThe number of the optimization variables of the unknowns,refer to the 0 th frame through the 0 th framenFrame sharingVelocity unknowns for three axes of the frame, 3 refers toThe expressed gravity vector unknowns, 1 refers tosThe scale that is represented is an unknown quantity,is the 0 th frame to the 0 th framenThe three-axis velocity of the frame in the inertial navigation coordinate system,is the gravity vector in the frame 0 camera coordinate system,sis a scale.
And 2.3, constructing an objective function, wherein the objective function comprises three parts of marginalized prior information, inertial measurement residual errors and visual re-projection residual errors, optimizing variables of each frame to enable the objective function to reach a minimum value, and solving a final pose result of each frame.
The objective function is:
in the formula,for the purpose of the marginalized a priori information,a Jacobian matrix calculated for the prior information,to optimize variablesConstructed inertial measurement residual,To optimize variablesConstructed visual re-projection residual。
Wherein:
optimizing variablesIn (1),is as followskThe state vector of the frame to be optimized,is a transformation matrix from the camera coordinate system to the inertial navigation coordinate system,is as followsmThe inverse depth of each feature point, i.e., the inverse of the depth.
In the formula,is the 0 th frame to the 0 th framekThe state vector of the frame to be optimized,is the first inertia of the world coordinate systemkThe position of the frame guide is determined,is the first in the world coordinate systemkThe speed of the frame inertial navigation is determined,is the first in the world coordinate systemkThe attitude of the frame inertial navigation is determined,is as followskZero offset of three-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,is as followskThe three-axis angular velocity of inertial navigation under the frame inertial navigation coordinate system is zero offset,the lever arm values from the camera coordinate system to the inertial navigation coordinate system,is a rotational quaternion from the camera coordinate system to the inertial navigation coordinate system.
Adjusting optimization variables using gauss-newton method(equation 12) the objective function (equation 11) is minimized, and the state vector obtained at this timeNamely the vision-inertial navigation first-level positioning result。
Step 3, obtaining the three-axis acceleration of inertial navigationAnd triaxial angular velocityData, derivative dataMatching the angular velocity, the acceleration and the satellite derivative data of the inertial navigation through the timestamp, fusing the inertial navigation and satellite derivative data based on the extended Kalman filtering, and calculating to obtain an inertial navigation-satellite navigation first-stage positioning result with the same frequency as the inertial navigation data。
Three-axis acceleration based on inertial navigationAnd triaxial angleSpeed of rotationData and satellite dataAnd constructing a state transition equation and an observation equation of Kalman filtering:
in the formula,is as followskThe observed value of the frame is determined,is as followskThe process noise of the frame is a noise,is as followskThe observed noise of the frame is a function of,is as followskThe state of frame inertial navigation is respectively position, speed, attitude, gyroscope and acceleration zero-offset parameters, and is represented as:
estimation of previous frame using Taylor expansionAnd a state prediction value of the current frameAnd (3) unfolding:
in the formula,is a function ofIn thatThe jacobian matrix of (a) is,is a function ofIn thatThe jacobian matrix of (a).
The prediction equation is:
in the formula,is a firstkThe predicted value of the frame inertial navigation state,is a firstkA covariance matrix of predicted values of the frame inertial navigation state;is a function ofIn thatThe jacobian matrix of (a) is,is as followsk1 covariance matrix of inertial navigation states,is the covariance matrix of the process noise.
The correction equation is:
in the formula,is as followskThe kalman gain of the frame is determined by the frame,is as followskA covariance matrix of predicted values of the frame inertial navigation states,is a function ofIn thatThe jacobian matrix of (a) is,in order to observe the covariance matrix of the noise,is a firstkThe output value of the frame inertial navigation state,is as followskThe observed value of the frame is determined,is as followskThe predicted value of the frame inertial navigation state,is an identity matrix.
Calculated by the extended Kalman Filter AlgorithmNamely the inertial navigation-satellite navigation primary positioning result。
Step 4, utilizing the time stamp to obtain two primary positioning results obtained by calculationAndmatching is carried out, and an objective function is constructed by using two primary positioning resultsPerforming global optimization to obtain the final positioning result。
Matching visual-inertial navigation primary positioning results through timestampsInertial navigation-satellite navigation primary positioning resultAnd constructing an objective function:
in the formula,Is a transformation matrix between two frames of inertial navigation-satellite navigation first-level positioning results under a longitude and latitude high coordinate system,is a transformation matrix from the inertial navigation-satellite navigation primary positioning result to the vision-inertial navigation primary positioning result,is a transformation matrix of the pose between two frames under the 0 th frame camera coordinate system,is a transformation matrix between the first-level positioning results of the two frames of vision-inertial navigation under the 0 th frame camera coordinate system,、and the pose variable to be optimized.
Adjusting optimization variables by Gauss-Newton method、The target function of the matched two frames of data reaches the minimum value, and the final optimization variable is outputAndi.e. as the final pose result。
The specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made to the described embodiments or alternatives may be employed by those skilled in the art without departing from the spirit or scope of the invention as defined in the appended claims.
Claims (10)
1. A vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing is characterized by comprising the following steps:
step 1, acquiring inertial navigation zero offset, an internal reference matrix of a binocular camera and a transformation matrix among three sensors of inertial navigation, the binocular camera and a satellite antenna;
step 2, obtaining the three-axis acceleration of inertial navigationAnd triaxial angular velocityData, binocular vision data、Matching the acceleration and the angular velocity of inertial navigation with binocular vision data through a timestamp, fusing the inertial navigation and the vision data based on a vision SLAM frame optimized by a graph, and calculating to obtain a vision-inertial navigation primary positioning result with the same frequency as the binocular vision data;
Step 2.1, three-axis acceleration according to inertial navigationAnd triaxial angular velocityData calculation ofkFrame and secondk+Integration result between 1 frame to obtaink+Position, velocity and attitude of 1 frame;
step 2.2, realizing mutual constraint of vision and inertial navigation according to the pre-integration information calculated in the step 2.1, and finishing initialization;
step 2.3, constructing an objective function, wherein the objective function comprises three parts of marginalized prior information, inertial measurement residual errors and visual re-projection residual errors, optimizing the state vector of each frame to enable the objective function to reach the minimum value, and solving the final pose result of each frame;
step 3, obtaining the three-axis acceleration of inertial navigationAnd triaxial angular velocityData, derivative dataMatching angular velocity, acceleration and satellite derivative data of inertial navigation through a timestamp, fusing the inertial navigation and satellite derivative data based on extended Kalman filtering, and calculating to obtain an inertial navigation-satellite navigation first-stage positioning result with the same frequency as the inertial navigation data;
2. The vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing as claimed in claim 1, characterized in that: in the step 1, three sensors of inertial navigation, a binocular camera and a satellite antenna are rigidly and fixedly connected to the same carrier, inertial navigation zero offset is calibrated in a static mode, an internal reference matrix of the binocular camera, an inertial navigation matrix of the binocular camera and a transformation matrix of the binocular camera are calibrated in a combined mode through a calibration plate, the transformation matrix from the satellite antenna to the inertial navigation and the transformation matrix from the satellite antenna to the binocular camera are measured, and the inertial navigation zero offset, the internal reference matrix of the binocular camera and the transformation matrix among the three sensors are obtained.
3. The vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing as claimed in claim 1, characterized in that: step 2.1 three-axis acceleration according to inertial navigationAnd triaxial angular velocityData calculation ofkFrame and secondk+Integration results between 1 frame to obtaink+Position, velocity, and attitude of 1 frame:
in the formula,is the first in the world coordinate systemk+The position of 1 frame of inertial navigation,is the first in the world coordinate systemkThe position of the frame inertial navigation is determined,is the first in the world coordinate systemkThe speed of the frame inertial navigation is determined,is the time interval between two key frames,is as followstThree-axis acceleration of inertial navigation under the frame inertial navigation coordinate system,is as followstA rotation matrix from the frame inertial navigation coordinate system to the world coordinate system,is as followstZero offset of three-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,is a gravity vector under a world coordinate system,is the first in the world coordinate systemk+The speed of the inertial navigation is 1 frame,is the first in the world coordinate systemk+1 frame of the attitude of inertial navigation,is the first in the world coordinate systemkThe attitude of the frame inertial navigation is determined,in the form of a sign of a quaternion multiplication,as a function of angular velocity transformationFunction ofIs to makeCarry-in functionIn the above-mentioned step (2), the calculation is carried out,is a firsttThree-axis angular velocity of inertial navigation under a frame inertial navigation coordinate system,is as followstZero offset of three-axis angular velocity of inertial navigation under the frame inertial navigation coordinate system,is a firsttFrame inertial navigation coordinate system relative to the secondkThe attitude variation of the frame inertial navigation coordinate system is determined by the fact that the inertial navigation data frequency is higher than the visual data frequencytIs a firstkFrame visual data tok+And 1, the world coordinate system is a right-hand coordinate system which is constructed by taking the position of the inertial navigation of the frame 0 as an original point and the directions of gravity, north and east, and the inertial navigation coordinate system is a right-hand coordinate system which is constructed by taking the position of the inertial navigation of the current frame as an original point and the directions of a gyroscope and a meter of inertial navigation hardware.
4. The vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing as claimed in claim 3, characterized in that: step 2.1 variablesFrom the firstkFrame to firstkSeparating in the integral term of +1 frame to obtain:
in the formula,is as followskThe variation of the position of the (k + 1) th frame inertial navigation in the frame inertial navigation coordinate system,is a firstkThe variation of the speed of the (k + 1) th frame inertial navigation in the frame inertial navigation coordinate system,is a firstkThe attitude variation of the (k + 1) th frame inertial navigation in the frame inertial navigation coordinate system,is as followskThe first under the frame inertial navigation coordinate systemtThe attitude variation of the frame inertial navigation,is as followstFrame inertial navigation coordinate system tokA rotation matrix of the frame inertial navigation coordinate system,is a firsttThree-axis acceleration of inertial navigation under the frame inertial navigation coordinate system,is a firsttZero offset of three-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,as a function of angular velocity transformationFunction ofIs to makeCarry-in functionIn the above-mentioned step (2), calculating,is as followstThree-axis angular velocity of inertial navigation under a frame inertial navigation coordinate system,is as followstAnd zero deviation of three-axis angular velocity of inertial navigation under the frame inertial navigation coordinate system.
5. The vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing as claimed in claim 1, characterized in that: step 2.2, zero offset of the inertial navigation data is constrained by using the visual data, scale information is provided for the visual data through the inertial navigation data, firstly, a target function is constructed through rotation of vision and inertial navigation, and variables are adjustedAnd enabling the target function to reach the minimum value so as to correct the zero offset of the gyroscope, wherein the target function is as follows:
in the formula,is as followsk+1 frame inertial navigation coordinateTo the 0 th frame camera coordinate system,is as followskA rotation matrix from the frame inertial navigation coordinate system to the 0 th frame camera coordinate system,is as followsk+1 frame inertial navigation coordinate system tokThe rotation of the frame inertial navigation coordinate system,the camera coordinate system is a right-hand coordinate system which is constructed by taking the position of the left eye of the current frame camera as an original point and taking the transverse direction, the longitudinal direction and the forward direction of an image as quaternion multiplication symbols;
wherein:
in the formula,is at the firstkThe first under the frame inertial navigation coordinate systemkThe estimated value of +1 frame inertial navigation coordinate system rotation is obtained by integral calculation of the observed value of angular velocity,in the form of a sign of a quaternion multiplication,for a jacobian matrix with zero offset angular velocity for the rotation increment,in increments of zero offset of angular velocity.
6. The vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing as claimed in claim 5, characterized in that: step 2.2 then construct the mesh by translation and rotation of vision, inertial navigationStandard function, by adjusting the optimization variablesAnd enabling the objective function to reach the minimum value so as to initialize the speed, the gravity and the scale, wherein the objective function is as follows:
in the formula,the optimization variables are represented as a function of time,to optimize variablesThe residual error of the inertial measurement of the structure,representing optimization variables by adjustingMake the inertia measurement residual errorThe purpose of the method is to reach the minimum,is a firstkFrame inertial navigation coordinate systemkA pre-integrated position increment error for +1 frame of inertial navigation,is as followskThe first under the frame inertial navigation coordinate systemkA pre-integrated velocity delta error for +1 frame of inertial navigation,is as followskThe first under the frame inertial navigation coordinate systemkPre-integration position increase for +1 frame inertial navigationThe amount of the (B) component (A),is a firstkFrame inertial navigation coordinate systemkThe pre-integration velocity increment for +1 frame of inertial navigation,from the camera coordinate system of frame 0 to thekA rotation matrix of the frame inertial navigation coordinate system,sin order to be a scale of,is as followskA rotation matrix from the frame inertial navigation coordinate system to the 0 th frame camera coordinate system,is as followskThe three-axis velocity under +1 frame inertial navigation coordinate system,is as followskThree-axis velocity under the frame inertial navigation coordinate system,is as followskA rotation matrix of +1 frame inertial navigation coordinate system to 0 th frame camera coordinate system,is the second frame in the 0 th frame camera coordinate systemkThe relative position of the frame inertial navigation system,is the first frame in the 0 th frame camera coordinate systemk+The relative position of 1 frame of inertial navigation,is the gravity vector in the 0 th frame camera coordinate system,is the time interval between two key frames;
wherein:
in the formula,to compriseThe number of the optimization variables of the unknowns,refer to the 0 th frame through the 0 th framenFrame sharingVelocity unknowns for three axes of the frame, 3 refers toThe expressed gravity vector unknowns, 1 refers tosThe scale that is represented is an unknown quantity,is the 0 th frame to the 0 th framenThe three-axis velocity of the frame in the inertial navigation coordinate system,is the gravity vector in the frame 0 camera coordinate system,sis a scale.
7. The vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing as claimed in claim 1, characterized in that: the objective function in step 2.3 is:
in the formula,is an edgeThe a-priori information of the quantization is,a Jacobian matrix calculated for the prior information,to optimize variablesConstructed inertial measurement residual,To optimize variablesConstructed visual re-projection residual;
Wherein:
optimizing variablesIn (1),is a firstkThe state vector of the frame to be optimized,is a transformation matrix from a camera coordinate system to an inertial navigation coordinate system,is a firstmThe inverse depth of each feature point, namely the inverse depth;
in the formula,is the 0 th frame to the 0 th framekThe state vector of the frame to be optimized,is the first inertia of the world coordinate systemkThe position of the frame guide is determined,is the first in the world coordinate systemkThe speed of the frame inertial navigation is determined,is the first in the world coordinate systemkThe attitude of the frame inertial navigation is determined,is as followskZero offset of three-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,is as followskThe three-axis angular velocity of inertial navigation under the frame inertial navigation coordinate system is zero offset,the lever arm values from the camera coordinate system to the inertial navigation coordinate system,the rotation quaternion from a camera coordinate system to an inertial navigation coordinate system;
8. The vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing as claimed in claim 1, characterized in that: three-axis acceleration according to inertial navigation in step 3And triaxial angular velocityData and satellite dataAnd constructing a state transition equation and an observation equation of Kalman filtering:
in the formula,is as followskThe observed value of the frame is determined,is a firstkThe process noise of the frame is a function of,is a firstkThe observed noise of the frame is a function of,is as followskThe state of frame inertial navigation is respectively position, speed, attitude, gyroscope and acceleration zero-offset parameters, and is represented as:
estimation of previous frame using Taylor expansionAnd a state prediction value of the current frameAnd (3) unfolding:
9. The vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing as claimed in claim 8, characterized in that: the prediction equation in step 3 is:
in the formula,is as followskThe predicted value of the frame inertial navigation state,is a firstkA covariance matrix of predicted values of the frame inertial navigation states;is a function ofIn thatThe jacobian matrix of (a) is,is a firstk1 covariance matrix of inertial navigation states,a covariance matrix that is the process noise;
the correction equation is:
in the formula,is as followskThe kalman gain of the frame is determined by the frame,is as followskA covariance matrix of predicted values of the frame inertial navigation states,is a function ofIn thatThe jacobian matrix of (a) is,in order to observe the covariance matrix of the noise,is as followskThe output value of the frame inertial navigation state,is as followskThe observed value of the frame is determined,is as followskThe predicted value of the frame inertial navigation state,is an identity matrix;
10. The vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing as claimed in claim 1, characterized in that: step 4, matching the vision-inertial navigation primary positioning result through the timestampInertial navigation-satellite navigation primary positioning resultAnd constructing an objective function:
in the formula,is a transformation matrix between two frames of inertial navigation-satellite navigation first-level positioning results under a longitude and latitude high coordinate system,is a transformation matrix from the inertial navigation-satellite navigation primary positioning result to the vision-inertial navigation primary positioning result,is a transformation matrix of the pose between two frames under the 0 th frame camera coordinate system,in the 0 th frame of the camera coordinate systemA transformation matrix between the next two frames of vision-inertial navigation first-level positioning results,、is a pose variable to be optimized;
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310069650.0A CN115793001B (en) | 2023-02-07 | 2023-02-07 | Vision, inertial navigation and defending fusion positioning method based on inertial navigation multiplexing |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310069650.0A CN115793001B (en) | 2023-02-07 | 2023-02-07 | Vision, inertial navigation and defending fusion positioning method based on inertial navigation multiplexing |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115793001A true CN115793001A (en) | 2023-03-14 |
CN115793001B CN115793001B (en) | 2023-05-16 |
Family
ID=85430078
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310069650.0A Active CN115793001B (en) | 2023-02-07 | 2023-02-07 | Vision, inertial navigation and defending fusion positioning method based on inertial navigation multiplexing |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115793001B (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111121767A (en) * | 2019-12-18 | 2020-05-08 | 南京理工大学 | GPS-fused robot vision inertial navigation combined positioning method |
CN112781582A (en) * | 2020-12-26 | 2021-05-11 | 复三人工智能科技(上海)有限公司 | Multi-sensor fusion high-precision pose estimation algorithm under satellite weak observation condition |
CN113155124A (en) * | 2021-04-27 | 2021-07-23 | 涵涡智航科技(玉溪)有限公司 | Multi-source auxiliary navigation method and device |
WO2021147391A1 (en) * | 2020-01-21 | 2021-07-29 | 魔门塔(苏州)科技有限公司 | Map generation method and device based on fusion of vio and satellite navigation system |
CN113358134A (en) * | 2021-04-20 | 2021-09-07 | 重庆知至科技有限公司 | Visual inertial odometer system |
US20220292711A1 (en) * | 2021-03-10 | 2022-09-15 | Beijing Tusen Zhitu Technology Co., Ltd. | Pose estimation method and device, related equipment and storage medium |
CN115479602A (en) * | 2022-10-14 | 2022-12-16 | 北京航空航天大学 | Visual inertial odometer method fusing event and distance |
WO2022262878A1 (en) * | 2021-06-16 | 2022-12-22 | 华南理工大学 | Ltc-dnn-based visual inertial navigation combined navigation system and self-learning method |
-
2023
- 2023-02-07 CN CN202310069650.0A patent/CN115793001B/en active Active
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111121767A (en) * | 2019-12-18 | 2020-05-08 | 南京理工大学 | GPS-fused robot vision inertial navigation combined positioning method |
WO2021147391A1 (en) * | 2020-01-21 | 2021-07-29 | 魔门塔(苏州)科技有限公司 | Map generation method and device based on fusion of vio and satellite navigation system |
CN112781582A (en) * | 2020-12-26 | 2021-05-11 | 复三人工智能科技(上海)有限公司 | Multi-sensor fusion high-precision pose estimation algorithm under satellite weak observation condition |
US20220292711A1 (en) * | 2021-03-10 | 2022-09-15 | Beijing Tusen Zhitu Technology Co., Ltd. | Pose estimation method and device, related equipment and storage medium |
CN113358134A (en) * | 2021-04-20 | 2021-09-07 | 重庆知至科技有限公司 | Visual inertial odometer system |
CN113155124A (en) * | 2021-04-27 | 2021-07-23 | 涵涡智航科技(玉溪)有限公司 | Multi-source auxiliary navigation method and device |
WO2022262878A1 (en) * | 2021-06-16 | 2022-12-22 | 华南理工大学 | Ltc-dnn-based visual inertial navigation combined navigation system and self-learning method |
CN115479602A (en) * | 2022-10-14 | 2022-12-16 | 北京航空航天大学 | Visual inertial odometer method fusing event and distance |
Non-Patent Citations (3)
Title |
---|
CHANGWU LIU 等: "Variable Observability Constrained Visual-Inertial-GNSS EKF-Based Navigation" * |
李子月 等: "一种基于多传感器融合辅助的Alex Net模型图像识别算法" * |
金红新 等: "多传感器信息融合理论在无人机相对导航中的应用" * |
Also Published As
Publication number | Publication date |
---|---|
CN115793001B (en) | 2023-05-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109991636B (en) | Map construction method and system based on GPS, IMU and binocular vision | |
CN110030994B (en) | Monocular-based robust visual inertia tight coupling positioning method | |
CN109376785B (en) | Navigation method based on iterative extended Kalman filtering fusion inertia and monocular vision | |
CN107289933B (en) | Double card Kalman Filtering navigation device and method based on MEMS sensor and VLC positioning fusion | |
CN111795686B (en) | Mobile robot positioning and mapping method | |
CN111156987B (en) | Inertia/astronomy combined navigation method based on residual compensation multi-rate CKF | |
WO2019071916A1 (en) | Antenna beam attitude control method and system | |
CN110702143B (en) | Rapid initial alignment method for SINS strapdown inertial navigation system moving base based on lie group description | |
CN113203418B (en) | GNSSINS visual fusion positioning method and system based on sequential Kalman filtering | |
CN110702107A (en) | Monocular vision inertial combination positioning navigation method | |
CN108709552A (en) | A kind of IMU and GPS tight integration air navigation aids based on MEMS | |
CN109520476B (en) | System and method for measuring dynamic pose of rear intersection based on inertial measurement unit | |
CN115143954B (en) | Unmanned vehicle navigation method based on multi-source information fusion | |
CN108344413B (en) | Underwater glider navigation system and low-precision and high-precision conversion method thereof | |
CN112562077B (en) | Pedestrian indoor positioning method integrating PDR and priori map | |
CN112857398B (en) | Rapid initial alignment method and device for ship under mooring state | |
CN113503872B (en) | Low-speed unmanned aerial vehicle positioning method based on fusion of camera and consumption-level IMU | |
CN114199239A (en) | Double-vision auxiliary inertial differential cockpit head posture measuring system combined with Beidou navigation | |
CN114690229A (en) | GPS-fused mobile robot visual inertial navigation method | |
Faizullin et al. | Best axes composition: Multiple gyroscopes imu sensor fusion to reduce systematic error | |
CN112284381B (en) | Visual inertia real-time initialization alignment method and system | |
CN112284388B (en) | Unmanned aerial vehicle multisource information fusion navigation method | |
CN113008229A (en) | Distributed autonomous integrated navigation method based on low-cost vehicle-mounted sensor | |
CN117073720A (en) | Method and equipment for quick visual inertia calibration and initialization under weak environment and weak action control | |
CN115793001A (en) | Vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |