CN115793001A - Vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing - Google Patents

Vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing Download PDF

Info

Publication number
CN115793001A
CN115793001A CN202310069650.0A CN202310069650A CN115793001A CN 115793001 A CN115793001 A CN 115793001A CN 202310069650 A CN202310069650 A CN 202310069650A CN 115793001 A CN115793001 A CN 115793001A
Authority
CN
China
Prior art keywords
inertial navigation
frame
coordinate system
navigation
follows
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310069650.0A
Other languages
Chinese (zh)
Other versions
CN115793001B (en
Inventor
邵慧超
张彦
胡正奎
幺改明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Leador Spatial Information Technology Co ltd
Original Assignee
Leador Spatial Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Leador Spatial Information Technology Co ltd filed Critical Leador Spatial Information Technology Co ltd
Priority to CN202310069650.0A priority Critical patent/CN115793001B/en
Publication of CN115793001A publication Critical patent/CN115793001A/en
Application granted granted Critical
Publication of CN115793001B publication Critical patent/CN115793001B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Navigation (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

The invention discloses a vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing. Firstly, three sensors of inertial navigation, a binocular camera and a satellite antenna are rigidly and fixedly connected to the same carrier, an inertial navigation zero offset, camera parameters and a transformation matrix among the three sensors are calibrated, then inertial navigation and visual data are matched through a timestamp, the inertial navigation and visual data are fused based on a visual SLAM frame optimized by a graph to obtain a visual-inertial navigation primary positioning result with the same frequency as the visual data, the inertial navigation and satellite navigation data are matched through the timestamp, the inertial navigation and satellite navigation data are fused based on extended Kalman filtering to obtain an inertial navigation-satellite navigation primary positioning result with the same frequency as the inertial navigation data, and then the two primary positioning results are subjected to secondary fusion in a graph optimization mode to obtain a final pose result. The invention can improve the pose output frequency of satellite navigation by multiplexing the inertial navigation data, reduce the time matching errors of the inertial navigation, vision and satellite navigation data and improve the pose precision.

Description

Vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing
Technical Field
The invention belongs to the technical field of multi-source fusion positioning, and particularly relates to a vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing.
Background
The traditional vision, inertial navigation and satellite navigation multi-source fusion positioning technology is that vision and inertial navigation data are fused to obtain a positioning result, the positioning result and satellite navigation data are fused, and a final positioning result is output. Under the condition of known inertial navigation and satellite navigation data, the positioning output frequency can be improved by fusing the inertial navigation and satellite navigation data, and then the subsequent fusion is participated. The existing vision, inertial navigation and defense multi-source fusion positioning method at least has the following technical problems:
firstly, the frequency of multi-source data is different, the frequency of satellite data is lower, and the time error is larger in the process of multi-source data matching;
secondly, the satellite signals are greatly influenced by the external environment, the positioning accuracy is different in different environments, and the influence on the final positioning accuracy is large when the satellite signals directly participate in multi-source fusion positioning;
thirdly, satellite derivative data in the multi-source data are only used as a residual error item in global optimization, and the satellite derivative data are not fully utilized for positioning calculation.
Therefore, the technical problems that the matching time error of multi-source different-frequency data is large, the influence of the environment on the satellite-guided positioning precision also influences the final positioning precision, and the multi-source data is not fully utilized exist in the method in the prior art.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing, which is used for solving the technical problems that the matching time error of multi-source different-frequency data is large, the environment influences satellite navigation positioning precision, further influences final positioning precision and underutilizes multi-source data in the prior art.
In order to achieve the above object, the technical scheme provided by the invention is a vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing, which comprises the following steps:
step 1, acquiring inertial navigation zero offset, an internal reference matrix of a binocular camera and a transformation matrix among three sensors of inertial navigation, the binocular camera and a satellite antenna;
step 2, obtaining the three-axis acceleration of inertial navigation
Figure SMS_1
And triaxial angular velocity
Figure SMS_2
Data, binocular vision data
Figure SMS_3
Figure SMS_4
Matching the acceleration and angular velocity of inertial navigation with binocular vision data through a timestamp, fusing the inertial navigation and the vision data based on a visual SLAM frame optimized by a graph, and calculating to obtain a primary positioning result of the visual inertial navigation with the same frequency as the binocular vision data
Figure SMS_5
Step 2.1, three-axis acceleration according to inertial navigation
Figure SMS_6
And triaxial angular velocity
Figure SMS_7
Data calculation ofkFrame and secondk+Integration result between 1 frame to obtaink+Position, velocity and attitude of 1 frame;
step 2.2, realizing mutual constraint of vision and inertial navigation according to the pre-integration information calculated in the step 2.1, and finishing initialization;
step 2.3, constructing an objective function, wherein the objective function comprises three parts of marginalization prior information, an inertial measurement residual error and a visual re-projection residual error, optimizing the state vector of each frame to enable the objective function to reach the minimum value, and solving the final pose result of each frame;
step 3, obtaining the three-axis acceleration of inertial navigation
Figure SMS_8
And triaxial angular velocity
Figure SMS_9
Data, satellite data
Figure SMS_10
Matching the angular velocity, the acceleration and the satellite derivative data of the inertial navigation through the timestamp, fusing the inertial navigation and satellite derivative data based on the extended Kalman filtering, and calculating to obtain an inertial navigation-satellite navigation first-stage positioning result with the same frequency as the inertial navigation data
Figure SMS_11
Step 4, utilizing the time stamp to calculate the two primary positioning results
Figure SMS_12
And
Figure SMS_13
matching is carried out, and an objective function is constructed by using two primary positioning results
Figure SMS_14
Performing global optimization to obtain the final positioning result
Figure SMS_15
And in the step 1, three sensors, namely the inertial navigation sensor, the binocular camera and the satellite antenna, are rigidly and fixedly connected to the same carrier, the inertial navigation zero offset is calibrated statically, the internal reference matrix of the binocular camera, the inertial navigation sensor and the transformation matrix of the binocular camera are calibrated jointly by using a calibration plate, the transformation matrix from the satellite antenna to the inertial navigation sensor and the transformation matrix from the satellite antenna to the binocular camera are measured, and the inertial navigation zero offset sensor, the internal reference matrix of the binocular camera and the transformation matrix between the three sensors are obtained.
Furthermore, the three-axis acceleration according to inertial navigation in step 2.1
Figure SMS_16
And triaxial angular velocity
Figure SMS_17
Data calculation ofkFrame and the firstk+Integration result between 1 frame to obtaink+Position, velocity, and attitude of 1 frame:
Figure SMS_18
(1)
Figure SMS_19
(2)
Figure SMS_20
(3)
in the formula,
Figure SMS_31
is the first in the world coordinate systemk+The position of 1 frame of inertial navigation,
Figure SMS_33
is the first in the world coordinate systemkThe position of the frame inertial navigation is determined,
Figure SMS_35
is the first in the world coordinate systemkThe speed of the frame inertial navigation is determined,
Figure SMS_37
is the time interval between two key frames,
Figure SMS_38
is as followstThree-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,
Figure SMS_39
is a firsttA rotation matrix from the frame inertial navigation coordinate system to the world coordinate system,
Figure SMS_40
is as followstZero offset of three-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,
Figure SMS_21
is a gravity vector under a world coordinate system,
Figure SMS_24
is the first in the world coordinate systemk+The speed of the inertial navigation is 1 frame,
Figure SMS_26
is the first in the world coordinate systemk+1 frame of the posture of inertial navigation,
Figure SMS_27
is the first in the world coordinate systemkThe attitude of the frame inertial navigation is determined,
Figure SMS_29
in order to be a sign of the quaternion multiplication,
Figure SMS_32
as a function of angular velocity transformation
Figure SMS_34
Function of
Figure SMS_36
Is to make
Figure SMS_22
Carry-in function
Figure SMS_23
In the above-mentioned step (2), calculating,
Figure SMS_25
is as followstThree-axis angular velocity of inertial navigation under a frame inertial navigation coordinate system,
Figure SMS_28
is as followstZero offset of three-axis angular velocity of inertial navigation under the frame inertial navigation coordinate system,
Figure SMS_30
is a firsttFrame inertial navigation coordinate system relative to the secondkThe attitude variation of the frame inertial navigation coordinate system is determined by the fact that the inertial navigation data frequency is higher than the visual data frequencytIs a firstkFrame visual data tok+And 1, inertial navigation data frame among the frames of visual data, wherein the world coordinate system is a right-hand coordinate system which is constructed by taking the position of the 0 th frame of inertial navigation as an original point and taking the gravity direction, the north direction and the east direction, and the inertial navigation coordinate system is a right-hand coordinate system which is constructed by taking the position of the current frame of inertial navigation as the original point and taking the gyroscope and meter adding directions of inertial navigation hardware.
Will be variable
Figure SMS_41
From the firstkFrame to firstkAnd (4) separating in an integral term of +1 frames to obtain:
Figure SMS_42
(4)
Figure SMS_43
(5)
Figure SMS_44
(6)
in the formula,
Figure SMS_46
is as followskThe variation of the position of the (k + 1) th frame inertial navigation in the frame inertial navigation coordinate system,
Figure SMS_48
is a firstkThe variation of the speed of the (k + 1) th frame inertial navigation in the frame inertial navigation coordinate system,
Figure SMS_49
is as followskThe attitude variation of the (k + 1) th frame inertial navigation in the frame inertial navigation coordinate system,
Figure SMS_52
is a firstkThe first under the frame inertial navigation coordinate systemtThe attitude variation of the frame inertial navigation,
Figure SMS_53
is as followstFrame inertial navigation coordinate system tokA rotation matrix of the frame inertial navigation coordinate system,
Figure SMS_55
is as followstThree-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,
Figure SMS_58
is a firsttZero offset of three-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,
Figure SMS_45
as a function of angular velocity transformation
Figure SMS_47
Function of
Figure SMS_50
Is to make
Figure SMS_51
Carry-in function
Figure SMS_54
In the above-mentioned step (2), calculating,
Figure SMS_56
is as followstThree-axis angular velocity of inertial navigation under a frame inertial navigation coordinate system,
Figure SMS_57
is as followstAnd zero deviation of three-axis angular velocity of inertial navigation under the frame inertial navigation coordinate system.
In step 2.2, the zero offset of the inertial navigation data is constrained by using the visual data, scale information is provided for the visual data by the inertial navigation data, an objective function is constructed by rotation of the visual data and the inertial navigation data, and a variable is adjusted
Figure SMS_59
And enabling the target function to reach the minimum value so as to correct the zero offset of the gyroscope, wherein the target function is as follows:
Figure SMS_60
(7)
in the formula,
Figure SMS_61
is as followskA rotation matrix from the +1 frame inertial navigation coordinate system to the 0 th frame camera coordinate system,
Figure SMS_62
is as followskA rotation matrix from the frame inertial navigation coordinate system to the 0 th frame camera coordinate system,
Figure SMS_63
is a firstk+1 frame inertial navigation coordinate system tokThe rotation of the frame inertial navigation coordinate system,
Figure SMS_64
the camera coordinate system is a quaternion multiplication symbol, and is a right-hand coordinate system which is constructed by taking the position of the left eye of the current frame camera as an original point and taking the transverse direction, the longitudinal direction and the forward direction of an image.
Wherein,
Figure SMS_65
(8)
in the formula,
Figure SMS_66
is at the firstkThe first under the frame inertial navigation coordinate systemkThe estimated value of +1 frame inertial navigation coordinate system rotation is obtained by integral calculation of the observed value of angular velocity,
Figure SMS_67
in the form of a sign of a quaternion multiplication,
Figure SMS_68
for a jacobian matrix with zero offset angular velocity for the rotation increment,
Figure SMS_69
in increments of zero offset of angular velocity.
Then, an objective function is constructed through translation and rotation of vision and inertial navigation, and optimization variables are adjusted
Figure SMS_70
And enabling the objective function to reach the minimum value so as to initialize the speed, the gravity and the scale, wherein the objective function is as follows:
Figure SMS_71
(9)
in the formula,
Figure SMS_83
the variables of the optimization are represented by a table,
Figure SMS_85
to optimize variables
Figure SMS_86
The residual error of the inertial measurement of the structure,
Figure SMS_87
representing optimization variables by adjusting
Figure SMS_88
Make the inertia measurement residual error
Figure SMS_89
The minimum value is reached, and the minimum value,
Figure SMS_90
is as followskThe first under the frame inertial navigation coordinate systemkThe pre-integration position delta error for +1 frame inertial navigation,
Figure SMS_72
is as followskThe first under the frame inertial navigation coordinate systemkA pre-integrated velocity delta error for +1 frame of inertial navigation,
Figure SMS_74
is as followskThe first under the frame inertial navigation coordinate systemkThe pre-integration position increment for +1 frame of inertial navigation,
Figure SMS_75
is as followskThe first under the frame inertial navigation coordinate systemkThe pre-integration velocity increment for +1 frame of inertial navigation,
Figure SMS_77
from the camera coordinate system of frame 0 to thekA rotation matrix of the frame inertial navigation coordinate system,sin order to be a scale of,
Figure SMS_78
is as followskA rotation matrix from the frame inertial navigation coordinate system to the 0 th frame camera coordinate system,
Figure SMS_80
is a firstkThe three-axis velocity under +1 frame inertial navigation coordinate system,
Figure SMS_82
is as followskThree-axis speed under the frame inertial navigation coordinate system,
Figure SMS_84
is a firstkA rotation matrix of +1 frame inertial navigation coordinate system to 0 th frame camera coordinate system,
Figure SMS_73
is the second frame in the 0 th frame camera coordinate systemkThe relative position of the frame inertial navigation is determined,
Figure SMS_76
is the first frame in the 0 th frame camera coordinate systemk+The relative position of 1 frame of inertial navigation,
Figure SMS_79
is the gravity vector in the 0 th frame camera coordinate system,
Figure SMS_81
the time interval between two key frames.
Wherein:
Figure SMS_91
(10)
in the formula,
Figure SMS_92
to comprise
Figure SMS_93
The number of the optimization variables of the unknowns,
Figure SMS_94
refer to the 0 th framenFrame sharing
Figure SMS_95
Velocity unknowns for three axes of the frame, 3 refers to
Figure SMS_96
The expressed gravity vector unknowns, 1 refers tosThe scale that is represented is an unknown quantity,
Figure SMS_97
from frame 0 to frame 0nThe three-axis velocity of the frame in the inertial navigation coordinate system,
Figure SMS_98
is the gravity vector in the 0 th frame camera coordinate system,sis a scale.
Moreover, the objective function in step 2.3 is:
Figure SMS_99
(11)
in the formula,
Figure SMS_101
for the purpose of marginalizing the a-priori information,
Figure SMS_102
a Jacobian matrix calculated for the prior information,
Figure SMS_104
to optimize variables
Figure SMS_105
Constructed inertial measurement residual
Figure SMS_106
Figure SMS_108
To optimize variables
Figure SMS_100
Constructed visual reprojection residual
Figure SMS_103
Wherein:
Figure SMS_109
(12)
optimizing variables
Figure SMS_110
In the step (1), the first step,
Figure SMS_111
is as followskThe state vector of the frame to be optimized,
Figure SMS_112
is a transformation matrix from a camera coordinate system to an inertial navigation coordinate system,
Figure SMS_113
is as followsmThe inverse depth of each feature point, i.e., the inverse of the depth.
Figure SMS_114
(13)
Figure SMS_115
(14)
In the formula,
Figure SMS_117
is the 0 th frame to the 0 th framekThe state vector of the frame to be optimized,
Figure SMS_118
is the first inertia of the world coordinate systemkThe position of the frame guide is determined,
Figure SMS_119
is the first in the world coordinate systemkThe speed of the frame inertial navigation is determined,
Figure SMS_120
is the first in the world coordinate systemkThe attitude of the frame inertial navigation is determined,
Figure SMS_121
is as followskZero offset of three-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,
Figure SMS_122
is as followskThe three-axis angular velocity of inertial navigation under the frame inertial navigation coordinate system is zero offset,
Figure SMS_123
the lever arm values from the camera coordinate system to the inertial navigation coordinate system,
Figure SMS_116
is a rotational quaternion from the camera coordinate system to the inertial navigation coordinate system.
Adjusting optimization variables using gauss-newton method
Figure SMS_124
(equation 12) the objective function (equation 11) is minimized, and the state vector obtained at this time
Figure SMS_125
Namely the vision-inertial navigation first-level positioning result
Figure SMS_126
Furthermore, the three-axis acceleration according to inertial navigation in the step 3
Figure SMS_127
And triaxial angular velocity
Figure SMS_128
Data and satellite data
Figure SMS_129
And constructing a state transition equation and an observation equation of Kalman filtering:
Figure SMS_130
(15)
Figure SMS_131
(16)
in the formula,
Figure SMS_132
is as followskThe observed value of the frame is determined,
Figure SMS_133
is as followskThe process noise of the frame is a noise,
Figure SMS_134
is as followskThe observed noise of the frame is a function of,
Figure SMS_135
is as followskThe state of frame inertial navigation is respectively position, speed, attitude, gyroscope and acceleration zero-offset parameters, and is represented as:
Figure SMS_136
(17)
estimation of previous frame using Taylor expansion
Figure SMS_137
And a state prediction value of the current frame
Figure SMS_138
And (3) unfolding:
Figure SMS_139
(18)
Figure SMS_140
(19)
in the formula,
Figure SMS_141
is a function of
Figure SMS_142
In that
Figure SMS_143
Jacobian moment ofThe number of the arrays is determined,
Figure SMS_144
is a function of
Figure SMS_145
In that
Figure SMS_146
The jacobian matrix of (a).
The prediction equation is:
Figure SMS_147
(20)
Figure SMS_148
(21)
in the formula,
Figure SMS_149
is as followskThe predicted value of the frame inertial navigation state,
Figure SMS_150
is as followskA covariance matrix of predicted values of the frame inertial navigation state;
Figure SMS_151
is a function of
Figure SMS_152
In that
Figure SMS_153
The jacobian matrix of (a) is,
Figure SMS_154
is as followsk1 covariance matrix of inertial navigation states,
Figure SMS_155
is the covariance matrix of the process noise.
The correction equation is:
Figure SMS_156
(22)
Figure SMS_157
(23)
Figure SMS_158
(24)
in the formula,
Figure SMS_159
is a firstkThe kalman gain of the frame is determined by the frame,
Figure SMS_161
is a firstkA covariance matrix of predicted values of the frame inertial navigation states,
Figure SMS_163
is a function of
Figure SMS_165
In that
Figure SMS_166
The jacobian matrix of (a) is,
Figure SMS_167
in order to observe the covariance matrix of the noise,
Figure SMS_168
is as followskThe output value of the frame inertial navigation state,
Figure SMS_160
is as followskThe observed value of the frame is determined,
Figure SMS_162
is as followskThe predicted value of the frame inertial navigation state,
Figure SMS_164
is an identity matrix.
Calculated by the above extended Kalman Filter Algorithm
Figure SMS_169
Namely the inertial navigation-satellite navigation primary positioning result
Figure SMS_170
Furthermore, in the step 4, the visual-inertial navigation primary positioning result is matched through the timestamp
Figure SMS_171
Inertial navigation-satellite navigation primary positioning result
Figure SMS_172
And constructing an objective function:
Figure SMS_173
(25)
in the formula,
Figure SMS_174
is a transformation matrix between two frames of inertial navigation-satellite navigation first-level positioning results under a longitude and latitude high coordinate system,
Figure SMS_175
is a transformation matrix from the inertial navigation-satellite navigation primary positioning result to the vision-inertial navigation primary positioning result,
Figure SMS_176
is a transformation matrix of the pose between two frames under the 0 th frame camera coordinate system,
Figure SMS_177
is a transformation matrix between two frames of vision-inertial navigation first-level positioning results under the 0 th frame camera coordinate system,
Figure SMS_178
Figure SMS_179
is the pose variable to be optimized.
Adjusting optimization variables by gauss-newton method
Figure SMS_180
Figure SMS_181
The target function of the matched two frames of data reaches the minimum value, and the final optimization variable is output
Figure SMS_182
And
Figure SMS_183
i.e. as the final pose result
Figure SMS_184
Compared with the prior art, the invention has the following advantages:
under the condition that new observation data are not added, the output frequency and the positioning precision of the satellite navigation system are improved through the fusion of the inertial navigation system and the satellite navigation system, the time matching error of multi-source data can be effectively reduced, and the positioning precision is improved.
Drawings
FIG. 1 is a flow chart of an embodiment of the present invention.
Detailed Description
The invention provides a vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing, and the technical scheme of the invention is further explained by combining the attached drawings and an embodiment.
As shown in fig. 1, the process of the embodiment of the present invention includes the following steps:
step 1, rigidly and fixedly connecting three sensors of inertial navigation, a binocular camera and a satellite antenna to the same carrier, calibrating inertial navigation zero offset through stillness, jointly calibrating an internal reference matrix of the binocular camera, an inertial navigation matrix and a transformation matrix of the binocular camera by using a calibration plate, measuring the transformation matrix from the satellite antenna to the inertial navigation, measuring the transformation matrix from the satellite antenna to the binocular camera, and obtaining the inertial navigation zero offset, the internal reference matrix of the binocular camera and the transformation matrix among the three sensors.
Step 2, obtaining the three-axis acceleration of inertial navigation
Figure SMS_185
And triaxial angular velocity
Figure SMS_186
Data, binocular vision data
Figure SMS_187
Figure SMS_188
Matching the acceleration and the angular velocity of inertial navigation with binocular vision data through a timestamp, fusing the inertial navigation and the vision data based on a vision SLAM frame optimized by a graph, and calculating to obtain a vision-inertial navigation primary positioning result with the same frequency as the binocular vision data
Figure SMS_189
Step 2.1, three-axis acceleration according to inertial navigation
Figure SMS_190
And triaxial angular velocity
Figure SMS_191
Data calculation ofkFrame and the firstk+Integration result between 1 frame to obtaink+Position, velocity and attitude of 1 frame.
Figure SMS_192
(1)
Figure SMS_193
(2)
Figure SMS_194
(3)
In the formula,
Figure SMS_206
is the first in the world coordinate systemk+The position of 1 frame of inertial navigation (usually, the position of the 0 th frame of inertial navigation is taken as the origin, and the right-hand system constructed in the gravity direction, the north direction and the east direction is taken as a world coordinate system),
Figure SMS_207
is the first in the world coordinate systemkThe position of the frame inertial navigation is determined,
Figure SMS_209
is the first in the world coordinate systemkThe speed of the frame inertial navigation is determined,
Figure SMS_210
is the time interval between two key frames,
Figure SMS_212
is a firsttThree-axis acceleration of inertial navigation under a frame inertial navigation coordinate system (an inertial navigation coordinate system, namely a Body system, takes the position of the current frame inertial navigation as an origin, and constructs a right-hand coordinate system in the directions of a gyroscope and a summator of inertial navigation hardware, generally a front right lower coordinate system),
Figure SMS_213
is a firsttA rotation matrix from the frame inertial navigation coordinate system to the world coordinate system,
Figure SMS_214
is a firsttZero offset of three-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,
Figure SMS_195
is a gravity vector under a world coordinate system,
Figure SMS_198
is the first in the world coordinate systemk+The speed of the inertial navigation is 1 frame,
Figure SMS_200
is the first in the world coordinate systemk+1 frame of the attitude of inertial navigation,
Figure SMS_202
is the first in the world coordinate systemkThe attitude of the frame inertial navigation is determined,
Figure SMS_204
in the form of a sign of a quaternion multiplication,
Figure SMS_205
as a function of angular velocity transformation
Figure SMS_208
Function of
Figure SMS_211
Is to make
Figure SMS_196
Carry-in function
Figure SMS_197
In the above-mentioned step (2), the calculation is carried out,
Figure SMS_199
is a firsttThree-axis angular velocity of inertial navigation under a frame inertial navigation coordinate system,
Figure SMS_201
is as followstZero offset of three-axis angular velocity of inertial navigation under the frame inertial navigation coordinate system,
Figure SMS_203
is as followstFrame inertial navigation coordinate system relative to the secondkThe attitude variation of the frame inertial navigation coordinate system is determined by the fact that the inertial navigation data frequency is higher than the visual data frequencytIs a firstkFrame visual data tok+Inertial navigation data frames between 1 frame of visual data.
Since the integration of the inertial navigation data needs to depend onkThe speed and attitude of the frame, when the back-end is non-linearly optimized, needs iteration and updatekThe speed and attitude of the frame, which results in the need to integrate again according to the value after each iteration, the amount of calculation is very large, so the variables are considered
Figure SMS_215
From the firstkFrame to firstkThe integral term of +1 frame is separated, namely:
Figure SMS_216
(4)
Figure SMS_217
(5)
Figure SMS_218
(6)
in the formula,
Figure SMS_219
is as followskThe variation of the position of the (k + 1) th frame inertial navigation in the frame inertial navigation coordinate system,
Figure SMS_221
is as followskThe variation of the speed of the (k + 1) th frame inertial navigation in the frame inertial navigation coordinate system,
Figure SMS_223
is as followskThe attitude variation of the (k + 1) th frame inertial navigation in the frame inertial navigation coordinate system,
Figure SMS_225
is as followskThe first under the frame inertial navigation coordinate systemtThe attitude variation of the frame inertial navigation,
Figure SMS_227
is as followstFrame inertial navigation coordinate system tokA rotation matrix of the frame inertial navigation coordinate system,
Figure SMS_229
is as followstThree-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,
Figure SMS_230
is as followstZero offset of three-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,
Figure SMS_220
as a function of angular velocity transformation
Figure SMS_222
Function of
Figure SMS_224
Is to make
Figure SMS_226
Carry-in function
Figure SMS_228
In the above-mentioned step (2), the calculation is carried out,
Figure SMS_231
is as followstThree-axis angular velocity of inertial navigation under a frame inertial navigation coordinate system,
Figure SMS_232
is as followstAnd zero deviation of three-axis angular velocity of inertial navigation under the frame inertial navigation coordinate system.
And 2.2, realizing mutual constraint of vision and inertial navigation according to the pre-integration information calculated in the step 2.1, and finishing initialization.
And the zero offset of the inertial navigation data is constrained by using the visual data, and scale information is provided for the visual data through the inertial navigation data, so that the final pose result of each frame can be conveniently solved subsequently.
Firstly, an objective function is constructed through rotation of vision and inertial navigation, and variables are adjusted
Figure SMS_233
And the target function is enabled to reach the minimum value, so that the gyro zero offset is corrected.
The objective function is:
Figure SMS_234
(7)
in the formula,
Figure SMS_235
is a firstkA rotation matrix from the +1 frame inertial navigation coordinate system to the 0 th frame camera coordinate system (the camera coordinate system is a right-hand coordinate system which is constructed by taking the position of the left eye of the current frame camera as an origin and taking the transverse direction, the longitudinal direction and the forward direction of an image),
Figure SMS_236
is as followskA rotation matrix from the frame inertial navigation coordinate system to the 0 th frame camera coordinate system,
Figure SMS_237
is as followsk+1 frame inertial navigation coordinate system tokThe rotation of the frame inertial navigation coordinate system,
Figure SMS_238
is a quaternion multiplication sign.
Wherein:
Figure SMS_239
(8)
in the formula,
Figure SMS_240
is at the firstkThe first under the frame inertial navigation coordinate systemkThe estimated value of +1 frame inertial navigation coordinate system rotation is obtained by integral calculation of the observed value of angular velocity,
Figure SMS_241
in the form of a sign of a quaternion multiplication,
Figure SMS_242
for a jacobian matrix with zero offset angular velocity for the rotation increment,
Figure SMS_243
in increments of zero offset of angular velocity.
Then, an objective function is constructed through translation and rotation of vision and inertial navigation, and optimization variables are adjusted
Figure SMS_244
The objective function is minimized to initialize velocity, gravity and scale.
The objective function is:
Figure SMS_245
(9)
in the formula,
Figure SMS_254
the variables of the optimization are represented by a table,
Figure SMS_256
to optimize variables
Figure SMS_257
The residual error of the inertial measurement of the structure,
Figure SMS_259
representing optimization variables by tuning
Figure SMS_262
Make the inertia measurement residual error
Figure SMS_263
The minimum value is reached, and the minimum value,
Figure SMS_264
is as followskThe first under the frame inertial navigation coordinate systemkA pre-integrated position increment error for +1 frame of inertial navigation,
Figure SMS_246
is as followskThe first under the frame inertial navigation coordinate systemkA pre-integrated velocity delta error for +1 frame of inertial navigation,
Figure SMS_249
is as followskFrame inertial navigation coordinate systemkThe pre-integration position increment for +1 frame of inertial navigation,
Figure SMS_251
is as followskThe first under the frame inertial navigation coordinate systemkThe pre-integration velocity increment for +1 frame of inertial navigation,
Figure SMS_253
from the camera coordinate system of frame 0 to the camera coordinate system of frame 0kA rotation matrix of the frame inertial navigation coordinate system,sin order to be a scale of,
Figure SMS_255
is a firstkA rotation matrix from the frame inertial navigation coordinate system to the 0 th frame camera coordinate system,
Figure SMS_258
is as followskThe three-axis velocity under +1 frame inertial navigation coordinate system,
Figure SMS_260
is a firstkThree-axis speed under the frame inertial navigation coordinate system,
Figure SMS_261
is as followskA rotation matrix from the +1 frame inertial navigation coordinate system to the 0 th frame camera coordinate system,
Figure SMS_247
is the second frame in the 0 th frame camera coordinate systemkThe relative position of the frame inertial navigation system,
Figure SMS_248
is the first frame in the 0 th frame camera coordinate systemk+The relative position of 1 frame of inertial navigation,
Figure SMS_250
is the gravity vector in the 0 th frame camera coordinate system,
Figure SMS_252
the time interval between two key frames.
Wherein:
Figure SMS_265
(10)
in the formula,
Figure SMS_266
to comprise
Figure SMS_267
The number of the optimization variables of the unknowns,
Figure SMS_268
refer to the 0 th frame through the 0 th framenFrame sharing
Figure SMS_269
Velocity unknowns for three axes of the frame, 3 refers to
Figure SMS_270
The expressed gravity vector unknowns, 1 refers tosThe scale that is represented is an unknown quantity,
Figure SMS_271
is the 0 th frame to the 0 th framenThe three-axis velocity of the frame in the inertial navigation coordinate system,
Figure SMS_272
is the gravity vector in the frame 0 camera coordinate system,sis a scale.
And 2.3, constructing an objective function, wherein the objective function comprises three parts of marginalized prior information, inertial measurement residual errors and visual re-projection residual errors, optimizing variables of each frame to enable the objective function to reach a minimum value, and solving a final pose result of each frame.
The objective function is:
Figure SMS_273
(11)
in the formula,
Figure SMS_275
for the purpose of the marginalized a priori information,
Figure SMS_276
a Jacobian matrix calculated for the prior information,
Figure SMS_278
to optimize variables
Figure SMS_279
Constructed inertial measurement residual
Figure SMS_280
Figure SMS_282
To optimize variables
Figure SMS_274
Constructed visual re-projection residual
Figure SMS_277
Wherein:
Figure SMS_283
(12)
optimizing variables
Figure SMS_284
In (1),
Figure SMS_285
is as followskThe state vector of the frame to be optimized,
Figure SMS_286
is a transformation matrix from the camera coordinate system to the inertial navigation coordinate system,
Figure SMS_287
is as followsmThe inverse depth of each feature point, i.e., the inverse of the depth.
Figure SMS_288
(13)
Figure SMS_289
(14)
In the formula,
Figure SMS_291
is the 0 th frame to the 0 th framekThe state vector of the frame to be optimized,
Figure SMS_292
is the first inertia of the world coordinate systemkThe position of the frame guide is determined,
Figure SMS_293
is the first in the world coordinate systemkThe speed of the frame inertial navigation is determined,
Figure SMS_294
is the first in the world coordinate systemkThe attitude of the frame inertial navigation is determined,
Figure SMS_295
is as followskZero offset of three-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,
Figure SMS_296
is as followskThe three-axis angular velocity of inertial navigation under the frame inertial navigation coordinate system is zero offset,
Figure SMS_297
the lever arm values from the camera coordinate system to the inertial navigation coordinate system,
Figure SMS_290
is a rotational quaternion from the camera coordinate system to the inertial navigation coordinate system.
Adjusting optimization variables using gauss-newton method
Figure SMS_298
(equation 12) the objective function (equation 11) is minimized, and the state vector obtained at this time
Figure SMS_299
Namely the vision-inertial navigation first-level positioning result
Figure SMS_300
Step 3, obtaining the three-axis acceleration of inertial navigation
Figure SMS_301
And triaxial angular velocity
Figure SMS_302
Data, derivative data
Figure SMS_303
Matching the angular velocity, the acceleration and the satellite derivative data of the inertial navigation through the timestamp, fusing the inertial navigation and satellite derivative data based on the extended Kalman filtering, and calculating to obtain an inertial navigation-satellite navigation first-stage positioning result with the same frequency as the inertial navigation data
Figure SMS_304
Three-axis acceleration based on inertial navigation
Figure SMS_305
And triaxial angleSpeed of rotation
Figure SMS_306
Data and satellite data
Figure SMS_307
And constructing a state transition equation and an observation equation of Kalman filtering:
Figure SMS_308
(15)
Figure SMS_309
(16)
in the formula,
Figure SMS_310
is as followskThe observed value of the frame is determined,
Figure SMS_311
is as followskThe process noise of the frame is a noise,
Figure SMS_312
is as followskThe observed noise of the frame is a function of,
Figure SMS_313
is as followskThe state of frame inertial navigation is respectively position, speed, attitude, gyroscope and acceleration zero-offset parameters, and is represented as:
Figure SMS_314
(17)
estimation of previous frame using Taylor expansion
Figure SMS_315
And a state prediction value of the current frame
Figure SMS_316
And (3) unfolding:
Figure SMS_317
(18)
Figure SMS_318
(19)
in the formula,
Figure SMS_319
is a function of
Figure SMS_320
In that
Figure SMS_321
The jacobian matrix of (a) is,
Figure SMS_322
is a function of
Figure SMS_323
In that
Figure SMS_324
The jacobian matrix of (a).
The prediction equation is:
Figure SMS_325
(20)
Figure SMS_326
(21)
in the formula,
Figure SMS_327
is a firstkThe predicted value of the frame inertial navigation state,
Figure SMS_328
is a firstkA covariance matrix of predicted values of the frame inertial navigation state;
Figure SMS_329
is a function of
Figure SMS_330
In that
Figure SMS_331
The jacobian matrix of (a) is,
Figure SMS_332
is as followsk1 covariance matrix of inertial navigation states,
Figure SMS_333
is the covariance matrix of the process noise.
The correction equation is:
Figure SMS_334
(22)
Figure SMS_335
(23)
Figure SMS_336
(24)
in the formula,
Figure SMS_338
is as followskThe kalman gain of the frame is determined by the frame,
Figure SMS_340
is as followskA covariance matrix of predicted values of the frame inertial navigation states,
Figure SMS_342
is a function of
Figure SMS_343
In that
Figure SMS_344
The jacobian matrix of (a) is,
Figure SMS_345
in order to observe the covariance matrix of the noise,
Figure SMS_346
is a firstkThe output value of the frame inertial navigation state,
Figure SMS_337
is as followskThe observed value of the frame is determined,
Figure SMS_339
is as followskThe predicted value of the frame inertial navigation state,
Figure SMS_341
is an identity matrix.
Calculated by the extended Kalman Filter Algorithm
Figure SMS_347
Namely the inertial navigation-satellite navigation primary positioning result
Figure SMS_348
Step 4, utilizing the time stamp to obtain two primary positioning results obtained by calculation
Figure SMS_349
And
Figure SMS_350
matching is carried out, and an objective function is constructed by using two primary positioning results
Figure SMS_351
Performing global optimization to obtain the final positioning result
Figure SMS_352
Matching visual-inertial navigation primary positioning results through timestamps
Figure SMS_353
Inertial navigation-satellite navigation primary positioning result
Figure SMS_354
And constructing an objective function:
Figure SMS_355
(25)
in the formula,
Figure SMS_356
Is a transformation matrix between two frames of inertial navigation-satellite navigation first-level positioning results under a longitude and latitude high coordinate system,
Figure SMS_357
is a transformation matrix from the inertial navigation-satellite navigation primary positioning result to the vision-inertial navigation primary positioning result,
Figure SMS_358
is a transformation matrix of the pose between two frames under the 0 th frame camera coordinate system,
Figure SMS_359
is a transformation matrix between the first-level positioning results of the two frames of vision-inertial navigation under the 0 th frame camera coordinate system,
Figure SMS_360
Figure SMS_361
and the pose variable to be optimized.
Adjusting optimization variables by Gauss-Newton method
Figure SMS_362
Figure SMS_363
The target function of the matched two frames of data reaches the minimum value, and the final optimization variable is output
Figure SMS_364
And
Figure SMS_365
i.e. as the final pose result
Figure SMS_366
The specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made to the described embodiments or alternatives may be employed by those skilled in the art without departing from the spirit or scope of the invention as defined in the appended claims.

Claims (10)

1. A vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing is characterized by comprising the following steps:
step 1, acquiring inertial navigation zero offset, an internal reference matrix of a binocular camera and a transformation matrix among three sensors of inertial navigation, the binocular camera and a satellite antenna;
step 2, obtaining the three-axis acceleration of inertial navigation
Figure QLYQS_1
And triaxial angular velocity
Figure QLYQS_2
Data, binocular vision data
Figure QLYQS_3
Figure QLYQS_4
Matching the acceleration and the angular velocity of inertial navigation with binocular vision data through a timestamp, fusing the inertial navigation and the vision data based on a vision SLAM frame optimized by a graph, and calculating to obtain a vision-inertial navigation primary positioning result with the same frequency as the binocular vision data
Figure QLYQS_5
Step 2.1, three-axis acceleration according to inertial navigation
Figure QLYQS_6
And triaxial angular velocity
Figure QLYQS_7
Data calculation ofkFrame and secondk+Integration result between 1 frame to obtaink+Position, velocity and attitude of 1 frame;
step 2.2, realizing mutual constraint of vision and inertial navigation according to the pre-integration information calculated in the step 2.1, and finishing initialization;
step 2.3, constructing an objective function, wherein the objective function comprises three parts of marginalized prior information, inertial measurement residual errors and visual re-projection residual errors, optimizing the state vector of each frame to enable the objective function to reach the minimum value, and solving the final pose result of each frame;
step 3, obtaining the three-axis acceleration of inertial navigation
Figure QLYQS_8
And triaxial angular velocity
Figure QLYQS_9
Data, derivative data
Figure QLYQS_10
Matching angular velocity, acceleration and satellite derivative data of inertial navigation through a timestamp, fusing the inertial navigation and satellite derivative data based on extended Kalman filtering, and calculating to obtain an inertial navigation-satellite navigation first-stage positioning result with the same frequency as the inertial navigation data
Figure QLYQS_11
Step 4, utilizing the time stamp to calculate the two primary positioning results
Figure QLYQS_12
And
Figure QLYQS_13
matching is carried out, and the two primary positioning results are used for constructing an objective function
Figure QLYQS_14
Performing global optimization to obtain the final pose result
Figure QLYQS_15
2. The vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing as claimed in claim 1, characterized in that: in the step 1, three sensors of inertial navigation, a binocular camera and a satellite antenna are rigidly and fixedly connected to the same carrier, inertial navigation zero offset is calibrated in a static mode, an internal reference matrix of the binocular camera, an inertial navigation matrix of the binocular camera and a transformation matrix of the binocular camera are calibrated in a combined mode through a calibration plate, the transformation matrix from the satellite antenna to the inertial navigation and the transformation matrix from the satellite antenna to the binocular camera are measured, and the inertial navigation zero offset, the internal reference matrix of the binocular camera and the transformation matrix among the three sensors are obtained.
3. The vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing as claimed in claim 1, characterized in that: step 2.1 three-axis acceleration according to inertial navigation
Figure QLYQS_16
And triaxial angular velocity
Figure QLYQS_17
Data calculation ofkFrame and secondk+Integration results between 1 frame to obtaink+Position, velocity, and attitude of 1 frame:
Figure QLYQS_18
(1)
Figure QLYQS_19
(2)
Figure QLYQS_20
(3)
in the formula,
Figure QLYQS_32
is the first in the world coordinate systemk+The position of 1 frame of inertial navigation,
Figure QLYQS_34
is the first in the world coordinate systemkThe position of the frame inertial navigation is determined,
Figure QLYQS_35
is the first in the world coordinate systemkThe speed of the frame inertial navigation is determined,
Figure QLYQS_37
is the time interval between two key frames,
Figure QLYQS_38
is as followstThree-axis acceleration of inertial navigation under the frame inertial navigation coordinate system,
Figure QLYQS_39
is as followstA rotation matrix from the frame inertial navigation coordinate system to the world coordinate system,
Figure QLYQS_40
is as followstZero offset of three-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,
Figure QLYQS_21
is a gravity vector under a world coordinate system,
Figure QLYQS_23
is the first in the world coordinate systemk+The speed of the inertial navigation is 1 frame,
Figure QLYQS_25
is the first in the world coordinate systemk+1 frame of the attitude of inertial navigation,
Figure QLYQS_27
is the first in the world coordinate systemkThe attitude of the frame inertial navigation is determined,
Figure QLYQS_29
in the form of a sign of a quaternion multiplication,
Figure QLYQS_31
as a function of angular velocity transformation
Figure QLYQS_33
Function of
Figure QLYQS_36
Is to make
Figure QLYQS_22
Carry-in function
Figure QLYQS_24
In the above-mentioned step (2), the calculation is carried out,
Figure QLYQS_26
is a firsttThree-axis angular velocity of inertial navigation under a frame inertial navigation coordinate system,
Figure QLYQS_28
is as followstZero offset of three-axis angular velocity of inertial navigation under the frame inertial navigation coordinate system,
Figure QLYQS_30
is a firsttFrame inertial navigation coordinate system relative to the secondkThe attitude variation of the frame inertial navigation coordinate system is determined by the fact that the inertial navigation data frequency is higher than the visual data frequencytIs a firstkFrame visual data tok+And 1, the world coordinate system is a right-hand coordinate system which is constructed by taking the position of the inertial navigation of the frame 0 as an original point and the directions of gravity, north and east, and the inertial navigation coordinate system is a right-hand coordinate system which is constructed by taking the position of the inertial navigation of the current frame as an original point and the directions of a gyroscope and a meter of inertial navigation hardware.
4. The vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing as claimed in claim 3, characterized in that: step 2.1 variables
Figure QLYQS_41
From the firstkFrame to firstkSeparating in the integral term of +1 frame to obtain:
Figure QLYQS_42
(4)
Figure QLYQS_43
(5)
Figure QLYQS_44
(6)
in the formula,
Figure QLYQS_46
is as followskThe variation of the position of the (k + 1) th frame inertial navigation in the frame inertial navigation coordinate system,
Figure QLYQS_48
is a firstkThe variation of the speed of the (k + 1) th frame inertial navigation in the frame inertial navigation coordinate system,
Figure QLYQS_50
is a firstkThe attitude variation of the (k + 1) th frame inertial navigation in the frame inertial navigation coordinate system,
Figure QLYQS_52
is as followskThe first under the frame inertial navigation coordinate systemtThe attitude variation of the frame inertial navigation,
Figure QLYQS_54
is as followstFrame inertial navigation coordinate system tokA rotation matrix of the frame inertial navigation coordinate system,
Figure QLYQS_56
is a firsttThree-axis acceleration of inertial navigation under the frame inertial navigation coordinate system,
Figure QLYQS_58
is a firsttZero offset of three-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,
Figure QLYQS_45
as a function of angular velocity transformation
Figure QLYQS_47
Function of
Figure QLYQS_49
Is to make
Figure QLYQS_51
Carry-in function
Figure QLYQS_53
In the above-mentioned step (2), calculating,
Figure QLYQS_55
is as followstThree-axis angular velocity of inertial navigation under a frame inertial navigation coordinate system,
Figure QLYQS_57
is as followstAnd zero deviation of three-axis angular velocity of inertial navigation under the frame inertial navigation coordinate system.
5. The vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing as claimed in claim 1, characterized in that: step 2.2, zero offset of the inertial navigation data is constrained by using the visual data, scale information is provided for the visual data through the inertial navigation data, firstly, a target function is constructed through rotation of vision and inertial navigation, and variables are adjusted
Figure QLYQS_59
And enabling the target function to reach the minimum value so as to correct the zero offset of the gyroscope, wherein the target function is as follows:
Figure QLYQS_60
(7)
in the formula,
Figure QLYQS_61
is as followsk+1 frame inertial navigation coordinateTo the 0 th frame camera coordinate system,
Figure QLYQS_62
is as followskA rotation matrix from the frame inertial navigation coordinate system to the 0 th frame camera coordinate system,
Figure QLYQS_63
is as followsk+1 frame inertial navigation coordinate system tokThe rotation of the frame inertial navigation coordinate system,
Figure QLYQS_64
the camera coordinate system is a right-hand coordinate system which is constructed by taking the position of the left eye of the current frame camera as an original point and taking the transverse direction, the longitudinal direction and the forward direction of an image as quaternion multiplication symbols;
wherein:
Figure QLYQS_65
(8)
in the formula,
Figure QLYQS_66
is at the firstkThe first under the frame inertial navigation coordinate systemkThe estimated value of +1 frame inertial navigation coordinate system rotation is obtained by integral calculation of the observed value of angular velocity,
Figure QLYQS_67
in the form of a sign of a quaternion multiplication,
Figure QLYQS_68
for a jacobian matrix with zero offset angular velocity for the rotation increment,
Figure QLYQS_69
in increments of zero offset of angular velocity.
6. The vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing as claimed in claim 5, characterized in that: step 2.2 then construct the mesh by translation and rotation of vision, inertial navigationStandard function, by adjusting the optimization variables
Figure QLYQS_70
And enabling the objective function to reach the minimum value so as to initialize the speed, the gravity and the scale, wherein the objective function is as follows:
Figure QLYQS_71
(9)
in the formula,
Figure QLYQS_79
the optimization variables are represented as a function of time,
Figure QLYQS_81
to optimize variables
Figure QLYQS_83
The residual error of the inertial measurement of the structure,
Figure QLYQS_85
representing optimization variables by adjusting
Figure QLYQS_88
Make the inertia measurement residual error
Figure QLYQS_89
The purpose of the method is to reach the minimum,
Figure QLYQS_90
is a firstkFrame inertial navigation coordinate systemkA pre-integrated position increment error for +1 frame of inertial navigation,
Figure QLYQS_73
is as followskThe first under the frame inertial navigation coordinate systemkA pre-integrated velocity delta error for +1 frame of inertial navigation,
Figure QLYQS_74
is as followskThe first under the frame inertial navigation coordinate systemkPre-integration position increase for +1 frame inertial navigationThe amount of the (B) component (A),
Figure QLYQS_78
is a firstkFrame inertial navigation coordinate systemkThe pre-integration velocity increment for +1 frame of inertial navigation,
Figure QLYQS_80
from the camera coordinate system of frame 0 to thekA rotation matrix of the frame inertial navigation coordinate system,sin order to be a scale of,
Figure QLYQS_82
is as followskA rotation matrix from the frame inertial navigation coordinate system to the 0 th frame camera coordinate system,
Figure QLYQS_84
is as followskThe three-axis velocity under +1 frame inertial navigation coordinate system,
Figure QLYQS_86
is as followskThree-axis velocity under the frame inertial navigation coordinate system,
Figure QLYQS_87
is as followskA rotation matrix of +1 frame inertial navigation coordinate system to 0 th frame camera coordinate system,
Figure QLYQS_72
is the second frame in the 0 th frame camera coordinate systemkThe relative position of the frame inertial navigation system,
Figure QLYQS_75
is the first frame in the 0 th frame camera coordinate systemk+The relative position of 1 frame of inertial navigation,
Figure QLYQS_76
is the gravity vector in the 0 th frame camera coordinate system,
Figure QLYQS_77
is the time interval between two key frames;
wherein:
Figure QLYQS_91
(10)
in the formula,
Figure QLYQS_92
to comprise
Figure QLYQS_93
The number of the optimization variables of the unknowns,
Figure QLYQS_94
refer to the 0 th frame through the 0 th framenFrame sharing
Figure QLYQS_95
Velocity unknowns for three axes of the frame, 3 refers to
Figure QLYQS_96
The expressed gravity vector unknowns, 1 refers tosThe scale that is represented is an unknown quantity,
Figure QLYQS_97
is the 0 th frame to the 0 th framenThe three-axis velocity of the frame in the inertial navigation coordinate system,
Figure QLYQS_98
is the gravity vector in the frame 0 camera coordinate system,sis a scale.
7. The vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing as claimed in claim 1, characterized in that: the objective function in step 2.3 is:
Figure QLYQS_99
(11)
in the formula,
Figure QLYQS_101
is an edgeThe a-priori information of the quantization is,
Figure QLYQS_103
a Jacobian matrix calculated for the prior information,
Figure QLYQS_104
to optimize variables
Figure QLYQS_105
Constructed inertial measurement residual
Figure QLYQS_106
Figure QLYQS_108
To optimize variables
Figure QLYQS_100
Constructed visual re-projection residual
Figure QLYQS_102
Wherein:
Figure QLYQS_109
(12)
optimizing variables
Figure QLYQS_110
In (1),
Figure QLYQS_111
is a firstkThe state vector of the frame to be optimized,
Figure QLYQS_112
is a transformation matrix from a camera coordinate system to an inertial navigation coordinate system,
Figure QLYQS_113
is a firstmThe inverse depth of each feature point, namely the inverse depth;
Figure QLYQS_114
(13)
Figure QLYQS_115
(14)
in the formula,
Figure QLYQS_117
is the 0 th frame to the 0 th framekThe state vector of the frame to be optimized,
Figure QLYQS_118
is the first inertia of the world coordinate systemkThe position of the frame guide is determined,
Figure QLYQS_119
is the first in the world coordinate systemkThe speed of the frame inertial navigation is determined,
Figure QLYQS_120
is the first in the world coordinate systemkThe attitude of the frame inertial navigation is determined,
Figure QLYQS_121
is as followskZero offset of three-axis acceleration of inertial navigation under a frame inertial navigation coordinate system,
Figure QLYQS_122
is as followskThe three-axis angular velocity of inertial navigation under the frame inertial navigation coordinate system is zero offset,
Figure QLYQS_123
the lever arm values from the camera coordinate system to the inertial navigation coordinate system,
Figure QLYQS_116
the rotation quaternion from a camera coordinate system to an inertial navigation coordinate system;
adjusting optimization variables using gauss-newton method
Figure QLYQS_124
(equation 12) the value of the objective function (equation 11) is minimized, and the state vector obtained at this time
Figure QLYQS_125
Namely the vision-inertial navigation first-level positioning result
Figure QLYQS_126
8. The vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing as claimed in claim 1, characterized in that: three-axis acceleration according to inertial navigation in step 3
Figure QLYQS_127
And triaxial angular velocity
Figure QLYQS_128
Data and satellite data
Figure QLYQS_129
And constructing a state transition equation and an observation equation of Kalman filtering:
Figure QLYQS_130
(15)
Figure QLYQS_131
(16)
in the formula,
Figure QLYQS_132
is as followskThe observed value of the frame is determined,
Figure QLYQS_133
is a firstkThe process noise of the frame is a function of,
Figure QLYQS_134
is a firstkThe observed noise of the frame is a function of,
Figure QLYQS_135
is as followskThe state of frame inertial navigation is respectively position, speed, attitude, gyroscope and acceleration zero-offset parameters, and is represented as:
Figure QLYQS_136
(17)
estimation of previous frame using Taylor expansion
Figure QLYQS_137
And a state prediction value of the current frame
Figure QLYQS_138
And (3) unfolding:
Figure QLYQS_139
(18)
Figure QLYQS_140
(19)
in the formula,
Figure QLYQS_141
is a function of
Figure QLYQS_142
In that
Figure QLYQS_143
The jacobian matrix of (a) is,
Figure QLYQS_144
is a function of
Figure QLYQS_145
In that
Figure QLYQS_146
The jacobian matrix of (a).
9. The vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing as claimed in claim 8, characterized in that: the prediction equation in step 3 is:
Figure QLYQS_147
(20)
Figure QLYQS_148
(21)
in the formula,
Figure QLYQS_149
is as followskThe predicted value of the frame inertial navigation state,
Figure QLYQS_150
is a firstkA covariance matrix of predicted values of the frame inertial navigation states;
Figure QLYQS_151
is a function of
Figure QLYQS_152
In that
Figure QLYQS_153
The jacobian matrix of (a) is,
Figure QLYQS_154
is a firstk1 covariance matrix of inertial navigation states,
Figure QLYQS_155
a covariance matrix that is the process noise;
the correction equation is:
Figure QLYQS_156
(22)
Figure QLYQS_157
(23)
Figure QLYQS_158
(24)
in the formula,
Figure QLYQS_160
is as followskThe kalman gain of the frame is determined by the frame,
Figure QLYQS_161
is as followskA covariance matrix of predicted values of the frame inertial navigation states,
Figure QLYQS_164
is a function of
Figure QLYQS_165
In that
Figure QLYQS_166
The jacobian matrix of (a) is,
Figure QLYQS_167
in order to observe the covariance matrix of the noise,
Figure QLYQS_168
is as followskThe output value of the frame inertial navigation state,
Figure QLYQS_159
is as followskThe observed value of the frame is determined,
Figure QLYQS_162
is as followskThe predicted value of the frame inertial navigation state,
Figure QLYQS_163
is an identity matrix;
calculated by the above extended Kalman Filter Algorithm
Figure QLYQS_169
Namely the inertial navigation-satellite navigation primary positioning result
Figure QLYQS_170
10. The vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing as claimed in claim 1, characterized in that: step 4, matching the vision-inertial navigation primary positioning result through the timestamp
Figure QLYQS_171
Inertial navigation-satellite navigation primary positioning result
Figure QLYQS_172
And constructing an objective function:
Figure QLYQS_173
(25)
in the formula,
Figure QLYQS_174
is a transformation matrix between two frames of inertial navigation-satellite navigation first-level positioning results under a longitude and latitude high coordinate system,
Figure QLYQS_175
is a transformation matrix from the inertial navigation-satellite navigation primary positioning result to the vision-inertial navigation primary positioning result,
Figure QLYQS_176
is a transformation matrix of the pose between two frames under the 0 th frame camera coordinate system,
Figure QLYQS_177
in the 0 th frame of the camera coordinate systemA transformation matrix between the next two frames of vision-inertial navigation first-level positioning results,
Figure QLYQS_178
Figure QLYQS_179
is a pose variable to be optimized;
adjusting optimization variables by gauss-newton method
Figure QLYQS_180
Figure QLYQS_181
The target function of the matched two frames of data reaches the minimum value, and the final optimization variable is output
Figure QLYQS_182
And
Figure QLYQS_183
i.e. as the final pose result
Figure QLYQS_184
CN202310069650.0A 2023-02-07 2023-02-07 Vision, inertial navigation and defending fusion positioning method based on inertial navigation multiplexing Active CN115793001B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310069650.0A CN115793001B (en) 2023-02-07 2023-02-07 Vision, inertial navigation and defending fusion positioning method based on inertial navigation multiplexing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310069650.0A CN115793001B (en) 2023-02-07 2023-02-07 Vision, inertial navigation and defending fusion positioning method based on inertial navigation multiplexing

Publications (2)

Publication Number Publication Date
CN115793001A true CN115793001A (en) 2023-03-14
CN115793001B CN115793001B (en) 2023-05-16

Family

ID=85430078

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310069650.0A Active CN115793001B (en) 2023-02-07 2023-02-07 Vision, inertial navigation and defending fusion positioning method based on inertial navigation multiplexing

Country Status (1)

Country Link
CN (1) CN115793001B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111121767A (en) * 2019-12-18 2020-05-08 南京理工大学 GPS-fused robot vision inertial navigation combined positioning method
CN112781582A (en) * 2020-12-26 2021-05-11 复三人工智能科技(上海)有限公司 Multi-sensor fusion high-precision pose estimation algorithm under satellite weak observation condition
CN113155124A (en) * 2021-04-27 2021-07-23 涵涡智航科技(玉溪)有限公司 Multi-source auxiliary navigation method and device
WO2021147391A1 (en) * 2020-01-21 2021-07-29 魔门塔(苏州)科技有限公司 Map generation method and device based on fusion of vio and satellite navigation system
CN113358134A (en) * 2021-04-20 2021-09-07 重庆知至科技有限公司 Visual inertial odometer system
US20220292711A1 (en) * 2021-03-10 2022-09-15 Beijing Tusen Zhitu Technology Co., Ltd. Pose estimation method and device, related equipment and storage medium
CN115479602A (en) * 2022-10-14 2022-12-16 北京航空航天大学 Visual inertial odometer method fusing event and distance
WO2022262878A1 (en) * 2021-06-16 2022-12-22 华南理工大学 Ltc-dnn-based visual inertial navigation combined navigation system and self-learning method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111121767A (en) * 2019-12-18 2020-05-08 南京理工大学 GPS-fused robot vision inertial navigation combined positioning method
WO2021147391A1 (en) * 2020-01-21 2021-07-29 魔门塔(苏州)科技有限公司 Map generation method and device based on fusion of vio and satellite navigation system
CN112781582A (en) * 2020-12-26 2021-05-11 复三人工智能科技(上海)有限公司 Multi-sensor fusion high-precision pose estimation algorithm under satellite weak observation condition
US20220292711A1 (en) * 2021-03-10 2022-09-15 Beijing Tusen Zhitu Technology Co., Ltd. Pose estimation method and device, related equipment and storage medium
CN113358134A (en) * 2021-04-20 2021-09-07 重庆知至科技有限公司 Visual inertial odometer system
CN113155124A (en) * 2021-04-27 2021-07-23 涵涡智航科技(玉溪)有限公司 Multi-source auxiliary navigation method and device
WO2022262878A1 (en) * 2021-06-16 2022-12-22 华南理工大学 Ltc-dnn-based visual inertial navigation combined navigation system and self-learning method
CN115479602A (en) * 2022-10-14 2022-12-16 北京航空航天大学 Visual inertial odometer method fusing event and distance

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
CHANGWU LIU 等: "Variable Observability Constrained Visual-Inertial-GNSS EKF-Based Navigation" *
李子月 等: "一种基于多传感器融合辅助的Alex Net模型图像识别算法" *
金红新 等: "多传感器信息融合理论在无人机相对导航中的应用" *

Also Published As

Publication number Publication date
CN115793001B (en) 2023-05-16

Similar Documents

Publication Publication Date Title
CN109991636B (en) Map construction method and system based on GPS, IMU and binocular vision
CN110030994B (en) Monocular-based robust visual inertia tight coupling positioning method
CN109376785B (en) Navigation method based on iterative extended Kalman filtering fusion inertia and monocular vision
CN107289933B (en) Double card Kalman Filtering navigation device and method based on MEMS sensor and VLC positioning fusion
CN111795686B (en) Mobile robot positioning and mapping method
CN111156987B (en) Inertia/astronomy combined navigation method based on residual compensation multi-rate CKF
WO2019071916A1 (en) Antenna beam attitude control method and system
CN110702143B (en) Rapid initial alignment method for SINS strapdown inertial navigation system moving base based on lie group description
CN113203418B (en) GNSSINS visual fusion positioning method and system based on sequential Kalman filtering
CN110702107A (en) Monocular vision inertial combination positioning navigation method
CN108709552A (en) A kind of IMU and GPS tight integration air navigation aids based on MEMS
CN109520476B (en) System and method for measuring dynamic pose of rear intersection based on inertial measurement unit
CN115143954B (en) Unmanned vehicle navigation method based on multi-source information fusion
CN108344413B (en) Underwater glider navigation system and low-precision and high-precision conversion method thereof
CN112562077B (en) Pedestrian indoor positioning method integrating PDR and priori map
CN112857398B (en) Rapid initial alignment method and device for ship under mooring state
CN113503872B (en) Low-speed unmanned aerial vehicle positioning method based on fusion of camera and consumption-level IMU
CN114199239A (en) Double-vision auxiliary inertial differential cockpit head posture measuring system combined with Beidou navigation
CN114690229A (en) GPS-fused mobile robot visual inertial navigation method
Faizullin et al. Best axes composition: Multiple gyroscopes imu sensor fusion to reduce systematic error
CN112284381B (en) Visual inertia real-time initialization alignment method and system
CN112284388B (en) Unmanned aerial vehicle multisource information fusion navigation method
CN113008229A (en) Distributed autonomous integrated navigation method based on low-cost vehicle-mounted sensor
CN117073720A (en) Method and equipment for quick visual inertia calibration and initialization under weak environment and weak action control
CN115793001A (en) Vision, inertial navigation and satellite navigation fusion positioning method based on inertial navigation multiplexing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant