CN103424114B - A kind of full combined method of vision guided navigation/inertial navigation - Google Patents

A kind of full combined method of vision guided navigation/inertial navigation Download PDF

Info

Publication number
CN103424114B
CN103424114B CN201210161621.9A CN201210161621A CN103424114B CN 103424114 B CN103424114 B CN 103424114B CN 201210161621 A CN201210161621 A CN 201210161621A CN 103424114 B CN103424114 B CN 103424114B
Authority
CN
China
Prior art keywords
attitude
inertial navigation
vision
error
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201210161621.9A
Other languages
Chinese (zh)
Other versions
CN103424114A (en
Inventor
刘春�
周发根
吴杭彬
姚连壁
简志伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tongji University
Original Assignee
Tongji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tongji University filed Critical Tongji University
Priority to CN201210161621.9A priority Critical patent/CN103424114B/en
Publication of CN103424114A publication Critical patent/CN103424114A/en
Application granted granted Critical
Publication of CN103424114B publication Critical patent/CN103424114B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The present invention relates to a kind of full combined method of vision guided navigation/inertial navigation, the method comprises the following steps: 1) vision guided navigation resolves: the vertical observation equation of row based on collinearity equation, principle of least square adjustment is adopted to ask for carrier positions and attitude parameter, and the variance-covariance battle array between calculating parameter; 2) inertial navigation calculates: in local horizontal coordinates, do navigation calculate, obtain the position of each moment carrier, speed and attitude parameter, the variance-covariance battle array simultaneously between calculating parameter; 3) vision system correction inertial navigation system: take Kalman filtering as means, estimates navigational parameter error and the device error of inertial navigation system, and compensates and feedback compensation, thus obtain the optimal estimation of all parameters of inertial navigation system; 4) inertial navigation system correction vision system: adopt sequential adjustment process, revises all parameters of vision system.Compared with prior art, the present invention has theoretical tight, stable performance, the advantage such as efficient.

Description

A kind of full combined method of vision guided navigation/inertial navigation
Technical field
The present invention relates to the array mode of a kind of vision navigation system and inertial navigation system, especially relate to a kind of full combined method of vision guided navigation/inertial navigation.
Background technology
Vision guided navigation, be also called vision odometry (VisualOdometry, VO), ultimate principle is: in carrier advance process, utilize stereoscopic camera to obtain the image of scene around, rely on the coupling of unique point between the image of left and right, and the tracking between the frame image of front and back, based on collinearity equation, the vertical observation equation of row, asks for locus and the attitude of camera according to principle of least square adjustment.Vision guided navigation is a kind of high-precision independent navigation mode, but as a kind of reckoning mode, its navigation error accumulates in time, and the stability of algorithm is a difficult problem always, the background sometimes in image visual field is single or too similar, or the dynamic of carrier car is excessive, such as angle of turn is excessive, make the degree of overlapping between adjacent image too little, these all can cause vision guided navigation to resolve failure, therefore will strengthen the stability of algorithm by other means.
Inertial navigation system (InertialNavigationSystem, INS) take Newton mechanics law as principle of work, utilize the inertial measurement component (InertialMeasurementUnit be arranged on carrier, IMU) come angular velocity and the acceleration information of sensitive carrier, obtained position, the navigational parameter such as speed and attitude of carrier by integral operation.Inertial navigation has independence and anti-interference completely, has high precision in the short time, but due to integral principle, its error is accumulated in time, and the precision therefore worked long hours is poor.In order to reduce error accumulation, external observation value be introduced inertial navigation system is revised.
Therefore, vision guided navigation and inertial navigation, as two kinds of independent navigation modes, have good complementarity.Both combinations can solve without the navigation problem under GPS environment, and for indoor navigation, robot path planning, unmanned plane independent landing, the navigation of automatic running car, underground traffic navigator fix, the mine operation application demand such as are safely significant.
At present, the array mode of vision guided navigation and inertial navigation system generally has three classes: (1) mating type, the general positional information relying on vision system to provide carrier, and inertial navigation system provides the attitude information of carrier; (2) correction type, this array mode goes to revise another kind of navigational system with a kind of navigational system; (3) pattern of fusion, the general data fusion adopting Kalman filtering to realize both.In above three class array modes, mating type belongs to the combination of low level, is usually used in the occasion that accuracy requirement is not high, and usually adopts correction type or pattern of fusion in high-precision occasion.
In correction type array mode, generally utilize inertial navigation system to revise vision system, implementation procedure is generally such: (1) camera obtains the image of current time, (2) extracting and matching feature points is carried out to the left and right image of current time, and mate with the image of previous moment, then the elements of exterior orientation (position and attitude) of camera is calculated by resection, (3) the output data of IMU are utilized to calculate the angle of pitch and the roll angle of carrier, and course angle increment, (4) angle of pitch of carrier and roll angle and course angle increment is utilized, by Kalman filtering, camera position and attitude are revised, (5) revised camera position and attitude is utilized, the object coordinates of unique point is asked by forward intersection, (6) above step is repeated.
The implementation procedure of pattern of fusion array mode is generally such: (1) camera obtains the image of current time, (2) extracting and matching feature points is carried out to the left and right image of current time, and mate with the image of previous moment, then the elements of exterior orientation (position and attitude) of camera is calculated by resection, (3) utilize the position of camera and attitude as measuring value, adopt position and the attitude of Kalman Filter Estimation IMU, (4) the IMU position obtained after filtering and attitude are passed to camera, then the object coordinates of unique point is asked by forward intersection, (5) above step is repeated.
As can be seen from the above, in the array mode of correction type, only utilize the output data of IMU to provide attitude, and positional information is not provided, therefore this mode navigation information that could not inertial navigation system be utilized to the full extent to provide.The array mode of pattern of fusion can utilize the information of vision system and inertial navigation system preferably, makes both can be revised, but this mode does not consider the prior imformation of image feature point object coordinates, and combined effect can not reach best.In fact in vision guided navigation, the transmission of unique point between different images is crucial, the unique point extracted in current image, and major part obtains its object coordinates by least square adjustment in previous frame image, and coordinate precision is also known, therefore must take in precision information.
Summary of the invention
Object of the present invention is exactly provide a kind of theory full combined method that is tight, stable, vision guided navigation/inertial navigation efficiently to overcome defect that above-mentioned prior art exists.
Object of the present invention can be achieved through the following technical solutions:
A full combined method for vision guided navigation/inertial navigation, the method comprises the following steps:
1) vision guided navigation resolves: camera obtains image in carrier movement process, and based on collinearity equation, the vertical observation equation of row, adopts principle of least square adjustment to ask for carrier positions and attitude parameter X vo, and the variance-covariance battle array P (X between calculating parameter vo);
2) inertial navigation calculates: in local horizontal coordinates, do navigation calculate, obtain the position of each moment carrier, speed and attitude parameter X ins, simultaneously according to the error equation of inertial navigation system, propagate rule by covariance, the variance-covariance battle array P (X between calculating parameter ins);
3) vision system correction inertial navigation system: take Kalman filtering as means, get the error equation of inertial navigation system as state equation, the carrier positions, the attitude that utilize vision guided navigation to resolve to obtain, the carrier positions calculated with inertial navigation system, attitude ask poor respectively, as the measurement of Kalman filtering, estimate navigational parameter error and the device error of inertial navigation system, and compensate and feedback compensation, thus obtain the optimal estimation of all parameters of inertial navigation system, be designated as
4) inertial navigation system correction vision system: position and the attitude of isolating IMU from the parameter of inertial navigation system, consider the geometric relationship between camera and IMU, the position of IMU and attitude are changed into position and the attitude information of camera, set up observation equation, adopt sequential adjustment process, obtain the optimal estimation of all parameters of vision system, be designated as
Described vision guided navigation resolves concrete steps and comprises:
11) initialization of camera elements of exterior orientation: laying reference mark around vision guided navigation reference position, asks for the elements of exterior orientation of camera by resection;
12) carrier ceaselessly moves, and utilizes camera to obtain the image of current time, carries out extracting and matching feature points;
13) based on collinearity equation, all observation equation is set up to the successful all unique points of left and right Image Matching:
v = B 1 x ^ 1 + B 2 x ^ 2 + A 1 y ^ 1 + A 2 y ^ 2 - l v ( y 1 ) = y ^ 1 - l ( y 1 ) C ( x ^ 1 , x ^ 2 ) - W x = 0
In above formula, refer to the elements of exterior orientation correction of left camera, refer to the elements of exterior orientation correction of right camera, refer to the object coordinates correction of known point, refer to the object coordinates correction of point to be located, y 1refer to the object coordinates of known point, v refers to observed reading residual error, and l is constant, for the restriction relation between the camera of left and right, W xfor the constant in equation of constraint;
14) solve above formula according to the principle of least square, try to achieve the elements of exterior orientation of left and right camera, and the object coordinates of all unique points.
Described Feature Points Matching comprises two aspects: one is that the left and right image of current time is carried out Feature Points Matching, and two was that the left image of current time and the left image in a upper moment are carried out Feature Points Matching; Described unique point comprises known point and point to be located: known point refers to the unique point occurred in previous frame image, and point to be located then refers to the unique point newly increased in current image.
Described inertial navigation calculation procedure specifically comprises:
(1) posture renewal: the angular velocity observed reading utilizing gyro to export, adopts Quaternion Method to gesture renewals, then the hypercomplex number after renewal is changed into attitude matrix, be designated as k represents a moment, and k+1 represents current time, and L is navigational coordinate system, and b is carrier coordinate system;
(2) attitude is extracted: according to attitude matrix the value of middle element, calculates the attitude of carrier, comprises course angle, the angle of pitch, roll angle;
(3) specific force decomposes: utilize attitude matrix, the specific force observation value f that degree of will speed up meter exports bnavigational coordinate system is transformed into by carrier coordinate system, as follows
f L = R b L ( k + 1 ) · f b
(4) compensation of harmful acceleration: to f lcarry out Corioli's acceleration correction and normal gravity compensation, obtain carrier acceleration a at the earth's surface, as follows
a = f L - ( 2 Ω ie L + Ω eL L ) v L + g L
Wherein, for the antisymmetric matrix that rotational-angular velocity of the earth three-dimensional component is formed, for L system relative to ground be admittedly angular velocity of rotation three-dimensional component form antisymmetric matrix, v lfor velocity, g lfor terrestrial gravitation vector;
(5) integration asks for speed: utilize acceleration to carry out integration and obtain speed increment, and then obtain the speed of current time as follows
v k + 1 L = v l L + a · Δt
(6) integration asks for position: utilize speed to carry out integration and obtain positional increment, and then obtain the position of current time as follows
r k + 1 L = r k L + 1 2 ( v k L + v k + 1 L ) · Δt
Described vision system correction inertial navigation system step specifically comprises:
31) get the state equation of error equation as Kalman filter of inertial navigation system, form is as follows
X · = F · X + G · ω
Wherein, X is error state vector, comprises site error δ r l, velocity error δ v l, the misaligned angle of the platform δ ε l, gyroscopic drift d and accelerometer bias b,
X=(δr L,δv L,δε L,d,b)
32) difference of the position that the position calculated by inertial navigation and vision guided navigation resolve, and the difference of attitude that the attitude that calculates of inertial navigation and vision guided navigation resolve is as measuring value, measurement equation is as follows
Z r Z a = H r H a · X + V r V a
Z r, Z arepresent that site error measures and attitude error measures, H respectively r, H arepresent site error measurement matrix and attitude error measurement matrix respectively, V r, V arepresent the residual error that site error measures and attitude error measures respectively;
Site error Z rcalculate as follows
Z r=r ins-r vo
In above formula, r insrefer to the coordinate that inertial navigation calculates, r vorepresent that vision guided navigation resolves the coordinate obtained; Site error measurement matrix H rform as follows
H r = I 3 × 3 0 12 × 3
In formula, I is unit battle array, and 0 is null matrix, attitude error Z aaccount form as follows
Z a=a ins-a vo
Wherein a insrepresent the attitude that inertial navigation calculates, a vorepresent that vision guided navigation resolves the attitude obtained; Consider the rotation matrix between camera and IMU, vision guided navigation resolved the camera attitude reduction that the obtains attitude to carrier:
a vo=(p′,r′,y′) T
P ', r in formula ', y ' represents the carrier angle of pitch, roll angle and the course angle that are obtained by the reduction of camera attitude; Attitude error measurement matrix H aform as follows
H a = 0 6 × 3 A 3 × 3 0 6 × 3
The form of A battle array is as follows
A = cos y sin y 0 - sin y cos p cos y cos p 0 sin y sin p cos p - cos y sin p cos p 1
In formula, p, r, y represent the carrier angle of pitch, roll angle and the course angle that are calculated by inertial navigation respectively;
33) estimated navigational parameter error and the device error of inertial navigation system by filtering, and compensate and feedback compensation.
Described inertial navigation system correction vision system step specifically comprises:
41) from navigational parameter, isolate position and the attitude of IMU, according to the geometric relationship between camera and IMU, the position of IMU and attitude are changed into the observed reading of camera position and attitude;
42) adopt sequential adjustment process, complete the correction of inertial navigation system to vision system.
Described step 42) specifically comprise:
The function model of inertial navigation system correction vision system is write as following form:
X vo = I I I I x ^ 1 x ^ 2 y ^ 1 y ^ 2 - l X vo r ins a ins = I 0 0 0 x ^ 1 x ^ 2 y ^ 1 y ^ 2 - l X ins
Wherein, be respectively constant corresponding in two equations.
Above formula is organized into the form of observation equation, as follows:
v ( x vo ) = x ^ vo - l ( x vo ) v ( r ins , a ins ) = B x ^ vo - l ( r ins , a ins )
In above formula, the form of matrix B is as follows
B=(I000)
The probabilistic model of observed reading, namely variance-covariance battle array is as follows:
D ( X vo ) = P ( V vo ) D ( r ins , a ins ) = P ( r ins , a ins )
In above formula, P (r ins, a ins) be by P (X ins) in isolated to position, submatrix that attitude is relevant, according to above-mentioned function model and probabilistic model, the correction formula of vision system can be derived, as follows
x ^ vo = x vo + D ( X vo ) B T ( D ( r ins , a ins ) + BD ( X vo ) B T ) - 1 ( l ( r ins , a ins ) - Bx vo ) D ( x ^ vo ) = D ( X vo ) - D ( X vo ) B T ( D ( r ins , a ins ) + BD ( X vo ) B T ) - 1 BD ( X vo )
In above formula, x vorefer to and resolve by vision guided navigation the vision parameter obtained, through revised vision parameter, through revised variance-covariance battle array.
Compared with prior art, the present invention has following advantage:
(1) vision system correction inertial navigation system.The navigational parameter error (site error, velocity error, attitude error) of inertial navigation system not only can be estimated by Kalman filtering, also can estimate device error (gyroscopic drift, accelerometer bias), and navigational parameter error is compensated, by device Error Feedback in the navigation calculating of next cycle, inertial navigation system can be made like this to keep long high precision.When vision guided navigation resolves failure, combined system then relies on inertial navigation to provide the navigational parameter of carrier completely, owing to estimating various error before and correcting, therefore at short notice, the failure can not resolved because of vision guided navigation and cause the precision of whole system obviously to decline.
(2) inertial navigation system correction vision system.The position utilizing inertial navigation to calculate and attitude, by sequential adjustment process, not only can improve the elements of exterior orientation of camera, and the object coordinates of image feature point is also revised.To be delivered in the navigation calculation of next frame image through the object coordinates of the revised image feature point of sequential adjustment and precision information (the variance-covariance battle array between object coordinates), least square collocation is adopted to process, this method account for the object coordinates prior imformation of image feature point, and vision system can be made to keep long high precision.
Accompanying drawing explanation
Fig. 1 is steps flow chart schematic diagram of the present invention;
Fig. 2 is the schematic diagram of vision system correction inertial navigation system;
Fig. 3 is inertial navigation system correction vision system schematic diagram;
Fig. 4 is the carrier running orbit that different navigation method calculates;
Fig. 5 is that the position in Kalman filtering process newly ceases;
Fig. 6 is that the attitude in Kalman filtering process newly ceases.
Embodiment
Below in conjunction with the drawings and specific embodiments, the present invention is described in detail.
Embodiment
As shown in Figure 1, a kind of full combined method of vision guided navigation/inertial navigation, the method comprises the following steps:
1) vision guided navigation resolves
Step is as follows:
(1) initialization of camera elements of exterior orientation, generally some reference mark of laying around vision guided navigation reference position, ask for the elements of exterior orientation of camera by resection.
(2) carrier ceaselessly moves, stereoscopic camera is utilized to obtain the image of current time, carry out extracting and matching feature points, Feature Points Matching comprises two aspects: one is that the left and right image of current time is carried out Feature Points Matching, and two was that the left image of current time and the left image in a upper moment are carried out Feature Points Matching.
(3) based on collinearity equation, all observation equation is set up to the successful all unique points of left and right Image Matching.Here unique point is divided into two classes: known point and point to be located.Known point refers to the unique point occurred in previous frame image, and therefore its object coordinates is known, and corresponding precision is also known, and point to be located then refers to the unique point newly increased in current image, and its object coordinates is without any prior imformation.The prior imformation least square collocation principle of known point treated in the present invention, therefore observation equation can be write as following form:
v = B 1 x ^ 1 + B 2 x ^ 2 + A 1 y ^ 1 + A 2 y ^ 2 - l v ( y 1 ) = y ^ 1 - l ( y 1 ) C ( x ^ 1 , x ^ 2 ) - W x = 0 - - - ( 1 )
Three equations are comprised altogether in above formula.In first equation refer to the elements of exterior orientation correction of left camera, refer to the elements of exterior orientation correction of right camera, refer to the object coordinates correction of known point, refer to the object coordinates correction of point to be located, y 1refer to the object coordinates of known point, v refers to observed reading residual error, and l refers to constant.Second equation is the virtual observation equation set up according to the prior imformation of known point.3rd equation is the equation of constraint between the camera of left and right, this equation by experiment before calibration carried out to the geometric relationship between camera and IMU set up.
(4) solve formula (1) according to the principle of least square, in fact, formula (1) is the indirect adjustment model of belt restraining, therefore can be tried to achieve the elements of exterior orientation of left and right camera by adjustment, and the object coordinates of all unique points.
2) inertial navigation calculates
Choose local horizontal coordinates (L system) as navigation system, three axles of L system are oriented in " east, north, sky ", and three axles of carrier system (b system) are oriented to " right, front, on ".The step that navigation calculates is as follows:
(1) posture renewal.The angular velocity observed reading utilizing gyro to export, adopts Quaternion Method to gesture renewals, then the hypercomplex number after renewal is changed into attitude matrix, be designated as
(2) attitude is extracted.According to attitude matrix the value of middle element, calculates the attitude of attitude, comprises course angle, the angle of pitch, roll angle.
(3) specific force decomposes.Utilize attitude matrix, the specific force observation value f that degree of will speed up meter exports bnavigational coordinate system is transformed into by carrier coordinate system, as follows
f L = R b L ( k + 1 ) · f b - - - ( 2 )
(4) compensation of harmful acceleration.To f lcarry out Corioli's acceleration correction and normal gravity compensation, obtain carrier acceleration at the earth's surface, as follows
a = f L - ( 2 Ω ie L + Ω eL L ) v L + g L - - - ( 3 )
(5) integration asks for speed.Utilize acceleration to carry out integration and obtain speed increment, and then obtain the speed of current time, as follows
v k + 1 L = v l L + a · Δt - - - ( 4 )
(6) integration asks for position.Utilize speed to carry out integration and obtain positional increment, and then obtain the position of current time, as follows
r k + 1 L = r k L + 1 2 ( v k L + v k + 1 L ) · Δt - - - ( 5 )
3) vision system correction inertial navigation system
Vision system correction inertial navigation system is realized by Kalman filtering, get the error equation of inertial navigation system as state equation, Fig. 2 shows its principle and process, as can be seen from the figure, the measuring value of system is the difference of the position that the position that calculates of inertial navigation and vision guided navigation resolve, and the difference of attitude that the attitude that calculates of inertial navigation and vision guided navigation resolve.
Get the error state vector on 15 rank, comprise site error, velocity error, the misaligned angle of the platform, gyroscopic drift, accelerometer bias.As follows
X=(δr L,δv L,δε L,d,b)(6)
State equation gets the error equation of inertial navigation system, and form is as follows
X · = F · X + G · ω - - - ( 7 )
The camera position that the external measurement information of system obtains from vision guided navigation and attitude.Therefore measurement equation can write out following form
Z r Z a = H r H a · X + V r V a - - - ( 8 )
Z in above formula r, Z arepresent that site error measures and attitude error measures respectively.Site error Z rcalculate as follows
Z r=r ins-r vo(9)
In above formula, r insrefer to that INS navigates the coordinate calculated, r vorepresent that vision guided navigation resolves the coordinate obtained.Due to the coordinate that the determined coordinate of vision guided navigation is camera projection centre, and INS navigation calculates is the coordinate of IMU geometric center, therefore needs r votransform, the three-dimensional between deduction camera and IMU is biased, thus by the coordinate reduction of camera projection centre to IMU geometric center.
Site error measurement matrix H rform as follows
H r = I 3 × 3 0 12 × 3 - - - ( 10 )
In formula, I is unit battle array, and 0 is null matrix.Attitude error Z aaccount form as follows
Z a=a ins-a vo(11)
Wherein a insrepresent that INS navigates the attitude calculated, a vorepresent that vision guided navigation resolves the attitude obtained.It is noted herein that, the navigate attitude that calculates of INS is the attitude of carrier, and vision guided navigation resolves what obtain is the attitude of camera, specifically refer to three the rotation Eulerian angle of image space coordinate system to world coordinate system, therefore the rotation matrix between camera and IMU to be considered, by the attitude of the attitude reduction of camera to carrier.
a vo=(p′,r′,y′) T(12)
P ', r in formula ', y ' represents the carrier angle of pitch, roll angle and the course angle that are obtained by the reduction of camera attitude.Attitude error measurement matrix H aform as follows
H a = 0 6 × 3 A 3 × 3 0 6 × 3 - - - ( 13 )
The form of A battle array is as follows
A = cos y sin y 0 - sin y cos p cos y cos p 0 sin y sin p cos p - cos y sin p cos p 1 - - - ( 14 )
In formula, p, r, y represent the carrier angle of pitch, roll angle and the course angle that are calculated by inertial navigation respectively.So far, the function model of Kalman filtering is established, and can be estimated navigational parameter error and the device error of inertial navigation system, and compensate and feedback compensation, thus make inertial navigation system keep long high precision by filtering.
4) inertial navigation system correction vision system
Inertial navigation system correction vision system adopts sequential adjustment to realize, and Fig. 3 shows its principle and process.As can be seen from the figure, calculated by inertial navigation, the navigational parameter in each moment can be obtained, simultaneously according to the error equation of inertial navigation system, propagate rule by covariance, the variance-covariance battle array between parameter can be calculated.Then from navigational parameter, isolate position and attitude, according to the geometric relationship between camera and IMU, change into the observed reading of camera position and attitude, adopt sequential adjustment process, realize the optimization of vision system navigational parameter.
Sequential adjustment is generally used for data situation by stages, that is in the different moment, system is observed, and utilize previous observed reading to carry out adjustment processing to system, obtain parameter estimation, and the precision information of parameter (variance-covariance battle array).The problem that sequential adjustment will solve is exactly how to utilize the parameter estimation and precision information that previously obtain, processes, obtain more optimal parameter estimation together with the observed reading newly increased.If total observation equation is
v = B x ^ - l - - - ( 15 )
Observation data is divided into two groups, then observation equation can be write as following form
v 1 = B 1 x ^ - l 1 v 2 = B 2 x ^ - l 2 - - - ( 16 )
Suppose that the observed reading number in above formula first equation is enough, solve first equation by the principle of least square,
x ^ 1 = ( B 1 T P 1 B 1 ) - 1 B 1 T P 1 l 1 Q x ^ 1 = ( B 1 T P 1 B 1 ) - 1 - - - ( 17 )
P in above formula 1represent the power battle array of observed reading in first equation.Solve two equations simultaneously:
x ^ = ( B 1 T P 1 B 1 + B 2 T P 2 B 2 ) - 1 ( B 1 T P 1 l 1 + B 2 T P 2 l 2 ) Q x ^ = ( B 1 T P 1 B 1 + B 2 T P 2 B 2 ) - 1 - - - ( 18 )
P in above formula 2represent the power battle array of observed reading in second equation.According to Inversion formula of matrix, the computing formula of sequential adjustment can be derived, as follows
x ^ = x ^ 1 + Q x ^ 1 B 2 T ( P 2 - 1 + B 2 Q x ^ 1 B 2 T ) - 1 ( l 2 - B 2 x ^ 1 ) Q x ^ = Q x ^ 1 - Q x ^ 1 B 2 T ( P 2 - 1 + B 2 Q x ^ 1 B 2 T ) - 1 B 2 Q x ^ 1 - - - ( 19 )
Present introduction adopts sequential adjustment to realize the correction of inertial navigation system to vision system.After vision guided navigation has resolved, the valuation X of camera parameter can be obtained vo, by accuracy assessment, the variance-covariance battle array P (X between parameter can be obtained vo).Inertial navigation system is calculated by navigation, can obtain the navigational parameter X in corresponding moment ins, simultaneously according to the error equation of inertial navigation system, propagate rule by covariance, the variance-covariance battle array between calculating parameter, is designated as P (X ins).From the navigational parameter X of inertial navigation system insin isolate position r inswith attitude a ins, from variance-covariance battle array P (X ins) in isolate the position submatrix P (r relevant to attitude ins, a ins), consider the geometric relationship between camera and IMU, be converted into the observed reading to vision parameter, so the function model of inertial navigation system correction vision system can be write as following form:
X vo = I I I I x ^ 1 x ^ 2 y ^ 1 y ^ 2 - l X vo r ins a ins = I 0 0 0 x ^ 1 x ^ 2 y ^ 1 y ^ 2 - l X ins - - - ( 20 )
Above formula is organized into the form of observation equation, as follows:
v ( x vo ) = x ^ vo - l ( x vo ) v ( r ins , a ins ) = B x ^ vo - l ( r ins , a ins ) - - - ( 21 )
In above formula, the form of matrix B is as follows
B=(I000)(22)
The probabilistic model (variance-covariance battle array) of observed reading is as follows:
D ( X vo ) = P ( V vo ) D ( r ins , a ins ) = P ( r ins , a ins ) - - - ( 23 )
According to function model above and probabilistic model, the correction formula of vision system can be derived, as follows
x ^ vo = x vo + D ( X vo ) B T ( D ( r ins , a ins ) + BD ( X vo ) B T ) - 1 ( l ( r ins , a ins ) - Bx vo ) D ( x ^ vo ) = D ( X vo ) - D ( X vo ) B T ( D ( r ins , a ins ) + BD ( X vo ) B T ) - 1 BD ( X vo ) - - - ( 24 )
In above formula, x vorefer to and resolve by vision guided navigation the vision parameter obtained, through revised vision parameter, through revised variance-covariance battle array.Employing above formula calculates, and can revise all parameters of vision system, comprise the position of left and right camera, attitude, and the object coordinates of all unique points.
If purely algorithmically, the 1st step and the 2nd step are paired running in fact, and the 3rd step and the 4th step are also paired runnings.Namely first vision system and inertial navigation system carry out navigation calculation separately, try to achieve corresponding parameter, and then two subsystems are assisted mutually, and each subsystem is revised, and obtain optimal estimation.The IMU optimal estimation that 3rd step obtains with the camera optimal estimation obtained in the 4th step all the optimal estimation to whole system, or perhaps the optimal estimation to carrier running status.In theory, when not considering the error of calculation, with of equal value.In other words, if the geometric relationship between camera and IMU is taken into account, so with should be consistent.
In order to verify the reliability and stability of the inventive method, carrying out many experiments and data processing, below having introduced wherein an experimentation and data processed result.Experiment place is positioned at Beijing, carrier car is equipped with the data acquisition equipment such as stereoscopic camera and IMU.Also installed a GPS in addition additional, made difference with base station GPS, difference result reaches the precision of centimetre-sized, therefore as the precision checking the inventive method with reference to value.Complete the calibration of stereoscopic camera elements of interior orientation and relative elements of exterior orientation calibration on pretreatment, and the calibration of relative position between camera and IMU and attitude.In experimentation, the landmark point of known absolute coordinate information is first utilized to carry out the initialization of camera elements of exterior orientation.After initialization completes, control carrier car and gather IMU data, image data, gps data in test block.The output frequency of IMU is 100Hz, its gyroscopic drift is 0.005 °/s, accelerometer bias is 10mg, and obtain stereopsis with the frequency of 2Hz by synchronous exposure by the onboard clock time service stereoscopic camera of IMU, the horizontal and vertical field angle of stereoscopic camera is respectively 42 ° and 31 °.The output frequency of GPS is 1Hz.In experiment, the accumulative about 1150m that advances of mobile experiment porch, is about 45min during experiment, obtains about 4300 frame stereopsis.
Two kinds of air navigation aids are adopted to process to experimental data.Method one is that pure vision guided navigation resolves, and when vision guided navigation solution process occurs unsuccessfully, utilizes the motion of uniform motion model prediction carrier, fills up vision guided navigation breach, and method two is that vision guided navigation and inertial navigation combine.In fact, author also attempts having carried out pure-inertial guidance calculating, but for reaching the experimental data of 45min, its navigation error reaches the magnitude of more than 10000m, and therefore pure-inertial guidance is nonsensical, does not discuss here.Below provide the result of two kinds of distinct methods process, and GPS reference locus, see Fig. 4.
Legend in Fig. 4, GPS represents differential GPS result, and VO/INS represents vision/inertia combined navigation result, and VO represents pure vision guided navigation result.As can be seen from the figure, the track of vision/inertia combined navigation and GPS track are very close, and its precision is higher than the result of pure vision guided navigation.Table 1 lists the error condition of two kinds of different navigation methods.As can be seen from the table, vision/inertia combined navigation is in the operating range of 1150m, and accumulative site error is 8.6m, and relative accuracy is 0.75%.
The error condition of table 1 different navigation method
Data processing method Site error (m) Relative error
Vision guided navigation 18.67 1.6%
Vision/inertia combination 8.63 0.75%
In experimentation, carrier car is sometimes across the road surface of baldness, and the angle of sometimes turning is comparatively large, and cause the vision guided navigation of 6 short time of appearance to resolve failure, at this moment combined system relies on inertial navigation to provide navigational parameter completely.As can be seen from the VO/INS combined trajectories in Fig. 4, when vision guided navigation resolves unsuccessfully, there is not obvious precise decreasing in system, and this embodies reliability and the high precision of this method.
Fig. 5 and Fig. 6 shows in the process of vision system correction inertial navigation system, the new breath situation of Kalman filter.In the Kalman filter that the inventive method adopts, innovation representation be that the navigational parameter that calculates of inertial navigation and vision guided navigation resolve the navigational parameter difference between the two obtained.
As can be seen from Figure 5, the maximal value of position difference is within 0.15m, and in most situation all within 0.05m, as can be seen from Figure 6, the maximal value of pose difference within 0.1 °, and in most situation all within 0.01 °, this illustrates that filtering is more stable, vision guided navigation and inertial navigation two systems can be good at combination, in conjunction with before adopt the accuracy evaluation of GPS absolute reference value, fully show reliability and the superiority of combinational algorithm proposed by the invention.

Claims (6)

1. a full combined method for vision guided navigation/inertial navigation, it is characterized in that, the method comprises the following steps:
1) vision guided navigation resolves: camera obtains image in carrier movement process, and based on collinearity equation, the vertical observation equation of row, adopts principle of least square adjustment to ask for carrier positions and attitude parameter X vo, and the variance-covariance battle array P (X between calculating parameter vo);
Described vision guided navigation resolves concrete steps and comprises:
11) initialization of camera elements of exterior orientation: laying reference mark around vision guided navigation reference position, asks for the elements of exterior orientation of camera by resection;
12) carrier ceaselessly moves, and utilizes camera to obtain the image of current time, carries out extracting and matching feature points;
13) based on collinearity equation, all observation equation is set up to the successful all unique points of left and right Image Matching:
v = B 1 x ^ 1 + B 2 x ^ 2 + A 1 y ^ 1 + A 2 y ^ 2 - l v ( y 1 ) = y ^ 1 - l ( y 1 ) C ( x ^ 1 , x ^ 2 ) - W x = 0
In above formula, refer to the elements of exterior orientation correction of left camera, refer to the elements of exterior orientation correction of right camera, refer to the object coordinates correction of known point, refer to the object coordinates correction of point to be located, y 1refer to that the object coordinates v of known point refers to observed reading residual error, l is constant, for the restriction relation between the camera of left and right, W xfor the constant in equation of constraint, B 1for corresponding matrix of coefficients, B 2for corresponding matrix of coefficients, A 1for corresponding matrix of coefficients, A 2for corresponding matrix of coefficients;
14) solve above formula according to the principle of least square, try to achieve the elements of exterior orientation of left and right camera, and the object coordinates of all unique points;
2) inertial navigation calculates: in local horizontal coordinates, do navigation calculate, obtain the position of each moment carrier, speed and attitude parameter X ins, simultaneously according to the error equation of inertial navigation system, propagate rule by covariance, the variance-covariance battle array P (X between calculating parameter ins);
3) vision system correction inertial navigation system: take Kalman filtering as means, get the error equation of inertial navigation system as state equation, the carrier positions, the attitude that utilize vision guided navigation to resolve to obtain, the carrier positions calculated with inertial navigation system, attitude ask poor respectively, as the measurement of Kalman filtering, estimate navigational parameter error and the device error of inertial navigation system, and compensate and feedback compensation, thus obtain the optimal estimation of all parameters of inertial navigation system, be designated as
4) inertial navigation system correction vision system: position and the attitude of isolating IMU from the parameter of inertial navigation system, consider the geometric relationship between camera and IMU, the position of IMU and attitude are changed into position and the attitude information of camera, set up observation equation, adopt sequential adjustment process, obtain the optimal estimation of all parameters of vision system, be designated as
2. the full combined method of a kind of vision guided navigation/inertial navigation according to claim 1, it is characterized in that, in described step 12), Feature Points Matching comprises two aspects: one is that the left and right image of current time is carried out Feature Points Matching, and two was that the left image of current time and the left image in a upper moment are carried out Feature Points Matching; Described unique point comprises known point and point to be located: known point refers to the unique point occurred in previous frame image, and point to be located then refers to the unique point newly increased in current image.
3. the full combined method of a kind of vision guided navigation/inertial navigation according to claim 1, is characterized in that, described inertial navigation calculation procedure specifically comprises:
(1) posture renewal: the angular velocity observed reading utilizing gyro to export, adopts Quaternion Method to gesture renewals, then the hypercomplex number after renewal is changed into attitude matrix, be designated as k represents a moment, and k+1 represents current time, and L is navigational coordinate system, and b is carrier coordinate system;
(2) attitude is extracted: according to attitude matrix the value of middle element, calculates the attitude of carrier, comprises course angle, the angle of pitch, roll angle;
(3) specific force decomposes: utilize attitude matrix, the specific force observation value f that degree of will speed up meter exports bnavigational coordinate system is transformed into by carrier coordinate system, as follows
f L = R b L ( k + 1 ) · f b
(4) compensation of harmful acceleration: to f lcarry out Corioli's acceleration correction and normal gravity compensation, obtain carrier acceleration a at the earth's surface, as follows
a = f L - ( 2 Ω i e L + Ω e L L ) v L + g L
Wherein, for the antisymmetric matrix that rotational-angular velocity of the earth three-dimensional component is formed, for L system relative to ground be admittedly angular velocity of rotation three-dimensional component form antisymmetric matrix, v lfor velocity, g lfor terrestrial gravitation vector;
(5) integration asks for speed: utilize acceleration to carry out integration and obtain speed increment, and then obtain the speed of current time as follows
v k + 1 L = v k L + a · Δ t
(6) integration asks for position: utilize speed to carry out integration and obtain positional increment, and then obtain the position of current time as follows
r k + 1 L = r k L + 1 2 ( v k L + v k + 1 L ) · Δ t .
4. the full combined method of a kind of vision guided navigation/inertial navigation according to claim 1, is characterized in that, described vision system correction inertial navigation system step specifically comprises:
31) get the state equation of error equation as Kalman filter of inertial navigation system, form is as follows
X · = F · X + G · ω
Wherein, X is error state vector, comprises site error δ r l, velocity error δ v l, the misaligned angle of the platform δ ε l, gyroscopic drift d and accelerometer bias b,
X=(δr L,δv L,δε L,d,b)
32) difference of the position that the position calculated by inertial navigation and vision guided navigation resolve, and the difference of attitude that the attitude that calculates of inertial navigation and vision guided navigation resolve is as measuring value, measurement equation is as follows
Z r Z a = H r H a · X + V r V a
Z r, Z arepresent that site error measures and attitude error measures, H respectively r, H arepresent site error measurement matrix and attitude error measurement matrix respectively, V r, V arepresent the residual error that site error measures and attitude error measures respectively;
Site error Z rcalculate as follows
Z r=r ins-r vo
In above formula, r insrefer to the coordinate that inertial navigation calculates, r vorepresent that vision guided navigation resolves the coordinate obtained; Site error measurement matrix H rform as follows
H r = I 3 × 3 0 12 × 3
In formula, I is unit battle array, and 0 is null matrix, attitude error Z aaccount form as follows
Z a=a ins-a vo
Wherein a insrepresent the attitude that inertial navigation calculates, a vorepresent that vision guided navigation resolves the attitude obtained; Consider the rotation matrix between camera and IMU, vision guided navigation resolved the camera attitude reduction that the obtains attitude to carrier:
a vo=(p′,r′,y′) T
P ', r in formula ', y ' represents the carrier angle of pitch, roll angle and the course angle that are obtained by the reduction of camera attitude; Attitude error measurement matrix H aform as follows
H a = 0 6 × 3 A 3 × 3 0 6 × 3
The form of A battle array is as follows
A = cos y sin y 0 - sin y cos p cos y cos p 0 sin y sin p cos p - cos y sin p cos p 1
In formula, p, r, y represent the carrier angle of pitch, roll angle and the course angle that are calculated by inertial navigation respectively;
33) estimated navigational parameter error and the device error of inertial navigation system by filtering, and compensate and feedback compensation.
5. the full combined method of a kind of vision guided navigation/inertial navigation according to claim 1, is characterized in that, described inertial navigation system correction vision system step specifically comprises:
41) from navigational parameter, isolate position and the attitude of IMU, according to the geometric relationship between camera and IMU, the position of IMU and attitude are changed into the observed reading of camera position and attitude;
42) adopt sequential adjustment process, complete the correction of inertial navigation system to vision system.
6. the full combined method of a kind of vision guided navigation/inertial navigation according to claim 5, is characterized in that, described step 42) specifically comprise:
The function model of inertial navigation system correction vision system is write as following form:
X v o = I I I I x ^ 1 x ^ 2 y ^ 1 y ^ 2 - l X v o r i n s a i n s = I 0 0 0 x ^ 1 x ^ 2 y ^ 1 y ^ 2 - l X i n s
Wherein, be respectively constant corresponding in two equations;
Above formula is organized into the form of observation equation, as follows:
v ( x v o ) = x ^ v o - l ( x v o ) v ( r i n s , a i n s ) = B x ^ v o - l ( r i n s , a i n s )
In above formula, the form of matrix B is as follows
B=(I000)
The probabilistic model of observed reading, namely variance-covariance battle array is as follows:
D ( X v o ) = P ( X v o ) D ( r i n s , a i n s ) = P ( r i n s , a i n s )
In above formula, P (r ins, a ins) be by P (X ins) in isolated to position, submatrix that attitude is relevant, according to above-mentioned function model and probabilistic model, the correction formula of vision system can be derived, as follows
x ^ v o = x v o + D ( X v o ) B T ( D ( r i n s , a i n s ) + B D ( X v o ) B T ) - 1 ( l ( r i n s , a i n s ) - B x v o ) D ( x ^ v o ) = D ( X v o ) - D ( X v o ) B T ( D ( r i n s , a i n s ) + B D ( X v o ) B T ) - 1 B D ( X v o )
In above formula, x vorefer to and resolve by vision guided navigation the vision parameter obtained, through revised vision parameter, through revised variance-covariance battle array.
CN201210161621.9A 2012-05-22 2012-05-22 A kind of full combined method of vision guided navigation/inertial navigation Expired - Fee Related CN103424114B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210161621.9A CN103424114B (en) 2012-05-22 2012-05-22 A kind of full combined method of vision guided navigation/inertial navigation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210161621.9A CN103424114B (en) 2012-05-22 2012-05-22 A kind of full combined method of vision guided navigation/inertial navigation

Publications (2)

Publication Number Publication Date
CN103424114A CN103424114A (en) 2013-12-04
CN103424114B true CN103424114B (en) 2016-01-20

Family

ID=49649216

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210161621.9A Expired - Fee Related CN103424114B (en) 2012-05-22 2012-05-22 A kind of full combined method of vision guided navigation/inertial navigation

Country Status (1)

Country Link
CN (1) CN103424114B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106885571A (en) * 2017-03-07 2017-06-23 辽宁工程技术大学 A kind of lunar surface rover method for rapidly positioning of combination IMU and navigation image

Families Citing this family (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102016551B1 (en) * 2014-01-24 2019-09-02 한화디펜스 주식회사 Apparatus and method for estimating position
CN103994765B (en) * 2014-02-27 2017-01-11 北京工业大学 Positioning method of inertial sensor
CN103914065B (en) * 2014-03-24 2016-09-07 深圳市大疆创新科技有限公司 The method and apparatus that flight state is revised in real time
WO2015143615A1 (en) 2014-03-24 2015-10-01 深圳市大疆创新科技有限公司 Method and apparatus for correcting aircraft state in real time
CN103925926B (en) * 2014-04-25 2016-08-24 哈尔滨工程大学 A kind of quaternary number measuring method based on CAMERA/MIMU indoor integrated navigation system
CN105675013B (en) * 2014-11-21 2019-03-01 中国飞行试验研究院 Civil aircraft inertial navigation dynamic calibration method
CN104808685A (en) * 2015-04-27 2015-07-29 中国科学院长春光学精密机械与物理研究所 Vision auxiliary device and method for automatic landing of unmanned aerial vehicle
WO2016187757A1 (en) * 2015-05-23 2016-12-01 SZ DJI Technology Co., Ltd. Sensor fusion using inertial and image sensors
CN113093808A (en) * 2015-05-23 2021-07-09 深圳市大疆创新科技有限公司 Sensor fusion using inertial and image sensors
CN105043392B (en) * 2015-08-17 2018-03-02 中国人民解放军63920部队 A kind of aircraft pose determines method and device
CN105208348B (en) * 2015-10-08 2018-08-21 成都时代星光科技有限公司 The aerial unmanned plane of railway line is patrolled and real time image collection Transmission system automatically
CN106708066B (en) * 2015-12-20 2019-07-26 中国电子科技集团公司第二十研究所 View-based access control model/inertial navigation unmanned plane independent landing method
CN105588563B (en) * 2016-01-15 2018-06-12 武汉光庭科技有限公司 Binocular camera and inertial navigation combined calibrating method in a kind of intelligent driving
US10132933B2 (en) * 2016-02-02 2018-11-20 Qualcomm Incorporated Alignment of visual inertial odometry and satellite positioning system reference frames
CN105698765B (en) * 2016-02-22 2018-09-18 天津大学 Object pose method under double IMU monocular visions measurement in a closed series noninertial systems
CN105953795B (en) * 2016-04-28 2019-02-26 南京航空航天大学 A kind of navigation device and method for the tour of spacecraft surface
CN106052683A (en) * 2016-05-25 2016-10-26 速感科技(北京)有限公司 Robot motion attitude estimating method
CN105973264A (en) * 2016-07-21 2016-09-28 触景无限科技(北京)有限公司 Intelligent blind guiding system
CN106843470B (en) * 2016-12-28 2020-04-03 歌尔科技有限公司 Visual angle control method and device and VR system
CN108253940B (en) * 2016-12-29 2020-09-22 东莞前沿技术研究院 Positioning method and device
US10371530B2 (en) * 2017-01-04 2019-08-06 Qualcomm Incorporated Systems and methods for using a global positioning system velocity in visual-inertial odometry
CN106705965A (en) * 2017-01-12 2017-05-24 苏州中德睿博智能科技有限公司 Scene three-dimensional data registration method and navigation system error correction method
CN107063246A (en) * 2017-04-24 2017-08-18 齐鲁工业大学 A kind of Loosely coupled air navigation aid of vision guided navigation/inertial navigation
CN107389088B (en) * 2017-05-27 2020-11-17 纵目科技(上海)股份有限公司 Error correction method, device, medium and equipment for vehicle-mounted inertial navigation
CN108253942B (en) * 2017-06-08 2020-05-01 中国科学院遥感与数字地球研究所 Method for improving oblique photography measurement space-three quality
CN107197154B (en) * 2017-06-22 2020-04-21 北京奇艺世纪科技有限公司 Panoramic video stabilization method and device
CN107621266B (en) * 2017-08-14 2020-12-15 上海宇航系统工程研究所 Space non-cooperative target relative navigation method based on feature point tracking
CN109073407B (en) * 2017-10-26 2022-07-05 深圳市大疆创新科技有限公司 Drift calibration method and device of inertial measurement unit and unmanned aerial vehicle
CN110361003B (en) * 2018-04-09 2023-06-30 中南大学 Information fusion method, apparatus, computer device and computer readable storage medium
CN108801248B (en) * 2018-06-22 2020-09-15 深圳市北斗产业互联网研究院 Planar vision inertial navigation method based on UKF
CN108937742A (en) * 2018-09-06 2018-12-07 苏州领贝智能科技有限公司 A kind of the gyroscope angle modification method and sweeper of sweeper
CN109520476B (en) * 2018-10-24 2021-02-19 天津大学 System and method for measuring dynamic pose of rear intersection based on inertial measurement unit
CN109724597B (en) * 2018-12-19 2021-04-02 上海交通大学 Inertial navigation resolving method and system based on function iteration integral
CN109827569A (en) * 2019-02-21 2019-05-31 奇瑞汽车股份有限公司 Unmanned vehicle localization method and system
CN111750896B (en) * 2019-03-28 2022-08-05 杭州海康机器人技术有限公司 Holder calibration method and device, electronic equipment and storage medium
CN110068325A (en) * 2019-04-11 2019-07-30 同济大学 A kind of lever arm error compensating method of vehicle-mounted INS/ visual combination navigation system
CN111829552B (en) * 2019-04-19 2023-01-06 北京魔门塔科技有限公司 Error correction method and device for visual inertial system
CN110081881B (en) * 2019-04-19 2022-05-10 成都飞机工业(集团)有限责任公司 Carrier landing guiding method based on unmanned aerial vehicle multi-sensor information fusion technology
CN110177220B (en) * 2019-05-23 2020-09-01 上海图趣信息科技有限公司 Camera with external time service function and control method thereof
CN110412635B (en) * 2019-07-22 2023-11-24 武汉大学 GNSS/SINS/visual tight combination method under environment beacon support
CN110567486B (en) * 2019-08-15 2021-04-13 深圳市瑞立视多媒体科技有限公司 Mathematical model construction method for calibrating 3D rotation difference, calibration method and device thereof
CN112461258A (en) * 2019-09-06 2021-03-09 北京三快在线科技有限公司 Parameter correction method and device
CN110645975A (en) * 2019-10-16 2020-01-03 北京华捷艾米科技有限公司 Monocular vision positioning IMU (inertial measurement unit) auxiliary tracking method and device
CN111432113B (en) * 2019-12-30 2022-04-05 科沃斯机器人股份有限公司 Data calibration method, device and storage medium
CN111207742B (en) * 2020-01-17 2020-12-15 西安科技大学 Coal mining machine positioning and attitude determining method with additional external orientation element constraint
CN111780752B (en) * 2020-06-10 2022-01-04 北京航天控制仪器研究所 Method for improving inertial guidance precision with observable attitude error
CN112304304B (en) * 2020-10-23 2023-04-18 国网智能科技股份有限公司 Patrol unmanned aerial vehicle, system and method suitable for transformer substation
CN113012224B (en) * 2021-03-12 2022-06-03 浙江商汤科技开发有限公司 Positioning initialization method and related device, equipment and storage medium
CN113109831A (en) * 2021-03-26 2021-07-13 国家电网有限公司 Data processing method for polling transmission line by using laser radar
CN114485574B (en) * 2021-12-21 2023-03-21 武汉大学 Three-linear array image POS auxiliary ground positioning method based on Kalman filtering model
CN114942026A (en) * 2022-06-01 2022-08-26 四川大学 Multimode three-dimensional image navigation system based on intelligent data

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101270993A (en) * 2007-12-12 2008-09-24 北京航空航天大学 Remote high-precision independent combined navigation locating method
CN101532841A (en) * 2008-12-30 2009-09-16 华中科技大学 Method for navigating and positioning aerocraft based on landmark capturing and tracking
CN101576384A (en) * 2009-06-18 2009-11-11 北京航空航天大学 Indoor movable robot real-time navigation method based on visual information correction

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI397671B (en) * 2009-12-16 2013-06-01 Ind Tech Res Inst System and method for locating carrier, estimating carrier posture and building map

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101270993A (en) * 2007-12-12 2008-09-24 北京航空航天大学 Remote high-precision independent combined navigation locating method
CN101532841A (en) * 2008-12-30 2009-09-16 华中科技大学 Method for navigating and positioning aerocraft based on landmark capturing and tracking
CN101576384A (en) * 2009-06-18 2009-11-11 北京航空航天大学 Indoor movable robot real-time navigation method based on visual information correction

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
基于人工标志的视觉/SINS组合导航算法研究;宋琳娜等;《科学技术与工程》;20120229;第12卷(第4期);863页左栏 *
基于双目视觉和惯性器件的微小型无人机运动状态估计方法;李建等;《航空学报》;20111225;第32卷(第12期);全文 *
煤矿救灾机器人的惯性/视觉组合导航方法研究;朱晓飞;《矿山机械》;20101231;第38卷(第4期);全文 *
紧耦合INS/视觉相对位姿测量方法;王龙等;《中国惯性技术学报》;20111231;第19卷(第6期);687页右栏、688页右栏、689页右栏 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106885571A (en) * 2017-03-07 2017-06-23 辽宁工程技术大学 A kind of lunar surface rover method for rapidly positioning of combination IMU and navigation image

Also Published As

Publication number Publication date
CN103424114A (en) 2013-12-04

Similar Documents

Publication Publication Date Title
CN103424114B (en) A kind of full combined method of vision guided navigation/inertial navigation
CN102608596B (en) Information fusion method for airborne inertia/Doppler radar integrated navigation system
CN103994763B (en) The SINS/CNS deep integrated navigation system of a kind of Marsokhod and its implementation
CN103743414B (en) Initial Alignment Method between the traveling of vehicle-mounted SINS assisted by a kind of speedometer
CN102901514B (en) Collaborative initial alignment method based on multiple-inertia-unit informational constraint
Xiong et al. G-VIDO: A vehicle dynamics and intermittent GNSS-aided visual-inertial state estimator for autonomous driving
CN103759730A (en) Collaborative navigation system based on navigation information bilateral fusion for pedestrian and intelligent mobile carrier and navigation method thereof
CN101949703A (en) Strapdown inertial/satellite combined navigation filtering method
CN101266150B (en) Un-manned machine side navigation method
Kang et al. Vins-vehicle: A tightly-coupled vehicle dynamics extension to visual-inertial state estimator
CN106052688A (en) Terrain contour matching-based inertial navigation system speed accumulative error correction method
CN101900573B (en) Method for realizing landtype inertial navigation system movement aiming
CN105318876A (en) Inertia and mileometer combination high-precision attitude measurement method
CN103792561B (en) A kind of tight integration reduced-dimensions filtering method based on GNSS passage difference
CN105698822A (en) Autonomous inertial navigation action initial alignment method based on reverse attitude tracking
CN103217174B (en) A kind of strapdown inertial navitation system (SINS) Initial Alignment Method based on low precision MEMS (micro electro mechanical system)
CN102788580A (en) Flight path synthetic method in unmanned aerial vehicle visual navigation
CN109084760B (en) Navigation system between buildings
Park et al. MEMS 3D DR/GPS integrated system for land vehicle application robust to GPS outages
Niu et al. Camera-based lane-aided multi-information integration for land vehicle navigation
Sun et al. A vehicle-carried INS positioning accuracy improvement method by using lateral constraint in GPS-denied environment
CN111220151B (en) Inertia and milemeter combined navigation method considering temperature model under load system
CN104634348A (en) Attitude angle calculation method in integrated navigation
Ma et al. Vehicle model aided inertial navigation
CN103616026A (en) AUV (Autonomous Underwater Vehicle) manipulating model auxiliary strapdown inertial navigation combined navigation method based on H infinity filtering

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160120

Termination date: 20180522