CN103424114B  A kind of full combined method of vision guided navigation/inertial navigation  Google Patents
A kind of full combined method of vision guided navigation/inertial navigation Download PDFInfo
 Publication number
 CN103424114B CN103424114B CN201210161621.9A CN201210161621A CN103424114B CN 103424114 B CN103424114 B CN 103424114B CN 201210161621 A CN201210161621 A CN 201210161621A CN 103424114 B CN103424114 B CN 103424114B
 Authority
 CN
 China
 Prior art keywords
 attitude
 inertial navigation
 vision
 error
 system
 Prior art date
Links
Abstract
Description
Technical field
The present invention relates to the array mode of a kind of vision navigation system and inertial navigation system, especially relate to a kind of full combined method of vision guided navigation/inertial navigation.
Background technology
Vision guided navigation, be also called vision odometry (VisualOdometry, VO), ultimate principle is: in carrier advance process, utilize stereoscopic camera to obtain the image of scene around, rely on the coupling of unique point between the image of left and right, and the tracking between the frame image of front and back, based on collinearity equation, the vertical observation equation of row, asks for locus and the attitude of camera according to principle of least square adjustment.Vision guided navigation is a kind of highprecision independent navigation mode, but as a kind of reckoning mode, its navigation error accumulates in time, and the stability of algorithm is a difficult problem always, the background sometimes in image visual field is single or too similar, or the dynamic of carrier car is excessive, such as angle of turn is excessive, make the degree of overlapping between adjacent image too little, these all can cause vision guided navigation to resolve failure, therefore will strengthen the stability of algorithm by other means.
Inertial navigation system (InertialNavigationSystem, INS) take Newton mechanics law as principle of work, utilize the inertial measurement component (InertialMeasurementUnit be arranged on carrier, IMU) come angular velocity and the acceleration information of sensitive carrier, obtained position, the navigational parameter such as speed and attitude of carrier by integral operation.Inertial navigation has independence and antiinterference completely, has high precision in the short time, but due to integral principle, its error is accumulated in time, and the precision therefore worked long hours is poor.In order to reduce error accumulation, external observation value be introduced inertial navigation system is revised.
Therefore, vision guided navigation and inertial navigation, as two kinds of independent navigation modes, have good complementarity.Both combinations can solve without the navigation problem under GPS environment, and for indoor navigation, robot path planning, unmanned plane independent landing, the navigation of automatic running car, underground traffic navigator fix, the mine operation application demand such as are safely significant.
At present, the array mode of vision guided navigation and inertial navigation system generally has three classes: (1) mating type, the general positional information relying on vision system to provide carrier, and inertial navigation system provides the attitude information of carrier; (2) correction type, this array mode goes to revise another kind of navigational system with a kind of navigational system; (3) pattern of fusion, the general data fusion adopting Kalman filtering to realize both.In above three class array modes, mating type belongs to the combination of low level, is usually used in the occasion that accuracy requirement is not high, and usually adopts correction type or pattern of fusion in highprecision occasion.
In correction type array mode, generally utilize inertial navigation system to revise vision system, implementation procedure is generally such: (1) camera obtains the image of current time, (2) extracting and matching feature points is carried out to the left and right image of current time, and mate with the image of previous moment, then the elements of exterior orientation (position and attitude) of camera is calculated by resection, (3) the output data of IMU are utilized to calculate the angle of pitch and the roll angle of carrier, and course angle increment, (4) angle of pitch of carrier and roll angle and course angle increment is utilized, by Kalman filtering, camera position and attitude are revised, (5) revised camera position and attitude is utilized, the object coordinates of unique point is asked by forward intersection, (6) above step is repeated.
The implementation procedure of pattern of fusion array mode is generally such: (1) camera obtains the image of current time, (2) extracting and matching feature points is carried out to the left and right image of current time, and mate with the image of previous moment, then the elements of exterior orientation (position and attitude) of camera is calculated by resection, (3) utilize the position of camera and attitude as measuring value, adopt position and the attitude of Kalman Filter Estimation IMU, (4) the IMU position obtained after filtering and attitude are passed to camera, then the object coordinates of unique point is asked by forward intersection, (5) above step is repeated.
As can be seen from the above, in the array mode of correction type, only utilize the output data of IMU to provide attitude, and positional information is not provided, therefore this mode navigation information that could not inertial navigation system be utilized to the full extent to provide.The array mode of pattern of fusion can utilize the information of vision system and inertial navigation system preferably, makes both can be revised, but this mode does not consider the prior imformation of image feature point object coordinates, and combined effect can not reach best.In fact in vision guided navigation, the transmission of unique point between different images is crucial, the unique point extracted in current image, and major part obtains its object coordinates by least square adjustment in previous frame image, and coordinate precision is also known, therefore must take in precision information.
Summary of the invention
Object of the present invention is exactly provide a kind of theory full combined method that is tight, stable, vision guided navigation/inertial navigation efficiently to overcome defect that abovementioned prior art exists.
Object of the present invention can be achieved through the following technical solutions:
A full combined method for vision guided navigation/inertial navigation, the method comprises the following steps:
1) vision guided navigation resolves: camera obtains image in carrier movement process, and based on collinearity equation, the vertical observation equation of row, adopts principle of least square adjustment to ask for carrier positions and attitude parameter X _{vo}, and the variancecovariance battle array P (X between calculating parameter _{vo});
2) inertial navigation calculates: in local horizontal coordinates, do navigation calculate, obtain the position of each moment carrier, speed and attitude parameter X _{ins}, simultaneously according to the error equation of inertial navigation system, propagate rule by covariance, the variancecovariance battle array P (X between calculating parameter _{ins});
3) vision system correction inertial navigation system: take Kalman filtering as means, get the error equation of inertial navigation system as state equation, the carrier positions, the attitude that utilize vision guided navigation to resolve to obtain, the carrier positions calculated with inertial navigation system, attitude ask poor respectively, as the measurement of Kalman filtering, estimate navigational parameter error and the device error of inertial navigation system, and compensate and feedback compensation, thus obtain the optimal estimation of all parameters of inertial navigation system, be designated as
4) inertial navigation system correction vision system: position and the attitude of isolating IMU from the parameter of inertial navigation system, consider the geometric relationship between camera and IMU, the position of IMU and attitude are changed into position and the attitude information of camera, set up observation equation, adopt sequential adjustment process, obtain the optimal estimation of all parameters of vision system, be designated as
Described vision guided navigation resolves concrete steps and comprises:
11) initialization of camera elements of exterior orientation: laying reference mark around vision guided navigation reference position, asks for the elements of exterior orientation of camera by resection;
12) carrier ceaselessly moves, and utilizes camera to obtain the image of current time, carries out extracting and matching feature points;
13) based on collinearity equation, all observation equation is set up to the successful all unique points of left and right Image Matching:
In above formula, refer to the elements of exterior orientation correction of left camera, refer to the elements of exterior orientation correction of right camera, refer to the object coordinates correction of known point, refer to the object coordinates correction of point to be located, y _{1}refer to the object coordinates of known point, v refers to observed reading residual error, and l is constant, for the restriction relation between the camera of left and right, W _{x}for the constant in equation of constraint;
14) solve above formula according to the principle of least square, try to achieve the elements of exterior orientation of left and right camera, and the object coordinates of all unique points.
Described Feature Points Matching comprises two aspects: one is that the left and right image of current time is carried out Feature Points Matching, and two was that the left image of current time and the left image in a upper moment are carried out Feature Points Matching; Described unique point comprises known point and point to be located: known point refers to the unique point occurred in previous frame image, and point to be located then refers to the unique point newly increased in current image.
Described inertial navigation calculation procedure specifically comprises:
(1) posture renewal: the angular velocity observed reading utilizing gyro to export, adopts Quaternion Method to gesture renewals, then the hypercomplex number after renewal is changed into attitude matrix, be designated as k represents a moment, and k+1 represents current time, and L is navigational coordinate system, and b is carrier coordinate system;
(2) attitude is extracted: according to attitude matrix the value of middle element, calculates the attitude of carrier, comprises course angle, the angle of pitch, roll angle;
(3) specific force decomposes: utilize attitude matrix, the specific force observation value f that degree of will speed up meter exports ^{b}navigational coordinate system is transformed into by carrier coordinate system, as follows
(4) compensation of harmful acceleration: to f ^{l}carry out Corioli's acceleration correction and normal gravity compensation, obtain carrier acceleration a at the earth's surface, as follows
Wherein, for the antisymmetric matrix that rotationalangular velocity of the earth threedimensional component is formed, for L system relative to ground be admittedly angular velocity of rotation threedimensional component form antisymmetric matrix, v ^{l}for velocity, g ^{l}for terrestrial gravitation vector;
(5) integration asks for speed: utilize acceleration to carry out integration and obtain speed increment, and then obtain the speed of current time as follows
(6) integration asks for position: utilize speed to carry out integration and obtain positional increment, and then obtain the position of current time as follows
Described vision system correction inertial navigation system step specifically comprises:
31) get the state equation of error equation as Kalman filter of inertial navigation system, form is as follows
Wherein, X is error state vector, comprises site error δ r ^{l}, velocity error δ v ^{l}, the misaligned angle of the platform δ ε ^{l}, gyroscopic drift d and accelerometer bias b,
X＝(δr ^{L}，δv ^{L}，δε ^{L}，d，b)
32) difference of the position that the position calculated by inertial navigation and vision guided navigation resolve, and the difference of attitude that the attitude that calculates of inertial navigation and vision guided navigation resolve is as measuring value, measurement equation is as follows
Z _{r}, Z _{a}represent that site error measures and attitude error measures, H respectively _{r}, H _{a}represent site error measurement matrix and attitude error measurement matrix respectively, V _{r}, V _{a}represent the residual error that site error measures and attitude error measures respectively;
Site error Z _{r}calculate as follows
Z _{r}＝r _{ins}r _{vo}
In above formula, r _{ins}refer to the coordinate that inertial navigation calculates, r _{vo}represent that vision guided navigation resolves the coordinate obtained; Site error measurement matrix H _{r}form as follows
In formula, I is unit battle array, and 0 is null matrix, attitude error Z _{a}account form as follows
Z _{a}＝a _{ins}a _{vo}
Wherein a _{ins}represent the attitude that inertial navigation calculates, a _{vo}represent that vision guided navigation resolves the attitude obtained; Consider the rotation matrix between camera and IMU, vision guided navigation resolved the camera attitude reduction that the obtains attitude to carrier:
a _{vo}＝(p′，r′，y′) ^{T}
P ', r in formula ', y ' represents the carrier angle of pitch, roll angle and the course angle that are obtained by the reduction of camera attitude; Attitude error measurement matrix H _{a}form as follows
The form of A battle array is as follows
In formula, p, r, y represent the carrier angle of pitch, roll angle and the course angle that are calculated by inertial navigation respectively;
33) estimated navigational parameter error and the device error of inertial navigation system by filtering, and compensate and feedback compensation.
Described inertial navigation system correction vision system step specifically comprises:
41) from navigational parameter, isolate position and the attitude of IMU, according to the geometric relationship between camera and IMU, the position of IMU and attitude are changed into the observed reading of camera position and attitude;
42) adopt sequential adjustment process, complete the correction of inertial navigation system to vision system.
Described step 42) specifically comprise:
The function model of inertial navigation system correction vision system is write as following form:
Wherein, be respectively constant corresponding in two equations.
Above formula is organized into the form of observation equation, as follows:
In above formula, the form of matrix B is as follows
B＝(I000)
The probabilistic model of observed reading, namely variancecovariance battle array is as follows:
In above formula, P (r _{ins}, a _{ins}) be by P (X _{ins}) in isolated to position, submatrix that attitude is relevant, according to abovementioned function model and probabilistic model, the correction formula of vision system can be derived, as follows
In above formula, x _{vo}refer to and resolve by vision guided navigation the vision parameter obtained, through revised vision parameter, through revised variancecovariance battle array.
Compared with prior art, the present invention has following advantage:
(1) vision system correction inertial navigation system.The navigational parameter error (site error, velocity error, attitude error) of inertial navigation system not only can be estimated by Kalman filtering, also can estimate device error (gyroscopic drift, accelerometer bias), and navigational parameter error is compensated, by device Error Feedback in the navigation calculating of next cycle, inertial navigation system can be made like this to keep long high precision.When vision guided navigation resolves failure, combined system then relies on inertial navigation to provide the navigational parameter of carrier completely, owing to estimating various error before and correcting, therefore at short notice, the failure can not resolved because of vision guided navigation and cause the precision of whole system obviously to decline.
(2) inertial navigation system correction vision system.The position utilizing inertial navigation to calculate and attitude, by sequential adjustment process, not only can improve the elements of exterior orientation of camera, and the object coordinates of image feature point is also revised.To be delivered in the navigation calculation of next frame image through the object coordinates of the revised image feature point of sequential adjustment and precision information (the variancecovariance battle array between object coordinates), least square collocation is adopted to process, this method account for the object coordinates prior imformation of image feature point, and vision system can be made to keep long high precision.
Accompanying drawing explanation
Fig. 1 is steps flow chart schematic diagram of the present invention;
Fig. 2 is the schematic diagram of vision system correction inertial navigation system;
Fig. 3 is inertial navigation system correction vision system schematic diagram;
Fig. 4 is the carrier running orbit that different navigation method calculates;
Fig. 5 is that the position in Kalman filtering process newly ceases;
Fig. 6 is that the attitude in Kalman filtering process newly ceases.
Embodiment
Below in conjunction with the drawings and specific embodiments, the present invention is described in detail.
Embodiment
As shown in Figure 1, a kind of full combined method of vision guided navigation/inertial navigation, the method comprises the following steps:
1) vision guided navigation resolves
Step is as follows:
(1) initialization of camera elements of exterior orientation, generally some reference mark of laying around vision guided navigation reference position, ask for the elements of exterior orientation of camera by resection.
(2) carrier ceaselessly moves, stereoscopic camera is utilized to obtain the image of current time, carry out extracting and matching feature points, Feature Points Matching comprises two aspects: one is that the left and right image of current time is carried out Feature Points Matching, and two was that the left image of current time and the left image in a upper moment are carried out Feature Points Matching.
(3) based on collinearity equation, all observation equation is set up to the successful all unique points of left and right Image Matching.Here unique point is divided into two classes: known point and point to be located.Known point refers to the unique point occurred in previous frame image, and therefore its object coordinates is known, and corresponding precision is also known, and point to be located then refers to the unique point newly increased in current image, and its object coordinates is without any prior imformation.The prior imformation least square collocation principle of known point treated in the present invention, therefore observation equation can be write as following form:
Three equations are comprised altogether in above formula.In first equation refer to the elements of exterior orientation correction of left camera, refer to the elements of exterior orientation correction of right camera, refer to the object coordinates correction of known point, refer to the object coordinates correction of point to be located, y _{1}refer to the object coordinates of known point, v refers to observed reading residual error, and l refers to constant.Second equation is the virtual observation equation set up according to the prior imformation of known point.3rd equation is the equation of constraint between the camera of left and right, this equation by experiment before calibration carried out to the geometric relationship between camera and IMU set up.
(4) solve formula (1) according to the principle of least square, in fact, formula (1) is the indirect adjustment model of belt restraining, therefore can be tried to achieve the elements of exterior orientation of left and right camera by adjustment, and the object coordinates of all unique points.
2) inertial navigation calculates
Choose local horizontal coordinates (L system) as navigation system, three axles of L system are oriented in " east, north, sky ", and three axles of carrier system (b system) are oriented to " right, front, on ".The step that navigation calculates is as follows:
(1) posture renewal.The angular velocity observed reading utilizing gyro to export, adopts Quaternion Method to gesture renewals, then the hypercomplex number after renewal is changed into attitude matrix, be designated as
(2) attitude is extracted.According to attitude matrix the value of middle element, calculates the attitude of attitude, comprises course angle, the angle of pitch, roll angle.
(3) specific force decomposes.Utilize attitude matrix, the specific force observation value f that degree of will speed up meter exports ^{b}navigational coordinate system is transformed into by carrier coordinate system, as follows
(4) compensation of harmful acceleration.To f ^{l}carry out Corioli's acceleration correction and normal gravity compensation, obtain carrier acceleration at the earth's surface, as follows
(5) integration asks for speed.Utilize acceleration to carry out integration and obtain speed increment, and then obtain the speed of current time, as follows
(6) integration asks for position.Utilize speed to carry out integration and obtain positional increment, and then obtain the position of current time, as follows
3) vision system correction inertial navigation system
Vision system correction inertial navigation system is realized by Kalman filtering, get the error equation of inertial navigation system as state equation, Fig. 2 shows its principle and process, as can be seen from the figure, the measuring value of system is the difference of the position that the position that calculates of inertial navigation and vision guided navigation resolve, and the difference of attitude that the attitude that calculates of inertial navigation and vision guided navigation resolve.
Get the error state vector on 15 rank, comprise site error, velocity error, the misaligned angle of the platform, gyroscopic drift, accelerometer bias.As follows
X＝(δr ^{L}，δv ^{L}，δε ^{L}，d，b)(6)
State equation gets the error equation of inertial navigation system, and form is as follows
The camera position that the external measurement information of system obtains from vision guided navigation and attitude.Therefore measurement equation can write out following form
Z in above formula _{r}, Z _{a}represent that site error measures and attitude error measures respectively.Site error Z _{r}calculate as follows
Z _{r}＝r _{ins}r _{vo}(9)
In above formula, r _{ins}refer to that INS navigates the coordinate calculated, r _{vo}represent that vision guided navigation resolves the coordinate obtained.Due to the coordinate that the determined coordinate of vision guided navigation is camera projection centre, and INS navigation calculates is the coordinate of IMU geometric center, therefore needs r _{vo}transform, the threedimensional between deduction camera and IMU is biased, thus by the coordinate reduction of camera projection centre to IMU geometric center.
Site error measurement matrix H _{r}form as follows
In formula, I is unit battle array, and 0 is null matrix.Attitude error Z _{a}account form as follows
Z _{a}＝a _{ins}a _{vo}(11)
Wherein a _{ins}represent that INS navigates the attitude calculated, a _{vo}represent that vision guided navigation resolves the attitude obtained.It is noted herein that, the navigate attitude that calculates of INS is the attitude of carrier, and vision guided navigation resolves what obtain is the attitude of camera, specifically refer to three the rotation Eulerian angle of image space coordinate system to world coordinate system, therefore the rotation matrix between camera and IMU to be considered, by the attitude of the attitude reduction of camera to carrier.
a _{vo}＝(p′，r′，y′) ^{T}(12)
P ', r in formula ', y ' represents the carrier angle of pitch, roll angle and the course angle that are obtained by the reduction of camera attitude.Attitude error measurement matrix H _{a}form as follows
The form of A battle array is as follows
In formula, p, r, y represent the carrier angle of pitch, roll angle and the course angle that are calculated by inertial navigation respectively.So far, the function model of Kalman filtering is established, and can be estimated navigational parameter error and the device error of inertial navigation system, and compensate and feedback compensation, thus make inertial navigation system keep long high precision by filtering.
4) inertial navigation system correction vision system
Inertial navigation system correction vision system adopts sequential adjustment to realize, and Fig. 3 shows its principle and process.As can be seen from the figure, calculated by inertial navigation, the navigational parameter in each moment can be obtained, simultaneously according to the error equation of inertial navigation system, propagate rule by covariance, the variancecovariance battle array between parameter can be calculated.Then from navigational parameter, isolate position and attitude, according to the geometric relationship between camera and IMU, change into the observed reading of camera position and attitude, adopt sequential adjustment process, realize the optimization of vision system navigational parameter.
Sequential adjustment is generally used for data situation by stages, that is in the different moment, system is observed, and utilize previous observed reading to carry out adjustment processing to system, obtain parameter estimation, and the precision information of parameter (variancecovariance battle array).The problem that sequential adjustment will solve is exactly how to utilize the parameter estimation and precision information that previously obtain, processes, obtain more optimal parameter estimation together with the observed reading newly increased.If total observation equation is
Observation data is divided into two groups, then observation equation can be write as following form
Suppose that the observed reading number in above formula first equation is enough, solve first equation by the principle of least square,
P in above formula _{1}represent the power battle array of observed reading in first equation.Solve two equations simultaneously:
P in above formula _{2}represent the power battle array of observed reading in second equation.According to Inversion formula of matrix, the computing formula of sequential adjustment can be derived, as follows
Present introduction adopts sequential adjustment to realize the correction of inertial navigation system to vision system.After vision guided navigation has resolved, the valuation X of camera parameter can be obtained _{vo}, by accuracy assessment, the variancecovariance battle array P (X between parameter can be obtained _{vo}).Inertial navigation system is calculated by navigation, can obtain the navigational parameter X in corresponding moment _{ins}, simultaneously according to the error equation of inertial navigation system, propagate rule by covariance, the variancecovariance battle array between calculating parameter, is designated as P (X _{ins}).From the navigational parameter X of inertial navigation system _{ins}in isolate position r _{ins}with attitude a _{ins}, from variancecovariance battle array P (X _{ins}) in isolate the position submatrix P (r relevant to attitude _{ins}, a _{ins}), consider the geometric relationship between camera and IMU, be converted into the observed reading to vision parameter, so the function model of inertial navigation system correction vision system can be write as following form:
Above formula is organized into the form of observation equation, as follows:
In above formula, the form of matrix B is as follows
B＝(I000)(22)
The probabilistic model (variancecovariance battle array) of observed reading is as follows:
According to function model above and probabilistic model, the correction formula of vision system can be derived, as follows
In above formula, x _{vo}refer to and resolve by vision guided navigation the vision parameter obtained, through revised vision parameter, through revised variancecovariance battle array.Employing above formula calculates, and can revise all parameters of vision system, comprise the position of left and right camera, attitude, and the object coordinates of all unique points.
If purely algorithmically, the 1st step and the 2nd step are paired running in fact, and the 3rd step and the 4th step are also paired runnings.Namely first vision system and inertial navigation system carry out navigation calculation separately, try to achieve corresponding parameter, and then two subsystems are assisted mutually, and each subsystem is revised, and obtain optimal estimation.The IMU optimal estimation that 3rd step obtains with the camera optimal estimation obtained in the 4th step all the optimal estimation to whole system, or perhaps the optimal estimation to carrier running status.In theory, when not considering the error of calculation, with of equal value.In other words, if the geometric relationship between camera and IMU is taken into account, so with should be consistent.
In order to verify the reliability and stability of the inventive method, carrying out many experiments and data processing, below having introduced wherein an experimentation and data processed result.Experiment place is positioned at Beijing, carrier car is equipped with the data acquisition equipment such as stereoscopic camera and IMU.Also installed a GPS in addition additional, made difference with base station GPS, difference result reaches the precision of centimetresized, therefore as the precision checking the inventive method with reference to value.Complete the calibration of stereoscopic camera elements of interior orientation and relative elements of exterior orientation calibration on pretreatment, and the calibration of relative position between camera and IMU and attitude.In experimentation, the landmark point of known absolute coordinate information is first utilized to carry out the initialization of camera elements of exterior orientation.After initialization completes, control carrier car and gather IMU data, image data, gps data in test block.The output frequency of IMU is 100Hz, its gyroscopic drift is 0.005 °/s, accelerometer bias is 10mg, and obtain stereopsis with the frequency of 2Hz by synchronous exposure by the onboard clock time service stereoscopic camera of IMU, the horizontal and vertical field angle of stereoscopic camera is respectively 42 ° and 31 °.The output frequency of GPS is 1Hz.In experiment, the accumulative about 1150m that advances of mobile experiment porch, is about 45min during experiment, obtains about 4300 frame stereopsis.
Two kinds of air navigation aids are adopted to process to experimental data.Method one is that pure vision guided navigation resolves, and when vision guided navigation solution process occurs unsuccessfully, utilizes the motion of uniform motion model prediction carrier, fills up vision guided navigation breach, and method two is that vision guided navigation and inertial navigation combine.In fact, author also attempts having carried out pureinertial guidance calculating, but for reaching the experimental data of 45min, its navigation error reaches the magnitude of more than 10000m, and therefore pureinertial guidance is nonsensical, does not discuss here.Below provide the result of two kinds of distinct methods process, and GPS reference locus, see Fig. 4.
Legend in Fig. 4, GPS represents differential GPS result, and VO/INS represents vision/inertia combined navigation result, and VO represents pure vision guided navigation result.As can be seen from the figure, the track of vision/inertia combined navigation and GPS track are very close, and its precision is higher than the result of pure vision guided navigation.Table 1 lists the error condition of two kinds of different navigation methods.As can be seen from the table, vision/inertia combined navigation is in the operating range of 1150m, and accumulative site error is 8.6m, and relative accuracy is 0.75%.
The error condition of table 1 different navigation method
In experimentation, carrier car is sometimes across the road surface of baldness, and the angle of sometimes turning is comparatively large, and cause the vision guided navigation of 6 short time of appearance to resolve failure, at this moment combined system relies on inertial navigation to provide navigational parameter completely.As can be seen from the VO/INS combined trajectories in Fig. 4, when vision guided navigation resolves unsuccessfully, there is not obvious precise decreasing in system, and this embodies reliability and the high precision of this method.
Fig. 5 and Fig. 6 shows in the process of vision system correction inertial navigation system, the new breath situation of Kalman filter.In the Kalman filter that the inventive method adopts, innovation representation be that the navigational parameter that calculates of inertial navigation and vision guided navigation resolve the navigational parameter difference between the two obtained.
As can be seen from Figure 5, the maximal value of position difference is within 0.15m, and in most situation all within 0.05m, as can be seen from Figure 6, the maximal value of pose difference within 0.1 °, and in most situation all within 0.01 °, this illustrates that filtering is more stable, vision guided navigation and inertial navigation two systems can be good at combination, in conjunction with before adopt the accuracy evaluation of GPS absolute reference value, fully show reliability and the superiority of combinational algorithm proposed by the invention.
Claims (6)
Priority Applications (1)
Application Number  Priority Date  Filing Date  Title 

CN201210161621.9A CN103424114B (en)  20120522  20120522  A kind of full combined method of vision guided navigation/inertial navigation 
Applications Claiming Priority (1)
Application Number  Priority Date  Filing Date  Title 

CN201210161621.9A CN103424114B (en)  20120522  20120522  A kind of full combined method of vision guided navigation/inertial navigation 
Publications (2)
Publication Number  Publication Date 

CN103424114A CN103424114A (en)  20131204 
CN103424114B true CN103424114B (en)  20160120 
Family
ID=49649216
Family Applications (1)
Application Number  Title  Priority Date  Filing Date 

CN201210161621.9A CN103424114B (en)  20120522  20120522  A kind of full combined method of vision guided navigation/inertial navigation 
Country Status (1)
Country  Link 

CN (1)  CN103424114B (en) 
Cited By (1)
Publication number  Priority date  Publication date  Assignee  Title 

CN106885571A (en) *  20170307  20170623  辽宁工程技术大学  A kind of lunar surface rover method for rapidly positioning of combination IMU and navigation image 
Families Citing this family (24)
Publication number  Priority date  Publication date  Assignee  Title 

KR102016551B1 (en) *  20140124  20190902  한화디펜스 주식회사  Apparatus and method for estimating position 
CN103994765B (en) *  20140227  20170111  北京工业大学  Positioning method of inertial sensor 
JP6132981B2 (en)  20140324  20170524  エスゼット ディージェイアイ テクノロジー カンパニー リミテッドＳｚ Ｄｊｉ Ｔｅｃｈｎｏｌｏｇｙ Ｃｏ．，Ｌｔｄ  Method and apparatus for correcting plane conditions in real time 
CN103914065B (en) *  20140324  20160907  深圳市大疆创新科技有限公司  The method and apparatus that flight state is revised in real time 
CN103925926B (en) *  20140425  20160824  哈尔滨工程大学  A kind of quaternary number measuring method based on CAMERA/MIMU indoor integrated navigation system 
CN105675013B (en) *  20141121  20190301  中国飞行试验研究院  Civil aircraft inertial navigation dynamic calibration method 
CN104808685A (en) *  20150427  20150729  中国科学院长春光学精密机械与物理研究所  Vision auxiliary device and method for automatic landing of unmanned aerial vehicle 
CN107850436A (en) *  20150523  20180327  深圳市大疆创新科技有限公司  Merged using the sensor of inertial sensor and imaging sensor 
JP6320542B2 (en)  20150523  20180509  エスゼット ディージェイアイ テクノロジー カンパニー リミテッドＳｚ Ｄｊｉ Ｔｅｃｈｎｏｌｏｇｙ Ｃｏ．，Ｌｔｄ  Method, system, and program for estimating one or more external parameters for a movable object having a plurality of sensors having an initial configuration 
CN105043392B (en) *  20150817  20180302  中国人民解放军63920部队  A kind of aircraft pose determines method and device 
CN105208348B (en) *  20151008  20180821  成都时代星光科技有限公司  The aerial unmanned plane of railway line is patrolled and real time image collection Transmission system automatically 
CN106708066B (en) *  20151220  20190726  中国电子科技集团公司第二十研究所  Viewbased access control model/inertial navigation unmanned plane independent landing method 
CN105588563B (en) *  20160115  20180612  武汉光庭科技有限公司  Binocular camera and inertial navigation combined calibrating method in a kind of intelligent driving 
CN105698765B (en) *  20160222  20180918  天津大学  Object pose method under double IMU monocular visions measurement in a closed series noninertial systems 
CN105953795B (en) *  20160428  20190226  南京航空航天大学  A kind of navigation device and method for the tour of spacecraft surface 
CN106052683A (en) *  20160525  20161026  速感科技（北京）有限公司  Robot motion attitude estimating method 
CN105973264A (en) *  20160721  20160928  触景无限科技（北京）有限公司  Intelligent blind guiding system 
CN106843470A (en) *  20161228  20170613  歌尔科技有限公司  A kind of method of controlling viewing angle, device and VR systems 
CN108253940A (en) *  20161229  20180706  东莞前沿技术研究院  Localization method and device 
CN107389088A (en) *  20170527  20171124  纵目科技（上海）股份有限公司  Error correcting method, device, medium and the equipment of vehiclemounted inertial navigation 
CN108253942A (en) *  20170608  20180706  中国科学院遥感与数字地球研究所  A kind of method for improving oblique photograph and measuring empty three mass 
CN107197154A (en) *  20170622  20170922  北京奇艺世纪科技有限公司  A kind of panoramic video antihunt means and device 
WO2019080046A1 (en) *  20171026  20190502  深圳市大疆创新科技有限公司  Drift calibration method and device for inertial measurement unit, and unmanned aerial vehicle 
CN108801248A (en) *  20180622  20181113  深圳市北斗产业互联网研究院  Plane visual inertial navigation method based on UKF 
Citations (3)
Publication number  Priority date  Publication date  Assignee  Title 

CN101270993A (en) *  20071212  20080924  北京航空航天大学  Remote highprecision independent combined navigation locating method 
CN101532841A (en) *  20081230  20090916  华中科技大学  Method for navigating and positioning aerocraft based on landmark capturing and tracking 
CN101576384A (en) *  20090618  20091111  北京航空航天大学  Indoor movable robot realtime navigation method based on visual information correction 
Family Cites Families (1)
Publication number  Priority date  Publication date  Assignee  Title 

TWI397671B (en) *  20091216  20130601  Ind Tech Res Inst  System and method for locating carrier, estimating carrier posture and building map 

2012
 20120522 CN CN201210161621.9A patent/CN103424114B/en not_active IP Right Cessation
Patent Citations (3)
Publication number  Priority date  Publication date  Assignee  Title 

CN101270993A (en) *  20071212  20080924  北京航空航天大学  Remote highprecision independent combined navigation locating method 
CN101532841A (en) *  20081230  20090916  华中科技大学  Method for navigating and positioning aerocraft based on landmark capturing and tracking 
CN101576384A (en) *  20090618  20091111  北京航空航天大学  Indoor movable robot realtime navigation method based on visual information correction 
NonPatent Citations (4)
Title 

基于人工标志的视觉/SINS组合导航算法研究;宋琳娜等;《科学技术与工程》;20120229;第12卷(第4期);863页左栏 * 
基于双目视觉和惯性器件的微小型无人机运动状态估计方法;李建等;《航空学报》;20111225;第32卷(第12期);全文 * 
煤矿救灾机器人的惯性/视觉组合导航方法研究;朱晓飞;《矿山机械》;20101231;第38卷(第4期);全文 * 
紧耦合INS/视觉相对位姿测量方法;王龙等;《中国惯性技术学报》;20111231;第19卷(第6期);687页右栏、688页右栏、689页右栏 * 
Cited By (1)
Publication number  Priority date  Publication date  Assignee  Title 

CN106885571A (en) *  20170307  20170623  辽宁工程技术大学  A kind of lunar surface rover method for rapidly positioning of combination IMU and navigation image 
Also Published As
Publication number  Publication date 

CN103424114A (en)  20131204 
Similar Documents
Publication  Publication Date  Title 

CN103983254B (en)  The motordriven middle formation method of a kind of novel quick satellite  
Carrillo et al.  Combining stereo vision and inertial navigation system for a quadrotor UAV  
CN103675861B (en)  Satellite autonomous orbit determination method based on satelliteborne GNSS multiple antennas  
Georgy et al.  Modeling the stochastic drift of a MEMSbased gyroscope in gyro/odometer/GPS integrated navigation  
CN106017463B (en)  A kind of Aerial vehicle position method based on orientation sensing device  
Li et al.  LIDAR/MEMS IMU integrated navigation (SLAM) method for a small UAV in indoor environments  
Alonso et al.  Accurate global localization using visual odometry and digital maps on urban environments  
AU2008303962B2 (en)  Position determination method  
Goel et al.  Robust localization using relative and absolute position estimates  
Wang et al.  Integration of GPS/INS/vision sensors to navigate unmanned aerial vehicles  
US6553322B1 (en)  Apparatus and method for accurate pipeline surveying  
US10037028B2 (en)  Systems, devices, and methods for onboard sensing and control of micro aerial vehicles  
US7868821B2 (en)  Method and apparatus to estimate vehicle position and recognized landmark positions using GPS and camera  
CN101893445B (en)  Rapid initial alignment method for lowaccuracy strapdown inertial navigation system under swinging condition  
US20180102058A1 (en)  Highprecision autonomous obstacleavoidance flying method for unmanned aerial vehicle  
Cheng et al.  Visual odometry on the Mars exploration roversa tool to ensure accurate driving and science imaging  
Syed et al.  A new multiposition calibration method for MEMS inertial navigation systems  
Williamson et al.  Sensor fusion applied to autonomous aerial refueling  
US20130138264A1 (en)  Automotive navigation system and method to utilize internal geometry of sensor position with respect to rear wheel axis  
Thrapp et al.  Robust localization algorithms for an autonomous campus tour guide  
CN102829777B (en)  Autonomous underwater vehicle combined navigation system and method  
CN104064869B (en)  Biquaternion antenna for satellite communication in motion control method and system based on MEMS inertial navigation  
CN101413800B (en)  Navigating and steady aiming method of navigation / steady aiming integrated system  
CN102538781B (en)  Machine vision and inertial navigation fusionbased mobile robot motion attitude estimation method  
CN103914068A (en)  Service robot autonomous navigation method based on raster maps 
Legal Events
Date  Code  Title  Description 

PB01  Publication  
C06  Publication  
SE01  Entry into force of request for substantive examination  
C10  Entry into substantive examination  
GR01  Patent grant  
C14  Grant of patent or utility model  
CF01  Termination of patent right due to nonpayment of annual fee 
Granted publication date: 20160120 Termination date: 20180522 

CF01  Termination of patent right due to nonpayment of annual fee 