CN103424114B - A kind of full combined method of vision guided navigation/inertial navigation - Google Patents

A kind of full combined method of vision guided navigation/inertial navigation Download PDF

Info

Publication number
CN103424114B
CN103424114B CN201210161621.9A CN201210161621A CN103424114B CN 103424114 B CN103424114 B CN 103424114B CN 201210161621 A CN201210161621 A CN 201210161621A CN 103424114 B CN103424114 B CN 103424114B
Authority
CN
China
Prior art keywords
attitude
inertial navigation
vision
error
system
Prior art date
Application number
CN201210161621.9A
Other languages
Chinese (zh)
Other versions
CN103424114A (en
Inventor
刘春�
周发根
吴杭彬
姚连壁
简志伟
Original Assignee
同济大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 同济大学 filed Critical 同济大学
Priority to CN201210161621.9A priority Critical patent/CN103424114B/en
Publication of CN103424114A publication Critical patent/CN103424114A/en
Application granted granted Critical
Publication of CN103424114B publication Critical patent/CN103424114B/en

Links

Abstract

The present invention relates to a kind of full combined method of vision guided navigation/inertial navigation, the method comprises the following steps: 1) vision guided navigation resolves: the vertical observation equation of row based on collinearity equation, principle of least square adjustment is adopted to ask for carrier positions and attitude parameter, and the variance-covariance battle array between calculating parameter; 2) inertial navigation calculates: in local horizontal coordinates, do navigation calculate, obtain the position of each moment carrier, speed and attitude parameter, the variance-covariance battle array simultaneously between calculating parameter; 3) vision system correction inertial navigation system: take Kalman filtering as means, estimates navigational parameter error and the device error of inertial navigation system, and compensates and feedback compensation, thus obtain the optimal estimation of all parameters of inertial navigation system; 4) inertial navigation system correction vision system: adopt sequential adjustment process, revises all parameters of vision system.Compared with prior art, the present invention has theoretical tight, stable performance, the advantage such as efficient.

Description

A kind of full combined method of vision guided navigation/inertial navigation

Technical field

The present invention relates to the array mode of a kind of vision navigation system and inertial navigation system, especially relate to a kind of full combined method of vision guided navigation/inertial navigation.

Background technology

Vision guided navigation, be also called vision odometry (VisualOdometry, VO), ultimate principle is: in carrier advance process, utilize stereoscopic camera to obtain the image of scene around, rely on the coupling of unique point between the image of left and right, and the tracking between the frame image of front and back, based on collinearity equation, the vertical observation equation of row, asks for locus and the attitude of camera according to principle of least square adjustment.Vision guided navigation is a kind of high-precision independent navigation mode, but as a kind of reckoning mode, its navigation error accumulates in time, and the stability of algorithm is a difficult problem always, the background sometimes in image visual field is single or too similar, or the dynamic of carrier car is excessive, such as angle of turn is excessive, make the degree of overlapping between adjacent image too little, these all can cause vision guided navigation to resolve failure, therefore will strengthen the stability of algorithm by other means.

Inertial navigation system (InertialNavigationSystem, INS) take Newton mechanics law as principle of work, utilize the inertial measurement component (InertialMeasurementUnit be arranged on carrier, IMU) come angular velocity and the acceleration information of sensitive carrier, obtained position, the navigational parameter such as speed and attitude of carrier by integral operation.Inertial navigation has independence and anti-interference completely, has high precision in the short time, but due to integral principle, its error is accumulated in time, and the precision therefore worked long hours is poor.In order to reduce error accumulation, external observation value be introduced inertial navigation system is revised.

Therefore, vision guided navigation and inertial navigation, as two kinds of independent navigation modes, have good complementarity.Both combinations can solve without the navigation problem under GPS environment, and for indoor navigation, robot path planning, unmanned plane independent landing, the navigation of automatic running car, underground traffic navigator fix, the mine operation application demand such as are safely significant.

At present, the array mode of vision guided navigation and inertial navigation system generally has three classes: (1) mating type, the general positional information relying on vision system to provide carrier, and inertial navigation system provides the attitude information of carrier; (2) correction type, this array mode goes to revise another kind of navigational system with a kind of navigational system; (3) pattern of fusion, the general data fusion adopting Kalman filtering to realize both.In above three class array modes, mating type belongs to the combination of low level, is usually used in the occasion that accuracy requirement is not high, and usually adopts correction type or pattern of fusion in high-precision occasion.

In correction type array mode, generally utilize inertial navigation system to revise vision system, implementation procedure is generally such: (1) camera obtains the image of current time, (2) extracting and matching feature points is carried out to the left and right image of current time, and mate with the image of previous moment, then the elements of exterior orientation (position and attitude) of camera is calculated by resection, (3) the output data of IMU are utilized to calculate the angle of pitch and the roll angle of carrier, and course angle increment, (4) angle of pitch of carrier and roll angle and course angle increment is utilized, by Kalman filtering, camera position and attitude are revised, (5) revised camera position and attitude is utilized, the object coordinates of unique point is asked by forward intersection, (6) above step is repeated.

The implementation procedure of pattern of fusion array mode is generally such: (1) camera obtains the image of current time, (2) extracting and matching feature points is carried out to the left and right image of current time, and mate with the image of previous moment, then the elements of exterior orientation (position and attitude) of camera is calculated by resection, (3) utilize the position of camera and attitude as measuring value, adopt position and the attitude of Kalman Filter Estimation IMU, (4) the IMU position obtained after filtering and attitude are passed to camera, then the object coordinates of unique point is asked by forward intersection, (5) above step is repeated.

As can be seen from the above, in the array mode of correction type, only utilize the output data of IMU to provide attitude, and positional information is not provided, therefore this mode navigation information that could not inertial navigation system be utilized to the full extent to provide.The array mode of pattern of fusion can utilize the information of vision system and inertial navigation system preferably, makes both can be revised, but this mode does not consider the prior imformation of image feature point object coordinates, and combined effect can not reach best.In fact in vision guided navigation, the transmission of unique point between different images is crucial, the unique point extracted in current image, and major part obtains its object coordinates by least square adjustment in previous frame image, and coordinate precision is also known, therefore must take in precision information.

Summary of the invention

Object of the present invention is exactly provide a kind of theory full combined method that is tight, stable, vision guided navigation/inertial navigation efficiently to overcome defect that above-mentioned prior art exists.

Object of the present invention can be achieved through the following technical solutions:

A full combined method for vision guided navigation/inertial navigation, the method comprises the following steps:

1) vision guided navigation resolves: camera obtains image in carrier movement process, and based on collinearity equation, the vertical observation equation of row, adopts principle of least square adjustment to ask for carrier positions and attitude parameter X vo, and the variance-covariance battle array P (X between calculating parameter vo);

2) inertial navigation calculates: in local horizontal coordinates, do navigation calculate, obtain the position of each moment carrier, speed and attitude parameter X ins, simultaneously according to the error equation of inertial navigation system, propagate rule by covariance, the variance-covariance battle array P (X between calculating parameter ins);

3) vision system correction inertial navigation system: take Kalman filtering as means, get the error equation of inertial navigation system as state equation, the carrier positions, the attitude that utilize vision guided navigation to resolve to obtain, the carrier positions calculated with inertial navigation system, attitude ask poor respectively, as the measurement of Kalman filtering, estimate navigational parameter error and the device error of inertial navigation system, and compensate and feedback compensation, thus obtain the optimal estimation of all parameters of inertial navigation system, be designated as

4) inertial navigation system correction vision system: position and the attitude of isolating IMU from the parameter of inertial navigation system, consider the geometric relationship between camera and IMU, the position of IMU and attitude are changed into position and the attitude information of camera, set up observation equation, adopt sequential adjustment process, obtain the optimal estimation of all parameters of vision system, be designated as

Described vision guided navigation resolves concrete steps and comprises:

11) initialization of camera elements of exterior orientation: laying reference mark around vision guided navigation reference position, asks for the elements of exterior orientation of camera by resection;

12) carrier ceaselessly moves, and utilizes camera to obtain the image of current time, carries out extracting and matching feature points;

13) based on collinearity equation, all observation equation is set up to the successful all unique points of left and right Image Matching:

v = B 1 x ^ 1 + B 2 x ^ 2 + A 1 y ^ 1 + A 2 y ^ 2 - l v ( y 1 ) = y ^ 1 - l ( y 1 ) C ( x ^ 1 , x ^ 2 ) - W x = 0

In above formula, refer to the elements of exterior orientation correction of left camera, refer to the elements of exterior orientation correction of right camera, refer to the object coordinates correction of known point, refer to the object coordinates correction of point to be located, y 1refer to the object coordinates of known point, v refers to observed reading residual error, and l is constant, for the restriction relation between the camera of left and right, W xfor the constant in equation of constraint;

14) solve above formula according to the principle of least square, try to achieve the elements of exterior orientation of left and right camera, and the object coordinates of all unique points.

Described Feature Points Matching comprises two aspects: one is that the left and right image of current time is carried out Feature Points Matching, and two was that the left image of current time and the left image in a upper moment are carried out Feature Points Matching; Described unique point comprises known point and point to be located: known point refers to the unique point occurred in previous frame image, and point to be located then refers to the unique point newly increased in current image.

Described inertial navigation calculation procedure specifically comprises:

(1) posture renewal: the angular velocity observed reading utilizing gyro to export, adopts Quaternion Method to gesture renewals, then the hypercomplex number after renewal is changed into attitude matrix, be designated as k represents a moment, and k+1 represents current time, and L is navigational coordinate system, and b is carrier coordinate system;

(2) attitude is extracted: according to attitude matrix the value of middle element, calculates the attitude of carrier, comprises course angle, the angle of pitch, roll angle;

(3) specific force decomposes: utilize attitude matrix, the specific force observation value f that degree of will speed up meter exports bnavigational coordinate system is transformed into by carrier coordinate system, as follows

f L = R b L ( k + 1 ) · f b

(4) compensation of harmful acceleration: to f lcarry out Corioli's acceleration correction and normal gravity compensation, obtain carrier acceleration a at the earth's surface, as follows

a = f L - ( 2 Ω ie L + Ω eL L ) v L + g L

Wherein, for the antisymmetric matrix that rotational-angular velocity of the earth three-dimensional component is formed, for L system relative to ground be admittedly angular velocity of rotation three-dimensional component form antisymmetric matrix, v lfor velocity, g lfor terrestrial gravitation vector;

(5) integration asks for speed: utilize acceleration to carry out integration and obtain speed increment, and then obtain the speed of current time as follows

v k + 1 L = v l L + a · Δt

(6) integration asks for position: utilize speed to carry out integration and obtain positional increment, and then obtain the position of current time as follows

r k + 1 L = r k L + 1 2 ( v k L + v k + 1 L ) · Δt

Described vision system correction inertial navigation system step specifically comprises:

31) get the state equation of error equation as Kalman filter of inertial navigation system, form is as follows

X · = F · X + G · ω

Wherein, X is error state vector, comprises site error δ r l, velocity error δ v l, the misaligned angle of the platform δ ε l, gyroscopic drift d and accelerometer bias b,

X=(δr L,δv L,δε L,d,b)

32) difference of the position that the position calculated by inertial navigation and vision guided navigation resolve, and the difference of attitude that the attitude that calculates of inertial navigation and vision guided navigation resolve is as measuring value, measurement equation is as follows

Z r Z a = H r H a · X + V r V a

Z r, Z arepresent that site error measures and attitude error measures, H respectively r, H arepresent site error measurement matrix and attitude error measurement matrix respectively, V r, V arepresent the residual error that site error measures and attitude error measures respectively;

Site error Z rcalculate as follows

Z r=r ins-r vo

In above formula, r insrefer to the coordinate that inertial navigation calculates, r vorepresent that vision guided navigation resolves the coordinate obtained; Site error measurement matrix H rform as follows

H r = I 3 × 3 0 12 × 3

In formula, I is unit battle array, and 0 is null matrix, attitude error Z aaccount form as follows

Z a=a ins-a vo

Wherein a insrepresent the attitude that inertial navigation calculates, a vorepresent that vision guided navigation resolves the attitude obtained; Consider the rotation matrix between camera and IMU, vision guided navigation resolved the camera attitude reduction that the obtains attitude to carrier:

a vo=(p′,r′,y′) T

P ', r in formula ', y ' represents the carrier angle of pitch, roll angle and the course angle that are obtained by the reduction of camera attitude; Attitude error measurement matrix H aform as follows

H a = 0 6 × 3 A 3 × 3 0 6 × 3

The form of A battle array is as follows

A = cos y sin y 0 - sin y cos p cos y cos p 0 sin y sin p cos p - cos y sin p cos p 1

In formula, p, r, y represent the carrier angle of pitch, roll angle and the course angle that are calculated by inertial navigation respectively;

33) estimated navigational parameter error and the device error of inertial navigation system by filtering, and compensate and feedback compensation.

Described inertial navigation system correction vision system step specifically comprises:

41) from navigational parameter, isolate position and the attitude of IMU, according to the geometric relationship between camera and IMU, the position of IMU and attitude are changed into the observed reading of camera position and attitude;

42) adopt sequential adjustment process, complete the correction of inertial navigation system to vision system.

Described step 42) specifically comprise:

The function model of inertial navigation system correction vision system is write as following form:

X vo = I I I I x ^ 1 x ^ 2 y ^ 1 y ^ 2 - l X vo r ins a ins = I 0 0 0 x ^ 1 x ^ 2 y ^ 1 y ^ 2 - l X ins

Wherein, be respectively constant corresponding in two equations.

Above formula is organized into the form of observation equation, as follows:

v ( x vo ) = x ^ vo - l ( x vo ) v ( r ins , a ins ) = B x ^ vo - l ( r ins , a ins )

In above formula, the form of matrix B is as follows

B=(I000)

The probabilistic model of observed reading, namely variance-covariance battle array is as follows:

D ( X vo ) = P ( V vo ) D ( r ins , a ins ) = P ( r ins , a ins )

In above formula, P (r ins, a ins) be by P (X ins) in isolated to position, submatrix that attitude is relevant, according to above-mentioned function model and probabilistic model, the correction formula of vision system can be derived, as follows

x ^ vo = x vo + D ( X vo ) B T ( D ( r ins , a ins ) + BD ( X vo ) B T ) - 1 ( l ( r ins , a ins ) - Bx vo ) D ( x ^ vo ) = D ( X vo ) - D ( X vo ) B T ( D ( r ins , a ins ) + BD ( X vo ) B T ) - 1 BD ( X vo )

In above formula, x vorefer to and resolve by vision guided navigation the vision parameter obtained, through revised vision parameter, through revised variance-covariance battle array.

Compared with prior art, the present invention has following advantage:

(1) vision system correction inertial navigation system.The navigational parameter error (site error, velocity error, attitude error) of inertial navigation system not only can be estimated by Kalman filtering, also can estimate device error (gyroscopic drift, accelerometer bias), and navigational parameter error is compensated, by device Error Feedback in the navigation calculating of next cycle, inertial navigation system can be made like this to keep long high precision.When vision guided navigation resolves failure, combined system then relies on inertial navigation to provide the navigational parameter of carrier completely, owing to estimating various error before and correcting, therefore at short notice, the failure can not resolved because of vision guided navigation and cause the precision of whole system obviously to decline.

(2) inertial navigation system correction vision system.The position utilizing inertial navigation to calculate and attitude, by sequential adjustment process, not only can improve the elements of exterior orientation of camera, and the object coordinates of image feature point is also revised.To be delivered in the navigation calculation of next frame image through the object coordinates of the revised image feature point of sequential adjustment and precision information (the variance-covariance battle array between object coordinates), least square collocation is adopted to process, this method account for the object coordinates prior imformation of image feature point, and vision system can be made to keep long high precision.

Accompanying drawing explanation

Fig. 1 is steps flow chart schematic diagram of the present invention;

Fig. 2 is the schematic diagram of vision system correction inertial navigation system;

Fig. 3 is inertial navigation system correction vision system schematic diagram;

Fig. 4 is the carrier running orbit that different navigation method calculates;

Fig. 5 is that the position in Kalman filtering process newly ceases;

Fig. 6 is that the attitude in Kalman filtering process newly ceases.

Embodiment

Below in conjunction with the drawings and specific embodiments, the present invention is described in detail.

Embodiment

As shown in Figure 1, a kind of full combined method of vision guided navigation/inertial navigation, the method comprises the following steps:

1) vision guided navigation resolves

Step is as follows:

(1) initialization of camera elements of exterior orientation, generally some reference mark of laying around vision guided navigation reference position, ask for the elements of exterior orientation of camera by resection.

(2) carrier ceaselessly moves, stereoscopic camera is utilized to obtain the image of current time, carry out extracting and matching feature points, Feature Points Matching comprises two aspects: one is that the left and right image of current time is carried out Feature Points Matching, and two was that the left image of current time and the left image in a upper moment are carried out Feature Points Matching.

(3) based on collinearity equation, all observation equation is set up to the successful all unique points of left and right Image Matching.Here unique point is divided into two classes: known point and point to be located.Known point refers to the unique point occurred in previous frame image, and therefore its object coordinates is known, and corresponding precision is also known, and point to be located then refers to the unique point newly increased in current image, and its object coordinates is without any prior imformation.The prior imformation least square collocation principle of known point treated in the present invention, therefore observation equation can be write as following form:

v = B 1 x ^ 1 + B 2 x ^ 2 + A 1 y ^ 1 + A 2 y ^ 2 - l v ( y 1 ) = y ^ 1 - l ( y 1 ) C ( x ^ 1 , x ^ 2 ) - W x = 0 - - - ( 1 )

Three equations are comprised altogether in above formula.In first equation refer to the elements of exterior orientation correction of left camera, refer to the elements of exterior orientation correction of right camera, refer to the object coordinates correction of known point, refer to the object coordinates correction of point to be located, y 1refer to the object coordinates of known point, v refers to observed reading residual error, and l refers to constant.Second equation is the virtual observation equation set up according to the prior imformation of known point.3rd equation is the equation of constraint between the camera of left and right, this equation by experiment before calibration carried out to the geometric relationship between camera and IMU set up.

(4) solve formula (1) according to the principle of least square, in fact, formula (1) is the indirect adjustment model of belt restraining, therefore can be tried to achieve the elements of exterior orientation of left and right camera by adjustment, and the object coordinates of all unique points.

2) inertial navigation calculates

Choose local horizontal coordinates (L system) as navigation system, three axles of L system are oriented in " east, north, sky ", and three axles of carrier system (b system) are oriented to " right, front, on ".The step that navigation calculates is as follows:

(1) posture renewal.The angular velocity observed reading utilizing gyro to export, adopts Quaternion Method to gesture renewals, then the hypercomplex number after renewal is changed into attitude matrix, be designated as

(2) attitude is extracted.According to attitude matrix the value of middle element, calculates the attitude of attitude, comprises course angle, the angle of pitch, roll angle.

(3) specific force decomposes.Utilize attitude matrix, the specific force observation value f that degree of will speed up meter exports bnavigational coordinate system is transformed into by carrier coordinate system, as follows

f L = R b L ( k + 1 ) · f b - - - ( 2 )

(4) compensation of harmful acceleration.To f lcarry out Corioli's acceleration correction and normal gravity compensation, obtain carrier acceleration at the earth's surface, as follows

a = f L - ( 2 Ω ie L + Ω eL L ) v L + g L - - - ( 3 )

(5) integration asks for speed.Utilize acceleration to carry out integration and obtain speed increment, and then obtain the speed of current time, as follows

v k + 1 L = v l L + a · Δt - - - ( 4 )

(6) integration asks for position.Utilize speed to carry out integration and obtain positional increment, and then obtain the position of current time, as follows

r k + 1 L = r k L + 1 2 ( v k L + v k + 1 L ) · Δt - - - ( 5 )

3) vision system correction inertial navigation system

Vision system correction inertial navigation system is realized by Kalman filtering, get the error equation of inertial navigation system as state equation, Fig. 2 shows its principle and process, as can be seen from the figure, the measuring value of system is the difference of the position that the position that calculates of inertial navigation and vision guided navigation resolve, and the difference of attitude that the attitude that calculates of inertial navigation and vision guided navigation resolve.

Get the error state vector on 15 rank, comprise site error, velocity error, the misaligned angle of the platform, gyroscopic drift, accelerometer bias.As follows

X=(δr L,δv L,δε L,d,b)(6)

State equation gets the error equation of inertial navigation system, and form is as follows

X · = F · X + G · ω - - - ( 7 )

The camera position that the external measurement information of system obtains from vision guided navigation and attitude.Therefore measurement equation can write out following form

Z r Z a = H r H a · X + V r V a - - - ( 8 )

Z in above formula r, Z arepresent that site error measures and attitude error measures respectively.Site error Z rcalculate as follows

Z r=r ins-r vo(9)

In above formula, r insrefer to that INS navigates the coordinate calculated, r vorepresent that vision guided navigation resolves the coordinate obtained.Due to the coordinate that the determined coordinate of vision guided navigation is camera projection centre, and INS navigation calculates is the coordinate of IMU geometric center, therefore needs r votransform, the three-dimensional between deduction camera and IMU is biased, thus by the coordinate reduction of camera projection centre to IMU geometric center.

Site error measurement matrix H rform as follows

H r = I 3 × 3 0 12 × 3 - - - ( 10 )

In formula, I is unit battle array, and 0 is null matrix.Attitude error Z aaccount form as follows

Z a=a ins-a vo(11)

Wherein a insrepresent that INS navigates the attitude calculated, a vorepresent that vision guided navigation resolves the attitude obtained.It is noted herein that, the navigate attitude that calculates of INS is the attitude of carrier, and vision guided navigation resolves what obtain is the attitude of camera, specifically refer to three the rotation Eulerian angle of image space coordinate system to world coordinate system, therefore the rotation matrix between camera and IMU to be considered, by the attitude of the attitude reduction of camera to carrier.

a vo=(p′,r′,y′) T(12)

P ', r in formula ', y ' represents the carrier angle of pitch, roll angle and the course angle that are obtained by the reduction of camera attitude.Attitude error measurement matrix H aform as follows

H a = 0 6 × 3 A 3 × 3 0 6 × 3 - - - ( 13 )

The form of A battle array is as follows

A = cos y sin y 0 - sin y cos p cos y cos p 0 sin y sin p cos p - cos y sin p cos p 1 - - - ( 14 )

In formula, p, r, y represent the carrier angle of pitch, roll angle and the course angle that are calculated by inertial navigation respectively.So far, the function model of Kalman filtering is established, and can be estimated navigational parameter error and the device error of inertial navigation system, and compensate and feedback compensation, thus make inertial navigation system keep long high precision by filtering.

4) inertial navigation system correction vision system

Inertial navigation system correction vision system adopts sequential adjustment to realize, and Fig. 3 shows its principle and process.As can be seen from the figure, calculated by inertial navigation, the navigational parameter in each moment can be obtained, simultaneously according to the error equation of inertial navigation system, propagate rule by covariance, the variance-covariance battle array between parameter can be calculated.Then from navigational parameter, isolate position and attitude, according to the geometric relationship between camera and IMU, change into the observed reading of camera position and attitude, adopt sequential adjustment process, realize the optimization of vision system navigational parameter.

Sequential adjustment is generally used for data situation by stages, that is in the different moment, system is observed, and utilize previous observed reading to carry out adjustment processing to system, obtain parameter estimation, and the precision information of parameter (variance-covariance battle array).The problem that sequential adjustment will solve is exactly how to utilize the parameter estimation and precision information that previously obtain, processes, obtain more optimal parameter estimation together with the observed reading newly increased.If total observation equation is

v = B x ^ - l - - - ( 15 )

Observation data is divided into two groups, then observation equation can be write as following form

v 1 = B 1 x ^ - l 1 v 2 = B 2 x ^ - l 2 - - - ( 16 )

Suppose that the observed reading number in above formula first equation is enough, solve first equation by the principle of least square,

x ^ 1 = ( B 1 T P 1 B 1 ) - 1 B 1 T P 1 l 1 Q x ^ 1 = ( B 1 T P 1 B 1 ) - 1 - - - ( 17 )

P in above formula 1represent the power battle array of observed reading in first equation.Solve two equations simultaneously:

x ^ = ( B 1 T P 1 B 1 + B 2 T P 2 B 2 ) - 1 ( B 1 T P 1 l 1 + B 2 T P 2 l 2 ) Q x ^ = ( B 1 T P 1 B 1 + B 2 T P 2 B 2 ) - 1 - - - ( 18 )

P in above formula 2represent the power battle array of observed reading in second equation.According to Inversion formula of matrix, the computing formula of sequential adjustment can be derived, as follows

x ^ = x ^ 1 + Q x ^ 1 B 2 T ( P 2 - 1 + B 2 Q x ^ 1 B 2 T ) - 1 ( l 2 - B 2 x ^ 1 ) Q x ^ = Q x ^ 1 - Q x ^ 1 B 2 T ( P 2 - 1 + B 2 Q x ^ 1 B 2 T ) - 1 B 2 Q x ^ 1 - - - ( 19 )

Present introduction adopts sequential adjustment to realize the correction of inertial navigation system to vision system.After vision guided navigation has resolved, the valuation X of camera parameter can be obtained vo, by accuracy assessment, the variance-covariance battle array P (X between parameter can be obtained vo).Inertial navigation system is calculated by navigation, can obtain the navigational parameter X in corresponding moment ins, simultaneously according to the error equation of inertial navigation system, propagate rule by covariance, the variance-covariance battle array between calculating parameter, is designated as P (X ins).From the navigational parameter X of inertial navigation system insin isolate position r inswith attitude a ins, from variance-covariance battle array P (X ins) in isolate the position submatrix P (r relevant to attitude ins, a ins), consider the geometric relationship between camera and IMU, be converted into the observed reading to vision parameter, so the function model of inertial navigation system correction vision system can be write as following form:

X vo = I I I I x ^ 1 x ^ 2 y ^ 1 y ^ 2 - l X vo r ins a ins = I 0 0 0 x ^ 1 x ^ 2 y ^ 1 y ^ 2 - l X ins - - - ( 20 )

Above formula is organized into the form of observation equation, as follows:

v ( x vo ) = x ^ vo - l ( x vo ) v ( r ins , a ins ) = B x ^ vo - l ( r ins , a ins ) - - - ( 21 )

In above formula, the form of matrix B is as follows

B=(I000)(22)

The probabilistic model (variance-covariance battle array) of observed reading is as follows:

D ( X vo ) = P ( V vo ) D ( r ins , a ins ) = P ( r ins , a ins ) - - - ( 23 )

According to function model above and probabilistic model, the correction formula of vision system can be derived, as follows

x ^ vo = x vo + D ( X vo ) B T ( D ( r ins , a ins ) + BD ( X vo ) B T ) - 1 ( l ( r ins , a ins ) - Bx vo ) D ( x ^ vo ) = D ( X vo ) - D ( X vo ) B T ( D ( r ins , a ins ) + BD ( X vo ) B T ) - 1 BD ( X vo ) - - - ( 24 )

In above formula, x vorefer to and resolve by vision guided navigation the vision parameter obtained, through revised vision parameter, through revised variance-covariance battle array.Employing above formula calculates, and can revise all parameters of vision system, comprise the position of left and right camera, attitude, and the object coordinates of all unique points.

If purely algorithmically, the 1st step and the 2nd step are paired running in fact, and the 3rd step and the 4th step are also paired runnings.Namely first vision system and inertial navigation system carry out navigation calculation separately, try to achieve corresponding parameter, and then two subsystems are assisted mutually, and each subsystem is revised, and obtain optimal estimation.The IMU optimal estimation that 3rd step obtains with the camera optimal estimation obtained in the 4th step all the optimal estimation to whole system, or perhaps the optimal estimation to carrier running status.In theory, when not considering the error of calculation, with of equal value.In other words, if the geometric relationship between camera and IMU is taken into account, so with should be consistent.

In order to verify the reliability and stability of the inventive method, carrying out many experiments and data processing, below having introduced wherein an experimentation and data processed result.Experiment place is positioned at Beijing, carrier car is equipped with the data acquisition equipment such as stereoscopic camera and IMU.Also installed a GPS in addition additional, made difference with base station GPS, difference result reaches the precision of centimetre-sized, therefore as the precision checking the inventive method with reference to value.Complete the calibration of stereoscopic camera elements of interior orientation and relative elements of exterior orientation calibration on pretreatment, and the calibration of relative position between camera and IMU and attitude.In experimentation, the landmark point of known absolute coordinate information is first utilized to carry out the initialization of camera elements of exterior orientation.After initialization completes, control carrier car and gather IMU data, image data, gps data in test block.The output frequency of IMU is 100Hz, its gyroscopic drift is 0.005 °/s, accelerometer bias is 10mg, and obtain stereopsis with the frequency of 2Hz by synchronous exposure by the onboard clock time service stereoscopic camera of IMU, the horizontal and vertical field angle of stereoscopic camera is respectively 42 ° and 31 °.The output frequency of GPS is 1Hz.In experiment, the accumulative about 1150m that advances of mobile experiment porch, is about 45min during experiment, obtains about 4300 frame stereopsis.

Two kinds of air navigation aids are adopted to process to experimental data.Method one is that pure vision guided navigation resolves, and when vision guided navigation solution process occurs unsuccessfully, utilizes the motion of uniform motion model prediction carrier, fills up vision guided navigation breach, and method two is that vision guided navigation and inertial navigation combine.In fact, author also attempts having carried out pure-inertial guidance calculating, but for reaching the experimental data of 45min, its navigation error reaches the magnitude of more than 10000m, and therefore pure-inertial guidance is nonsensical, does not discuss here.Below provide the result of two kinds of distinct methods process, and GPS reference locus, see Fig. 4.

Legend in Fig. 4, GPS represents differential GPS result, and VO/INS represents vision/inertia combined navigation result, and VO represents pure vision guided navigation result.As can be seen from the figure, the track of vision/inertia combined navigation and GPS track are very close, and its precision is higher than the result of pure vision guided navigation.Table 1 lists the error condition of two kinds of different navigation methods.As can be seen from the table, vision/inertia combined navigation is in the operating range of 1150m, and accumulative site error is 8.6m, and relative accuracy is 0.75%.

The error condition of table 1 different navigation method

Data processing method Site error (m) Relative error Vision guided navigation 18.67 1.6% Vision/inertia combination 8.63 0.75%

In experimentation, carrier car is sometimes across the road surface of baldness, and the angle of sometimes turning is comparatively large, and cause the vision guided navigation of 6 short time of appearance to resolve failure, at this moment combined system relies on inertial navigation to provide navigational parameter completely.As can be seen from the VO/INS combined trajectories in Fig. 4, when vision guided navigation resolves unsuccessfully, there is not obvious precise decreasing in system, and this embodies reliability and the high precision of this method.

Fig. 5 and Fig. 6 shows in the process of vision system correction inertial navigation system, the new breath situation of Kalman filter.In the Kalman filter that the inventive method adopts, innovation representation be that the navigational parameter that calculates of inertial navigation and vision guided navigation resolve the navigational parameter difference between the two obtained.

As can be seen from Figure 5, the maximal value of position difference is within 0.15m, and in most situation all within 0.05m, as can be seen from Figure 6, the maximal value of pose difference within 0.1 °, and in most situation all within 0.01 °, this illustrates that filtering is more stable, vision guided navigation and inertial navigation two systems can be good at combination, in conjunction with before adopt the accuracy evaluation of GPS absolute reference value, fully show reliability and the superiority of combinational algorithm proposed by the invention.

Claims (6)

1. a full combined method for vision guided navigation/inertial navigation, it is characterized in that, the method comprises the following steps:
1) vision guided navigation resolves: camera obtains image in carrier movement process, and based on collinearity equation, the vertical observation equation of row, adopts principle of least square adjustment to ask for carrier positions and attitude parameter X vo, and the variance-covariance battle array P (X between calculating parameter vo);
Described vision guided navigation resolves concrete steps and comprises:
11) initialization of camera elements of exterior orientation: laying reference mark around vision guided navigation reference position, asks for the elements of exterior orientation of camera by resection;
12) carrier ceaselessly moves, and utilizes camera to obtain the image of current time, carries out extracting and matching feature points;
13) based on collinearity equation, all observation equation is set up to the successful all unique points of left and right Image Matching:
v = B 1 x ^ 1 + B 2 x ^ 2 + A 1 y ^ 1 + A 2 y ^ 2 - l v ( y 1 ) = y ^ 1 - l ( y 1 ) C ( x ^ 1 , x ^ 2 ) - W x = 0
In above formula, refer to the elements of exterior orientation correction of left camera, refer to the elements of exterior orientation correction of right camera, refer to the object coordinates correction of known point, refer to the object coordinates correction of point to be located, y 1refer to that the object coordinates v of known point refers to observed reading residual error, l is constant, for the restriction relation between the camera of left and right, W xfor the constant in equation of constraint, B 1for corresponding matrix of coefficients, B 2for corresponding matrix of coefficients, A 1for corresponding matrix of coefficients, A 2for corresponding matrix of coefficients;
14) solve above formula according to the principle of least square, try to achieve the elements of exterior orientation of left and right camera, and the object coordinates of all unique points;
2) inertial navigation calculates: in local horizontal coordinates, do navigation calculate, obtain the position of each moment carrier, speed and attitude parameter X ins, simultaneously according to the error equation of inertial navigation system, propagate rule by covariance, the variance-covariance battle array P (X between calculating parameter ins);
3) vision system correction inertial navigation system: take Kalman filtering as means, get the error equation of inertial navigation system as state equation, the carrier positions, the attitude that utilize vision guided navigation to resolve to obtain, the carrier positions calculated with inertial navigation system, attitude ask poor respectively, as the measurement of Kalman filtering, estimate navigational parameter error and the device error of inertial navigation system, and compensate and feedback compensation, thus obtain the optimal estimation of all parameters of inertial navigation system, be designated as
4) inertial navigation system correction vision system: position and the attitude of isolating IMU from the parameter of inertial navigation system, consider the geometric relationship between camera and IMU, the position of IMU and attitude are changed into position and the attitude information of camera, set up observation equation, adopt sequential adjustment process, obtain the optimal estimation of all parameters of vision system, be designated as
2. the full combined method of a kind of vision guided navigation/inertial navigation according to claim 1, it is characterized in that, in described step 12), Feature Points Matching comprises two aspects: one is that the left and right image of current time is carried out Feature Points Matching, and two was that the left image of current time and the left image in a upper moment are carried out Feature Points Matching; Described unique point comprises known point and point to be located: known point refers to the unique point occurred in previous frame image, and point to be located then refers to the unique point newly increased in current image.
3. the full combined method of a kind of vision guided navigation/inertial navigation according to claim 1, is characterized in that, described inertial navigation calculation procedure specifically comprises:
(1) posture renewal: the angular velocity observed reading utilizing gyro to export, adopts Quaternion Method to gesture renewals, then the hypercomplex number after renewal is changed into attitude matrix, be designated as k represents a moment, and k+1 represents current time, and L is navigational coordinate system, and b is carrier coordinate system;
(2) attitude is extracted: according to attitude matrix the value of middle element, calculates the attitude of carrier, comprises course angle, the angle of pitch, roll angle;
(3) specific force decomposes: utilize attitude matrix, the specific force observation value f that degree of will speed up meter exports bnavigational coordinate system is transformed into by carrier coordinate system, as follows
f L = R b L ( k + 1 ) · f b
(4) compensation of harmful acceleration: to f lcarry out Corioli's acceleration correction and normal gravity compensation, obtain carrier acceleration a at the earth's surface, as follows
a = f L - ( 2 Ω i e L + Ω e L L ) v L + g L
Wherein, for the antisymmetric matrix that rotational-angular velocity of the earth three-dimensional component is formed, for L system relative to ground be admittedly angular velocity of rotation three-dimensional component form antisymmetric matrix, v lfor velocity, g lfor terrestrial gravitation vector;
(5) integration asks for speed: utilize acceleration to carry out integration and obtain speed increment, and then obtain the speed of current time as follows
v k + 1 L = v k L + a · Δ t
(6) integration asks for position: utilize speed to carry out integration and obtain positional increment, and then obtain the position of current time as follows
r k + 1 L = r k L + 1 2 ( v k L + v k + 1 L ) · Δ t .
4. the full combined method of a kind of vision guided navigation/inertial navigation according to claim 1, is characterized in that, described vision system correction inertial navigation system step specifically comprises:
31) get the state equation of error equation as Kalman filter of inertial navigation system, form is as follows
X · = F · X + G · ω
Wherein, X is error state vector, comprises site error δ r l, velocity error δ v l, the misaligned angle of the platform δ ε l, gyroscopic drift d and accelerometer bias b,
X=(δr L,δv L,δε L,d,b)
32) difference of the position that the position calculated by inertial navigation and vision guided navigation resolve, and the difference of attitude that the attitude that calculates of inertial navigation and vision guided navigation resolve is as measuring value, measurement equation is as follows
Z r Z a = H r H a · X + V r V a
Z r, Z arepresent that site error measures and attitude error measures, H respectively r, H arepresent site error measurement matrix and attitude error measurement matrix respectively, V r, V arepresent the residual error that site error measures and attitude error measures respectively;
Site error Z rcalculate as follows
Z r=r ins-r vo
In above formula, r insrefer to the coordinate that inertial navigation calculates, r vorepresent that vision guided navigation resolves the coordinate obtained; Site error measurement matrix H rform as follows
H r = I 3 × 3 0 12 × 3
In formula, I is unit battle array, and 0 is null matrix, attitude error Z aaccount form as follows
Z a=a ins-a vo
Wherein a insrepresent the attitude that inertial navigation calculates, a vorepresent that vision guided navigation resolves the attitude obtained; Consider the rotation matrix between camera and IMU, vision guided navigation resolved the camera attitude reduction that the obtains attitude to carrier:
a vo=(p′,r′,y′) T
P ', r in formula ', y ' represents the carrier angle of pitch, roll angle and the course angle that are obtained by the reduction of camera attitude; Attitude error measurement matrix H aform as follows
H a = 0 6 × 3 A 3 × 3 0 6 × 3
The form of A battle array is as follows
A = cos y sin y 0 - sin y cos p cos y cos p 0 sin y sin p cos p - cos y sin p cos p 1
In formula, p, r, y represent the carrier angle of pitch, roll angle and the course angle that are calculated by inertial navigation respectively;
33) estimated navigational parameter error and the device error of inertial navigation system by filtering, and compensate and feedback compensation.
5. the full combined method of a kind of vision guided navigation/inertial navigation according to claim 1, is characterized in that, described inertial navigation system correction vision system step specifically comprises:
41) from navigational parameter, isolate position and the attitude of IMU, according to the geometric relationship between camera and IMU, the position of IMU and attitude are changed into the observed reading of camera position and attitude;
42) adopt sequential adjustment process, complete the correction of inertial navigation system to vision system.
6. the full combined method of a kind of vision guided navigation/inertial navigation according to claim 5, is characterized in that, described step 42) specifically comprise:
The function model of inertial navigation system correction vision system is write as following form:
X v o = I I I I x ^ 1 x ^ 2 y ^ 1 y ^ 2 - l X v o r i n s a i n s = I 0 0 0 x ^ 1 x ^ 2 y ^ 1 y ^ 2 - l X i n s
Wherein, be respectively constant corresponding in two equations;
Above formula is organized into the form of observation equation, as follows:
v ( x v o ) = x ^ v o - l ( x v o ) v ( r i n s , a i n s ) = B x ^ v o - l ( r i n s , a i n s )
In above formula, the form of matrix B is as follows
B=(I000)
The probabilistic model of observed reading, namely variance-covariance battle array is as follows:
D ( X v o ) = P ( X v o ) D ( r i n s , a i n s ) = P ( r i n s , a i n s )
In above formula, P (r ins, a ins) be by P (X ins) in isolated to position, submatrix that attitude is relevant, according to above-mentioned function model and probabilistic model, the correction formula of vision system can be derived, as follows
x ^ v o = x v o + D ( X v o ) B T ( D ( r i n s , a i n s ) + B D ( X v o ) B T ) - 1 ( l ( r i n s , a i n s ) - B x v o ) D ( x ^ v o ) = D ( X v o ) - D ( X v o ) B T ( D ( r i n s , a i n s ) + B D ( X v o ) B T ) - 1 B D ( X v o )
In above formula, x vorefer to and resolve by vision guided navigation the vision parameter obtained, through revised vision parameter, through revised variance-covariance battle array.
CN201210161621.9A 2012-05-22 2012-05-22 A kind of full combined method of vision guided navigation/inertial navigation CN103424114B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210161621.9A CN103424114B (en) 2012-05-22 2012-05-22 A kind of full combined method of vision guided navigation/inertial navigation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210161621.9A CN103424114B (en) 2012-05-22 2012-05-22 A kind of full combined method of vision guided navigation/inertial navigation

Publications (2)

Publication Number Publication Date
CN103424114A CN103424114A (en) 2013-12-04
CN103424114B true CN103424114B (en) 2016-01-20

Family

ID=49649216

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210161621.9A CN103424114B (en) 2012-05-22 2012-05-22 A kind of full combined method of vision guided navigation/inertial navigation

Country Status (1)

Country Link
CN (1) CN103424114B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106885571A (en) * 2017-03-07 2017-06-23 辽宁工程技术大学 A kind of lunar surface rover method for rapidly positioning of combination IMU and navigation image

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102016551B1 (en) * 2014-01-24 2019-09-02 한화디펜스 주식회사 Apparatus and method for estimating position
CN103994765B (en) * 2014-02-27 2017-01-11 北京工业大学 Positioning method of inertial sensor
JP6132981B2 (en) 2014-03-24 2017-05-24 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Method and apparatus for correcting plane conditions in real time
CN103914065B (en) * 2014-03-24 2016-09-07 深圳市大疆创新科技有限公司 The method and apparatus that flight state is revised in real time
CN103925926B (en) * 2014-04-25 2016-08-24 哈尔滨工程大学 A kind of quaternary number measuring method based on CAMERA/MIMU indoor integrated navigation system
CN105675013B (en) * 2014-11-21 2019-03-01 中国飞行试验研究院 Civil aircraft inertial navigation dynamic calibration method
CN104808685A (en) * 2015-04-27 2015-07-29 中国科学院长春光学精密机械与物理研究所 Vision auxiliary device and method for automatic landing of unmanned aerial vehicle
CN107850436A (en) * 2015-05-23 2018-03-27 深圳市大疆创新科技有限公司 Merged using the sensor of inertial sensor and imaging sensor
JP6320542B2 (en) 2015-05-23 2018-05-09 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Method, system, and program for estimating one or more external parameters for a movable object having a plurality of sensors having an initial configuration
CN105043392B (en) * 2015-08-17 2018-03-02 中国人民解放军63920部队 A kind of aircraft pose determines method and device
CN105208348B (en) * 2015-10-08 2018-08-21 成都时代星光科技有限公司 The aerial unmanned plane of railway line is patrolled and real time image collection Transmission system automatically
CN106708066B (en) * 2015-12-20 2019-07-26 中国电子科技集团公司第二十研究所 View-based access control model/inertial navigation unmanned plane independent landing method
CN105588563B (en) * 2016-01-15 2018-06-12 武汉光庭科技有限公司 Binocular camera and inertial navigation combined calibrating method in a kind of intelligent driving
CN105698765B (en) * 2016-02-22 2018-09-18 天津大学 Object pose method under double IMU monocular visions measurement in a closed series noninertial systems
CN105953795B (en) * 2016-04-28 2019-02-26 南京航空航天大学 A kind of navigation device and method for the tour of spacecraft surface
CN106052683A (en) * 2016-05-25 2016-10-26 速感科技(北京)有限公司 Robot motion attitude estimating method
CN105973264A (en) * 2016-07-21 2016-09-28 触景无限科技(北京)有限公司 Intelligent blind guiding system
CN106843470A (en) * 2016-12-28 2017-06-13 歌尔科技有限公司 A kind of method of controlling viewing angle, device and VR systems
CN108253940A (en) * 2016-12-29 2018-07-06 东莞前沿技术研究院 Localization method and device
CN107389088A (en) * 2017-05-27 2017-11-24 纵目科技(上海)股份有限公司 Error correcting method, device, medium and the equipment of vehicle-mounted inertial navigation
CN108253942A (en) * 2017-06-08 2018-07-06 中国科学院遥感与数字地球研究所 A kind of method for improving oblique photograph and measuring empty three mass
CN107197154A (en) * 2017-06-22 2017-09-22 北京奇艺世纪科技有限公司 A kind of panoramic video antihunt means and device
WO2019080046A1 (en) * 2017-10-26 2019-05-02 深圳市大疆创新科技有限公司 Drift calibration method and device for inertial measurement unit, and unmanned aerial vehicle
CN108801248A (en) * 2018-06-22 2018-11-13 深圳市北斗产业互联网研究院 Plane visual inertial navigation method based on UKF

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101270993A (en) * 2007-12-12 2008-09-24 北京航空航天大学 Remote high-precision independent combined navigation locating method
CN101532841A (en) * 2008-12-30 2009-09-16 华中科技大学 Method for navigating and positioning aerocraft based on landmark capturing and tracking
CN101576384A (en) * 2009-06-18 2009-11-11 北京航空航天大学 Indoor movable robot real-time navigation method based on visual information correction

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI397671B (en) * 2009-12-16 2013-06-01 Ind Tech Res Inst System and method for locating carrier, estimating carrier posture and building map

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101270993A (en) * 2007-12-12 2008-09-24 北京航空航天大学 Remote high-precision independent combined navigation locating method
CN101532841A (en) * 2008-12-30 2009-09-16 华中科技大学 Method for navigating and positioning aerocraft based on landmark capturing and tracking
CN101576384A (en) * 2009-06-18 2009-11-11 北京航空航天大学 Indoor movable robot real-time navigation method based on visual information correction

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
基于人工标志的视觉/SINS组合导航算法研究;宋琳娜等;《科学技术与工程》;20120229;第12卷(第4期);863页左栏 *
基于双目视觉和惯性器件的微小型无人机运动状态估计方法;李建等;《航空学报》;20111225;第32卷(第12期);全文 *
煤矿救灾机器人的惯性/视觉组合导航方法研究;朱晓飞;《矿山机械》;20101231;第38卷(第4期);全文 *
紧耦合INS/视觉相对位姿测量方法;王龙等;《中国惯性技术学报》;20111231;第19卷(第6期);687页右栏、688页右栏、689页右栏 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106885571A (en) * 2017-03-07 2017-06-23 辽宁工程技术大学 A kind of lunar surface rover method for rapidly positioning of combination IMU and navigation image

Also Published As

Publication number Publication date
CN103424114A (en) 2013-12-04

Similar Documents

Publication Publication Date Title
CN103983254B (en) The motor-driven middle formation method of a kind of novel quick satellite
Carrillo et al. Combining stereo vision and inertial navigation system for a quad-rotor UAV
CN103675861B (en) Satellite autonomous orbit determination method based on satellite-borne GNSS multiple antennas
Georgy et al. Modeling the stochastic drift of a MEMS-based gyroscope in gyro/odometer/GPS integrated navigation
CN106017463B (en) A kind of Aerial vehicle position method based on orientation sensing device
Li et al. LIDAR/MEMS IMU integrated navigation (SLAM) method for a small UAV in indoor environments
Alonso et al. Accurate global localization using visual odometry and digital maps on urban environments
AU2008303962B2 (en) Position determination method
Goel et al. Robust localization using relative and absolute position estimates
Wang et al. Integration of GPS/INS/vision sensors to navigate unmanned aerial vehicles
US6553322B1 (en) Apparatus and method for accurate pipeline surveying
US10037028B2 (en) Systems, devices, and methods for on-board sensing and control of micro aerial vehicles
US7868821B2 (en) Method and apparatus to estimate vehicle position and recognized landmark positions using GPS and camera
CN101893445B (en) Rapid initial alignment method for low-accuracy strapdown inertial navigation system under swinging condition
US20180102058A1 (en) High-precision autonomous obstacle-avoidance flying method for unmanned aerial vehicle
Cheng et al. Visual odometry on the Mars exploration rovers-a tool to ensure accurate driving and science imaging
Syed et al. A new multi-position calibration method for MEMS inertial navigation systems
Williamson et al. Sensor fusion applied to autonomous aerial refueling
US20130138264A1 (en) Automotive navigation system and method to utilize internal geometry of sensor position with respect to rear wheel axis
Thrapp et al. Robust localization algorithms for an autonomous campus tour guide
CN102829777B (en) Autonomous underwater vehicle combined navigation system and method
CN104064869B (en) Biquaternion antenna for satellite communication in motion control method and system based on MEMS inertial navigation
CN101413800B (en) Navigating and steady aiming method of navigation / steady aiming integrated system
CN102538781B (en) Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN103914068A (en) Service robot autonomous navigation method based on raster maps

Legal Events

Date Code Title Description
PB01 Publication
C06 Publication
SE01 Entry into force of request for substantive examination
C10 Entry into substantive examination
GR01 Patent grant
C14 Grant of patent or utility model
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160120

Termination date: 20180522

CF01 Termination of patent right due to non-payment of annual fee