CN103954283B - Inertia integrated navigation method based on scene matching aided navigation/vision mileage - Google Patents

Inertia integrated navigation method based on scene matching aided navigation/vision mileage Download PDF

Info

Publication number
CN103954283B
CN103954283B CN201410128459.XA CN201410128459A CN103954283B CN 103954283 B CN103954283 B CN 103954283B CN 201410128459 A CN201410128459 A CN 201410128459A CN 103954283 B CN103954283 B CN 103954283B
Authority
CN
China
Prior art keywords
image
step
navigation
unmanned plane
position
Prior art date
Application number
CN201410128459.XA
Other languages
Chinese (zh)
Other versions
CN103954283A (en
Inventor
赵春晖
王荣志
张天武
潘泉
马鑫
Original Assignee
西北工业大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 西北工业大学 filed Critical 西北工业大学
Priority to CN201410128459.XA priority Critical patent/CN103954283B/en
Publication of CN103954283A publication Critical patent/CN103954283A/en
Application granted granted Critical
Publication of CN103954283B publication Critical patent/CN103954283B/en

Links

Abstract

The present invention relates to a kind of inertia integrated navigation method based on scene matching aided navigation/vision mileage, according to vision mileage principle, calculating unmanned plane to take photo by plane the homography matrix of real-time image sequences, by the relative displacement between the real-time figure of accumulation two continuous frames, recurrence calculation goes out the current location of unmanned plane;Owing to vision mileage navigation increase in time can produce cumulative error, thus introduce Scene Matching Algorithms based on FREAK feature and carry out auxiliary correction, scene matching aided navigation has the advantages such as high, strong, the electromagnetism interference of autonomy of positioning precision, can carry out hi-Fix in adaptive district, the navigation of effective compensation vision mileage works long hours the cumulative error produced;Set up the error model of inertial navigation system and the measurement model of vision data, draw optimal estimation result by Kalman filtering, and inertial navigation system is corrected.The present invention effectively improves navigation accuracy, is favorably improved unmanned plane autonomous flight ability.

Description

Inertia integrated navigation method based on scene matching aided navigation/vision mileage

Technical field

The invention belongs to Navigation of Pilotless Aircraft field of locating technology, relate to a kind of inertia integrated navigation method based on scene matching aided navigation/vision mileage.

Background technology

In high precision, high dynamic and highly reliable independent navigation be to ensure that unmanned plane smoothly completes one of key technology of various task, for strengthening unmanned plane inner directed behavior ability, improve fighting efficiency tool and be of great significance.Navigation mode is divided into satellite navigation, radionavigation, inertial navigation etc., wherein inertial navigation (INS) is with its most autonomous outstanding advantages, special position is occupied in airmanship, existing UAV Navigation System is all to be constituted integrated navigation system with inertial navigation for core, completes the independent navigation of unmanned plane under complex environment.

At present, the main mode of Navigation of Pilotless Aircraft is INS/GPS integrated navigation system, but under complex environment unknown, dynamically change, gps signal power is weak, is vulnerable to electromagnetic interference, even can quit work in signal blind zone, cause integrated navigation system that very big mistake occurs, the consequence that generation can not be estimated, and the Beidou satellite navigation system of China is in the development stage, its reliability and accuracy are still difficult meets the high accuracy navigation requests such as such as Military Application.

Summary of the invention

Solve the technical problem that

In place of the deficiencies in the prior art, the present invention proposes a kind of inertia integrated navigation method based on scene matching aided navigation/vision mileage, it is achieved full independent navigation under unmanned plane complex environment.

Technical scheme

A kind of inertia integrated navigation method based on scene matching aided navigation/vision mileage, it is characterised in that step is as follows:

Step 1: during unmanned plane during flying, airborne lower obtains ground image a in real time depending on video camera;

Step 2: utilizing image a and previous frame image a ', determine the vision mileage of unmanned plane, step is as follows:

A, use Harris Corner Detection Algorithm are respectively in two continuous frames real time imaging a and a ' middle extraction characteristic point;

B, in image a with (x, y)TCentered by square region in search with image a ' in each characteristic point (x, y)TThere is the match point of the highest neighborhood cross-correlation;Meanwhile, in image a ' with (x, y)TCentered by square region in search with image a in each characteristic point (x, y)TThere is the match point of the highest neighborhood cross-correlation;

C, utilization RANSAC robust estimation method obtain consistent point set and the estimation of homography matrix H of maximum, and first process for randomly drawing 4 groups of matching double points one random samples of composition, and calculates homography matrix H;Then to each matching double points in step B, computed range dis;Resetting threshold value t, if dis < t, then this matching double points is interior point, otherwise rejects, and counts out in statistics;Repeat said process k time, select H;

D, for all matching double points of interior point, there is the H that imperial palace is counted and reappraise by delimiting, use the H regained calculate in image a ' with each characteristic point in image a (x, y)TCorresponding point (x1,y1)T.And use in step B method respectively in image a with (x, y)TCentered by square region in search with image a ' in each characteristic point (x1,y1)TThere is the match point of the highest neighborhood cross-correlation;Meanwhile, with (x in image a '1,y1)TCentered by square region in search with image a in each characteristic point (x, y)TThere is the match point of the highest neighborhood cross-correlation;

E, repetition step B to step D, until the number of matching double points is stable;

Step 3: when unmanned plane enters adaptive district, according to inertial navigation system, unmanned plane is carried out coarse positioning, find the benchmark image b corresponding with coarse positioning result in on-board memory devices, and carry out scene matching aided navigation with real time imaging a, determine the position of unmanned plane;

Step 4: the error that the unmanned plane position correction vision mileage utilizing step 3 to obtain in adaptive district produces, obtains unmanned plane relatively accurately position;

Step 5: utilize inertial navigation system to provide the current position of unmanned plane and attitude;

Step 6: using the error equation state equation as integrated navigation system of inertial navigation system, navigational coordinate system is chosen as sky, northeast coordinate system, obtains the difference of out position as measurement using obtaining out position in step 4 in step 5.Estimate the drift error of inertia system by Kalman filter, and use this drift error to correct inertial navigation system, the navigational parameter after being merged.

The process of described step 3 is as follows: first obtains ground image a under unmanned aerial vehicle onboard depending on video camera in real time, image a is carried out pretreatment, obtains image c;Then the FAST feature of image b and image c is extracted;Re-use FREAK feature descriptor the FAST feature extracted is described;The similarity criterion utilizing nearest Hamming distance carries out characteristic matching, obtains matched position, i.e. currently without man-machine position.

Described ground image a is optical imagery or infrared image.

Beneficial effect

A kind of based on scene matching aided navigation/vision mileage the inertia integrated navigation method that the present invention proposes, according to vision mileage principle, calculating unmanned plane to take photo by plane the homography matrix of real-time image sequences, by the relative displacement between the real-time figure of accumulation two continuous frames, recurrence calculation goes out the current location of unmanned plane;Owing to vision mileage navigation increase in time can produce cumulative error, thus introduce Scene Matching Algorithms based on FREAK feature and carry out auxiliary correction, scene matching aided navigation has the advantages such as high, strong, the electromagnetism interference of autonomy of positioning precision, can carry out hi-Fix in adaptive district, the navigation of effective compensation vision mileage works long hours the cumulative error produced;Set up the error model of inertial navigation system and the measurement model of vision data, draw optimal estimation result by Kalman filtering, and inertial navigation system is corrected.The present invention effectively improves navigation accuracy, is favorably improved unmanned plane autonomous flight ability.

But, the research of scene matching aided navigation adaptation discrimination shows, the only coupling in adaptive district can provide positional information more reliably for unmanned plane, and on non-adaptive district such as desert, sea etc., scene matching aided navigation cannot normally work.

Vision mileage is the technology by processing two continuous frames Image estimation movable information, and this technology, as a kind of new navigator fix mode, has been successfully applied in autonomous mobile robot.During unmanned plane during flying, owing to two continuous frames image is same sensor under unmanned aerial vehicle platform, same time period, the image of identical conditions shooting, there is identical noise profile and image error, and there is not the imaging difference that natural conditions change causes, position information more accurately therefore, it is possible to provide in non-adaptive district;Meanwhile, two continuous frames picture size is much smaller compared to reference map size in scene matching aided navigation, and therefore match time is less, thus improves the real-time of navigation system.

Therefore, the present invention proposes a kind of unmanned plane inertia integrated navigation method based on scene matching aided navigation/vision mileage, the method has that autonomy is strong, load is light, equipment cost is low and the advantage such as strong interference immunity, and the engineer applied for UAV Navigation System provides a kind of feasible technical scheme.It it is a kind of navigation mode based on computer vision, have that positioning precision is high, Anti-amyloid-β antibody ability strong and the remarkable advantage such as little, the low cost of airborne equipment size, the cumulative error that inertial navigation system works long hours can be eliminated, increase substantially the positioning precision of inertial navigation system, become the standby navigation when GPS failure or precise decreasing and send out method.

Accompanying drawing explanation

Fig. 1 is the framework flow process of the present invention

Fig. 2 is two view homograph graphs of a relation

Fig. 3 is that FREAK describes sampling pattern

Detailed description of the invention

In conjunction with embodiment, accompanying drawing, the invention will be further described:

Inertia integrated navigation method based on scene matching aided navigation/vision mileage in the present invention, reconnaissance means are utilized to obtain the atural object scene of the predetermined flight range of unmanned plane as reference map, when the unmanned plane carrying vision sensor flies over predetermined region, the parameters such as the local image of acquisition according to pixels point resolution, flying height and visual field generate a certain size atural object scene as figure in real time, by mating of in real time figure and reference map, find out real-time figure position in reference map, and then determine the accurate location that unmanned plane is current.Scene matching navigation precision is not affected by navigation time, possess the advantages such as anti-electromagnetic interference capability is strong, autonomy good, precision is high, little, the abundant information of low cost, volume, itself and inertial navigation are combined being greatly improved the overall performance of navigation system, therefore in Navigation of Pilotless Aircraft, carry out inertia/scene matching aided navigation independent combined navigation research, be conducive to breaking away from extraneous aid system, strengthen the reliability of unmanned plane, mobility, disguise, anti-interference and survival ability.Mainly include inertial navigation, scene matching aided navigation, vision mileage, merge correction and five parts of integrated navigation Kalman filtering, comprise the following steps:

The first step, obtains ground image a in real time depending on video camera, by estimating the homography matrix of real time imaging a and former frame real time imaging a ', determines the vision mileage of unmanned plane for airborne time.

It is embodied as step as follows:

1, use Harris Corner Detection Algorithm respectively in two continuous frames real time imaging a and a ' middle extraction characteristic point;

2, in a ' each characteristic point (x, y)T, in a with (x, y)TCentered by square region in search there is the match point of the highest neighborhood cross-correlation.In like manner, to each characteristic point in a at a ' its match point of middle search, matching characteristic point pair is finally determined;

3, RANSAC robust estimation method is used to obtain the consistent point set of maximum and calculate the homography matrix H between two continuous frames real time imaging a and a '.Idiographic flow includes: (1) randomly draws 4 groups of matching double points one random samples of composition, and calculates homography matrix H;(2) to each matching double points in step 2, computed range dis;(3) setting threshold value t (t is less than the half of the real-time figure length of side), if dis < t, then this matching double points is interior point, otherwise rejects, and counts out in statistics;(4) repeat said process k time, select that there is the H that imperial palace is counted;

4, homography matrix H is recalculated by all match points delimited as interior point, and by the position of the region of search in H deciding step 2, so that it is determined that matching double points more accurately;

5, repeat step 2 to stablize to step 4, the number until matching double points, and use the stable matching point finally determined to calculating homography matrix H.

6, the homography matrix H that step 5 obtains is used, and the elevation information that the attitude information provided according to inertance element provides with barometertic altimeter, obtain the video camera relative displacement when shooting two continuous frames image, displacement by accumulation video camera, the position estimated before utilization, cyclically calculates the position that unmanned plane is current.

Second step, when unmanned plane enters adaptive district, according to inertial navigation system, unmanned plane is carried out coarse positioning, according to coarse positioning result, cutting circular horizon subimage b in reference map is taken photo by plane in the unmanned plane during flying region prestored, benchmark subgraph radius need to be more than inertance element unit interval average drift distance and unmanned plane between two adaptive districts 1.5 times of flight time product.Real time imaging a and benchmark subimage b is carried out scene matching aided navigation, determines the position of unmanned plane.Scene matching aided navigation specifically comprises the following steps that

1, respectively real time imaging a and benchmark subimage b is carried out gray processing, and extract the FAST characteristic point of real time imaging a and benchmark subimage b.

2, each FAST characteristic point being obtained its FREAK feature description operator, its step is as follows:

As it is shown on figure 3, FREAK builds circular sampling centered by each characteristic point, center is intensive, and surrounding is sparse, and number of samples successively decreases with exponential form.Using different gaussian kernel pre-smoothed each sampling circle, the standard deviation of gaussian kernel is directly proportional to the size of circle.

FREAK describes son and is made up of the Bit String of difference Gauss, and criterion T of definition sampling circle is

PaRepresenting sampling right, I () represents the sampling circle intensity level after smoothing.Select N number of sampling right, define binary system criterion

F = &Sigma; 0 &le; a < N 2 a T ( P a )

I.e. can get the binary bits string of N-dimensional.

Sample to choose be one by thick to smart process, criterion is as follows:

A) creating a big matrix D that can comprise 50,000 characteristic points, often row represents a characteristic point, by comparing the 43 of each characteristic point sampling circle intensity two-by-two, obtains 1000 multidimensional and describes son;

B) calculating average and the variance of the every string of matrix D, when average is 0.5, variance is maximum, it is ensured that the uniqueness of feature descriptor.

C) being ranked up every string according to side's extent, what variance was maximum is positioned at first row.

D) before retaining, N row, are described each characteristic point, obtain N-dimensional binary bits string.The present invention selects N=256.

Such as Fig. 3, selecting M (M=45) individual symmetric sampling pair in all sampling circles, calculating partial gradient is

O = 1 M &Sigma; P O &Element; G ( I ( P O r 1 ) - I ( P O r 2 ) ) P O r 1 - P O r 2 | | P O r 1 - P O r 2 | |

The bivector of corresponding sampling circle, I () represents the sampling circle intensity level after smoothing.

3, calculating carries out the nearest Hamming distance of two the characteristic point FREAK descriptors mated.In order to filter error hiding, the present invention arranges a threshold value 10, the most directly filters higher than this, is considered as 2 points being mutually matched less than the point of this threshold value.

4, carry out characteristic matching according to the method in step 2 and step 3, be determined by real time imaging a characteristic point position in benchmark subimage b to determine real time imaging a position in benchmark subimage b, so that it is determined that aircraft current location.

3rd step, the error that the unmanned plane position correction vision mileage utilizing second step to obtain produces, obtain unmanned plane more accurate position Pvision, and provide, by inertial navigation system, the position P that unmanned plane is currentinsWith attitude Ains

Assuming that the unmanned plane site error obtained by scene matching aided navigation in adaptive district is the least, the position the most directly using the result of scene matching aided navigation to calculate visual odometry resets.

4th step, uses Kalman filter to Pvision、PinsAnd AinsEstimate, obtain optimum navigation information.

Specific embodiment is as follows:

1, during unmanned plane during flying, airborne lower ground image a is obtained in real time depending on video camera.

The lower optometry video camera or the thermal camera that utilize unmanned aerial vehicle onboard obtain ground image sequence in real time, but only need to preserve present frame and previous frame image.

2, utilize image a and previous frame image a ', by estimating the homography matrix of a and a ', determine the vision mileage of unmanned plane.

As in figure 2 it is shown, unmanned plane is under state of flight, Airborne camera is continuously shot two two field picture I under different positions and pose1And I2, corresponding camera coordinate system is F and F ', it is assumed that the some P in plane π is mapped as image I1In point p and I2In some p ', corresponding to the vector in F and F ' beWithThen exist

p &RightArrow; &prime; = R c 1 c 2 p &RightArrow; + t

With t represents rotation and the translational motion of unmanned plane between twice shooting respectively.

Being normal vector relative to plane π at c1 by Fig. 2, n, d is the distance that c1 arrives plane π, has

n T p &RightArrow; = d

Therefore,

p &RightArrow; &prime; = R p &RightArrow; + t n T p &RightArrow; d = H c p &RightArrow;

Wherein,

H c = R c 1 c 2 + tn T d

It is referred to as the plane π homography matrix about video camera.

Consider camera intrinsic parameter, then

p = K p &RightArrow; , p &prime; = K p &RightArrow; &prime;

Therefore,

P '=KHcK-1P=Hp

Wherein,

H = K ( R c 1 c 2 + tn T d ) K - 1

It is referred to as the plane π homography matrix about two two field pictures.

According to document, homography matrix be degree of freedom be 3 × 3 matrixes of 8, it is therefore desirable to know that between two width images, 4 groups of matching double points calculate, in order to prevent degeneration, these 4 groups of matching double points place planes require to pass through video camera photocentre, and any 3 point not conllinear of 4, space point.

Further, since video camera is connected on unmanned plane, it is believed that the attitude of video camera is consistent with the attitude of unmanned plane, airborne inertial navigation device providing, respectively yaw angle ψ, pitching angle theta, roll angle γ, therefore have

R c 1 c 2 = R n c 2 R c 1 n

WhereinIt is the spin matrix that at c2, camera coordinates is tied to navigational coordinate system, for

R n c 2 = cos &gamma; cos &psi; - sin &theta; sin &gamma; sin &psi; cos &gamma; sin &psi; + sin &theta; sin &gamma; cos &psi; - cos &theta; sin &gamma; - cos &theta; sin &psi; cos &theta; cos &psi; sin &theta; sin &gamma; cos &psi; + sin &theta; cos &gamma; sin &psi; sin &gamma; sin &psi; - sin &theta; cos &gamma; cos &psi; cos &theta; cos &gamma;

In like manner can obtain

If it is assumed that ground is plane, then n=(0,0,1)TAfter calculating homography matrix, the elevation information that the attitude information provided according to inertial navigation provides with barometertic altimeter, above-mentioned formula is used to obtain the video camera relative displacement when shooting two continuous frames image, displacement by accumulation video camera, the position estimated before utilization, cyclically calculates the position that unmanned plane is current.It is embodied as step as follows:

2.1, use Harris Corner Detection Algorithm respectively in two continuous frames real time imaging a and a ' middle extraction characteristic point;

2.2, in a ' each characteristic point (x, y)T, in a with (x, y)TCentered by square region in search there is the coupling of the highest neighborhood cross-correlation, symmetrical, to each characteristic point in a in its coupling of the middle search of a ', finally determine matching characteristic point pair;

2.3, RANSAC robust estimation method is used to obtain consistent point set and the estimation of homography matrix H of maximum.Idiographic flow includes: (1) randomly draws 4 groups of matching double points one random samples of composition, and calculates homography matrix H;(2) to each matching double points in step B, computed range dis;(3) setting threshold value t, if dis < t, then this matching double points is interior point, otherwise rejects, and counts out in statistics;(4) repeat said process k time, select that there is the H that imperial palace is counted;

2.4, reappraised by all matching double points H delimited as interior point, and by the region of search in H deciding step 2.2, determine matching double points more accurately;

2.5, repeat step 2.2 to stablize to step 2.4, the number until matching double points.

3, when unmanned plane enters adaptive district, according to inertial navigation system, unmanned plane is carried out coarse positioning, on-board memory devices finds the benchmark image b corresponding with coarse positioning result, and carries out scene matching aided navigation with real time imaging a, determine the position of unmanned plane.

The present invention uses Scene Matching Algorithms based on FREAK descriptor to mate figure in real time with reference map, as it is shown on figure 3, FREAK describes son uses similar retina sampling configuration, circle sampling is built centered by each characteristic point, center is intensive, and surrounding is sparse, and number of samples successively decreases with exponential form.In order to strengthen the robustness of sampling circle, improving stability and the uniqueness of descriptor, use different gaussian kernel pre-smoothed each sampling circle, the standard deviation of gaussian kernel is directly proportional to the size of circle.

FREAK describes son and is made up of the Bit String of difference Gauss, and criterion T of definition sampling circle is

PaRepresenting sampling right, I () represents the sampling circle intensity level after smoothing.Select N number of sampling right, define binary system criterion

F = &Sigma; 0 &le; a < N 2 a T ( P a )

I.e. can get the binary bits string of N-dimensional.

Sample to choose be one by thick to smart process, criterion is as follows:

A) creating a big matrix D that can comprise 50,000 characteristic points, often row represents a characteristic point, by comparing the 43 of each characteristic point sampling circle intensity two-by-two, obtains 1000 multidimensional and describes son;

B) calculating average and the variance of the every string of matrix D, when average is 0.5, variance is maximum, it is ensured that the uniqueness of feature descriptor.

C) being ranked up every string according to side's extent, what variance was maximum is positioned at first row.

D) before retaining, N row, are described each characteristic point, obtain N-dimensional binary bits string.Select N=256 herein.

Sample to Criterion of Selecting ensure that the gray scale invariance of descriptor.

Such as Fig. 3, selecting M (M=45) individual symmetric sampling pair in all sampling circles, calculating partial gradient is

O = 1 M &Sigma; P O &Element; G ( I ( P O r 1 ) - I ( P O r 2 ) ) P O r 1 - P O r 2 | | P O r 1 - P O r 2 | |

The bivector of corresponding sampling circle, I () represents the sampling circle intensity level after smoothing.Sample to symmetry ensure that describe son rotational invariance.

FREAK descriptor is the binary bits string of 1 and 0 composition, and therefore similarity criterion uses nearest Hamming distance method, i.e. two descriptor step-by-steps to mating to carry out xor operation.For 512 dimension FREAK descriptors, maximum Hamming distance is 512, minimum 0, and in order to filter error hiding, threshold value T is set, the most directly filters higher than T, threshold value is set to 10 herein.

4, the error that the unmanned plane position correction vision mileage utilizing scene matching aided navigation to obtain in adaptive district produces, obtains unmanned plane relatively accurately position.

Owing to the positioning result of scene matching aided navigation is reliable in adaptive district, therefore directly utilize the position that visual odometry obtains by the unmanned plane position that scene matching aided navigation obtains and reset, eliminate vision mileage and work long hours the cumulative error of generation.

5, inertial navigation system is utilized to provide the current position of unmanned plane and attitude.

6, Kalman filtering algorithm is utilized to estimate the current exact position of unmanned plane and attitude.

Use the error equation state equation as integrated navigation system of inertial navigation system, navigational coordinate system is chosen as sky, northeast coordinate system, using the difference of position in position in 4 and 5 as measuring value, the drift error of inertia system is estimated by Kalman filter, then inertial navigation system is corrected, the navigational parameter after being merged.

Claims (3)

1. an inertia integrated navigation method based on scene matching aided navigation/vision mileage, it is characterised in that step is as follows:
Step 1: during unmanned plane during flying, airborne lower obtains ground image a in real time depending on video camera;
Step 2: utilizing image a and previous frame image a ', determine the vision mileage of unmanned plane, step is as follows:
A, use Harris Corner Detection Algorithm scheme a and a ' middle extraction characteristic point in real time in two continuous frames respectively;
B, in image a with (x, y)TCentered by square region in search with image a ' in each characteristic point (x, y)T There is the match point of the highest neighborhood cross-correlation;Meanwhile, in image a ' with (x, y)TCentered by square region in search for With in image a each characteristic point (x, y)TThere is the match point of the highest neighborhood cross-correlation;
C, utilization RANSAC robust estimation method obtain consistent point set and the estimation of homography matrix H of maximum, and process is First randomly draw 4 groups of matching double points one random samples of composition, and calculate homography matrix H;Then in step B Each matching double points, computed range dis;Resetting threshold value t, if dis < t, then this matching double points is interior point, otherwise Reject, and count out in statistics;Repeat said process k time, select H;
D, all matching double points being interior point by delimitation have the H that imperial palace is counted and reappraise, and use regains H calculate in image a ' with each characteristic point in image a (x, y)TCorresponding point (x1,y1)T, and use step B Middle method respectively in image a with (x, y)TCentered by square region in search with image a ' in each characteristic point (x1,y1)TThere is the match point of the highest neighborhood cross-correlation;Meanwhile, with (x in image a '1,y1)TCentered by squared region In territory search with image a in each characteristic point (x, y)TThere is the match point of the highest neighborhood cross-correlation;
E, repeating step B to step D, until the number of matching double points is stable, and use finally determines Stable matching point is to calculating homography matrix H;
The homography matrix H that F, use step E obtain, and the attitude information provided according to inertance element is high with air pressure The elevation information that degree meter provides, is obtained the video camera relative displacement when shooting two continuous frames image, is imaged by accumulation The displacement of machine, the position estimated before utilization, cyclically calculate the position that unmanned plane is current;
Step 3: when unmanned plane enters adaptive district, according to inertial navigation system, unmanned plane is carried out coarse positioning, at machine Carry and storage device finds the reference map b corresponding with coarse positioning result, and carry out scene matching aided navigation with figure a in real time, determine The position of unmanned plane;
Step 4: the error that the unmanned plane position correction vision mileage utilizing step 3 to obtain in adaptive district produces, To unmanned plane relatively accurately position;
Step 5: utilize inertial navigation system to provide the current position of unmanned plane and attitude;
Step 6: use the error equation state equation as integrated navigation system of inertial navigation system, navigation coordinate System is chosen as sky, northeast coordinate system, using in step 4 in out position and step 5 the difference of out position as measurement, Estimate the drift error of inertia system by Kalman filter, and use this drift error to correct inertial navigation system, Navigational parameter after being merged.
Inertia integrated navigation method based on scene matching aided navigation/vision mileage the most according to claim 1, it is characterised in that: Described ground image a is optical imagery.
Inertia integrated navigation method based on scene matching aided navigation/vision mileage the most according to claim 1, it is characterised in that: The process of described step 3 is as follows: first obtain ground image a under unmanned aerial vehicle onboard in real time depending on video camera, to image A carries out pretreatment, obtains image c;Then the FAST feature of image b and image c is extracted;Re-use FREAK The FAST feature extracted is described by feature descriptor;The similarity criterion utilizing nearest Hamming distance carries out spy Levy coupling, obtain matched position, i.e. currently without man-machine position.
CN201410128459.XA 2014-04-01 2014-04-01 Inertia integrated navigation method based on scene matching aided navigation/vision mileage CN103954283B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410128459.XA CN103954283B (en) 2014-04-01 2014-04-01 Inertia integrated navigation method based on scene matching aided navigation/vision mileage

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410128459.XA CN103954283B (en) 2014-04-01 2014-04-01 Inertia integrated navigation method based on scene matching aided navigation/vision mileage

Publications (2)

Publication Number Publication Date
CN103954283A CN103954283A (en) 2014-07-30
CN103954283B true CN103954283B (en) 2016-08-31

Family

ID=51331593

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410128459.XA CN103954283B (en) 2014-04-01 2014-04-01 Inertia integrated navigation method based on scene matching aided navigation/vision mileage

Country Status (1)

Country Link
CN (1) CN103954283B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105675013B (en) * 2014-11-21 2019-03-01 中国飞行试验研究院 Civil aircraft inertial navigation dynamic calibration method
CN105045276B (en) * 2015-07-03 2019-01-11 深圳一电航空技术有限公司 UAV Flight Control method and device
CN105180933B (en) * 2015-09-14 2017-11-21 中国科学院合肥物质科学研究院 Mobile robot reckoning update the system and method based on the detection of straight trip crossing
CN105222772B (en) * 2015-09-17 2018-03-16 泉州装备制造研究所 A kind of high-precision motion track detection system based on Multi-source Information Fusion
CN105374043B (en) * 2015-12-02 2017-04-05 福州华鹰重工机械有限公司 Visual odometry filtering background method and device
CN105865451B (en) * 2016-04-19 2019-10-01 深圳市神州云海智能科技有限公司 Method and apparatus for mobile robot indoor positioning
CN105953796A (en) * 2016-05-23 2016-09-21 北京暴风魔镜科技有限公司 Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone
US10012517B2 (en) * 2016-08-01 2018-07-03 Infinity Augmented Reality Israel Ltd. Method and system for calibrating components of an inertial measurement unit (IMU) using scene-captured data
CN107067415B (en) * 2017-03-21 2019-07-30 南京航空航天大学 A kind of object localization method based on images match
CN107167140B (en) * 2017-05-26 2019-11-08 江苏大学 A kind of unmanned plane vision positioning accumulated error suppressing method
CN107843240A (en) * 2017-09-14 2018-03-27 中国人民解放军92859部队 A kind of seashore region unmanned plane image same place information rapid extracting method
CN107498559A (en) * 2017-09-26 2017-12-22 珠海市微半导体有限公司 The detection method and chip that the robot of view-based access control model turns to

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6856894B1 (en) * 2003-10-23 2005-02-15 International Business Machines Corporation Navigating a UAV under remote control and manual control with three dimensional flight depiction
EP1975646A2 (en) * 2007-03-28 2008-10-01 Honeywell International Inc. Lader-based motion estimation for navigation
CN101598556A (en) * 2009-07-15 2009-12-09 北京航空航天大学 Unmanned aerial vehicle vision/inertia integrated navigation method in unknown environment
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
CN102538781A (en) * 2011-12-14 2012-07-04 浙江大学 Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN102829785A (en) * 2012-08-30 2012-12-19 中国人民解放军国防科学技术大学 Air vehicle full-parameter navigation method based on sequence image and reference image matching

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6856894B1 (en) * 2003-10-23 2005-02-15 International Business Machines Corporation Navigating a UAV under remote control and manual control with three dimensional flight depiction
EP1975646A2 (en) * 2007-03-28 2008-10-01 Honeywell International Inc. Lader-based motion estimation for navigation
CN101598556A (en) * 2009-07-15 2009-12-09 北京航空航天大学 Unmanned aerial vehicle vision/inertia integrated navigation method in unknown environment
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
CN102538781A (en) * 2011-12-14 2012-07-04 浙江大学 Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN102829785A (en) * 2012-08-30 2012-12-19 中国人民解放军国防科学技术大学 Air vehicle full-parameter navigation method based on sequence image and reference image matching

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
一种基于单目视觉的微型无人机姿态算法;庄瞳等;《计算机工程》;20120831;第38卷(第15期);全文 *
基于空间关系几何约束的无人机景象匹配导航;李耀军等;《计算机应用研究》;20101031;第27卷(第10期);全文 *
惯性组合导航系统中的快速景象匹配算法研究;陈方等;《宇航学报》;20091130;第30卷(第6期);全文 *

Also Published As

Publication number Publication date
CN103954283A (en) 2014-07-30

Similar Documents

Publication Publication Date Title
Cesetti et al. A vision-based guidance system for UAV navigation and safe landing using natural landmarks
Artieda et al. Visual 3-d slam from uavs
Eustice et al. Visually augmented navigation for autonomous underwater vehicles
Kohlbrecher et al. A flexible and scalable slam system with full 3d motion estimation
JP4685313B2 (en) Method for processing passive volumetric image of any aspect
CN103149939B (en) A kind of unmanned plane dynamic target tracking of view-based access control model and localization method
Zhang et al. An Unmanned Aerial Vehicle‐Based Imaging System for 3D Measurement of Unpaved Road Surface Distresses 1
US9243916B2 (en) Observability-constrained vision-aided inertial navigation
Sim et al. Integrated position estimation using aerial image sequences
US10311297B2 (en) Determination of position from images and associated camera positions
US9031809B1 (en) Method and apparatus for generating three-dimensional pose using multi-modal sensor fusion
Pizarro et al. Large area 3-D reconstructions from underwater optical surveys
JP2009266224A (en) Method and system for real-time visual odometry
Singh et al. Towards high-resolution imaging from underwater vehicles
Merino et al. Vision-based multi-UAV position estimation
Chiabrando et al. UAV and RPV systems for photogrammetric surveys in archaelogical areas: two tests in the Piedmont region (Italy)
Eustice Large-area visually augmented navigation for autonomous underwater vehicles
Kim et al. Pose-graph visual SLAM with geometric model selection for autonomous underwater ship hull inspection
Lin et al. Autonomous aerial navigation using monocular visual‐inertial fusion
US8855442B2 (en) Image registration of multimodal data using 3D-GeoArcs
Barazzetti et al. Fully automatic UAV image-based sensor orientation
Kong et al. Autonomous landing of an UAV with a ground-based actuated infrared stereo vision system
CN102435188B (en) Monocular vision/inertia autonomous navigation method for indoor environment
Shabayek et al. Vision based uav attitude estimation: Progress and insights
GurŽfil et al. Partial aircraft state estimation from visual motion using the subspace constraints approach

Legal Events

Date Code Title Description
PB01 Publication
C06 Publication
SE01 Entry into force of request for substantive examination
C10 Entry into substantive examination
GR01 Patent grant
C14 Grant of patent or utility model
TR01 Transfer of patent right

Effective date of registration: 20190826

Address after: Room 404, Material Building, Northwest Polytechnic University, 127 Youyi West Road, Xi'an City, Shaanxi Province, 710072

Patentee after: Xi'an Northwestern Polytechnical University Asset Management Co., Ltd.

Address before: 710072 Xi'an friendship West Road, Shaanxi, No. 127

Patentee before: Northwestern Polytechnical University

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20191113

Address after: 710072 floor 19, building B, innovation and technology building, northwest Polytechnic University, No.127, Youyi West Road, Beilin District, Xi'an, Shaanxi Province

Patentee after: Shaanxi CISCO Rudi Network Security Technology Co., Ltd.

Address before: Room 404, Material Building, Northwest Polytechnic University, 127 Youyi West Road, Xi'an City, Shaanxi Province, 710072

Patentee before: Xi'an Northwestern Polytechnical University Asset Management Co., Ltd.

TR01 Transfer of patent right