CN103954283B  Inertia integrated navigation method based on scene matching aided navigation/vision mileage  Google Patents
Inertia integrated navigation method based on scene matching aided navigation/vision mileage Download PDFInfo
 Publication number
 CN103954283B CN103954283B CN201410128459.XA CN201410128459A CN103954283B CN 103954283 B CN103954283 B CN 103954283B CN 201410128459 A CN201410128459 A CN 201410128459A CN 103954283 B CN103954283 B CN 103954283B
 Authority
 CN
 China
 Prior art keywords
 image
 step
 navigation
 unmanned plane
 position
 Prior art date
Links
Abstract
Description
Technical field
The invention belongs to Navigation of Pilotless Aircraft field of locating technology, relate to a kind of inertia integrated navigation method based on scene matching aided navigation/vision mileage.
Background technology
In high precision, high dynamic and highly reliable independent navigation be to ensure that unmanned plane smoothly completes one of key technology of various task, for strengthening unmanned plane inner directed behavior ability, improve fighting efficiency tool and be of great significance.Navigation mode is divided into satellite navigation, radionavigation, inertial navigation etc., wherein inertial navigation (INS) is with its most autonomous outstanding advantages, special position is occupied in airmanship, existing UAV Navigation System is all to be constituted integrated navigation system with inertial navigation for core, completes the independent navigation of unmanned plane under complex environment.
At present, the main mode of Navigation of Pilotless Aircraft is INS/GPS integrated navigation system, but under complex environment unknown, dynamically change, gps signal power is weak, is vulnerable to electromagnetic interference, even can quit work in signal blind zone, cause integrated navigation system that very big mistake occurs, the consequence that generation can not be estimated, and the Beidou satellite navigation system of China is in the development stage, its reliability and accuracy are still difficult meets the high accuracy navigation requests such as such as Military Application.
Summary of the invention
Solve the technical problem that
In place of the deficiencies in the prior art, the present invention proposes a kind of inertia integrated navigation method based on scene matching aided navigation/vision mileage, it is achieved full independent navigation under unmanned plane complex environment.
Technical scheme
A kind of inertia integrated navigation method based on scene matching aided navigation/vision mileage, it is characterised in that step is as follows:
Step 1: during unmanned plane during flying, airborne lower obtains ground image a in real time depending on video camera；
Step 2: utilizing image a and previous frame image a ', determine the vision mileage of unmanned plane, step is as follows:
A, use Harris Corner Detection Algorithm are respectively in two continuous frames real time imaging a and a ' middle extraction characteristic point；
B, in image a with (x, y)^{T}Centered by square region in search with image a ' in each characteristic point (x, y)^{T}There is the match point of the highest neighborhood crosscorrelation；Meanwhile, in image a ' with (x, y)^{T}Centered by square region in search with image a in each characteristic point (x, y)^{T}There is the match point of the highest neighborhood crosscorrelation；
C, utilization RANSAC robust estimation method obtain consistent point set and the estimation of homography matrix H of maximum, and first process for randomly drawing 4 groups of matching double points one random samples of composition, and calculates homography matrix H；Then to each matching double points in step B, computed range dis；Resetting threshold value t, if dis < t, then this matching double points is interior point, otherwise rejects, and counts out in statistics；Repeat said process k time, select H；
D, for all matching double points of interior point, there is the H that imperial palace is counted and reappraise by delimiting, use the H regained calculate in image a ' with each characteristic point in image a (x, y)^{T}Corresponding point (x_{1},y_{1})^{T}.And use in step B method respectively in image a with (x, y)^{T}Centered by square region in search with image a ' in each characteristic point (x_{1},y_{1})^{T}There is the match point of the highest neighborhood crosscorrelation；Meanwhile, with (x in image a '_{1},y_{1})^{T}Centered by square region in search with image a in each characteristic point (x, y)^{T}There is the match point of the highest neighborhood crosscorrelation；
E, repetition step B to step D, until the number of matching double points is stable；
Step 3: when unmanned plane enters adaptive district, according to inertial navigation system, unmanned plane is carried out coarse positioning, find the benchmark image b corresponding with coarse positioning result in onboard memory devices, and carry out scene matching aided navigation with real time imaging a, determine the position of unmanned plane；
Step 4: the error that the unmanned plane position correction vision mileage utilizing step 3 to obtain in adaptive district produces, obtains unmanned plane relatively accurately position；
Step 5: utilize inertial navigation system to provide the current position of unmanned plane and attitude；
Step 6: using the error equation state equation as integrated navigation system of inertial navigation system, navigational coordinate system is chosen as sky, northeast coordinate system, obtains the difference of out position as measurement using obtaining out position in step 4 in step 5.Estimate the drift error of inertia system by Kalman filter, and use this drift error to correct inertial navigation system, the navigational parameter after being merged.
The process of described step 3 is as follows: first obtains ground image a under unmanned aerial vehicle onboard depending on video camera in real time, image a is carried out pretreatment, obtains image c；Then the FAST feature of image b and image c is extracted；Reuse FREAK feature descriptor the FAST feature extracted is described；The similarity criterion utilizing nearest Hamming distance carries out characteristic matching, obtains matched position, i.e. currently without manmachine position.
Described ground image a is optical imagery or infrared image.
Beneficial effect
A kind of based on scene matching aided navigation/vision mileage the inertia integrated navigation method that the present invention proposes, according to vision mileage principle, calculating unmanned plane to take photo by plane the homography matrix of realtime image sequences, by the relative displacement between the realtime figure of accumulation two continuous frames, recurrence calculation goes out the current location of unmanned plane；Owing to vision mileage navigation increase in time can produce cumulative error, thus introduce Scene Matching Algorithms based on FREAK feature and carry out auxiliary correction, scene matching aided navigation has the advantages such as high, strong, the electromagnetism interference of autonomy of positioning precision, can carry out hiFix in adaptive district, the navigation of effective compensation vision mileage works long hours the cumulative error produced；Set up the error model of inertial navigation system and the measurement model of vision data, draw optimal estimation result by Kalman filtering, and inertial navigation system is corrected.The present invention effectively improves navigation accuracy, is favorably improved unmanned plane autonomous flight ability.
But, the research of scene matching aided navigation adaptation discrimination shows, the only coupling in adaptive district can provide positional information more reliably for unmanned plane, and on nonadaptive district such as desert, sea etc., scene matching aided navigation cannot normally work.
Vision mileage is the technology by processing two continuous frames Image estimation movable information, and this technology, as a kind of new navigator fix mode, has been successfully applied in autonomous mobile robot.During unmanned plane during flying, owing to two continuous frames image is same sensor under unmanned aerial vehicle platform, same time period, the image of identical conditions shooting, there is identical noise profile and image error, and there is not the imaging difference that natural conditions change causes, position information more accurately therefore, it is possible to provide in nonadaptive district；Meanwhile, two continuous frames picture size is much smaller compared to reference map size in scene matching aided navigation, and therefore match time is less, thus improves the realtime of navigation system.
Therefore, the present invention proposes a kind of unmanned plane inertia integrated navigation method based on scene matching aided navigation/vision mileage, the method has that autonomy is strong, load is light, equipment cost is low and the advantage such as strong interference immunity, and the engineer applied for UAV Navigation System provides a kind of feasible technical scheme.It it is a kind of navigation mode based on computer vision, have that positioning precision is high, Antiamyloidβ antibody ability strong and the remarkable advantage such as little, the low cost of airborne equipment size, the cumulative error that inertial navigation system works long hours can be eliminated, increase substantially the positioning precision of inertial navigation system, become the standby navigation when GPS failure or precise decreasing and send out method.
Accompanying drawing explanation
Fig. 1 is the framework flow process of the present invention
Fig. 2 is two view homograph graphs of a relation
Fig. 3 is that FREAK describes sampling pattern
Detailed description of the invention
In conjunction with embodiment, accompanying drawing, the invention will be further described:
Inertia integrated navigation method based on scene matching aided navigation/vision mileage in the present invention, reconnaissance means are utilized to obtain the atural object scene of the predetermined flight range of unmanned plane as reference map, when the unmanned plane carrying vision sensor flies over predetermined region, the parameters such as the local image of acquisition according to pixels point resolution, flying height and visual field generate a certain size atural object scene as figure in real time, by mating of in real time figure and reference map, find out realtime figure position in reference map, and then determine the accurate location that unmanned plane is current.Scene matching navigation precision is not affected by navigation time, possess the advantages such as antielectromagnetic interference capability is strong, autonomy good, precision is high, little, the abundant information of low cost, volume, itself and inertial navigation are combined being greatly improved the overall performance of navigation system, therefore in Navigation of Pilotless Aircraft, carry out inertia/scene matching aided navigation independent combined navigation research, be conducive to breaking away from extraneous aid system, strengthen the reliability of unmanned plane, mobility, disguise, antiinterference and survival ability.Mainly include inertial navigation, scene matching aided navigation, vision mileage, merge correction and five parts of integrated navigation Kalman filtering, comprise the following steps:
The first step, obtains ground image a in real time depending on video camera, by estimating the homography matrix of real time imaging a and former frame real time imaging a ', determines the vision mileage of unmanned plane for airborne time.
It is embodied as step as follows:
1, use Harris Corner Detection Algorithm respectively in two continuous frames real time imaging a and a ' middle extraction characteristic point；
2, in a ' each characteristic point (x, y)^{T}, in a with (x, y)^{T}Centered by square region in search there is the match point of the highest neighborhood crosscorrelation.In like manner, to each characteristic point in a at a ' its match point of middle search, matching characteristic point pair is finally determined；
3, RANSAC robust estimation method is used to obtain the consistent point set of maximum and calculate the homography matrix H between two continuous frames real time imaging a and a '.Idiographic flow includes: (1) randomly draws 4 groups of matching double points one random samples of composition, and calculates homography matrix H；(2) to each matching double points in step 2, computed range dis；(3) setting threshold value t (t is less than the half of the realtime figure length of side), if dis < t, then this matching double points is interior point, otherwise rejects, and counts out in statistics；(4) repeat said process k time, select that there is the H that imperial palace is counted；
4, homography matrix H is recalculated by all match points delimited as interior point, and by the position of the region of search in H deciding step 2, so that it is determined that matching double points more accurately；
5, repeat step 2 to stablize to step 4, the number until matching double points, and use the stable matching point finally determined to calculating homography matrix H.
6, the homography matrix H that step 5 obtains is used, and the elevation information that the attitude information provided according to inertance element provides with barometertic altimeter, obtain the video camera relative displacement when shooting two continuous frames image, displacement by accumulation video camera, the position estimated before utilization, cyclically calculates the position that unmanned plane is current.
Second step, when unmanned plane enters adaptive district, according to inertial navigation system, unmanned plane is carried out coarse positioning, according to coarse positioning result, cutting circular horizon subimage b in reference map is taken photo by plane in the unmanned plane during flying region prestored, benchmark subgraph radius need to be more than inertance element unit interval average drift distance and unmanned plane between two adaptive districts 1.5 times of flight time product.Real time imaging a and benchmark subimage b is carried out scene matching aided navigation, determines the position of unmanned plane.Scene matching aided navigation specifically comprises the following steps that
1, respectively real time imaging a and benchmark subimage b is carried out gray processing, and extract the FAST characteristic point of real time imaging a and benchmark subimage b.
2, each FAST characteristic point being obtained its FREAK feature description operator, its step is as follows:
As it is shown on figure 3, FREAK builds circular sampling centered by each characteristic point, center is intensive, and surrounding is sparse, and number of samples successively decreases with exponential form.Using different gaussian kernel presmoothed each sampling circle, the standard deviation of gaussian kernel is directly proportional to the size of circle.
FREAK describes son and is made up of the Bit String of difference Gauss, and criterion T of definition sampling circle is
P_{a}Representing sampling right, I () represents the sampling circle intensity level after smoothing.Select N number of sampling right, define binary system criterion
I.e. can get the binary bits string of Ndimensional.
Sample to choose be one by thick to smart process, criterion is as follows:
A) creating a big matrix D that can comprise 50,000 characteristic points, often row represents a characteristic point, by comparing the 43 of each characteristic point sampling circle intensity twobytwo, obtains 1000 multidimensional and describes son；
B) calculating average and the variance of the every string of matrix D, when average is 0.5, variance is maximum, it is ensured that the uniqueness of feature descriptor.
C) being ranked up every string according to side's extent, what variance was maximum is positioned at first row.
D) before retaining, N row, are described each characteristic point, obtain Ndimensional binary bits string.The present invention selects N=256.
Such as Fig. 3, selecting M (M=45) individual symmetric sampling pair in all sampling circles, calculating partial gradient is
The bivector of corresponding sampling circle, I () represents the sampling circle intensity level after smoothing.
3, calculating carries out the nearest Hamming distance of two the characteristic point FREAK descriptors mated.In order to filter error hiding, the present invention arranges a threshold value 10, the most directly filters higher than this, is considered as 2 points being mutually matched less than the point of this threshold value.
4, carry out characteristic matching according to the method in step 2 and step 3, be determined by real time imaging a characteristic point position in benchmark subimage b to determine real time imaging a position in benchmark subimage b, so that it is determined that aircraft current location.
3rd step, the error that the unmanned plane position correction vision mileage utilizing second step to obtain produces, obtain unmanned plane more accurate position P_{vision}, and provide, by inertial navigation system, the position P that unmanned plane is current_{ins}With attitude A_{ins}。
Assuming that the unmanned plane site error obtained by scene matching aided navigation in adaptive district is the least, the position the most directly using the result of scene matching aided navigation to calculate visual odometry resets.
4th step, uses Kalman filter to P_{vision}、P_{ins}And A_{ins}Estimate, obtain optimum navigation information.
Specific embodiment is as follows:
1, during unmanned plane during flying, airborne lower ground image a is obtained in real time depending on video camera.
The lower optometry video camera or the thermal camera that utilize unmanned aerial vehicle onboard obtain ground image sequence in real time, but only need to preserve present frame and previous frame image.
2, utilize image a and previous frame image a ', by estimating the homography matrix of a and a ', determine the vision mileage of unmanned plane.
As in figure 2 it is shown, unmanned plane is under state of flight, Airborne camera is continuously shot two two field picture I under different positions and pose_{1}And I_{2}, corresponding camera coordinate system is F and F ', it is assumed that the some P in plane π is mapped as image I_{1}In point p and I_{2}In some p ', corresponding to the vector in F and F ' beWithThen exist
With t represents rotation and the translational motion of unmanned plane between twice shooting respectively.
Being normal vector relative to plane π at c1 by Fig. 2, n, d is the distance that c1 arrives plane π, has
Therefore,
Wherein,
It is referred to as the plane π homography matrix about video camera.
Consider camera intrinsic parameter, then
Therefore,
P '=KH_{c}K^{1}P=Hp
Wherein,
It is referred to as the plane π homography matrix about two two field pictures.
According to document, homography matrix be degree of freedom be 3 × 3 matrixes of 8, it is therefore desirable to know that between two width images, 4 groups of matching double points calculate, in order to prevent degeneration, these 4 groups of matching double points place planes require to pass through video camera photocentre, and any 3 point not conllinear of 4, space point.
Further, since video camera is connected on unmanned plane, it is believed that the attitude of video camera is consistent with the attitude of unmanned plane, airborne inertial navigation device providing, respectively yaw angle ψ, pitching angle theta, roll angle γ, therefore have
WhereinIt is the spin matrix that at c2, camera coordinates is tied to navigational coordinate system, for
In like manner can obtain
If it is assumed that ground is plane, then n=(0,0,1)^{T}After calculating homography matrix, the elevation information that the attitude information provided according to inertial navigation provides with barometertic altimeter, abovementioned formula is used to obtain the video camera relative displacement when shooting two continuous frames image, displacement by accumulation video camera, the position estimated before utilization, cyclically calculates the position that unmanned plane is current.It is embodied as step as follows:
2.1, use Harris Corner Detection Algorithm respectively in two continuous frames real time imaging a and a ' middle extraction characteristic point；
2.2, in a ' each characteristic point (x, y)^{T}, in a with (x, y)^{T}Centered by square region in search there is the coupling of the highest neighborhood crosscorrelation, symmetrical, to each characteristic point in a in its coupling of the middle search of a ', finally determine matching characteristic point pair；
2.3, RANSAC robust estimation method is used to obtain consistent point set and the estimation of homography matrix H of maximum.Idiographic flow includes: (1) randomly draws 4 groups of matching double points one random samples of composition, and calculates homography matrix H；(2) to each matching double points in step B, computed range dis；(3) setting threshold value t, if dis < t, then this matching double points is interior point, otherwise rejects, and counts out in statistics；(4) repeat said process k time, select that there is the H that imperial palace is counted；
2.4, reappraised by all matching double points H delimited as interior point, and by the region of search in H deciding step 2.2, determine matching double points more accurately；
2.5, repeat step 2.2 to stablize to step 2.4, the number until matching double points.
3, when unmanned plane enters adaptive district, according to inertial navigation system, unmanned plane is carried out coarse positioning, onboard memory devices finds the benchmark image b corresponding with coarse positioning result, and carries out scene matching aided navigation with real time imaging a, determine the position of unmanned plane.
The present invention uses Scene Matching Algorithms based on FREAK descriptor to mate figure in real time with reference map, as it is shown on figure 3, FREAK describes son uses similar retina sampling configuration, circle sampling is built centered by each characteristic point, center is intensive, and surrounding is sparse, and number of samples successively decreases with exponential form.In order to strengthen the robustness of sampling circle, improving stability and the uniqueness of descriptor, use different gaussian kernel presmoothed each sampling circle, the standard deviation of gaussian kernel is directly proportional to the size of circle.
FREAK describes son and is made up of the Bit String of difference Gauss, and criterion T of definition sampling circle is
P_{a}Representing sampling right, I () represents the sampling circle intensity level after smoothing.Select N number of sampling right, define binary system criterion
I.e. can get the binary bits string of Ndimensional.
Sample to choose be one by thick to smart process, criterion is as follows:
A) creating a big matrix D that can comprise 50,000 characteristic points, often row represents a characteristic point, by comparing the 43 of each characteristic point sampling circle intensity twobytwo, obtains 1000 multidimensional and describes son；
B) calculating average and the variance of the every string of matrix D, when average is 0.5, variance is maximum, it is ensured that the uniqueness of feature descriptor.
C) being ranked up every string according to side's extent, what variance was maximum is positioned at first row.
D) before retaining, N row, are described each characteristic point, obtain Ndimensional binary bits string.Select N=256 herein.
Sample to Criterion of Selecting ensure that the gray scale invariance of descriptor.
Such as Fig. 3, selecting M (M=45) individual symmetric sampling pair in all sampling circles, calculating partial gradient is
The bivector of corresponding sampling circle, I () represents the sampling circle intensity level after smoothing.Sample to symmetry ensure that describe son rotational invariance.
FREAK descriptor is the binary bits string of 1 and 0 composition, and therefore similarity criterion uses nearest Hamming distance method, i.e. two descriptor stepbysteps to mating to carry out xor operation.For 512 dimension FREAK descriptors, maximum Hamming distance is 512, minimum 0, and in order to filter error hiding, threshold value T is set, the most directly filters higher than T, threshold value is set to 10 herein.
4, the error that the unmanned plane position correction vision mileage utilizing scene matching aided navigation to obtain in adaptive district produces, obtains unmanned plane relatively accurately position.
Owing to the positioning result of scene matching aided navigation is reliable in adaptive district, therefore directly utilize the position that visual odometry obtains by the unmanned plane position that scene matching aided navigation obtains and reset, eliminate vision mileage and work long hours the cumulative error of generation.
5, inertial navigation system is utilized to provide the current position of unmanned plane and attitude.
6, Kalman filtering algorithm is utilized to estimate the current exact position of unmanned plane and attitude.
Use the error equation state equation as integrated navigation system of inertial navigation system, navigational coordinate system is chosen as sky, northeast coordinate system, using the difference of position in position in 4 and 5 as measuring value, the drift error of inertia system is estimated by Kalman filter, then inertial navigation system is corrected, the navigational parameter after being merged.
Claims (3)
Priority Applications (1)
Application Number  Priority Date  Filing Date  Title 

CN201410128459.XA CN103954283B (en)  20140401  20140401  Inertia integrated navigation method based on scene matching aided navigation/vision mileage 
Applications Claiming Priority (1)
Application Number  Priority Date  Filing Date  Title 

CN201410128459.XA CN103954283B (en)  20140401  20140401  Inertia integrated navigation method based on scene matching aided navigation/vision mileage 
Publications (2)
Publication Number  Publication Date 

CN103954283A CN103954283A (en)  20140730 
CN103954283B true CN103954283B (en)  20160831 
Family
ID=51331593
Family Applications (1)
Application Number  Title  Priority Date  Filing Date 

CN201410128459.XA CN103954283B (en)  20140401  20140401  Inertia integrated navigation method based on scene matching aided navigation/vision mileage 
Country Status (1)
Country  Link 

CN (1)  CN103954283B (en) 
Families Citing this family (12)
Publication number  Priority date  Publication date  Assignee  Title 

CN105675013B (en) *  20141121  20190301  中国飞行试验研究院  Civil aircraft inertial navigation dynamic calibration method 
CN105045276B (en) *  20150703  20190111  深圳一电航空技术有限公司  UAV Flight Control method and device 
CN105180933B (en) *  20150914  20171121  中国科学院合肥物质科学研究院  Mobile robot reckoning update the system and method based on the detection of straight trip crossing 
CN105222772B (en) *  20150917  20180316  泉州装备制造研究所  A kind of highprecision motion track detection system based on Multisource Information Fusion 
CN105374043B (en) *  20151202  20170405  福州华鹰重工机械有限公司  Visual odometry filtering background method and device 
CN105865451B (en) *  20160419  20191001  深圳市神州云海智能科技有限公司  Method and apparatus for mobile robot indoor positioning 
CN105953796A (en) *  20160523  20160921  北京暴风魔镜科技有限公司  Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone 
US10012517B2 (en) *  20160801  20180703  Infinity Augmented Reality Israel Ltd.  Method and system for calibrating components of an inertial measurement unit (IMU) using scenecaptured data 
CN107067415B (en) *  20170321  20190730  南京航空航天大学  A kind of object localization method based on images match 
CN107167140B (en) *  20170526  20191108  江苏大学  A kind of unmanned plane vision positioning accumulated error suppressing method 
CN107843240A (en) *  20170914  20180327  中国人民解放军92859部队  A kind of seashore region unmanned plane image same place information rapid extracting method 
CN107498559A (en) *  20170926  20171222  珠海市微半导体有限公司  The detection method and chip that the robot of viewbased access control model turns to 
Citations (6)
Publication number  Priority date  Publication date  Assignee  Title 

US6856894B1 (en) *  20031023  20050215  International Business Machines Corporation  Navigating a UAV under remote control and manual control with three dimensional flight depiction 
EP1975646A2 (en) *  20070328  20081001  Honeywell International Inc.  Laderbased motion estimation for navigation 
CN101598556A (en) *  20090715  20091209  北京航空航天大学  Unmanned aerial vehicle vision/inertia integrated navigation method in unknown environment 
CN102435188A (en) *  20110915  20120502  南京航空航天大学  Monocular vision/inertia autonomous navigation method for indoor environment 
CN102538781A (en) *  20111214  20120704  浙江大学  Machine vision and inertial navigation fusionbased mobile robot motion attitude estimation method 
CN102829785A (en) *  20120830  20121219  中国人民解放军国防科学技术大学  Air vehicle fullparameter navigation method based on sequence image and reference image matching 

2014
 20140401 CN CN201410128459.XA patent/CN103954283B/en active IP Right Grant
Patent Citations (6)
Publication number  Priority date  Publication date  Assignee  Title 

US6856894B1 (en) *  20031023  20050215  International Business Machines Corporation  Navigating a UAV under remote control and manual control with three dimensional flight depiction 
EP1975646A2 (en) *  20070328  20081001  Honeywell International Inc.  Laderbased motion estimation for navigation 
CN101598556A (en) *  20090715  20091209  北京航空航天大学  Unmanned aerial vehicle vision/inertia integrated navigation method in unknown environment 
CN102435188A (en) *  20110915  20120502  南京航空航天大学  Monocular vision/inertia autonomous navigation method for indoor environment 
CN102538781A (en) *  20111214  20120704  浙江大学  Machine vision and inertial navigation fusionbased mobile robot motion attitude estimation method 
CN102829785A (en) *  20120830  20121219  中国人民解放军国防科学技术大学  Air vehicle fullparameter navigation method based on sequence image and reference image matching 
NonPatent Citations (3)
Title 

一种基于单目视觉的微型无人机姿态算法;庄瞳等;《计算机工程》;20120831;第38卷(第15期);全文 * 
基于空间关系几何约束的无人机景象匹配导航;李耀军等;《计算机应用研究》;20101031;第27卷(第10期);全文 * 
惯性组合导航系统中的快速景象匹配算法研究;陈方等;《宇航学报》;20091130;第30卷(第6期);全文 * 
Also Published As
Publication number  Publication date 

CN103954283A (en)  20140730 
Similar Documents
Publication  Publication Date  Title 

Cesetti et al.  A visionbased guidance system for UAV navigation and safe landing using natural landmarks  
Artieda et al.  Visual 3d slam from uavs  
Eustice et al.  Visually augmented navigation for autonomous underwater vehicles  
Kohlbrecher et al.  A flexible and scalable slam system with full 3d motion estimation  
JP4685313B2 (en)  Method for processing passive volumetric image of any aspect  
CN103149939B (en)  A kind of unmanned plane dynamic target tracking of viewbased access control model and localization method  
Zhang et al.  An Unmanned Aerial Vehicle‐Based Imaging System for 3D Measurement of Unpaved Road Surface Distresses 1  
US9243916B2 (en)  Observabilityconstrained visionaided inertial navigation  
Sim et al.  Integrated position estimation using aerial image sequences  
US10311297B2 (en)  Determination of position from images and associated camera positions  
US9031809B1 (en)  Method and apparatus for generating threedimensional pose using multimodal sensor fusion  
Pizarro et al.  Large area 3D reconstructions from underwater optical surveys  
JP2009266224A (en)  Method and system for realtime visual odometry  
Singh et al.  Towards highresolution imaging from underwater vehicles  
Merino et al.  Visionbased multiUAV position estimation  
Chiabrando et al.  UAV and RPV systems for photogrammetric surveys in archaelogical areas: two tests in the Piedmont region (Italy)  
Eustice  Largearea visually augmented navigation for autonomous underwater vehicles  
Kim et al.  Posegraph visual SLAM with geometric model selection for autonomous underwater ship hull inspection  
Lin et al.  Autonomous aerial navigation using monocular visual‐inertial fusion  
US8855442B2 (en)  Image registration of multimodal data using 3DGeoArcs  
Barazzetti et al.  Fully automatic UAV imagebased sensor orientation  
Kong et al.  Autonomous landing of an UAV with a groundbased actuated infrared stereo vision system  
CN102435188B (en)  Monocular vision/inertia autonomous navigation method for indoor environment  
Shabayek et al.  Vision based uav attitude estimation: Progress and insights  
GurŽfil et al.  Partial aircraft state estimation from visual motion using the subspace constraints approach 
Legal Events
Date  Code  Title  Description 

PB01  Publication  
C06  Publication  
SE01  Entry into force of request for substantive examination  
C10  Entry into substantive examination  
GR01  Patent grant  
C14  Grant of patent or utility model  
TR01  Transfer of patent right 
Effective date of registration: 20190826 Address after: Room 404, Material Building, Northwest Polytechnic University, 127 Youyi West Road, Xi'an City, Shaanxi Province, 710072 Patentee after: Xi'an Northwestern Polytechnical University Asset Management Co., Ltd. Address before: 710072 Xi'an friendship West Road, Shaanxi, No. 127 Patentee before: Northwestern Polytechnical University 

TR01  Transfer of patent right  
TR01  Transfer of patent right 
Effective date of registration: 20191113 Address after: 710072 floor 19, building B, innovation and technology building, northwest Polytechnic University, No.127, Youyi West Road, Beilin District, Xi'an, Shaanxi Province Patentee after: Shaanxi CISCO Rudi Network Security Technology Co., Ltd. Address before: Room 404, Material Building, Northwest Polytechnic University, 127 Youyi West Road, Xi'an City, Shaanxi Province, 710072 Patentee before: Xi'an Northwestern Polytechnical University Asset Management Co., Ltd. 

TR01  Transfer of patent right 