CN103954283A - Scene matching/visual odometry-based inertial integrated navigation method - Google Patents

Scene matching/visual odometry-based inertial integrated navigation method Download PDF

Info

Publication number
CN103954283A
CN103954283A CN201410128459.XA CN201410128459A CN103954283A CN 103954283 A CN103954283 A CN 103954283A CN 201410128459 A CN201410128459 A CN 201410128459A CN 103954283 A CN103954283 A CN 103954283A
Authority
CN
China
Prior art keywords
image
unmanned aerial
matching
aerial vehicle
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410128459.XA
Other languages
Chinese (zh)
Other versions
CN103954283B (en
Inventor
赵春晖
王荣志
张天武
潘泉
马鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Chenxiang Zhuoyue Technology Co ltd
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN201410128459.XA priority Critical patent/CN103954283B/en
Publication of CN103954283A publication Critical patent/CN103954283A/en
Application granted granted Critical
Publication of CN103954283B publication Critical patent/CN103954283B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manufacturing & Machinery (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a scene matching/visual odometry-based inertial integrated navigation method. The method comprises the following steps: calculating the homography matrix of an unmanned plane aerial photography real time image sequence according to a visual odometry principle, and carrying out recursive calculation by accumulating a relative displacement between two continuous frames of real time graph to obtain the present position of the unmanned plane; introducing an FREAK characteristic-based scene matching algorithm because of the accumulative error generation caused by the increase of the visual odometry navigation with the time in order to carry out aided correction, and carrying out high precision positioning in an adaption zone to effectively compensate the accumulative error generated by the long-time work of the visual odometry navigation, wherein the scene matching has the advantages of high positioning precision, strong automaticity, anti-electromagnetic interference and the like; and establishing the error model of the inertial navigation system and a visual data measuring model, carrying out Kalman filtering to obtain an optimal estimation result, and correcting the inertial navigation system. The method effectively improves the navigation precision, and is helpful for improving the autonomous flight capability of the unmanned plane.

Description

Inertial integrated navigation method based on scene matching/visual mileage
Technical Field
The invention belongs to the technical field of unmanned aerial vehicle navigation positioning, and relates to an inertial integrated navigation method based on scene matching/visual mileage.
Background
The high-precision, high-dynamic and high-reliability autonomous navigation is one of key technologies for ensuring that the unmanned aerial vehicle smoothly completes various tasks, and has very important significance for enhancing the autonomous behavior capability of the unmanned aerial vehicle and improving the combat effectiveness. The navigation modes are satellite navigation, radio navigation, inertial navigation and the like, wherein Inertial Navigation (INS) has the outstanding advantage of high autonomy and occupies a special position in the navigation technology, and the existing unmanned aerial vehicle navigation system forms a combined navigation system by taking inertial navigation as a core to finish the autonomy navigation of the unmanned aerial vehicle in a complex environment.
At present, the main mode of unmanned aerial vehicle navigation is an INS/GPS combined navigation system, but in an unknown and dynamically-changing complex environment, the GPS signal power is weak, electromagnetic interference is easy to occur, and even the work can be stopped in a signal blind area, so that the combined navigation system has great errors and unpredictable results, and the Beidou satellite navigation system in China is in a continuous development stage, and the reliability and the accuracy of the Beidou satellite navigation system in China are difficult to meet the high-accuracy navigation requirements such as military application.
Disclosure of Invention
Technical problem to be solved
In order to avoid the defects of the prior art, the invention provides an inertial integrated navigation method based on scene matching/visual mileage, and full autonomous navigation of an unmanned aerial vehicle in a complex environment is realized.
Technical scheme
An inertial integrated navigation method based on scene matching/visual mileage is characterized by comprising the following steps:
step 1: in the flight process of the unmanned aerial vehicle, an airborne downward-looking camera acquires a ground image a in real time;
step 2: determining the visual mileage of the unmanned aerial vehicle by using the image a and the previous frame image a', and comprising the following steps:
A. respectively extracting characteristic points from two continuous frames of real-time images a and a' by using a Harris angular point detection algorithm;
B. in image a with (x, y)TSquare area as centerDomain search and image a' each feature point (x, y)TA matching point with highest neighborhood cross-correlation; at the same time, in the image a', with (x, y)TSearch for each feature point (x, y) in the image a in a centered square areaTA matching point with highest neighborhood cross-correlation;
C. obtaining the estimation of the maximum consistent point set and the homography matrix H by using a RANSAC robust estimation method, wherein the process comprises the steps of randomly extracting 4 groups of matching point pairs to form a random sample and calculating the homography matrix H; then calculating the distance dis for each matching point pair in the step B; setting a threshold value t again, if dis < t, the matching point pair is an inner point, otherwise, removing the inner point pair, and counting the number of the inner points; repeating the process k times, and selecting H;
D. reestimating H with maximum number of inliers from all pairs of matching points delimited as inliersUsing the retrieved H, each feature point (x, y) in the image a' and in the image a is calculatedTCorresponding point (x)1,y1)T. And using the method in step B to respectively obtain (x, y) in the image aTSearch for each feature point (x) in the image a' in a centered square region1,y1)TA matching point with highest neighborhood cross-correlation; at the same time, in the image a' with (x)1,y1)TSearch for each feature point (x, y) in the image a in a centered square areaTA matching point with highest neighborhood cross-correlation;
E. repeating the step B to the step D until the number of the matching point pairs is stable;
and step 3: when the unmanned aerial vehicle enters the adaptation area, roughly positioning the unmanned aerial vehicle according to an inertial navigation system, finding a reference image b corresponding to a roughly positioning result in an onboard storage device, carrying out scene matching with a real-time image a, and determining the position of the unmanned aerial vehicle;
and 4, step 4: correcting errors generated by the visual mileage by using the position of the unmanned aerial vehicle obtained in the step (3) in the adaptation area to obtain the more accurate position of the unmanned aerial vehicle;
and 5: the current position and the current attitude of the unmanned aerial vehicle are given by an inertial navigation system;
step 6: and (3) using an error equation of the inertial navigation system as a state equation of the integrated navigation system, selecting a navigation coordinate system as a northeast coordinate system, and measuring a difference value between the position obtained in the step (4) and the position obtained in the step (5). And estimating the drift error of the inertial system by using a Kalman filter, and correcting the inertial navigation system by using the drift error to obtain the fused navigation parameters.
The process of the step 3 is as follows: firstly, an airborne down-looking camera of an unmanned aerial vehicle acquires a ground image a in real time, and the image a is preprocessed to obtain an image c; then extracting FAST characteristics of the image b and the image c; describing the extracted FAST features by using a FREAK feature descriptor; and performing feature matching by using a similarity criterion of the nearest Hamming distance to obtain a matching position, namely the position of the current unmanned aerial vehicle.
The ground image a is an optical image or an infrared image.
Advantageous effects
The invention provides an inertial integrated navigation method based on scene matching/visual mileage, which comprises the steps of calculating a homography matrix of an unmanned aerial vehicle aerial real-time image sequence according to a visual mileage principle, and calculating the current position of the unmanned aerial vehicle in a recursion manner by accumulating the relative displacement between two continuous frames of real-time images; because the visual mileage navigation can generate accumulated errors along with the increase of time, a scene matching algorithm based on the FREAK characteristics is introduced for auxiliary correction, the scene matching has the advantages of high positioning precision, strong autonomy, electromagnetic interference resistance and the like, the high-precision positioning can be performed in an adaptation area, and the accumulated errors generated by the long-time work of the visual mileage navigation are effectively compensated; and establishing an error model of the inertial navigation system and a measurement model of the visual data, obtaining an optimal estimation result through Kalman filtering, and correcting the inertial navigation system. The invention effectively improves the navigation precision and is beneficial to improving the autonomous flight capability of the unmanned aerial vehicle.
However, research on the scene matching adaptive area degree shows that only matching in the adaptive area can provide reliable position information for the unmanned aerial vehicle, and scene matching cannot work normally in non-adaptive areas such as deserts, seas and the like.
The visual mileage is a technology for estimating motion information by processing two continuous frames of images, and the technology is used as a new navigation positioning mode and is successfully applied to an autonomous mobile robot. In the flight process of the unmanned aerial vehicle, two continuous frames of images are images shot under the same sensor, the same time period and the same condition under the unmanned aerial vehicle platform, have the same noise distribution and imaging error, and have no imaging difference caused by natural condition change, so that more accurate positioning information can be provided in a non-adaptive area; meanwhile, the size of two continuous frames of images is much smaller than that of a reference image in scene matching, so that the matching time is shorter, and the real-time performance of the navigation system is improved.
Therefore, the invention provides the unmanned aerial vehicle inertial integrated navigation method based on scene matching/visual mileage, which has the advantages of strong autonomy, light load, low equipment cost, strong anti-interference performance and the like, and provides a feasible technical scheme for the engineering application of the unmanned aerial vehicle navigation system. The method is a navigation mode based on computer vision, has the remarkable advantages of high positioning accuracy, strong anti-electronic interference capability, small size of airborne equipment, low cost and the like, can eliminate accumulated errors of long-time work of the inertial navigation system, greatly improves the positioning accuracy of the inertial navigation system, and becomes a standby navigation method under the conditions of GPS failure, failure or accuracy reduction and the like.
Drawings
FIG. 1 is a framework flow of the present invention
FIG. 2 is a diagram of a two-view homographic transformation relationship
FIG. 3 is a FREAK description sub-sampling pattern
Detailed Description
The invention will now be further described with reference to the following examples and drawings:
according to the inertial integrated navigation method based on scene matching/visual mileage, a reconnaissance means is used for obtaining the ground object scene of a preset flight area of an unmanned aerial vehicle as a reference image, when the unmanned aerial vehicle carrying a visual sensor flies through the preset area, the local image is obtained, the ground object scene with a certain size is generated according to the pixel resolution, the flight height, the field of view and other parameters to serve as a real-time image, the position of the real-time image in the reference image is found out through matching of the real-time image and the reference image, and the current accurate position of the unmanned aerial vehicle is further determined. The scene matching navigation precision is not influenced by navigation time, and the method has the advantages of strong anti-electromagnetic interference capability, good autonomy, high precision, low cost, small size, rich information and the like, and the integral performance of the navigation system can be greatly improved by combining the method with inertial navigation, so that the research on the inertial/scene matching autonomous combined navigation is developed in the navigation of the unmanned aerial vehicle, an external auxiliary system is favorably eliminated, and the reliability, maneuverability, concealment, anti-interference performance and viability of the unmanned aerial vehicle are enhanced. The method mainly comprises five parts of inertial navigation, scene matching, visual mileage, fusion correction and integrated navigation Kalman filtering, and comprises the following steps:
firstly, an airborne downward-looking camera acquires a ground image a in real time, and the visual mileage of the unmanned aerial vehicle is determined by estimating a homography matrix of the real-time image a and a previous frame of real-time image a'.
The specific implementation steps are as follows:
1. respectively extracting characteristic points from two continuous frames of real-time images a and a' by using a Harris angular point detection algorithm;
2. for each of aCharacteristic point (x, y)TIn a with (x, y)TThe matching point with the highest neighborhood cross-correlation is searched in the centered square region. Similarly, searching each characteristic point in a' for a matching point thereof, and finally determining a matching characteristic point pair;
3. and obtaining a maximum consistent point set by using a RANSAC robust estimation method and calculating a homography matrix H between two continuous frames of real-time images a and a'. The specific process comprises the following steps: (1) randomly extracting 4 groups of matching point pairs to form a random sample, and calculating a homography matrix H; (2) calculating the distance dis for each matching point pair in the step 2; (3) setting a threshold value t (t is less than half of the side length of the real-time graph), if dis < t, the matching point pair is an inner point, otherwise, the matching point pair is removed, and counting the number of the inner points; (4) repeating the process k times, and selecting H with the maximum inner point number;
4. recalculating the homography matrix H from all the matching points which are defined as interior points, and determining the position of the search area in the step 2 according to H, thereby determining a more accurate matching point pair;
5. repeating the steps 2 to 4 until the number of the matching point pairs is stable, and calculating the homography matrix H by using the finally determined stable matching point pairs.
6. And (3) calculating the relative displacement of the camera when the camera shoots two continuous frames of images by using the homography matrix H obtained in the step (5) and according to the attitude information provided by the inertial element and the height information provided by the barometric altimeter, and circularly calculating the current position of the unmanned aerial vehicle by accumulating the displacement of the camera and utilizing the previously estimated position.
And secondly, when the unmanned aerial vehicle enters the adaptation areas, roughly positioning the unmanned aerial vehicle according to an inertial navigation system, and cutting a circular reference sub-image b in a prestored aerial shooting reference image of the flight area of the unmanned aerial vehicle according to a rough positioning result, wherein the radius of the reference sub-image is required to be larger than 1.5 times of the product of the average drift distance of the inertial element in unit time and the flight time of the unmanned aerial vehicle between the two adaptation areas. And carrying out scene matching on the real-time image a and the reference sub-image b, and determining the position of the unmanned aerial vehicle. The scene matching comprises the following specific steps:
1. and graying the real-time image a and the reference sub-image b respectively, and extracting FAST characteristic points of the real-time image a and the reference sub-image b.
2. Finding the FREAK feature description operator of each FAST feature point, wherein the steps are as follows:
as shown in fig. 3, FREAK constructs circular samples with each feature point as the center, the center is dense, the periphery is sparse, and the number of samples decreases exponentially. Each sample circle is pre-smoothed using a different gaussian kernel whose standard deviation is proportional to the size of the circle.
The FREAK descriptor is composed of bit strings of differential gaussians, and the criterion T for defining a sampling circle is
PaRepresents the sample pair, and I (-) represents the smoothed sampled circular intensity value. Selecting N sampling pairs to define binary standard
<math><mrow> <mi>F</mi> <mo>=</mo> <munder> <mi>&Sigma;</mi> <mrow> <mn>0</mn> <mo>&le;</mo> <mi>a</mi> <mo><</mo> <mi>N</mi> </mrow> </munder> <msup> <mn>2</mn> <mi>a</mi> </msup> <mi>T</mi> <mrow> <mo>(</mo> <msub> <mi>P</mi> <mi>a</mi> </msub> <mo>)</mo> </mrow> </mrow></math>
Thus obtaining the binary bit string with N dimensions.
The sampling pair is selected from a coarse process to a fine process, and the criteria are as follows:
a) creating a large matrix D which can contain 5 ten thousand feature points, wherein each row represents one feature point, and comparing the intensities of 43 sampling circles of each feature point pairwise to obtain 1000 multidimensional descriptors;
b) and calculating the mean and the variance of each column of the matrix D, wherein the variance is the largest when the mean is 0.5, so that the uniqueness of the feature descriptor can be ensured.
c) Each column is sorted according to the magnitude of the variance, with the largest variance being in the first column.
d) And reserving the first N columns, and describing each feature point to obtain an N-dimensional binary bit string. The present invention selects N-256.
As shown in fig. 3, M (M ═ 45) symmetric sampling pairs are selected from all sampling circles, and the local gradient is calculated as
<math><mrow> <mi>O</mi> <mo>=</mo> <mfrac> <mn>1</mn> <mi>M</mi> </mfrac> <munder> <mi>&Sigma;</mi> <mrow> <msub> <mi>P</mi> <mi>O</mi> </msub> <mo>&Element;</mo> <mi>G</mi> </mrow> </munder> <mrow> <mo>(</mo> <mi>I</mi> <mrow> <mo>(</mo> <msubsup> <mi>P</mi> <mi>O</mi> <mrow> <mi>r</mi> <mn>1</mn> </mrow> </msubsup> <mo>)</mo> </mrow> <mo>-</mo> <mi>I</mi> <mrow> <mo>(</mo> <msubsup> <mi>P</mi> <mi>O</mi> <mrow> <mi>r</mi> <mn>2</mn> </mrow> </msubsup> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mfrac> <mrow> <msubsup> <mi>P</mi> <mi>O</mi> <mrow> <mi>r</mi> <mn>1</mn> </mrow> </msubsup> <mo>-</mo> <msubsup> <mi>P</mi> <mi>O</mi> <mrow> <mi>r</mi> <mn>2</mn> </mrow> </msubsup> </mrow> <mrow> <mo>|</mo> <mo>|</mo> <msubsup> <mi>P</mi> <mi>O</mi> <mrow> <mi>r</mi> <mn>1</mn> </mrow> </msubsup> <mo>-</mo> <msubsup> <mi>P</mi> <mi>O</mi> <mrow> <mi>r</mi> <mn>2</mn> </mrow> </msubsup> <mo>|</mo> <mo>|</mo> </mrow> </mfrac> </mrow></math>
And corresponding to the two-dimensional vector of the sampling circle, and I (-) represents the intensity value of the sampling circle after smoothing.
3. The closest hamming distance of the matching two feature point FREAK descriptors is calculated. In order to filter out mismatches, the invention sets a threshold 10 above which direct filtering is performed, and points below which are considered as two points that match each other.
4. And performing feature matching according to the methods in the step 2 and the step 3, and determining the position of the real-time image a in the reference sub-image b by determining the position of the feature point in the real-time image a in the reference sub-image b, so as to determine the current position of the airplane.
Thirdly, correcting errors generated by the visual mileage by using the position of the unmanned aerial vehicle obtained in the second step to obtain a more accurate position P of the unmanned aerial vehiclevisionAnd the current position P of the unmanned aerial vehicle is given through an inertial navigation systeminsWith attitude Ains
Assuming that the unmanned aerial vehicle position error obtained by scene matching in the adaptation area is small, the position calculated by the visual range is reset by directly using the result of scene matching in the step.
The fourth step, using Kalman filter to Pvision、PinsAnd AinsAnd estimating to obtain the optimal navigation information.
The specific embodiment is as follows:
1. in the flight process of the unmanned aerial vehicle, the airborne downward-looking camera acquires a ground image a in real time.
The ground image sequence is acquired in real time by using a down-looking optical camera or an infrared camera carried by an unmanned aerial vehicle, but only the current frame and the previous frame of image are required to be stored.
2. And determining the visual mileage of the unmanned aerial vehicle by estimating a homography matrix of the image a and the image a' of the previous frame.
As shown in FIG. 2, when the unmanned aerial vehicle is in flight, the airborne camera continuously shoots two frames of images I at different poses1And I2Corresponding camera coordinate systems F and F', assuming that a point P on plane π is mapped as image I1Points p and I in2Point p 'in (1) corresponds to a vector in F and F' ofAndthen there is
<math><mrow> <msup> <mover> <mi>p</mi> <mo>&RightArrow;</mo> </mover> <mo>&prime;</mo> </msup> <mo>=</mo> <msubsup> <mi>R</mi> <mrow> <mi>c</mi> <mn>1</mn> </mrow> <mrow> <mi>c</mi> <mn>2</mn> </mrow> </msubsup> <mover> <mi>p</mi> <mo>&RightArrow;</mo> </mover> <mo>+</mo> <mi>t</mi> </mrow></math>
And t represents the rotation and translation motion of the drone between two shots, respectively.
From FIG. 2, n is the normal vector at c1 with respect to plane π, d is the distance from c1 to plane π, there
<math><mrow> <msup> <mi>n</mi> <mi>T</mi> </msup> <mover> <mi>p</mi> <mo>&RightArrow;</mo> </mover> <mo>=</mo> <mi>d</mi> </mrow></math>
Therefore, the temperature of the molten metal is controlled,
<math><mrow> <msup> <mover> <mi>p</mi> <mo>&RightArrow;</mo> </mover> <mo>&prime;</mo> </msup> <mo>=</mo> <mi>R</mi> <mover> <mi>p</mi> <mo>&RightArrow;</mo> </mover> <mo>+</mo> <mi>t</mi> <mfrac> <mrow> <msup> <mi>n</mi> <mi>T</mi> </msup> <mover> <mi>p</mi> <mo>&RightArrow;</mo> </mover> </mrow> <mi>d</mi> </mfrac> <mo>=</mo> <msub> <mi>H</mi> <mi>c</mi> </msub> <mover> <mi>p</mi> <mo>&RightArrow;</mo> </mover> </mrow></math>
wherein,
H c = R c 1 c 2 + tn T d
called the homography matrix of the plane pi with respect to the camera.
Taking into account camera intrinsic parameters, then
<math><mrow> <mi>p</mi> <mo>=</mo> <mi>K</mi> <mover> <mi>p</mi> <mo>&RightArrow;</mo> </mover> <mo>,</mo> <msup> <mi>p</mi> <mo>&prime;</mo> </msup> <mo>=</mo> <mi>K</mi> <msup> <mover> <mi>p</mi> <mo>&RightArrow;</mo> </mover> <mo>&prime;</mo> </msup> </mrow></math>
Therefore, the temperature of the molten metal is controlled,
p′=KHcK-1p=Hp
wherein,
H = K ( R c 1 c 2 + tn T d ) K - 1
called the homography matrix of the plane pi with respect to the two frames of images.
According to the literature, the homography matrix is a 3 × 3 matrix with 8 degrees of freedom, so that 4 groups of matching point pairs between two images need to be known for calculation, and in order to prevent degradation, the planes where the 4 groups of matching point pairs are located need not pass through the optical center of the camera, and any 3 points of the 4 points in space are not collinear.
In addition, because the camera is fixedly connected with the unmanned aerial vehicle, the attitude of the camera can be considered to be consistent with that of the unmanned aerial vehicle, and the attitude of the camera is provided by an airborne inertial navigation device and is respectively a yaw angle psi, a pitch angle theta and a roll angle gamma, so that the unmanned aerial vehicle has the following advantages
R c 1 c 2 = R n c 2 R c 1 n
WhereinIs the rotation matrix of the camera coordinate system at c2 to the navigation coordinate system, is
<math><mrow> <msubsup> <mi>R</mi> <mi>n</mi> <mrow> <mi>c</mi> <mn>2</mn> </mrow> </msubsup> <mo>=</mo> <mfenced open='[' close=']'> <mtable> <mtr> <mtd> <mi>cos</mi> <mi></mi> <mi>&gamma;</mi> <mi>cos</mi> <mi>&psi;</mi> <mo>-</mo> <mi>sin</mi> <mi></mi> <mi>&theta;</mi> <mi>sin</mi> <mi></mi> <mi>&gamma;</mi> <mi>sin</mi> <mi>&psi;</mi> </mtd> <mtd> <mi>cos</mi> <mi></mi> <mi>&gamma;</mi> <mi>sin</mi> <mi>&psi;</mi> <mo>+</mo> <mi>sin</mi> <mi></mi> <mi>&theta;</mi> <mi>sin</mi> <mi></mi> <mi>&gamma;</mi> <mi>cos</mi> <mi>&psi;</mi> </mtd> <mtd> <mo>-</mo> <mi>cos</mi> <mi></mi> <mi>&theta;</mi> <mi>sin</mi> <mi>&gamma;</mi> </mtd> </mtr> <mtr> <mtd> <mo>-</mo> <mi>cos</mi> <mi></mi> <mi>&theta;</mi> <mi>sin</mi> <mi>&psi;</mi> </mtd> <mtd> <mi>cos</mi> <mi></mi> <mi>&theta;</mi> <mi>cos</mi> <mi>&psi;</mi> </mtd> <mtd> <mi>sin</mi> <mi>&theta;</mi> </mtd> </mtr> <mtr> <mtd> <mi>sin</mi> <mi></mi> <mi>&gamma;</mi> <mi>cos</mi> <mi>&psi;</mi> <mo>+</mo> <mi>sin</mi> <mi></mi> <mi>&theta;</mi> <mi>cos</mi> <mi></mi> <mi>&gamma;</mi> <mi>sin</mi> <mi></mi> <mi>&psi;</mi> </mtd> <mtd> <mi>sin</mi> <mi>&gamma;</mi> <mi>sin</mi> <mi>&psi;</mi> <mo>-</mo> <mi>sin</mi> <mi></mi> <mi>&theta;</mi> <mi>cos</mi> <mi></mi> <mi>&gamma;</mi> <mi>cos</mi> <mi>&psi;</mi> </mtd> <mtd> <mi>cos</mi> <mi></mi> <mi>&theta;</mi> <mi>cos</mi> <mi>&gamma;</mi> </mtd> </mtr> </mtable> </mfenced> </mrow></math>
The same can be obtained
If the ground is assumed to be a plane, n is (0,0,1)TAfter the homography matrix is calculated, according to attitude information provided by inertial navigation and height information provided by a barometric altimeter, the relative displacement of the camera when two continuous frames of images are shot is calculated by using the formula, and the current position of the unmanned aerial vehicle is calculated in a circulating manner by accumulating the displacement of the camera and utilizing the position estimated before. The specific implementation steps are as follows:
2.1, respectively extracting characteristic points from two continuous frames of real-time images a and a' by using a Harris angular point detection algorithm;
2.2, for each feature point (x, y) in aTIn a with (x, y)TSearching the square region with the center for the matching with the highest neighborhood cross correlation, symmetrically searching each characteristic point in a' for the matching of the characteristic points, and finally determining the matched characteristic point pair;
and 2.3, obtaining the estimation of the maximum consistent point set and the homography matrix H by using a RANSAC robust estimation method. The specific process comprises the following steps: (1) randomly extracting 4 groups of matching point pairs to form a random sample, and calculating a homography matrix H; (2) calculating the distance dis for each matching point pair in the step B; (3) setting a threshold value t, if dis < t, the matching point pair is an inner point, otherwise, removing the inner point pair, and counting the number of the inner points; (4) repeating the process k times, and selecting H with the maximum inner point number;
2.4, re-estimating all matching point pairs H which are defined as interior points, and determining the search area in the step 2.2 by H to more accurately determine the matching point pairs;
2.5, repeating the steps 2.2 to 2.4 until the number of the matching point pairs is stable.
3. When the unmanned aerial vehicle enters the adaptation area, the unmanned aerial vehicle is roughly positioned according to the inertial navigation system, a reference image b corresponding to a rough positioning result is found in the onboard storage device, scene matching is carried out on the reference image b and the real-time image a, and the position of the unmanned aerial vehicle is determined.
The invention adopts a scene matching algorithm based on a FREAK descriptor to match a real-time image with a reference image, as shown in figure 3, a FREAK descriptor adopts a similar retina sampling mode, each characteristic point is used as a center to construct circular sampling, the center is dense, the periphery is sparse, and the sampling number is reduced in an exponential mode. In order to enhance the robustness of the sampling circles and improve the stability and uniqueness of the descriptors, each sampling circle is pre-smoothed with a different gaussian kernel whose standard deviation is proportional to the size of the circle.
The FREAK descriptor is composed of bit strings of differential gaussians, and the criterion T for defining a sampling circle is
PaRepresents the sample pair, and I (-) represents the smoothed sampled circular intensity value. Selecting N sampling pairs to define binary standard
<math><mrow> <mi>F</mi> <mo>=</mo> <munder> <mi>&Sigma;</mi> <mrow> <mn>0</mn> <mo>&le;</mo> <mi>a</mi> <mo><</mo> <mi>N</mi> </mrow> </munder> <msup> <mn>2</mn> <mi>a</mi> </msup> <mi>T</mi> <mrow> <mo>(</mo> <msub> <mi>P</mi> <mi>a</mi> </msub> <mo>)</mo> </mrow> </mrow></math>
Thus obtaining the binary bit string with N dimensions.
The sampling pair is selected from a coarse process to a fine process, and the criteria are as follows:
a) creating a large matrix D which can contain 5 ten thousand feature points, wherein each row represents one feature point, and comparing the intensities of 43 sampling circles of each feature point pairwise to obtain 1000 multidimensional descriptors;
b) and calculating the mean and the variance of each column of the matrix D, wherein the variance is the largest when the mean is 0.5, so that the uniqueness of the feature descriptor can be ensured.
c) Each column is sorted according to the magnitude of the variance, with the largest variance being in the first column.
d) And reserving the first N columns, and describing each feature point to obtain an N-dimensional binary bit string. N is chosen 256 herein.
The selection criterion of the sampling pairs guarantees the grey scale invariance of the descriptors.
As shown in fig. 3, M (M ═ 45) symmetric sampling pairs are selected from all sampling circles, and the local gradient is calculated as
<math><mrow> <mi>O</mi> <mo>=</mo> <mfrac> <mn>1</mn> <mi>M</mi> </mfrac> <munder> <mi>&Sigma;</mi> <mrow> <msub> <mi>P</mi> <mi>O</mi> </msub> <mo>&Element;</mo> <mi>G</mi> </mrow> </munder> <mrow> <mo>(</mo> <mi>I</mi> <mrow> <mo>(</mo> <msubsup> <mi>P</mi> <mi>O</mi> <mrow> <mi>r</mi> <mn>1</mn> </mrow> </msubsup> <mo>)</mo> </mrow> <mo>-</mo> <mi>I</mi> <mrow> <mo>(</mo> <msubsup> <mi>P</mi> <mi>O</mi> <mrow> <mi>r</mi> <mn>2</mn> </mrow> </msubsup> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mfrac> <mrow> <msubsup> <mi>P</mi> <mi>O</mi> <mrow> <mi>r</mi> <mn>1</mn> </mrow> </msubsup> <mo>-</mo> <msubsup> <mi>P</mi> <mi>O</mi> <mrow> <mi>r</mi> <mn>2</mn> </mrow> </msubsup> </mrow> <mrow> <mo>|</mo> <mo>|</mo> <msubsup> <mi>P</mi> <mi>O</mi> <mrow> <mi>r</mi> <mn>1</mn> </mrow> </msubsup> <mo>-</mo> <msubsup> <mi>P</mi> <mi>O</mi> <mrow> <mi>r</mi> <mn>2</mn> </mrow> </msubsup> <mo>|</mo> <mo>|</mo> </mrow> </mfrac> </mrow></math>
And corresponding to the two-dimensional vector of the sampling circle, and I (-) represents the intensity value of the sampling circle after smoothing. The symmetry of the sampling pairs guarantees the rotational invariance of the descriptors.
The FREAK descriptor is a binary bit string composed of 1 and 0, so the similarity criterion adopts a minimum hamming distance method, i.e. two descriptors for matching are subjected to bitwise exclusive-or operation. For the 512-dimensional FREAK descriptor, the maximum hamming distance is 512 and the minimum is 0, in order to filter out the mismatch, a threshold value T is set, and if T is higher than T, the mismatch is directly filtered out, and the threshold value is set to 10 in this document.
4. And correcting errors generated by the visual mileage by using the positions of the unmanned aerial vehicles obtained by scene matching in the adaptation area to obtain the more accurate positions of the unmanned aerial vehicles.
Because the positioning result of scene matching in the adaptation area is reliable, the position obtained by calculating the visual mileage is reset by directly utilizing the position of the unmanned aerial vehicle obtained by scene matching, and the accumulated error generated by long-time work of the visual mileage is eliminated.
5. And the current position and the current attitude of the unmanned aerial vehicle are given by using an inertial navigation system.
6. And estimating the current accurate position and attitude of the unmanned aerial vehicle by using a Kalman filtering algorithm.
And (3) using an error equation of the inertial navigation system as a state equation of the integrated navigation system, selecting a navigation coordinate system as a northeast coordinate system, using a difference value between the position in 4 and the position in 5 as a measurement value, estimating a drift error of the inertial system by using a Kalman filter, and then correcting the inertial navigation system to obtain a fused navigation parameter.

Claims (3)

1. An inertial integrated navigation method based on scene matching/visual mileage is characterized by comprising the following steps:
step 1: in the flight process of the unmanned aerial vehicle, an airborne downward-looking camera acquires a ground image a in real time;
step 2: determining the visual mileage of the unmanned aerial vehicle by using the image a and the previous frame image a', and comprising the following steps:
A. respectively extracting characteristic points from two continuous frames of real-time images a and a' by using a Harris angular point detection algorithm;
B. in image a with (x, y)TSearching a square region as a center for a matching point having the highest neighborhood cross-correlation with each feature point (x, y) T in the image a'; at the same time, in the image a', with (x, y)TSearch for each feature point (x, y) in the image a in a centered square areaTA matching point with highest neighborhood cross-correlation;
C. obtaining the estimation of the maximum consistent point set and the homography matrix H by using a RANSAC robust estimation method, wherein the process comprises the steps of randomly extracting 4 groups of matching point pairs to form a random sample and calculating the homography matrix H; then calculating the distance dis for each matching point pair in the step B; setting a threshold value t again, if dis < t, the matching point pair is an inner point, otherwise, removing the inner point pair, and counting the number of the inner points; repeating the process k times, and selecting H;
D. re-estimating H with the maximum number of inliers from all the matching point pairs defined as inliers, and calculating each feature point (x, y) in the image a' and the image a using the re-estimated HTCorresponding point (x)1,y1)T. And using the method in step B to respectively obtain (x, y) in the image aTSearch for each feature point (x) in the image a' in a centered square region1,y1)TA matching point with highest neighborhood cross-correlation; at the same time, in the image a' with (x)1,y1)TSearch for each feature point (x, y) in the image a in a centered square areaTA matching point with highest neighborhood cross-correlation;
E. repeating the step B to the step D until the number of the matching point pairs is stable;
and step 3: when the unmanned aerial vehicle enters the adaptation area, performing coarse positioning on the unmanned aerial vehicle according to an inertial navigation system, finding a reference image b corresponding to a coarse positioning result in an onboard storage device, performing scene matching with the real-time image a, and determining the position of the unmanned aerial vehicle;
and 4, step 4: correcting errors generated by the visual mileage by using the position of the unmanned aerial vehicle obtained in the step (3) in the adaptation area to obtain the more accurate position of the unmanned aerial vehicle;
and 5: the current position and the current attitude of the unmanned aerial vehicle are given by an inertial navigation system;
step 6: and (3) using an error equation of the inertial navigation system as a state equation of the integrated navigation system, selecting a navigation coordinate system as a northeast coordinate system, and measuring a difference value between the position obtained in the step (4) and the position obtained in the step (5). And estimating the drift error of the inertial system by using a Kalman filter, and correcting the inertial navigation system by using the drift error to obtain the fused navigation parameters.
2. The scene matching/visual odometry-based inertial combined navigation method according to claim 1, characterized in that: the ground image a is an optical image or an infrared image.
3. The scene matching/visual odometry-based inertial combined navigation method according to claim 1, characterized in that: the process of the step 3 is as follows: firstly, an airborne down-looking camera of an unmanned aerial vehicle acquires a ground image a in real time, and the image a is preprocessed to obtain an image c; then extracting FAST characteristics of the image b and the image c; describing the extracted FAST features by using a FREAK feature descriptor; and performing feature matching by using a similarity criterion of the nearest Hamming distance to obtain a matching position, namely the position of the current unmanned aerial vehicle.
CN201410128459.XA 2014-04-01 2014-04-01 Inertia integrated navigation method based on scene matching aided navigation/vision mileage Active CN103954283B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410128459.XA CN103954283B (en) 2014-04-01 2014-04-01 Inertia integrated navigation method based on scene matching aided navigation/vision mileage

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410128459.XA CN103954283B (en) 2014-04-01 2014-04-01 Inertia integrated navigation method based on scene matching aided navigation/vision mileage

Publications (2)

Publication Number Publication Date
CN103954283A true CN103954283A (en) 2014-07-30
CN103954283B CN103954283B (en) 2016-08-31

Family

ID=51331593

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410128459.XA Active CN103954283B (en) 2014-04-01 2014-04-01 Inertia integrated navigation method based on scene matching aided navigation/vision mileage

Country Status (1)

Country Link
CN (1) CN103954283B (en)

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105045276A (en) * 2015-07-03 2015-11-11 深圳一电科技有限公司 Method and apparatus for controlling flight of unmanned plane
CN105180933A (en) * 2015-09-14 2015-12-23 中国科学院合肥物质科学研究院 Mobile robot track plotting correcting system based on straight-running intersection and mobile robot track plotting correcting method
CN105222772A (en) * 2015-09-17 2016-01-06 泉州装备制造研究所 A kind of high-precision motion track detection system based on Multi-source Information Fusion
CN105374043A (en) * 2015-12-02 2016-03-02 福州华鹰重工机械有限公司 Method and device of background filtering of visual odometry
CN105675013A (en) * 2014-11-21 2016-06-15 中国飞行试验研究院 Civil aircraft inertial navigation dynamic calibration method
CN105865451A (en) * 2016-04-19 2016-08-17 深圳市神州云海智能科技有限公司 Method and device applied to indoor location of mobile robot
CN105953796A (en) * 2016-05-23 2016-09-21 北京暴风魔镜科技有限公司 Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone
CN107067415A (en) * 2017-03-21 2017-08-18 南京航空航天大学 A kind of quick accurate positioning method of target based on images match
CN107167140A (en) * 2017-05-26 2017-09-15 江苏大学 A kind of unmanned plane vision positioning accumulated error suppressing method
CN107270904A (en) * 2017-06-23 2017-10-20 西北工业大学 Unmanned plane auxiliary guiding control system and method based on image registration
CN107462244A (en) * 2017-04-24 2017-12-12 北京航空航天大学 A kind of air remote sensing platform attitude angle high-precision measuring method matched based on GPS location and aerial map picture
CN107498559A (en) * 2017-09-26 2017-12-22 珠海市微半导体有限公司 The detection method and chip that the robot of view-based access control model turns to
CN107843240A (en) * 2017-09-14 2018-03-27 中国人民解放军92859部队 A kind of seashore region unmanned plane image same place information rapid extracting method
CN107967691A (en) * 2016-10-20 2018-04-27 株式会社理光 A kind of visual odometry calculates method and apparatus
CN108196285A (en) * 2017-11-30 2018-06-22 中山大学 A kind of Precise Position System based on Multi-sensor Fusion
CN108544494A (en) * 2018-05-31 2018-09-18 珠海市微半导体有限公司 A kind of positioning device, method and robot based on inertia and visual signature
CN108846857A (en) * 2018-06-28 2018-11-20 清华大学深圳研究生院 The measurement method and visual odometry of visual odometry
CN108981692A (en) * 2018-06-14 2018-12-11 兰州晨阳启创信息科技有限公司 It is a kind of based on inertial navigation/visual odometry train locating method and system
CN109102013A (en) * 2018-08-01 2018-12-28 重庆大学 A kind of improvement FREAK Feature Points Matching digital image stabilization method suitable for tunnel environment characteristic
CN109143305A (en) * 2018-09-30 2019-01-04 百度在线网络技术(北京)有限公司 Automobile navigation method and device
CN109341700A (en) * 2018-12-04 2019-02-15 中国航空工业集团公司西安航空计算技术研究所 Fixed wing aircraft vision assists landing navigation method under a kind of low visibility
CN109341685A (en) * 2018-12-04 2019-02-15 中国航空工业集团公司西安航空计算技术研究所 A kind of fixed wing aircraft vision auxiliary landing navigation method based on homograph
CN109360295A (en) * 2018-10-31 2019-02-19 张维玲 A kind of mileage measuring system and method based on Aberration Analysis
CN109523579A (en) * 2018-11-12 2019-03-26 北京联海信息系统有限公司 A kind of matching process and device of UAV Video image and three-dimensional map
CN109782012A (en) * 2018-12-29 2019-05-21 中国电子科技集团公司第二十研究所 A kind of speed-measuring method based on photoelectric image feature association
CN109791048A (en) * 2016-08-01 2019-05-21 无限增强现实以色列有限公司 Usage scenario captures the method and system of the component of data calibration Inertial Measurement Unit (IMU)
CN110388939A (en) * 2018-04-23 2019-10-29 湖南海迅自动化技术有限公司 One kind being based on the matched vehicle-mounted inertial navigation position error modification method of Aerial Images
CN111238488A (en) * 2020-03-18 2020-06-05 湖南云顶智能科技有限公司 Aircraft accurate positioning method based on heterogeneous image matching
CN112577493A (en) * 2021-03-01 2021-03-30 中国人民解放军国防科技大学 Unmanned aerial vehicle autonomous positioning method and system based on remote sensing map assistance
CN113390410A (en) * 2021-08-04 2021-09-14 北京云恒科技研究院有限公司 Inertial integrated navigation method suitable for unmanned aerial vehicle
CN113432594A (en) * 2021-07-05 2021-09-24 北京鑫海宜科技有限公司 Unmanned aerial vehicle automatic navigation system based on map and environment
CN114265427A (en) * 2021-12-06 2022-04-01 江苏方天电力技术有限公司 Inspection unmanned aerial vehicle auxiliary navigation system and method based on infrared image matching
CN114764005A (en) * 2021-03-11 2022-07-19 深圳市科卫泰实业发展有限公司 Monocular vision odometer method for unmanned aerial vehicle
CN116518981B (en) * 2023-06-29 2023-09-22 中国人民解放军国防科技大学 Aircraft visual navigation method based on deep learning matching and Kalman filtering

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107796417B (en) * 2016-09-06 2021-02-05 北京自动化控制设备研究所 Method for adaptively estimating scene matching and inertial navigation installation error
CN107966147B (en) * 2016-10-20 2021-02-05 北京自动化控制设备研究所 Scene matching method under large-locomotive condition

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6856894B1 (en) * 2003-10-23 2005-02-15 International Business Machines Corporation Navigating a UAV under remote control and manual control with three dimensional flight depiction
EP1975646A2 (en) * 2007-03-28 2008-10-01 Honeywell International Inc. Lader-based motion estimation for navigation
CN101598556A (en) * 2009-07-15 2009-12-09 北京航空航天大学 Unmanned plane vision/inertia integrated navigation method under a kind of circumstances not known
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
CN102538781A (en) * 2011-12-14 2012-07-04 浙江大学 Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN102829785A (en) * 2012-08-30 2012-12-19 中国人民解放军国防科学技术大学 Air vehicle full-parameter navigation method based on sequence image and reference image matching

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6856894B1 (en) * 2003-10-23 2005-02-15 International Business Machines Corporation Navigating a UAV under remote control and manual control with three dimensional flight depiction
EP1975646A2 (en) * 2007-03-28 2008-10-01 Honeywell International Inc. Lader-based motion estimation for navigation
CN101598556A (en) * 2009-07-15 2009-12-09 北京航空航天大学 Unmanned plane vision/inertia integrated navigation method under a kind of circumstances not known
CN102435188A (en) * 2011-09-15 2012-05-02 南京航空航天大学 Monocular vision/inertia autonomous navigation method for indoor environment
CN102538781A (en) * 2011-12-14 2012-07-04 浙江大学 Machine vision and inertial navigation fusion-based mobile robot motion attitude estimation method
CN102829785A (en) * 2012-08-30 2012-12-19 中国人民解放军国防科学技术大学 Air vehicle full-parameter navigation method based on sequence image and reference image matching

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
庄瞳等: "一种基于单目视觉的微型无人机姿态算法", 《计算机工程》 *
李耀军等: "基于空间关系几何约束的无人机景象匹配导航", 《计算机应用研究》 *
陈方等: "惯性组合导航系统中的快速景象匹配算法研究", 《宇航学报》 *

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105675013B (en) * 2014-11-21 2019-03-01 中国飞行试验研究院 Civil aircraft inertial navigation dynamic calibration method
CN105675013A (en) * 2014-11-21 2016-06-15 中国飞行试验研究院 Civil aircraft inertial navigation dynamic calibration method
CN105045276A (en) * 2015-07-03 2015-11-11 深圳一电科技有限公司 Method and apparatus for controlling flight of unmanned plane
CN105180933A (en) * 2015-09-14 2015-12-23 中国科学院合肥物质科学研究院 Mobile robot track plotting correcting system based on straight-running intersection and mobile robot track plotting correcting method
CN105180933B (en) * 2015-09-14 2017-11-21 中国科学院合肥物质科学研究院 Mobile robot reckoning update the system and method based on the detection of straight trip crossing
CN105222772A (en) * 2015-09-17 2016-01-06 泉州装备制造研究所 A kind of high-precision motion track detection system based on Multi-source Information Fusion
CN105222772B (en) * 2015-09-17 2018-03-16 泉州装备制造研究所 A kind of high-precision motion track detection system based on Multi-source Information Fusion
CN105374043A (en) * 2015-12-02 2016-03-02 福州华鹰重工机械有限公司 Method and device of background filtering of visual odometry
CN105865451A (en) * 2016-04-19 2016-08-17 深圳市神州云海智能科技有限公司 Method and device applied to indoor location of mobile robot
CN105865451B (en) * 2016-04-19 2019-10-01 深圳市神州云海智能科技有限公司 Method and apparatus for mobile robot indoor positioning
CN105953796A (en) * 2016-05-23 2016-09-21 北京暴风魔镜科技有限公司 Stable motion tracking method and stable motion tracking device based on integration of simple camera and IMU (inertial measurement unit) of smart cellphone
CN109791048A (en) * 2016-08-01 2019-05-21 无限增强现实以色列有限公司 Usage scenario captures the method and system of the component of data calibration Inertial Measurement Unit (IMU)
US11125581B2 (en) 2016-08-01 2021-09-21 Alibaba Technologies (Israel) LTD. Method and system for calibrating components of an inertial measurement unit (IMU) using scene-captured data
CN107967691B (en) * 2016-10-20 2021-11-23 株式会社理光 Visual mileage calculation method and device
CN107967691A (en) * 2016-10-20 2018-04-27 株式会社理光 A kind of visual odometry calculates method and apparatus
CN107067415B (en) * 2017-03-21 2019-07-30 南京航空航天大学 A kind of object localization method based on images match
CN107067415A (en) * 2017-03-21 2017-08-18 南京航空航天大学 A kind of quick accurate positioning method of target based on images match
CN107462244A (en) * 2017-04-24 2017-12-12 北京航空航天大学 A kind of air remote sensing platform attitude angle high-precision measuring method matched based on GPS location and aerial map picture
CN107167140B (en) * 2017-05-26 2019-11-08 江苏大学 A kind of unmanned plane vision positioning accumulated error suppressing method
CN107167140A (en) * 2017-05-26 2017-09-15 江苏大学 A kind of unmanned plane vision positioning accumulated error suppressing method
CN107270904B (en) * 2017-06-23 2020-07-03 西北工业大学 Unmanned aerial vehicle auxiliary guide control system and method based on image registration
CN107270904A (en) * 2017-06-23 2017-10-20 西北工业大学 Unmanned plane auxiliary guiding control system and method based on image registration
CN107843240A (en) * 2017-09-14 2018-03-27 中国人民解放军92859部队 A kind of seashore region unmanned plane image same place information rapid extracting method
CN107498559A (en) * 2017-09-26 2017-12-22 珠海市微半导体有限公司 The detection method and chip that the robot of view-based access control model turns to
CN108196285A (en) * 2017-11-30 2018-06-22 中山大学 A kind of Precise Position System based on Multi-sensor Fusion
CN110388939A (en) * 2018-04-23 2019-10-29 湖南海迅自动化技术有限公司 One kind being based on the matched vehicle-mounted inertial navigation position error modification method of Aerial Images
CN108544494B (en) * 2018-05-31 2023-10-24 珠海一微半导体股份有限公司 Positioning device, method and robot based on inertia and visual characteristics
CN108544494A (en) * 2018-05-31 2018-09-18 珠海市微半导体有限公司 A kind of positioning device, method and robot based on inertia and visual signature
CN108981692A (en) * 2018-06-14 2018-12-11 兰州晨阳启创信息科技有限公司 It is a kind of based on inertial navigation/visual odometry train locating method and system
CN108846857A (en) * 2018-06-28 2018-11-20 清华大学深圳研究生院 The measurement method and visual odometry of visual odometry
CN109102013A (en) * 2018-08-01 2018-12-28 重庆大学 A kind of improvement FREAK Feature Points Matching digital image stabilization method suitable for tunnel environment characteristic
CN109143305A (en) * 2018-09-30 2019-01-04 百度在线网络技术(北京)有限公司 Automobile navigation method and device
CN109360295A (en) * 2018-10-31 2019-02-19 张维玲 A kind of mileage measuring system and method based on Aberration Analysis
CN109523579A (en) * 2018-11-12 2019-03-26 北京联海信息系统有限公司 A kind of matching process and device of UAV Video image and three-dimensional map
CN109341700A (en) * 2018-12-04 2019-02-15 中国航空工业集团公司西安航空计算技术研究所 Fixed wing aircraft vision assists landing navigation method under a kind of low visibility
CN109341685B (en) * 2018-12-04 2023-06-30 中国航空工业集团公司西安航空计算技术研究所 Fixed wing aircraft vision auxiliary landing navigation method based on homography transformation
CN109341685A (en) * 2018-12-04 2019-02-15 中国航空工业集团公司西安航空计算技术研究所 A kind of fixed wing aircraft vision auxiliary landing navigation method based on homograph
CN109782012A (en) * 2018-12-29 2019-05-21 中国电子科技集团公司第二十研究所 A kind of speed-measuring method based on photoelectric image feature association
CN111238488A (en) * 2020-03-18 2020-06-05 湖南云顶智能科技有限公司 Aircraft accurate positioning method based on heterogeneous image matching
CN112577493A (en) * 2021-03-01 2021-03-30 中国人民解放军国防科技大学 Unmanned aerial vehicle autonomous positioning method and system based on remote sensing map assistance
CN114764005A (en) * 2021-03-11 2022-07-19 深圳市科卫泰实业发展有限公司 Monocular vision odometer method for unmanned aerial vehicle
CN113432594A (en) * 2021-07-05 2021-09-24 北京鑫海宜科技有限公司 Unmanned aerial vehicle automatic navigation system based on map and environment
CN113390410A (en) * 2021-08-04 2021-09-14 北京云恒科技研究院有限公司 Inertial integrated navigation method suitable for unmanned aerial vehicle
CN114265427A (en) * 2021-12-06 2022-04-01 江苏方天电力技术有限公司 Inspection unmanned aerial vehicle auxiliary navigation system and method based on infrared image matching
CN114265427B (en) * 2021-12-06 2024-02-02 江苏方天电力技术有限公司 Inspection unmanned aerial vehicle auxiliary navigation system and method based on infrared image matching
CN116518981B (en) * 2023-06-29 2023-09-22 中国人民解放军国防科技大学 Aircraft visual navigation method based on deep learning matching and Kalman filtering

Also Published As

Publication number Publication date
CN103954283B (en) 2016-08-31

Similar Documents

Publication Publication Date Title
CN103954283B (en) Inertia integrated navigation method based on scene matching aided navigation/vision mileage
US10438366B2 (en) Method for fast camera pose refinement for wide area motion imagery
CN107833249B (en) Method for estimating attitude of shipboard aircraft in landing process based on visual guidance
CN102435188B (en) Monocular vision/inertia autonomous navigation method for indoor environment
US8766975B2 (en) Method of correlating images with terrain elevation maps for navigation
CN106595659A (en) Map merging method of unmanned aerial vehicle visual SLAM under city complex environment
CN108917753B (en) Aircraft position determination method based on motion recovery structure
Troglio et al. Automatic extraction of ellipsoidal features for planetary image registration
CN114216454A (en) Unmanned aerial vehicle autonomous navigation positioning method based on heterogeneous image matching in GPS rejection environment
Dumble et al. Airborne vision-aided navigation using road intersection features
CN117253029B (en) Image matching positioning method based on deep learning and computer equipment
CN115861352A (en) Monocular vision, IMU and laser radar data fusion and edge extraction method
Cao et al. Template matching based on convolution neural network for UAV visual localization
Ebadi et al. Semantic mapping in unstructured environments: Toward autonomous localization of planetary robotic explorers
CN117994543A (en) Scene matching method based on deep learning
CN117115414B (en) GPS-free unmanned aerial vehicle positioning method and device based on deep learning
CN112288813A (en) Pose estimation method based on multi-view vision measurement and laser point cloud map matching
Xia et al. Dense matching comparison between classical and deep learning based algorithms for remote sensing data
Ding et al. Stereo vision SLAM-based 3D reconstruction on UAV development platforms
CN113239936B (en) Unmanned aerial vehicle visual navigation method based on deep learning and feature point extraction
CN114842224A (en) Monocular unmanned aerial vehicle absolute vision matching positioning scheme based on geographical base map
Timotheatos et al. Visual horizon line detection for uav navigation
Chen et al. Metric localization for lunar rovers via cross-view image matching
Allik et al. Localization for aerial systems in GPS denied environments using recognition
Arai et al. Fast vision-based localization for a mars airplane

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20190826

Address after: Room 404, Material Building, Northwest Polytechnic University, 127 Youyi West Road, Xi'an City, Shaanxi Province, 710072

Patentee after: Xi'an Northwestern Polytechnical University Asset Management Co.,Ltd.

Address before: 710072 Xi'an friendship West Road, Shaanxi, No. 127

Patentee before: Northwestern Polytechnical University

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20191113

Address after: 710072 floor 19, building B, innovation and technology building, northwest Polytechnic University, No.127, Youyi West Road, Beilin District, Xi'an, Shaanxi Province

Patentee after: Shaanxi CISCO Rudi Network Security Technology Co.,Ltd.

Address before: Room 404, Material Building, Northwest Polytechnic University, 127 Youyi West Road, Xi'an City, Shaanxi Province, 710072

Patentee before: Xi'an Northwestern Polytechnical University Asset Management Co.,Ltd.

TR01 Transfer of patent right
CP01 Change in the name or title of a patent holder

Address after: Floor 19, block B, innovation and technology building, Northwest University of technology, 127 Youyi West Road, Beilin District, Xi'an City, Shaanxi Province, 710072

Patentee after: Shaanxi University of technology Ruidi Information Technology Co.,Ltd.

Address before: Floor 19, block B, innovation and technology building, Northwest University of technology, 127 Youyi West Road, Beilin District, Xi'an City, Shaanxi Province, 710072

Patentee before: Shaanxi CISCO Rudi Network Security Technology Co.,Ltd.

CP01 Change in the name or title of a patent holder
TR01 Transfer of patent right

Effective date of registration: 20231008

Address after: 518000 Unit 204, Xingyuanju 2, Xilihu Road, Xili Street, Nanshan District, Shenzhen, Guangdong Province

Patentee after: Shenzhen Onoan Technology Co.,Ltd.

Address before: Floor 19, block B, innovation and technology building, Northwest University of technology, 127 Youyi West Road, Beilin District, Xi'an City, Shaanxi Province, 710072

Patentee before: Shaanxi University of technology Ruidi Information Technology Co.,Ltd.

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20231220

Address after: 710000, No. 581, East Zone, National E-commerce Demonstration Base, No. 528 Tianguba Road, Software New City, High tech Zone, Xi'an City, Shaanxi Province

Patentee after: Xi'an Chenxiang Zhuoyue Technology Co.,Ltd.

Address before: 518000 Unit 204, Xingyuanju 2, Xilihu Road, Xili Street, Nanshan District, Shenzhen, Guangdong Province

Patentee before: Shenzhen Onoan Technology Co.,Ltd.

TR01 Transfer of patent right