CN106802149B - Rapid sequence image matching navigation method based on high-dimensional combination characteristics - Google Patents

Rapid sequence image matching navigation method based on high-dimensional combination characteristics Download PDF

Info

Publication number
CN106802149B
CN106802149B CN201611078011.7A CN201611078011A CN106802149B CN 106802149 B CN106802149 B CN 106802149B CN 201611078011 A CN201611078011 A CN 201611078011A CN 106802149 B CN106802149 B CN 106802149B
Authority
CN
China
Prior art keywords
real
image
time image
straight line
matching
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611078011.7A
Other languages
Chinese (zh)
Other versions
CN106802149A (en
Inventor
冷雪飞
巩哲
刘杨
茹江涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN201611078011.7A priority Critical patent/CN106802149B/en
Publication of CN106802149A publication Critical patent/CN106802149A/en
Application granted granted Critical
Publication of CN106802149B publication Critical patent/CN106802149B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Analysis (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a rapid sequence image matching navigation method based on high-dimensional combination characteristics, and belongs to the technical field of flight navigation. Firstly, selecting a first frame of real-time image in a sequence image to be matched with a reference image; then denoising and refining the real-time images of two adjacent frames, and extracting stable branch characteristic points; constructing an intersecting straight line pair by the branch feature points to form a high-dimensional combined feature as a matching element, determining the position relationship of two adjacent frames of real-time images, and further determining the position relationship of the current frame of real-time image relative to a reference image; finally, a control command is formed to dynamically correct the accumulated error of the inertial navigation system. The invention effectively improves the accuracy and the real-time property of the navigation system by utilizing the sequence images, and the application of the high-dimensional combination characteristics ensures that the method has stronger robustness to rotation and scale change.

Description

Rapid sequence image matching navigation method based on high-dimensional combination characteristics
Technical Field
The invention relates to a sequence image matching navigation method, in particular to a rapid sequence image matching navigation method of high-dimensional combination characteristics constructed based on branch characteristic points, and belongs to the technical field of flight navigation.
Background
Image matching technology is an extremely important technology in the field of modern information processing. In the navigation system, image matching navigation utilizes airborne equipment to acquire scene images of the ground in real time, and compares the scene images with a reference digital map stored in an airborne computer to determine the position of an aircraft. However, different imaging conditions of shooting equipment, climate, noise and the like can cause imaging differences between the real-time image and the reference image, so that the image matching navigation puts higher requirements on the accuracy, robustness and rapidity of the image matching algorithm.
The traditional single image matching method needs to process a large amount of image operation and operation, consumes more hardware time, is difficult to meet the real-time requirement of a navigation system, and is usually used as an auxiliary means to eliminate the accumulated error of the long-time work of the inertial navigation system. Therefore, sequential image matching navigation becomes a current research hotspot. The sequence image matching navigation realizes navigation by matching the shot sequence images, wherein two adjacent images in the sequence images are the same sensor, the images acquired under the same imaging condition and the same time period have the same noise distribution and smaller size change and geometric deformation, and generally, the continuous image sequences have the same size, do not have proportion change, only have translation and rotation, and the rotation angle is not too large. Therefore, the motion trend of the two adjacent frames of images is estimated by utilizing the overlapping area of the two adjacent frames of images, the flight parameters are solved, the matching efficiency and accuracy of the images matched with the reference image can be improved, and the autonomous image navigation becomes possible.
In the feature-based image matching method, a feature having geometric attributes such as a point, a line, and a region is generally selected as a matching primitive. Point features are relatively simple, easily acquired features in an image, but are complex and difficult to match. In 7 months 2007, the automated chemical newspaper 33, No. 7, page 678-682, the authors refer to the real-time image matching algorithm documents for navigation based on branch feature points of the cold snow fly, Liu construction industry and the bear intelligence, and the branch feature points are introduced into the image matching algorithm; in 3 months of 2015, the inventor is a precise real-time image matching algorithm research in a navigation system of king petiole, and the master graduate thesis also introduces branch feature points and distinguishes pseudo branch points, so that stable branch feature points are obtained, and the accuracy of the matching algorithm is improved. The straight line features are more accurate, more stable and more representative of the information content contained in the image, but the extraction of the straight line features has an important influence on the matching result, and an algorithm for accurately extracting all line segments is not available at present. Therefore, the invention provides a rapid sequence image matching navigation method, which utilizes the extracted stable branch characteristic points to construct the intersecting straight line pair to match as the primitive, and can effectively improve the matching speed of the image matching algorithm.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a rapid sequence image matching navigation method which can accurately construct straight line characteristics according to branch point characteristics so as to effectively improve the accuracy of a navigation system.
In order to solve the technical problem, the fast sequence image matching navigation method based on the high-dimensional combination features provided by the invention comprises the following steps:
1) acquiring current position information of the aircraft, and taking a reference image which is stored in advance and corresponds to the current position information of the aircraft;
2) acquiring a real-time image sequence of the current position of the aircraft, comparing the first frame of real-time image with a reference image, and determining the current position and attitude of the aircraft;
3) comparing the second frame of real-time image with the previous frame of real-time image, and respectively extracting edge binary features of two adjacent frames of real-time images to obtain edge binary feature images of the two adjacent frames of real-time images;
4) respectively extracting branch characteristic points after denoising and refining preprocessing are carried out on the edges of the edge binary characteristic images of two adjacent frames of real-time images, removing pseudo branch characteristic points, and respectively storing the positions of the branch characteristic points in two sets;
5) constructing an intersecting straight line pair as a matching primitive by taking three branch points in a branch feature point set of a current frame real-time image and storing the intersecting straight line pair in a set 1; constructing an intersecting straight line pair as a matching primitive by taking three branch points in a branch feature point set of a previous frame of real-time image and storing the intersecting straight line pair in a set 2;
6) sequentially traversing the two sets of the alternating-direct line pairs, judging whether the two sets of the alternating-direct line pairs are matched, if so, adding 1 to an accumulator at a matching position, and finally taking the position with the largest numerical value of the accumulator as the matching position of the current frame real-time image and the previous frame real-time image to obtain the distance from the center of the current frame real-time image to the center of the reference image so as to form a control instruction to carry out error correction on the inertial navigation system; .
7) And repeating the steps 3) to 6), when the frame number of the current real-time image is a multiple of 5-10, making the current real-time image as a first frame, and repeating the steps 2) to 7), thereby eliminating the accumulative error of relative matching between the sequence image frames.
In the invention, the step 5) comprises the following specific steps:
51) sequentially traversing three stable branch characteristic points in the current frame real-time image
a1(x1,y1)、a2(x2,y2)、a3(x3,y3)
52) Selection of a1、a2、a3The point with the minimum x coordinate value among the three points is the vertex z (z) of the intersecting straight line pairx,zy) If the x coordinate values are equal, selecting the point with the minimum y coordinate value as the intersecting straight line pair vertex z;
53) selection of a1、a2、a3The point with the maximum x coordinate value among the three points is the first endpoint z for describing the intersecting straight line pair1(z1x,z1y) If the x coordinate values are equal, selecting the point with the maximum y coordinate value as a first endpoint z for describing the orthogonal straight line pair1The remaining three points are taken as the second end point z for describing the AC-DC line pair2(z2x,z2y);
54) Let us make
Figure BDA0001165273480000031
Figure BDA0001165273480000032
L1、L2Setting a specified threshold tau for the lengths of two intersecting lines of the intersecting line pair, if L1Tau and L are less than or equal to2If the number is less than or equal to tau, storing a matched primitive formed by the three feature points in the set 1, otherwise, rejecting the primitive, and re-entering the circular traversal;
55) for the previous frame of real-time image, all matching primitives meeting the condition are found in the same way and stored in the set 2.
In the invention, the step 6) comprises the following specific steps:
is provided with L1、L2α are the intersecting straight lines in set 1Length and apex angle, L 'of the two intersecting segments of the pair'1、L'2α' are the lengths and angles of the two intersecting lines of the intersecting straight line pairs in set 2, ε1、ε2Is a threshold value; taking an array with the same size as the edge characteristic binary image for accumulation counting, and setting initial values of all accumulators in the array to be 0;
61) if | α - α' | is less than or equal to epsilon1Continuing the judgment of the step 62), otherwise, terminating the matching;
62) if L1-L'1|≤ε2And | L2-L'2|≤ε2Continuing the judgment of the step 63), otherwise, terminating the matching;
63) ξ order1=L1/L'12=L2/L'23=L1/L24=L'1/L'2If, ifThen the two AC-DC pairs are considered to match, ξ1,ξ2It can be determined whether the two lines of the AC-DC line pair have the same scale variation trend, ξ3,ξ4Whether the length relations of the two alternating lines of the alternating-direct line pairs are consistent or not can be judged;
64) if the two phases are matched with each other, according to the formula
Figure BDA0001165273480000034
In the formula, theta is the rotation angle between two AC-DC pairs, x0,y0Respectively adding 1 to the accumulator count value at the same position in the count array for the x-axis direction translation distance and the y-axis direction translation distance between two orthogonal and straight line pairs;
65) repeating the steps 61) to 64) until all the intersecting straight line pairs in the set 1 and the set 2 are matched;
66) and taking the position with the largest counting value of the accumulator as the position of the center of the current frame real-time image relative to the center of the previous frame real-time image to obtain the distance between the center of the current frame real-time image and the center of the reference image, and forming a control instruction to carry out error correction on the inertial navigation system.
The invention has the beneficial effects that: (1) the image matching navigation method provided by the invention constructs high-dimensional combination characteristics by using stable branch points extracted from the denoised edge binary characteristic image for matching, combines the advantages of relatively simple and easy extraction of point characteristics and stable, accurate and easy matching of linear characteristics, greatly reduces the calculation complexity of an algorithm, eliminates the adverse effect of inaccurate linear extraction on a matching result, simultaneously fully utilizes the characteristics of sequence images to match two adjacent frames of images in an image sequence, has quick and simple algorithm and good real-time performance, can match a current frame real-time image with a reference image after a certain interval time, eliminates the accumulated error of inter-frame matching, and improves the accuracy of a navigation system; (2) the method provided by the invention can still obtain an accurate matching position in an image area with unobvious linear characteristics, has a wider application range, and has stronger robustness to rotation and scale change by utilizing an orthogonal linear pair matching method.
Drawings
FIG. 1 is a schematic diagram of sequence image matching;
FIG. 2 is a flow chart of the fast sequential image matching navigation method based on high-dimensional combined features according to the present invention;
FIG. 3 is a schematic diagram of branch feature points;
FIG. 4 is a diagram of a matching primitive.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings.
As shown in fig. 1, the sequence image is an image sequence captured at the same time interval, and in general, two adjacent frames of images have a large overlapping area, and the capturing conditions are the same as the time period, so that the matching between two adjacent frames of images can effectively avoid the problem that the reference image and the real-time image have gray scale difference and large geometric deformation due to the change of the capturing time and conditions.
As shown in FIG. 2, the fast sequential image matching navigation method based on high-dimensional combined features of the present invention comprises the following steps:
step 1: the method comprises the steps of obtaining current position information of an aircraft by using an inertial navigation system of aircraft equipment, and selecting a certain-size area corresponding to the current position information of the aircraft from a digital map database prestored by an aircraft onboard computer of the aircraft as a reference image A of a sequence image matching navigation system.
Step 2: the imaging sensor is used for acquiring a real-time image sequence of the current position of the aircraft, the first frame of real-time image B1 is compared with the reference image A, and the position of the center of the real-time image B1 relative to the reference image A is accurately determined by using the existing image matching technology.
And step 3: and comparing the second frame real-time image B2 with the previous frame real-time image B1, and respectively extracting the EDGE features of the real-time images B1 and B2 to obtain EDGE binary feature images EDGE _ B1 and EDGE _ B2 of the real-time images B1 and B2.
And 4, step 4: EDGE denoising is carried out on the EDGE binary feature images EDGE _ B1 and EDGE _ B2, burrs and holes on the EDGE binary feature images or points with an isolated value of 1 are removed, the denoised EDGE binary images are refined, EDGE binary images EDGE _ DT _1 and EDGE _ DT _2 with the EDGE contour width of one pixel are obtained, the specific steps adopt the denoising and refining technology which is conventional in the field, and description is not carried out in the invention.
And 5: extracting branch feature POINTs in the images EDGE _ DT _1 and EDGE _ DT _2, wherein a schematic diagram of the branch feature POINTs is shown in FIG. 3, traversing the whole images EDGE _ DT _1 and EDGE _ DT _2 respectively, finding all the branch feature POINTs, eliminating pseudo branch feature POINTs, and obtaining stable branch feature POINT sets POINT _ B1 and POINT _ B2. The method for extracting stable branch points is the prior art, and the invention is not described further. Specifically, in 7 months 2007, volume 33, page 7 678-.
Step 6: any three branch POINTs in POINT _ B1 are used to construct an orthogonal straight line pair as a matching element, and the matching element is schematically shown in fig. 4. The method comprises the following specific steps:
(1) sequentially traversing three stable branch feature POINTs in POINT _ B1:
a1(x1,y1)、a2(x2,y2)、a3(x3,y3)
(2) selection of a1、a2、a3The point with the minimum x coordinate value among the three points is the vertex z (z) of the intersecting straight line pairx,zy) If the x coordinate values are equal, the point with the minimum y coordinate value is selected as the intersecting straight line to the vertex z.
(3) Selection of a1、a2、a3The point with the maximum x coordinate value among the three points is the first endpoint z for describing the intersecting straight line pair1(z1x,z1y) If the x coordinate values are equal, selecting the point with the maximum y coordinate value as a first endpoint z for describing the orthogonal straight line pair1The remaining three points are taken as the second end point z for describing the AC-DC line pair2(z2x,z2y)。
(4) Let us make
Figure BDA0001165273480000051
Figure BDA0001165273480000061
L1、L2Respectively, the lengths of two intersecting line segments of the intersecting straight line pair are set to be τ 20(τ is a threshold value, which can be modified according to specific experimental environments, for example, the distribution of feature points is sparse, and τ can be increased appropriately), and if L is L1Tau and L are less than or equal to2And (4) storing a matching primitive formed by the three feature points in a set HTV1, and otherwise, rejecting the primitive and re-entering the loop traversal.
For Point _ B2, find all its matching primitives that satisfy the condition in the same way and store in the set HVT 2.
And 7, solving a vertex angle α of each orthogonal straight line pair in the stable branch characteristic POINT set POINT _ B1 and POINT _ B2 by a cosine theorem, matching elements in the set HVT1 and the set HVT2 according to the following rule, taking an array D with the same size as the edge characteristic binary image for accumulation counting during matching, and setting initial values of all accumulators in the array to be 0.
Note L1、L2α are the lengths and vertex angles L 'of two intersecting line segments of intersecting straight line pairs in set HVT 1'1、L'2α' is the length and vertex angle of two intersecting line segments of the intersecting straight line pairs in set HVT 2.
(1) If | α - α' | is less than or equal to epsilon1(setting ε1=2°,ε1The threshold value can be modified according to specific experimental environment, and when the geometric deformation of the image is larger, epsilon can be increased appropriately1Value of) is calculated, the judgment of the second step is continued, otherwise, the matching is terminated.
(2) If L1-L'1|≤ε2And | L2-L'2|≤ε2(setting ε2=2,ε2The threshold value can be modified according to specific experimental environment numerical values, and when the geometric deformation of the image is larger, epsilon can be increased appropriately2And (3) continuing the judgment of the third step, otherwise, terminating the matching.
(3) ξ order1=L1/L'12=L2/L'23=L1/L24=L'1/L'2If, if
Figure BDA0001165273480000064
Then the two ac-dc pairs are considered to match, here ξ1,ξ2It can be determined whether the two lines of the AC-DC line pair have the same scale variation trend, ξ3,ξ4Whether the length relations of the two alternating lines of the alternating-direct line pairs are consistent or not can be judged.
(4) If the two phases are matched with each other, according to the formula
Figure BDA0001165273480000063
Wherein, theta, x0、y0All the undetermined parameters can be solved through the formula, theta is the rotating angle between two alternating-direct line pairs, and x0、y0The x-axis direction translation distance and the y-axis direction translation distance between two orthogonal and linear pairs respectively, therefore, (x0,y0) The accumulator for the same position in the log group D is incremented by 1 for the possible position of the center of the current frame live image B2 relative to the center of the previous frame live image B1.
(5) And repeating the operations (1), (2), (3) and (4) until the set HVT1 and the intersected straight lines in the set HVT2 are completely matched.
(6) And taking the position with the maximum counting value of the accumulator as the position of the center of the current frame real-time image B2 relative to the center of the previous frame real-time image B1, further calculating the distance of the center of the current measured image B2 deviating from the center of the reference image A, and forming a control command to carry out error correction on the inertial navigation system.
And 8: and (4) repeating the steps 3 to 7, when the frame number of the current real-time image B2 is a multiple of 5-10, making the current real-time image B2 be a first frame, and repeating the steps 2 to 7 to eliminate the accumulative error of relative matching between sequence image frames and improve the accuracy of the sequence image matching navigation system.
While the foregoing is directed to the preferred embodiment of the present invention, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention.

Claims (1)

1. A fast sequence image matching navigation method based on high-dimensional combined features is characterized by comprising the following steps:
1) acquiring current position information of the aircraft, and taking a reference image which is stored in advance and corresponds to the current position information of the aircraft;
2) acquiring a real-time image sequence of the current position of the aircraft, comparing the first frame of real-time image with a reference image, and determining the current position and attitude of the aircraft;
3) comparing the second frame of real-time image with the previous frame of real-time image, and respectively extracting edge binary features of two adjacent frames of real-time images to obtain edge binary feature images of the two adjacent frames of real-time images;
4) respectively extracting branch characteristic points after denoising and refining preprocessing are carried out on the edges of the edge binary characteristic images of two adjacent frames of real-time images, removing pseudo branch characteristic points, and respectively storing the positions of the branch characteristic points in two sets;
5) constructing an intersecting straight line pair as a matching primitive by taking three branch points in a branch feature point set of a current frame real-time image and storing the intersecting straight line pair in a set 1; constructing an intersecting straight line pair as a matching primitive by taking three branch points in a branch feature point set of a previous frame of real-time image and storing the intersecting straight line pair in a set 2; the method comprises the following specific steps:
51) sequentially traversing three stable branch characteristic points in the current frame real-time image
a1(x1,y1)、a2(x2,y2)、a3(x3,y3)
52) Selection of a1、a2、a3The point with the minimum x coordinate value among the three points is the vertex z (z) of the intersecting straight line pairx,zy) If the x coordinate values are equal, selecting the point with the minimum y coordinate value as the intersecting straight line pair vertex z;
53) selection of a1、a2、a3The point with the maximum x coordinate value among the three points is the first endpoint z for describing the intersecting straight line pair1(z1x,z1y) If the x coordinate values are equal, selecting the point with the maximum y coordinate value as a first endpoint z for describing the orthogonal straight line pair1The remaining three points are taken as the second end point z for describing the AC-DC line pair2(z2x,z2y);
54) Let us make
Figure FDA0002258952340000011
Figure FDA0002258952340000012
L1、L2Setting a specified threshold tau for the lengths of two intersecting lines of the intersecting line pair, if L1Tau and L are less than or equal to2If the number is less than or equal to tau, storing a matched primitive formed by the three feature points in the set 1, otherwise, rejecting the primitive, and re-entering the circular traversal;
55) for the previous frame of real-time image, all matched primitives meeting the conditions are found in the same method and stored in the set 2;
6) sequentially traversing the two sets of the alternating-direct line pairs, judging whether the two sets of the alternating-direct line pairs are matched, if so, adding 1 to an accumulator at a matching position, and finally taking the position with the largest numerical value of the accumulator as the matching position of the current frame real-time image and the previous frame real-time image to obtain the distance from the center of the current frame real-time image to the center of the reference image so as to form a control instruction to carry out error correction on the inertial navigation system; the method comprises the following specific steps:
is provided with L1、L2α are the lengths and angles L 'of two intersecting segments of the 1 phase intersecting straight line pair in the set'1、L'2α' are the lengths and angles of the two intersecting lines of the intersecting straight line pairs in set 2, ε1、ε2Is a threshold value; taking an array with the same size as the edge characteristic binary image for accumulation counting, and setting initial values of all accumulators in the array to be 0;
61) if | α - α' | is less than or equal to epsilon1Continuing the judgment of the step 62), otherwise, terminating the matching;
62) if L1-L'1|≤ε2And | L2-L'2|≤ε2Continuing the judgment of the step 63), otherwise, terminating the matching;
63) ξ order1=L1/L'12=L2/L'23=L1/L24=L'1/L'2If, if
Figure FDA0002258952340000021
Then the two AC-DC pairs are considered to match, ξ1,ξ2It can be determined whether the two lines of the AC-DC line pair have the same scale variation trend, ξ3,ξ4Whether the length relations of the two alternating lines of the alternating-direct line pairs are consistent or not can be judged;
64) if the two phases are matched with each other, according to the formula
Figure FDA0002258952340000022
In the formula, theta is the rotation angle between two AC-DC pairs, x0,y0Respectively adding 1 to the accumulator count value at the same position in the count array for the x-axis direction translation distance and the y-axis direction translation distance between two orthogonal and straight line pairs;
65) repeating the steps 61) to 64) until all the intersecting straight line pairs in the set 1 and the set 2 are matched;
66) taking the position with the largest counting value of the accumulator as the position of the center of the current frame real-time image relative to the center of the previous frame real-time image to obtain the distance between the center of the current frame real-time image and the center of the reference image, and forming a control instruction to carry out error correction on the inertial navigation system;
7) and repeating the steps 3) to 6), when the number of the current real-time image frames is a multiple of 5 to 10, making the current real-time image frame be the first frame, and repeating the steps 2) to 6), thereby eliminating the accumulative error of relative matching between the sequence image frames.
CN201611078011.7A 2016-11-29 2016-11-29 Rapid sequence image matching navigation method based on high-dimensional combination characteristics Active CN106802149B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611078011.7A CN106802149B (en) 2016-11-29 2016-11-29 Rapid sequence image matching navigation method based on high-dimensional combination characteristics

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611078011.7A CN106802149B (en) 2016-11-29 2016-11-29 Rapid sequence image matching navigation method based on high-dimensional combination characteristics

Publications (2)

Publication Number Publication Date
CN106802149A CN106802149A (en) 2017-06-06
CN106802149B true CN106802149B (en) 2020-02-21

Family

ID=58984590

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611078011.7A Active CN106802149B (en) 2016-11-29 2016-11-29 Rapid sequence image matching navigation method based on high-dimensional combination characteristics

Country Status (1)

Country Link
CN (1) CN106802149B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108648172B (en) * 2018-03-30 2021-08-03 四川元匠科技有限公司 CT (computed tomography) map pulmonary nodule detection system based on 3D-Unet
CN110288620B (en) * 2019-05-07 2023-06-23 南京航空航天大学 Image matching method based on line segment geometric features and aircraft navigation method
CN116518981B (en) * 2023-06-29 2023-09-22 中国人民解放军国防科技大学 Aircraft visual navigation method based on deep learning matching and Kalman filtering

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5146228A (en) * 1990-01-24 1992-09-08 The Johns Hopkins University Coherent correlation addition for increasing match information in scene matching navigation systems
CN101046387A (en) * 2006-08-07 2007-10-03 南京航空航天大学 Scene matching method for raising navigation precision and simulating combined navigation system
CN102506867A (en) * 2011-11-21 2012-06-20 清华大学 SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system) combined navigation method based on Harris comer matching and combined navigation system
CN102829785A (en) * 2012-08-30 2012-12-19 中国人民解放军国防科学技术大学 Air vehicle full-parameter navigation method based on sequence image and reference image matching
CN104679011A (en) * 2015-01-30 2015-06-03 南京航空航天大学 Image matching navigation method based on stable branch characteristic point

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5146228A (en) * 1990-01-24 1992-09-08 The Johns Hopkins University Coherent correlation addition for increasing match information in scene matching navigation systems
CN101046387A (en) * 2006-08-07 2007-10-03 南京航空航天大学 Scene matching method for raising navigation precision and simulating combined navigation system
CN102506867A (en) * 2011-11-21 2012-06-20 清华大学 SINS (strap-down inertia navigation system)/SMANS (scene matching auxiliary navigation system) combined navigation method based on Harris comer matching and combined navigation system
CN102829785A (en) * 2012-08-30 2012-12-19 中国人民解放军国防科学技术大学 Air vehicle full-parameter navigation method based on sequence image and reference image matching
CN104679011A (en) * 2015-01-30 2015-06-03 南京航空航天大学 Image matching navigation method based on stable branch characteristic point

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于遗传算法的导航实时图像匹配算法;冷雪飞等;《通信学报》;20080229;第29卷(第2期);第17-21页 *

Also Published As

Publication number Publication date
CN106802149A (en) 2017-06-06

Similar Documents

Publication Publication Date Title
CN107301654B (en) Multi-sensor high-precision instant positioning and mapping method
CN107590827A (en) A kind of indoor mobile robot vision SLAM methods based on Kinect
CN106683137B (en) Artificial mark based monocular and multiobjective identification and positioning method
CN108921895B (en) Sensor relative pose estimation method
CN108022262A (en) A kind of point cloud registration method based on neighborhood of a point center of gravity vector characteristics
CN104933434A (en) Image matching method combining length between perpendiculars (LBP) feature extraction method and surf feature extraction method
CN108305277B (en) Heterogeneous image matching method based on straight line segments
CN106340010B (en) A kind of angular-point detection method based on second order profile difference
CN110310331B (en) Pose estimation method based on combination of linear features and point cloud features
CN109711321B (en) Structure-adaptive wide baseline image view angle invariant linear feature matching method
CN106802149B (en) Rapid sequence image matching navigation method based on high-dimensional combination characteristics
CN108269274B (en) Image registration method based on Fourier transform and Hough transform
CN102938147A (en) Low-altitude unmanned aerial vehicle vision positioning method based on rapid robust feature
CN106780309A (en) A kind of diameter radar image joining method
CN111739071A (en) Rapid iterative registration method, medium, terminal and device based on initial value
CN109872343B (en) Weak texture object posture tracking method, system and device
CN113838069B (en) Point cloud segmentation method and system based on flatness constraint
CN108447084B (en) Stereo matching compensation method based on ORB characteristics
CN109215118B (en) Incremental motion structure recovery optimization method based on image sequence
Kallasi et al. Computer vision in underwater environments: A multiscale graph segmentation approach
Qi et al. Research of image matching based on improved SURF algorithm
CN107730543A (en) A kind of iteratively faster computational methods of half dense stereo matching
CN109816710B (en) Parallax calculation method for binocular vision system with high precision and no smear
WO2023130842A1 (en) Camera pose determining method and apparatus
Xu et al. ESD-SLAM: An efficient semantic visual SLAM towards dynamic environments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant