CN115049847B - ORB descriptor-based feature point local neighborhood feature matching method - Google Patents

ORB descriptor-based feature point local neighborhood feature matching method Download PDF

Info

Publication number
CN115049847B
CN115049847B CN202210703901.1A CN202210703901A CN115049847B CN 115049847 B CN115049847 B CN 115049847B CN 202210703901 A CN202210703901 A CN 202210703901A CN 115049847 B CN115049847 B CN 115049847B
Authority
CN
China
Prior art keywords
feature point
feature
image
neighborhood
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210703901.1A
Other languages
Chinese (zh)
Other versions
CN115049847A (en
Inventor
袁建军
黄一鸣
章弘凯
鲍晟
杜亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CN202210703901.1A priority Critical patent/CN115049847B/en
Publication of CN115049847A publication Critical patent/CN115049847A/en
Application granted granted Critical
Publication of CN115049847B publication Critical patent/CN115049847B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a feature point local neighborhood feature matching method based on ORB descriptors, which comprises the following steps: s1: extracting feature points in the image f p and the image f q; s2: searching for a neighborhood S3 of each feature point in image f p and image f q in S1: selecting adjacent characteristic points of each characteristic point in the image f p in the step S2, which correspond to the image f q, to form a presumptive characteristic point pair, wherein the presumptive characteristic point pair is S; s4: the set of hypothetical feature point pairs S is input. According to the feature point local neighborhood feature matching method of the ORB descriptor, the feature point matching speed can be effectively improved by establishing the initial grid matrix G 0, then true and false feature matching is performed on the feature point set, a secondary grid matrix G 1 is established, the feature point set is processed for multiple times through the secondary grid matrix G 1, the false matching point set is removed, the false matching can be effectively reduced, the success rate of feature point matching is improved, and the robustness of various feature matching algorithms is further improved.

Description

ORB descriptor-based feature point local neighborhood feature matching method
Technical Field
The invention relates to the technical field of image recognition algorithms, in particular to a feature point local neighborhood feature matching method based on ORB descriptors.
Background
Image matching is an important task in image processing and is applied to occasions such as image registration, target motion detection and tracking, target identification and positioning, map matching and the like. The image matching method based on the characteristic elements is widely used due to the high matching efficiency, strong environment adaptation capability and the like. When two frames of images are identified, the problem of feature point matching between the two frames of images is a key link, and the accuracy and precision of feature point matching determine the realizability of visual tasks.
According to publication No.: CN109697692a, publication date: 2019-04-30 discloses a feature matching method based on local structural similarity, which comprises the following steps: step 1, extracting features of two images to be matched and initially matching the features; step 2, establishing a neighborhood affine coefficient matrix of the feature points; step 3, for each match in the initial matching set, calculating the difference of the neighborhood affine coefficient matrix of the feature point associated with the match; step 4, optimizing the neighborhood affine coefficient matrix to obtain the local structure difference degree; and 5, setting a comparison threshold according to the local structure difference value of the feature points associated with each match, and determining a final feature matching pair as a matching relation result of the images to be matched. The invention solves the problems of complex optimization process and slow convergence in the prior art technically, and effectively improves the matching efficiency.
The existing random sample consensus method needs to rely on a hypothesis model, has lower efficiency when the outlier is larger, and frequently causes error matching of local neighborhood feature point pairs when the image change is relatively complex, and the local preservation matching method frequently generates error matching in a symmetrical texture scene.
Disclosure of Invention
The invention aims to provide a feature point local neighborhood feature matching method based on ORB descriptors, which aims to solve the problem of incorrect matching during image matching.
In order to achieve the above object, the present invention provides the following technical solutions: a feature point local neighborhood feature matching method based on ORB descriptors comprises the following steps:
s1: detecting and describing the characteristics of two frames of images f p and f q in the same or similar scene through an ORB rapid characteristic point extraction and description algorithm, and extracting characteristic points in images f p and f q;
S2: calculating pixel distance under Euclidean distance, searching the neighborhood N i of each characteristic point in the image f p and the image f q in the step S1, setting a threshold value r, and excluding the characteristic points exceeding the threshold value r from the neighborhood;
S3: selecting adjacent characteristic points in the image f p in the step S2, which correspond to corresponding adjacent characteristic points in the corresponding adjacent domain in the image f q, to form a presumptive characteristic point pair, wherein the presumptive characteristic point pair is S;
S4: inputting a set S of assumed characteristic points in an algorithm, and simultaneously establishing two point sets: a true matching feature point set Y and an error matching feature point set M;
S5: searching nearest neighbor feature points of all feature points in the image f p and the image f q by using a GMS grid method, and dividing non-overlapping initial grid matrixes G 0 on the image f p and the image f q respectively;
S6: searching the nearest neighbor feature point a j、bj of the current feature point a i、bi in the grid matrix G 0 according to the Euclidean distance for the feature point in the step S1 according to the grid matrix G 0 divided in the step S4 and the feature point extracted in the step S1, and calculating whether the similarity is larger than a set threshold value or not under the Hamming distance;
If yes, the similarity of the two feature points is low, and the current feature points are combined into an initially set error matching feature point set M for error matching;
If not, entering S7;
S7: searching 3 nearest neighbor feature points around the feature points which are currently compared according to Euclidean distance, and constructing local neighborhood constraint by calculating k-dimensional test and hamming distance of the neighborhood;
S8: setting a comparison threshold, and judging whether the result of the similarity function is smaller than the threshold according to the similarity of the local neighborhood constraint of the corresponding feature points in the similarity function comparison f p and the image f q:
if yes, the feature point pair is a feature point pair which is supposed to be truly matched, and the feature point pair is combined into a truly matched point set Y which is initially set;
If not, the feature point pair is a feature point pair which is assumed to be incorrectly matched, and the feature point pair is combined into an initially set incorrectly matched point set M;
S9: searching for a feature point in which a line between a feature point a i、bi currently compared with the nearest neighbor feature point a j、bj thereof and an edge line of the grid of G 0 are staggered in the image f p and the image f q, respectively, recording a pixel distance between a current feature point a i、bi and a nearest neighbor feature point a j、bj thereof as L 1, recording a pixel distance between a current feature point a i、bi and an intersection point of the grid edge line of the grid of G 0 as L 2, and comparing L 1 and L 2, wherein a comparison formula is as follows:
If yes, the nearest neighbor feature point a j、bj representing the current feature point a i、bi is in other grids, and the grid G 1 needs to be re-divided, and then S10 is entered;
If not, the nearest neighbor feature point a j、bj representing the current feature point a i、bi is in the grid, and a real feature point set Y is directly output;
S10: defining a group of variable neighborhood R through a variable neighborhood searching algorithm, inputting the variable neighborhood R into the algorithm, and redefining a second grid matrix G 1;
S11: repeating the steps S7 to S10, and finally outputting the truly matched feature point set Y.
Preferably, in the step S2, the neighborhood N i is expressed as:
Ni={nj|nj≠ni,L(nj,ni)<r};
Where n i is the current feature point, n j is a feature point different from n i, L (n j,ni) is the pixel distance between two feature points, and r is the comparison threshold of the set neighborhood.
Preferably, in the step S3, it is assumed that the pair of feature points S is defined as:
wherein S i is the correspondence of the primary feature point pairs in the S set, p i is the feature point in image f p, q i is the feature point in image f q corresponding to image f p, d (,) is the hamming distance between the two feature descriptors, d H is the true corresponding distance threshold, and N is the number of feature point pairs.
Preferably, in the step S5, a series of grid matrix sizes are set, and an algorithm execution speed test is performed, and finally, a grid matrix size with the highest algorithm execution speed is selected as the grid matrix size of the grid matrix G 0.
Preferably, in the step S6, a similarity calculation formula is as follows:
wherein is an exclusive-or operation, f k(aj) and f k(bj) are k-dimensional tests of ORB descriptors, which satisfy gaussian distribution, and the calculation formula is:
wherein τ (a; x j,yj) is a gray-scale-based binary intensity test, and its formula is:
Preferably, in the step S7, the local neighborhood constructing method calculates a k-dimensional test between the feature point currently compared in f p and f q and the 3 nearest neighbor feature points and hamming distances of the neighborhood.
Preferably, in the step S10, the variable domain R is specifically defined as:
wherein R is the neighborhood of the current feature point, R m is a set of neighborhood set in advance, and M is the number of the set of neighborhood.
In the technical scheme, the ORB descriptor-based feature point local neighborhood feature matching method provided by the invention has the following beneficial effects:
The invention provides a feature point local neighborhood feature matching method based on ORB descriptors, which can effectively improve the feature point matching speed by establishing an initial grid matrix G 0, then truly and falsely match a feature point set, and establish a secondary grid matrix G 1, and then repeatedly process the feature point set through the secondary grid matrix G 1 to remove the false matching point set, thereby effectively reducing false matching, improving the success rate of feature point matching, and further improving the robustness of various feature matching algorithms.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings required for the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments described in the present application, and other drawings may be obtained according to these drawings for a person having ordinary skill in the art.
FIG. 1 shows a schematic flow diagram of the present invention;
FIG. 2 shows a partial algorithm flow chart of the present invention;
FIG. 3 shows a flowchart of the algorithm of steps S9-S11 of the present invention;
FIG. 4 illustrates a nearest neighbor local constraint schematic of the present invention;
FIG. 5 shows a schematic representation of a local neighborhood structure of the present invention;
FIG. 6 is a graph showing the initial matching results of two images in a conduit dataset in an embodiment provided by the present invention;
Fig. 7 shows a graph of the result after the mismatch feature points are removed in the embodiment provided by the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
As shown in fig. 1-7, a feature point local neighborhood feature matching method based on ORB descriptors includes the following steps:
s1: detecting and describing the characteristics of two frames of images f p and f q in the same or similar scene through an ORB rapid characteristic point extraction and description algorithm, and extracting characteristic points in images f p and f q;
S2: calculating the pixel distance under the Euclidean distance, searching the neighborhood N i of each characteristic point in the image f p and the image f q in the step S1, setting a threshold r, excluding the characteristic points exceeding the threshold r from the neighborhood, and expressing the neighborhood N i as:
Ni={nj|nj≠ni,L(nj,ni)<r};
Wherein n i is the current feature point, n j is a feature point different from n i, L (n j,ni) is the pixel distance between two feature points, a threshold r is selected through multiple groups of experiments, the number of error matching removal is the largest under the threshold r, the largest number of correct matching can be reserved under the threshold r, and the accuracy and recall rate of the correct matching are both the best;
S3: selecting each feature point in the image f p in the step S2 to form a pair of assumed feature points corresponding to adjacent feature points in the corresponding neighboring domain in the image f q, where the pair of assumed feature points is S, and the pair of assumed feature points is defined as:
wherein S i is the correspondence of the primary feature point pairs in the S set, p i is the feature point in image f p, q i is the feature point in image f q corresponding to image f p, d (,) is the hamming distance between two feature descriptors, d H is the true corresponding distance threshold, and N is the number of feature point pairs;
S4: as shown in fig. 2, a feature point pair set S is input in the algorithm, and two point sets are simultaneously established: a true matching feature point set Y and an error matching feature point set M;
S5: searching nearest neighbor feature points of all feature points in the image f p and the image f q by using a GMS grid method, dividing non-overlapping initial grid matrix G 0 on the image f p and the image f q respectively, changing global searching into local searching in grids when searching the nearest neighbor feature points, further reducing complexity, improving algorithm running speed, setting a series of grid matrix scales, performing algorithm execution speed test, and finally selecting a grid matrix scale with the highest algorithm execution speed as a grid matrix scale of the grid matrix G 0, wherein in the embodiment provided by the invention, the initial grid matrix G0 is defined as 8 multiplied by 9;
S6: searching the nearest neighbor feature point a j、bj of the current feature point a i、bi in the grid matrix G 0 according to the Euclidean distance for the feature point in the step S1 according to the grid matrix G 0 divided in the step S4 and the feature point extracted in the step S1, and calculating whether the similarity is larger than a set threshold value or not under the Hamming distance;
The characteristic point currently compared in the image f p is a i, the nearest neighbor characteristic point is a j, the characteristic point currently compared in the image f q is b i, the nearest neighbor characteristic point is b j, and the similarity of the two is calculated according to a similarity formula ;
Wherein is an exclusive-or operation, f k(aj) and f k(bj) are k-dimensional tests of ORB descriptors, which satisfy gaussian distribution, and the calculation formula is:
Wherein τ (a; x j,yj) is a binary intensity test based on a gray scale map, the idea of the ORB feature point descriptor is to select a pixel point pair around a FAST corner in a certain mode, then compare the pixel intensities of the point pair, and the pixel intensity is 1 in the calculation of the pair of pixel point pairs, otherwise, is 0, and the formula is expressed as:
Wherein p (x j) is the pixel intensity of x j at x= [ u, v ], p (y j) is the pixel intensity of y j at y= [ w, t ], and [ u, v ] and [ w, t ] are the coordinates of pixel point pairs selected in a certain pattern around the FAST corner;
Setting a threshold lambda for similarity comparison, wherein the threshold lambda is selected through a plurality of groups of experimental experience, the number of error matching removal is the largest under the threshold lambda, and meanwhile, the maximum number of correct matching is reserved, and the accuracy and recall rate of the correct matching are both the best;
judging whether the value is larger than a threshold lambda;
If yes, the similarity of the two feature points is lower, the current feature points are combined into an initially set error matching feature point set M for error matching, namely M=M U-s i;
If not, entering S7;
S7: searching 3 nearest neighbor feature points around the feature points which are currently compared according to Euclidean distance, and constructing local neighborhood constraint by calculating k-dimensional test and hamming distance of the neighborhood;
The local neighborhood construction method is to calculate the k-dimensional test between the current compared characteristic point and the nearest 3 characteristic points in the image f p and the image f q and the hamming distance of the neighborhood;
And searching three nearest neighbor feature points around the a i feature point in the corresponding grid in the f p according to Euclidean distance, wherein the three nearest neighbor feature points are respectively: m a,0、ma,1 and m a,2. Three nearest neighbor feature points around b i are searched out in the corresponding grids in f q, namely m b,0、mb,1 and m b,2 respectively. By computing the k-dimensional test and the hamming distance of its neighborhood, local neighborhood constraints are constructed for a i and b i in f p and f q, respectively, as shown in fig. 5, expressed as:
And/>
Wherein a i is the feature point currently compared in the assumed feature point pair f p, m a,j is the three feature points of the nearest neighbor found in the neighborhood around the feature point a i, and cost (,) is the similarity of the ORB descriptors represented by calculating the hamming distance between the current feature point and the nearest neighbor feature point;
S8: setting a comparison threshold, and judging whether the result of the similarity function is smaller than the threshold according to the similarity of the local neighborhood constraint of the corresponding feature points in the similarity function comparison f p and the image f q;
For the local neighborhood constraint constructed in the S7, comparing the similarity of the local neighborhood constraints of the corresponding feature points in the image f p and the image f q, distinguishing true matching and false matching through a similarity function, and defining the similarity function of the local neighborhood constraint as:
Setting a threshold epsilon for similarity function comparison through multiple groups of experiments, wherein the threshold epsilon is selected through multiple groups of experiments, the number of mismatching removal is the largest under the threshold, meanwhile, the number of correct matching is kept the largest, the accuracy and recall rate of the correct matching are both the best, then comparing the calculation result of the similarity function with the threshold, judging whether F (a i,bi) is smaller than the threshold epsilon, the similarity function value of the true matching is smaller, the similarity function value of the false matching is larger, and the comparison formula is as follows:
If yes, the feature point pair is a feature point pair which is supposed to be truly matched, and the feature point pair is combined into a truly matched point set Y which is originally set, namely Y=Y U-s i;
If not, the feature point pair is a feature point pair which is assumed to be in error matching, and the feature point pair is combined into an error matching point set M which is initially set, namely M=MU s i;
S9: as shown in fig. 2, after the true and false feature points of the assumed feature point set in the previous step are matched, the point set is divided into two parts, which are respectively: a true matching feature point set Y and a false matching feature point set M. In this step, for the problem that the nearest neighbor feature points near the edges of the grids may fall in different grids to form mismatching, firstly, inputting the error point set M allocated in the previous step, searching for feature points where the connecting line between the feature point a i、bi currently compared with the nearest neighbor feature point a j、bj in the image f p and the image f q is staggered with the grid edge line of the G 0, marking the pixel distance between the current feature point a i、bi and the nearest neighbor feature point a j、bj as L 1, marking the pixel distance between the intersection point of the current feature point a i、bi and the grid edge line of the G 0 as L 2, comparing L 1 with L 2, and the comparison formula is as follows:
If yes, the nearest neighbor feature point a j、bj representing the current feature point a i、bi is in other grids, and the grid G 1 needs to be re-divided, and then S10 is entered;
If not, the nearest neighbor feature point a j、bj representing the current feature point a i、bi is in the grid, and a real feature point set Y is directly output;
S10: defining a group of variable neighborhood R by utilizing the idea of the variable neighborhood, inputting the variable neighborhood R into an algorithm, and redefining a second grid matrix G 1;
The variant neighborhood R is specifically defined as:
wherein R is the neighborhood of the current feature point, R m is a set of neighborhood set in advance, and M is the number of the set of neighborhood;
After a group of neighborhoods is set, the neighborhoods defined in S2 are changed into , the group of neighborhoods are input into an algorithm, the size of a second grid matrix is finally determined through a plurality of groups of experimental tests, the number of mismatching removal is the largest under the grid matrix, the maximum number of the mismatching removal is reserved, and the accuracy and recall rate of the correct matching are the best, and in the embodiment provided by the invention, the size of the second grid matrix G 1 is 6 multiplied by 7;
S11: repeating the steps S7 to S10, and finally outputting a truly matched feature point set Y;
Setting the correspondence of the primary characteristic point pair in the assumed error matching characteristic point set M output by S9 as h i, finding out the nearest neighbor characteristic point of each characteristic point pair of all h i epsilon M, setting the characteristic point currently compared in f p as p i, setting the nearest neighbor characteristic point as the characteristic point currently compared in p j,fq as q i, and setting the nearest neighbor characteristic point as q j;
Calculating the similarity of the two according to a formula ;
Setting a threshold lambda for similarity comparison, wherein the threshold lambda is selected through a plurality of groups of experiments, the number of mismatching removal is the largest under the threshold lambda, the maximum number of correct matching is reserved, the accuracy and recall rate of the correct matching are optimal, and whether is larger than lambda is judged:
if yes, the similarity of the two feature points is lower, the two feature points are in error matching, and then the two feature points are combined into an initially set error matching feature point set M, namely M=M%s i;
If not, three nearest neighbor feature points around the p i feature points are searched in the corresponding grids in fp, and the three nearest neighbor feature points are respectively: m p,0、mp,1 and m p,2. Three nearest neighbor feature points around b i are searched in the corresponding grids in fq, and the three nearest neighbor feature points are respectively: m q,0、mq,1 and m q,2. By calculating the k-dimensional test and the hamming distance of the neighborhood thereof, local neighborhood constraint is respectively constructed for two feature points in two images, and the local neighborhood constraint is expressed as follows:
And/>
Then, the similarity function of the two local neighborhood constraints is calculated, and the formula is as follows:
Setting a threshold epsilon for similarity function comparison, wherein the threshold epsilon is selected through a plurality of groups of experimental experience, the number of mismatching removal is the largest under the threshold, meanwhile, the maximum number of correct matching is reserved, the accuracy and recall rate of the correct matching are both optimal, and then comparing the calculation result of the similarity function with the threshold, and judging: if F (p i,qi) is smaller than the threshold epsilon, the similarity function value of the true match is smaller, and the similarity function value of the false match is larger, and the comparison formula is as follows:
If yes, the local neighborhood structure surrounding the characteristic points in the two images is true matching, and the characteristic points are combined into an originally set true matching point set Y, namely Y=Y U-s i;
If not, the local neighborhood structure around the feature point in the two images is false matching, and the feature point is combined into the false matching point set M which is initially set, namely M=MU-s i.
The invention tests in the data set of the pipeline, white circles in figure 6 represent feature points, white lines represent real feature point matching pairs in two images, and black lines represent error feature point matching pairs in two images; the white circles and lines in fig. 7 represent the results of the two images after the mismatch is removed.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The principles and embodiments of the present invention have been described in detail with reference to specific examples, which are provided to facilitate understanding of the method and core ideas of the present invention; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in accordance with the ideas of the present invention, the present description should not be construed as limiting the present invention in view of the above.
The embodiment of the application also provides a specific implementation mode of the electronic equipment capable of realizing all the steps in the method in the embodiment, and the electronic equipment specifically comprises the following contents:
A processor (processor), a memory (memory), a communication interface (Communications Interface), and a bus;
The processor, the memory and the communication interface complete communication with each other through the bus;
The processor is configured to invoke the computer program in the memory, where the processor executes the computer program to implement all the steps in the method in the foregoing embodiment, for example, the processor executes the computer program to implement the following steps:
s1: detecting and describing the characteristics of two frames of images f p and f q in the same or similar scene through an ORB rapid characteristic point extraction and description algorithm, and extracting characteristic points in images f p and f q;
S2: calculating pixel distance under Euclidean distance, searching the neighborhood N i of each characteristic point in the image f p and the image f q in the step S1, setting a threshold value r, and excluding the characteristic points exceeding the threshold value r from the neighborhood;
S3: selecting adjacent characteristic points in the image f p in the step S2, which correspond to corresponding adjacent characteristic points in the corresponding adjacent domain in the image f q, to form a presumptive characteristic point pair, wherein the presumptive characteristic point pair is S;
S4: inputting a set S of assumed characteristic points in an algorithm, and simultaneously establishing two point sets: a true matching feature point set Y and an error matching feature point set M;
S5: searching nearest neighbor feature points of all feature points in the image f p and the image f q by using a GMS grid method, and dividing non-overlapping initial grid matrixes G 0 on the image f p and the image f q respectively;
S6: searching the nearest neighbor feature point a j、bj of the current feature point a i、bi in the grid matrix G 0 according to the Euclidean distance for the feature point in the step S1 according to the grid matrix G 0 divided in the step S4 and the feature point extracted in the step S1, and calculating whether the similarity is larger than a set threshold value or not under the Hamming distance;
If yes, the similarity of the two feature points is low, and the current feature points are combined into an initially set error matching feature point set M for error matching;
If not, entering S7;
S7: searching 3 nearest neighbor feature points around the feature points which are currently compared according to Euclidean distance, and constructing local neighborhood constraint by calculating k-dimensional test and hamming distance of the neighborhood;
S8: setting a comparison threshold, and judging whether the result of the similarity function is smaller than the threshold according to the similarity of the local neighborhood constraint of the corresponding feature points in the similarity function comparison f p and the image f q:
if yes, the feature point pair is a feature point pair which is supposed to be truly matched, and the feature point pair is combined into a truly matched point set Y which is initially set;
If not, the feature point pair is a feature point pair which is assumed to be incorrectly matched, and the feature point pair is combined into an initially set incorrectly matched point set M;
S9: searching for a feature point in which a line between a feature point a i、bi currently compared with the nearest neighbor feature point a j、bj thereof and an edge line of the grid of G 0 are staggered in the image f p and the image f q, respectively, recording a pixel distance between a current feature point a i、bi and a nearest neighbor feature point a j、bj thereof as L 1, recording a pixel distance between a current feature point a i、bi and an intersection point of the grid edge line of the grid of G 0 as L 2, and comparing L 1 and L 2, wherein a comparison formula is as follows:
If yes, the nearest neighbor feature point a j、bj representing the current feature point a i、bi is in other grids, and the grid G 1 needs to be re-divided, and then S10 is entered;
If not, the nearest neighbor feature point a j、bj representing the current feature point a i、bi is in the grid, and a real feature point set Y is directly output;
S10: defining a group of variable neighborhood R through a variable neighborhood searching algorithm, inputting the variable neighborhood R into the algorithm, and redefining a second grid matrix G 1;
S11: repeating the steps S7 to S10, and finally outputting the truly matched feature point set Y.
An embodiment of the present application also provides a computer-readable storage medium capable of implementing all the steps of the method in the above embodiment, the computer-readable storage medium storing thereon a computer program that, when executed by a processor, implements all the steps of the method in the above embodiment, for example, the processor implements the following steps when executing the computer program:
s1: detecting and describing the characteristics of two frames of images f p and f q in the same or similar scene through an ORB rapid characteristic point extraction and description algorithm, and extracting characteristic points in images f p and f q;
S2: calculating pixel distance under Euclidean distance, searching the neighborhood N i of each characteristic point in the image f p and the image f q in the step S1, setting a threshold value r, and excluding the characteristic points exceeding the threshold value r from the neighborhood;
S3: selecting adjacent characteristic points in the image f p in the step S2, which correspond to corresponding adjacent characteristic points in the corresponding adjacent domain in the image f q, to form a presumptive characteristic point pair, wherein the presumptive characteristic point pair is S;
S4: inputting a set S of assumed characteristic points in an algorithm, and simultaneously establishing two point sets: a true matching feature point set Y and an error matching feature point set M;
S5: searching nearest neighbor feature points of all feature points in the image f p and the image f q by using a GMS grid method, and dividing non-overlapping initial grid matrixes G 0 on the image f p and the image f q respectively;
S6: searching the nearest neighbor feature point a j、bj of the current feature point a i、bi in the grid matrix G 0 according to the Euclidean distance for the feature point in the step S1 according to the grid matrix G 0 divided in the step S4 and the feature point extracted in the step S1, and calculating whether the similarity is larger than a set threshold value or not under the Hamming distance;
If yes, the similarity of the two feature points is low, and the current feature points are combined into an initially set error matching feature point set M for error matching;
If not, entering S7;
S7: searching 3 nearest neighbor feature points around the feature points which are currently compared according to Euclidean distance, and constructing local neighborhood constraint by calculating k-dimensional test and hamming distance of the neighborhood;
S8: setting a comparison threshold, and judging whether the result of the similarity function is smaller than the threshold according to the similarity of the local neighborhood constraint of the corresponding feature points in the similarity function comparison f p and the image f q:
if yes, the feature point pair is a feature point pair which is supposed to be truly matched, and the feature point pair is combined into a truly matched point set Y which is initially set;
If not, the feature point pair is a feature point pair which is assumed to be incorrectly matched, and the feature point pair is combined into an initially set incorrectly matched point set M;
S9: searching for a feature point in which a line between a feature point a i、bi currently compared with the nearest neighbor feature point a j、bj thereof and an edge line of the grid of G 0 are staggered in the image f p and the image f q, respectively, recording a pixel distance between a current feature point a i、bi and a nearest neighbor feature point a j、bj thereof as L 1, recording a pixel distance between a current feature point a i、bi and an intersection point of the grid edge line of the grid of G 0 as L 2, and comparing L 1 and L 2, wherein a comparison formula is as follows:
If yes, the nearest neighbor feature point a j、bj representing the current feature point a i、bi is in other grids, and the grid G 1 needs to be re-divided, and then S10 is entered;
If not, the nearest neighbor feature point a j、bj representing the current feature point a i、bi is in the grid, and a real feature point set Y is directly output;
S10: defining a group of variable neighborhood R through a variable neighborhood searching algorithm, inputting the variable neighborhood R into the algorithm, and redefining a second grid matrix G 1;
S11: repeating the steps S7 to S10, and finally outputting the truly matched feature point set Y.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for the hardware+program class embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments in part. Although the present description provides method operational steps as described in the examples or flowcharts, more or fewer operational steps may be included based on conventional or non-inventive means. The order of steps recited in the embodiments is merely one way of performing the order of steps and does not represent a unique order of execution. When implemented in an actual device or end product, the instructions may be executed sequentially or in parallel (e.g., in a parallel processor or multi-threaded processing environment, or even in a distributed data processing environment) as illustrated by the embodiments or by the figures. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, it is not excluded that additional identical or equivalent elements may be present in a process, method, article, or apparatus that comprises a described element. For convenience of description, the above devices are described as being functionally divided into various modules, respectively. Of course, when implementing the embodiments of the present disclosure, the functions of each module may be implemented in the same or multiple pieces of software and/or hardware, or a module that implements the same function may be implemented by multiple sub-modules or a combination of sub-units, or the like. The above-described apparatus embodiments are merely illustrative, for example, the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be appreciated by those skilled in the art that embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, the present specification embodiments may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present description embodiments may take the form of a computer program product on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein. In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system embodiments, since they are substantially similar to method embodiments, the description is relatively simple, as relevant to see a section of the description of method embodiments. In the description of the present specification, a description referring to terms "one embodiment," "some embodiments," "examples," "specific examples," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the embodiments of the present specification.
In this specification, schematic representations of the above terms are not necessarily directed to the same embodiment or example. Furthermore, the different embodiments or examples described in this specification and the features of the different embodiments or examples may be combined and combined by those skilled in the art without contradiction. The foregoing is merely an example of an embodiment of the present disclosure and is not intended to limit the embodiment of the present disclosure. Various modifications and variations of the illustrative embodiments will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, or the like, which is within the spirit and principles of the embodiments of the present specification, should be included in the scope of the claims of the embodiments of the present specification.

Claims (7)

1. The ORB descriptor-based feature point local neighborhood feature matching method is characterized by comprising the following steps of:
s1: detecting and describing the characteristics of two frames of images f p and f q in the same or similar scene through an ORB rapid characteristic point extraction and description algorithm, and extracting characteristic points in images f p and f q;
S2: calculating pixel distance under Euclidean distance, searching the neighborhood N i of each characteristic point in the image f p and the image f q in the step S1, setting a threshold value r, and excluding the characteristic points exceeding the threshold value r from the neighborhood;
S3: selecting adjacent characteristic points in the image f p in the step S2, which correspond to corresponding adjacent characteristic points in the corresponding adjacent domain in the image f q, to form a presumptive characteristic point pair, wherein the presumptive characteristic point pair is S;
S4: inputting a set S of assumed characteristic points in an algorithm, and simultaneously establishing two point sets: a true matching feature point set Y and an error matching feature point set M;
S5: searching nearest neighbor feature points of all feature points in the image f p and the image f q by using a GMS grid method, and dividing non-overlapping initial grid matrixes G 0 on the image f p and the image f q respectively;
S6: searching the nearest neighbor feature point a j、bj of the current feature point a i、bi in the grid matrix G 0 according to the Euclidean distance for the feature point in the step S1 according to the grid matrix G 0 divided in the step S4 and the feature point extracted in the step S1, and calculating whether the similarity is larger than a set threshold value or not under the Hamming distance;
If yes, the similarity of the two feature points is low, and the current feature points are combined into an initially set error matching feature point set M for error matching;
If not, entering S7;
S7: searching 3 nearest neighbor feature points around the feature points which are currently compared according to Euclidean distance, and constructing local neighborhood constraint by calculating k-dimensional test and hamming distance of the neighborhood;
S8: setting a comparison threshold, and judging whether the result of the similarity function is smaller than the threshold according to the similarity of the local neighborhood constraint of the corresponding feature points in the similarity function comparison f p and the image f q:
if yes, the feature point pair is a feature point pair which is supposed to be truly matched, and the feature point pair is combined into a truly matched point set Y which is initially set;
If not, the feature point pair is a feature point pair which is assumed to be incorrectly matched, and the feature point pair is combined into an initially set incorrectly matched point set M;
S9: searching for a feature point in which a line between a feature point a i、bi currently compared with the nearest neighbor feature point a j、bj thereof and an edge line of the grid of G 0 are staggered in the image f p and the image f q, respectively, recording a pixel distance between a current feature point a i、bi and a nearest neighbor feature point a j、bj thereof as L 1, recording a pixel distance between a current feature point a i、bi and an intersection point of the grid edge line of the grid of G 0 as L 2, and comparing L 1 and L 2, wherein a comparison formula is as follows:
If yes, the nearest neighbor feature point a j、bj representing the current feature point a i、bi is in other grids, and the grid G 1 needs to be re-divided, and then S10 is entered;
If not, the nearest neighbor feature point a j、bj representing the current feature point a i、bi is in the grid, and a real feature point set Y is directly output;
S10: defining a group of variable neighborhood R through a variable neighborhood searching algorithm, inputting the variable neighborhood R into the algorithm, and redefining a second grid matrix G 1;
S11: repeating the steps S7 to S10, and finally outputting the truly matched feature point set Y.
2. The ORB descriptor-based feature point local neighborhood feature matching method according to claim 1, wherein in step S2, the neighborhood N i is represented as:
Ni={nj|nj≠ni,L(nj,ni)<r};
Where n i is the current feature point, n j is a feature point different from n i, L (n j,ni) is the pixel distance between two feature points, and r is the comparison threshold of the set neighborhood.
3. The ORB descriptor-based feature point local neighborhood feature matching method according to claim 1, wherein in step S3, it is assumed that the feature point pair S is defined as:
wherein S i is the correspondence of the primary feature point pairs in the S set, p i is the feature point in image f p, q i is the feature point in image f q corresponding to image f p, d (,) is the hamming distance between the two feature descriptors, d H is the true corresponding distance threshold, and N is the number of feature point pairs.
4. The ORB descriptor-based feature point local neighborhood feature matching method according to claim 1, wherein in step S5, a series of grid matrix sizes are set, an algorithm execution speed test is performed, and finally a grid matrix size with the highest algorithm execution speed is selected as the grid matrix size of the grid matrix G 0.
5. The ORB descriptor-based feature point local neighborhood feature matching method according to claim 1, wherein in step S8, the similarity calculation formula is:
Wherein is an exclusive-or operation, f k(aj) and f k(bj) are k-dimensional tests of ORB descriptors, which satisfy gaussian distribution, and the calculation formula is:
wherein τ (a; x j,yj) is a gray-scale-based binary intensity test, and its formula is:
6. The ORB descriptor-based feature point local neighborhood feature matching method according to claim 1, wherein in step S7, the local neighborhood construction method is to calculate k-dimensional test between the feature point currently compared in f p and f q and 3 nearest neighbor feature points and hamming distance of the neighborhood.
7. The ORB descriptor-based feature point local neighborhood feature matching method according to claim 1, wherein in the step S10, the variable neighborhood R is specifically defined as:
wherein R is the neighborhood of the current feature point, R m is a set of neighborhood set in advance, and M is the number of the set of neighborhood.
CN202210703901.1A 2022-06-21 2022-06-21 ORB descriptor-based feature point local neighborhood feature matching method Active CN115049847B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210703901.1A CN115049847B (en) 2022-06-21 2022-06-21 ORB descriptor-based feature point local neighborhood feature matching method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210703901.1A CN115049847B (en) 2022-06-21 2022-06-21 ORB descriptor-based feature point local neighborhood feature matching method

Publications (2)

Publication Number Publication Date
CN115049847A CN115049847A (en) 2022-09-13
CN115049847B true CN115049847B (en) 2024-04-16

Family

ID=83163829

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210703901.1A Active CN115049847B (en) 2022-06-21 2022-06-21 ORB descriptor-based feature point local neighborhood feature matching method

Country Status (1)

Country Link
CN (1) CN115049847B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109697692A (en) * 2018-12-29 2019-04-30 安徽大学 One kind being based on the similar feature matching method of partial structurtes
CN110675437A (en) * 2019-09-24 2020-01-10 重庆邮电大学 Image matching method based on improved GMS-ORB characteristics and storage medium
WO2021035988A1 (en) * 2019-08-30 2021-03-04 长安大学 Method and apparatus for quickly matching and extracting feature of unmanned aerial vehicle visual image
WO2021212874A1 (en) * 2020-04-24 2021-10-28 平安科技(深圳)有限公司 Palm print mismatching point elimination method, apparatus, and device, and storage medium
WO2022002039A1 (en) * 2020-06-30 2022-01-06 杭州海康机器人技术有限公司 Visual positioning method and device based on visual map

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109697692A (en) * 2018-12-29 2019-04-30 安徽大学 One kind being based on the similar feature matching method of partial structurtes
WO2021035988A1 (en) * 2019-08-30 2021-03-04 长安大学 Method and apparatus for quickly matching and extracting feature of unmanned aerial vehicle visual image
CN110675437A (en) * 2019-09-24 2020-01-10 重庆邮电大学 Image matching method based on improved GMS-ORB characteristics and storage medium
WO2021212874A1 (en) * 2020-04-24 2021-10-28 平安科技(深圳)有限公司 Palm print mismatching point elimination method, apparatus, and device, and storage medium
WO2022002039A1 (en) * 2020-06-30 2022-01-06 杭州海康机器人技术有限公司 Visual positioning method and device based on visual map

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Orb-slam3: An accurate open-source library for visual visual-inertial and multimap slam;C.Campos et al.;《IEEE Transactions on Robotics》;20211231;第37卷(第6期);第1874-1890页 *
一种改进的ORB特征匹配算法;廖弘真等;《北京航空航天大学学报》;20211231;第47卷(第10期);第2149-2154页 *
图像特征检测与匹配方法研究综述;唐灿;唐亮贵;刘波;;南京信息工程大学学报(自然科学版);20200528(第03期);第261-273页 *
基于改进网格划分统计的特征点快速匹配方法;陈方杰等;《计算机测量与控制》;20191231;第27卷(第08期);第231-235页 *

Also Published As

Publication number Publication date
CN115049847A (en) 2022-09-13

Similar Documents

Publication Publication Date Title
US9619733B2 (en) Method for generating a hierarchical structured pattern based descriptor and method and device for recognizing object using the same
CN109697692B (en) Feature matching method based on local structure similarity
CN109146963B (en) Image position offset detection method based on rapid feature matching
KR101618996B1 (en) Sampling method and image processing apparatus for estimating homography
CN105160686B (en) A kind of low latitude various visual angles Remote Sensing Images Matching Method based on improvement SIFT operators
JP6997369B2 (en) Programs, ranging methods, and ranging devices
CN105809651A (en) Image saliency detection method based on edge non-similarity comparison
CN109740606A (en) A kind of image-recognizing method and device
CN104867137A (en) Improved RANSAC algorithm-based image registration method
CN108256454B (en) Training method based on CNN model, and face posture estimation method and device
CN107038432B (en) Fingerprint image direction field extraction method based on frequency information
CN114581646A (en) Text recognition method and device, electronic equipment and storage medium
CN111241924A (en) Face detection and alignment method and device based on scale estimation and storage medium
CN112365511A (en) Point cloud segmentation method based on overlapped region retrieval and alignment
CN110163894B (en) Sub-pixel level target tracking method based on feature matching
CN112102384A (en) Non-rigid medical image registration method and system
CN112330699B (en) Three-dimensional point cloud segmentation method based on overlapping region alignment
CN115049847B (en) ORB descriptor-based feature point local neighborhood feature matching method
CN110969640A (en) Video image segmentation method, terminal device and computer-readable storage medium
CN113704276A (en) Map updating method and device, electronic equipment and computer readable storage medium
CN116403010A (en) Medical image matching method based on FAST algorithm
CN112801045B (en) Text region detection method, electronic equipment and computer storage medium
US20230360262A1 (en) Object pose recognition method based on triangulation and probability weighted ransac algorithm
CN104021534A (en) Shredded paper splicing method
CN114022434A (en) Automatic extraction method and system for upper and lower lines of guardrail

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant