CN115049847A - Characteristic point local neighborhood characteristic matching method based on ORB descriptor - Google Patents

Characteristic point local neighborhood characteristic matching method based on ORB descriptor Download PDF

Info

Publication number
CN115049847A
CN115049847A CN202210703901.1A CN202210703901A CN115049847A CN 115049847 A CN115049847 A CN 115049847A CN 202210703901 A CN202210703901 A CN 202210703901A CN 115049847 A CN115049847 A CN 115049847A
Authority
CN
China
Prior art keywords
feature
image
feature point
point
characteristic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210703901.1A
Other languages
Chinese (zh)
Other versions
CN115049847B (en
Inventor
袁建军
黄一鸣
章弘凯
鲍晟
杜亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Shanghai for Science and Technology
Original Assignee
University of Shanghai for Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Shanghai for Science and Technology filed Critical University of Shanghai for Science and Technology
Priority to CN202210703901.1A priority Critical patent/CN115049847B/en
Publication of CN115049847A publication Critical patent/CN115049847A/en
Application granted granted Critical
Publication of CN115049847B publication Critical patent/CN115049847B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a characteristic point local neighborhood characteristic matching method based on an ORB descriptor, which comprises the following steps: s1: extracting an image f p And image f q The feature point of (1); s2: searching for image f in S1 p And image f q Neighborhood of each feature point in
Figure DDA0003705466910000011
S3: the image f in step S2 is selected p Where each feature point corresponds to an image f q The adjacent characteristic points form an assumed characteristic point pair, and the assumed characteristic point pair set is S; s4: a set S of pairs of hypothetical feature points is input. The invention provides a feature point local neighborhood feature matching method of ORB descriptor, which is implemented by establishing an initial grid matrix G 0 The matching speed of the feature points can be effectively improved, and the feature point set is subjected to true and false matching and a quadratic grid matrix G is established 1 By a quadratic grid matrix G 1 Then processing the characteristic point set for multiple times to remove the error matching point setThe method can effectively reduce mismatching and improve the success rate of matching of the feature points, thereby improving the robustness of various feature matching algorithms.

Description

Characteristic point local neighborhood characteristic matching method based on ORB descriptor
Technical Field
The invention relates to the technical field of image recognition algorithms, in particular to a feature point local neighborhood feature matching method based on an ORB descriptor.
Background
The image matching is an important task in image processing and is applied to occasions such as image registration, target motion detection and tracking, target identification and positioning, map matching and the like. The image matching method based on the feature elements is widely used due to the reasons of high matching efficiency, strong environmental adaptability and the like. When two images are identified, the problem of matching the feature points between the two images is a key link, and the accuracy and precision of matching the feature points determine the realizability of the visual task.
According to the publication: CN109697692A, published date: 2019-04-30, which is an invention patent application, discloses a feature matching method based on local structure similarity, comprising the following steps of: step 1, performing feature extraction and initial matching on two images to be matched; step 2, establishing a neighborhood affine coefficient matrix of the feature points; step 3, calculating the difference of the neighborhood affine coefficient matrixes of the feature points associated with each matching in the initial matching set; step 4, optimizing the neighborhood affine coefficient matrix to obtain the difference degree of the local structure; and 5, setting a comparison threshold according to the local structure difference value of each matched and associated feature point, and determining a final feature matching pair as a matching relation result of the image to be matched. The invention technically overcomes the problems of complex optimization process and slow convergence in the prior art, and effectively improves the matching efficiency.
The existing random sample consensus method needs to depend on a hypothesis model, when the abnormal value accounts for a large amount, the efficiency is low, and when the image change is relatively complex, local neighborhood feature point pair matching errors often occur, and a local preserving matching method often generates wrong matching in a symmetric texture scene.
Disclosure of Invention
The invention aims to provide a characteristic point local neighborhood characteristic matching method based on an ORB descriptor, and aims to solve the problem of error matching during image matching.
In order to achieve the above purpose, the invention provides the following technical scheme: a feature point local neighborhood feature matching method based on ORB descriptors comprises the following steps:
s1: two frames of images f under the same or similar scenes are subjected to ORB (object oriented bounding box) rapid feature point extraction and description algorithm p And image f q Detecting and describing the characteristics thereof, and extracting an image f p And image f q The feature point of (1);
s2: calculating the pixel distance at the Euclidean distance and searching the image f in the step S1 p And image f q Neighborhood N of each feature point in i Setting a threshold r, and excluding the feature points exceeding the threshold r from the neighborhood;
s3: the image f in step S2 is selected p Where each feature point corresponds to an image f q The adjacent characteristic points in the corresponding adjacent domains form an assumed characteristic point pair, and the assumed characteristic point pair set is S;
s4: inputting an assumed characteristic point pair combination set S in the algorithm, and simultaneously establishing two point sets: a real matching feature point set Y and an error matching feature point set M;
s5: finding image f by GMS grid method p And image f q The nearest neighbor feature point of the sum of all feature points and in the image f p And image f q Divide the non-overlapping initial grid matrix G 0
S6: according to the grid matrix G divided in the step S4 0 And the characteristic points extracted in the step S1, for the characteristic points in the step S1, the grid matrix G is formed according to Euclidean distance 0 Searching for the current feature point a i 、b i Nearest neighbor feature point a of j 、b j And calculating whether the similarity is greater than a set threshold value or not under the Hamming distance;
if yes, the similarity of the two feature points is low, and the current feature points are merged into an initially set error matching feature point set M for error matching;
otherwise, go to S7;
s7: searching 3 nearest neighbor feature points around the feature points currently being compared according to the Euclidean distance, and constructing local neighborhood constraint by calculating k-dimensional inspection and the Hamming distance of the neighborhood;
s8: setting a comparison threshold, comparing f according to the similarity function p And image f q Judging whether the result of the similarity function is smaller than the threshold value or not according to the local neighborhood constraint similarity of the corresponding characteristic points:
if so, the characteristic point pair is a characteristic point pair which is supposed to be actually matched, and the characteristic point pair is merged into an initially set actually matched point set Y;
if not, the characteristic point pair is assumed to be a characteristic point pair in error matching, and the characteristic point pair is merged into an initially set error matching point set M;
s9: searching for images f separately p And image f q Feature point a currently being compared with i 、b i With their respective nearest neighbor feature points a j 、b j A connecting line between and G 0 The feature points of the interlaced grid edge lines are the current feature points a i 、b i And its nearest neighbor feature point a j 、b j Pixel distance therebetween is noted as L 1 The current feature point a i 、b i And G 0 The pixel distance between the intersections of the grid edge lines is recorded as L 2 And comparing L 1 And L 2 The comparative formula is as follows:
Figure BDA0003705466890000031
if yes, then the current feature point a is represented i 、b i Nearest neighbor feature point a of j 、b j Within the other mesh, mesh G needs to be re-divided 1 And proceeds to S10;
if not, the current characteristic point a is represented i 、b i Nearest neighbor feature point a of j 、b j Directly outputting a real feature point set Y in the grid;
s10: defining a group of variable neighborhoods R by a variable neighborhood searching algorithm, inputting the variable neighborhoods R into the algorithm, and re-determining a secondary grid matrix G 1
S11: repeating the steps S7 to S10, and finally outputting the feature point set Y that is truly matched.
Preferably, in step S2, the neighborhood N is i Expressed as:
N i ={n j |n j ≠n i ,L(n j ,n i )<r};
wherein n is i As the current feature point, n j Is equal to n i Different characteristic points, L (n) j ,n i ) R is the comparison threshold of the set neighborhood, which is the pixel distance between two feature points.
Preferably, in step S3, it is assumed that the characteristic point pair S is defined as:
Figure BDA0003705466890000032
wherein S i For correspondence of pairs of primary characteristic points within the S set, p i As an image f p Characteristic point of (1), q i As an image f q Neutralization image f p The corresponding feature point, d (,) is the Hamming distance between two feature descriptors, d H N is the number of pairs of feature points for the true corresponding distance threshold.
Preferably, in step S5, a series of grid matrix scales are set, an algorithm execution speed test is performed, and one grid matrix scale with the highest algorithm execution speed is finally selected as the grid matrix G 0 The grid matrix size of (a).
Preferably, in step S6, the similarity calculation formula is:
Figure BDA0003705466890000041
wherein the content of the first and second substances,
Figure BDA0003705466890000042
for an XOR operation, f k (a j ) And f k (b j ) K-dimensional tests, which are all ORB descriptors, satisfy a Gaussian distribution, and have the calculation formula:
Figure BDA0003705466890000043
Figure BDA0003705466890000044
wherein τ (a; x) j ,y j ) For the binary intensity test based on the gray scale image, the formula is as follows:
Figure BDA0003705466890000045
preferably, in step S7, the local neighborhood structure method calculates f separately p And f q The hamming distance of k-dimensional test and its neighborhood between the feature point currently being compared and its nearest neighbor 3 feature points.
Preferably, in step S10, the variable neighborhood region R is specifically defined as:
Figure BDA0003705466890000046
wherein R is the neighborhood of the current feature point, R m For a set of neighborhoods set in advance, M is the number of the set of neighborhoods.
In the above technical solution, the feature point local neighborhood feature matching method based on the ORB descriptor provided by the present invention has the following beneficial effects:
the invention provides a characteristic point local neighborhood characteristic matching method based on ORB descriptor, which is implemented by establishing an initial grid matrix G 0 Can be used forEffectively improving the speed of matching the feature points, matching the feature point sets with the truth and the falsity, and establishing a quadratic grid matrix G 1 By a quadratic grid matrix G 1 And then, the feature point set is processed for multiple times, and the mismatching point set is removed, so that mismatching can be effectively reduced, the success rate of feature point matching is improved, and the robustness of various feature matching algorithms is further improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the present invention, and other drawings can be obtained by those skilled in the art according to the drawings.
FIG. 1 shows a schematic flow diagram of the present invention;
FIG. 2 shows a partial algorithm flow diagram of the present invention;
FIG. 3 shows a flowchart of the steps S9-S11 algorithm of the present invention;
FIG. 4 illustrates a nearest neighbor local constraint diagram of the present invention;
FIG. 5 is a diagram illustrating the local neighborhood structure of the present invention;
FIG. 6 is a graph illustrating the initial matching results of two images in a pipeline dataset in an embodiment provided by the present invention;
fig. 7 is a diagram illustrating a result of removing the mismatched feature points in the embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1-7, a feature point local neighborhood feature matching method based on ORB descriptors includes the following steps:
s1: two frames of images f under the same or similar scenes are subjected to ORB (object oriented bounding box) rapid feature point extraction and description algorithm p And image f q Detecting and describing the characteristics thereof, and extracting an image f p And image f q The feature point of (1);
s2: calculating the pixel distance at the Euclidean distance and searching the image f in the step S1 p And image f q Neighborhood N of each feature point in i Setting a threshold r, excluding the feature points exceeding the threshold r from the neighborhood, the neighborhood N i Expressed as:
N i ={n j |n j ≠n i ,L(n j ,n i )<r};
wherein n is i As the current feature point, n j Is equal to n i Different characteristic points, L (n) j ,n i ) For the pixel distance between two feature points, a threshold value r is selected through a plurality of groups of experiments, the number of wrong matching removal under the threshold value r is the largest, the number of correct matching can be kept to be the largest under the threshold value r, and the accuracy and the recall rate of the correct matching are the best;
s3: the image f in step S2 is selected p Where each feature point corresponds to an image f q The adjacent characteristic points in the corresponding neighborhood form an assumed characteristic point pair, the assumed characteristic point pair set is S, and the assumed characteristic point pair S is defined as:
Figure BDA0003705466890000061
wherein s is i For correspondence of pairs of primary characteristic points within the S set, p i As an image f p Characteristic point of (1), q i As an image f q Neutralization image f p The corresponding feature point, d (,) is the Hamming distance between two feature descriptors, d H N is the number of the characteristic point pairs as the actual corresponding distance threshold;
s4: as shown in fig. 2, a feature point pair set S is input in the algorithm, and two point sets are simultaneously established: a real matching feature point set Y and an error matching feature point set M;
s5: finding image f by GMS grid method p And image f q The nearest neighbor feature point of all feature points in the image f p And image f q Divide the non-overlapping initial grid matrix G 0 When searching nearest neighbor characteristic points, the global search can be changed into local search in grids, so that the complexity is reduced, the algorithm operation speed is increased, a series of grid matrix scales are set, the algorithm execution speed test is carried out, and finally, one grid matrix scale with the highest algorithm execution speed is selected as a grid matrix G 0 The size of the grid matrix of (2), as shown in fig. 4, in the embodiment provided by the present invention, the initial grid matrix G0 is defined as a size of 8 × 9;
s6: according to the grid matrix G divided in the step S4 0 And the characteristic points extracted in the step S1, for the characteristic points in the step S1, the grid matrix G is formed according to Euclidean distance 0 Searching for the current feature point a i 、b i Nearest neighbor feature point a of j 、b j And calculating whether the similarity is greater than a set threshold value or not under the Hamming distance;
is set in the image f p The feature point currently being compared is a i The nearest neighbor feature point is a j At f q The feature point currently being compared in (a) is (b) i Its nearest neighbor feature point is b j According to the similarity formula
Figure BDA0003705466890000062
Calculating the similarity of the two;
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003705466890000063
for an XOR operation, f k (a j ) And f k (b j ) Is a k-dimensional test of ORB descriptor, which satisfies Gaussian distribution, and the calculation formula is as follows:
Figure BDA0003705466890000064
wherein τ (a; x) j ,y j ) For the binary intensity test based on the gray scale image, the idea of ORB feature point descriptor is to select pixel point pairs around the FAST corner in a certain mode, then compare the pixel intensities of the point pairs, the pixel intensity is large and is 1 in the calculation of the pixel point pair, otherwise is 0, and the formula is expressed as:
Figure BDA0003705466890000071
wherein, p (x) j ) Is x j Where x is [ u, v ]]Pixel intensity of p (y) j ) Is y j When y is ═ w, t]Pixel intensity of [ u, v ]]And [ w, t]Coordinates of pixel point pairs selected according to a certain mode around the FAST angular point;
setting a threshold lambda for similarity comparison, wherein the threshold lambda is selected through a plurality of groups of experimental experiences, the number of wrong matching removal is the largest under the threshold, the number of correct matching is kept to be the largest, and the accuracy and the recall rate of the correct matching are both the best;
judgment of
Figure BDA0003705466890000072
Whether it is greater than a threshold λ;
if yes, the similarity of the two feature points is low, and the current feature point is merged into an initially set error matching feature point set M for error matching, that is, M ═ M ues i
Otherwise, go to S7;
s7: searching 3 nearest neighbor feature points around the feature points currently being compared according to the Euclidean distance, and constructing local neighborhood constraint by calculating k-dimensional inspection and the Hamming distance of the neighborhood;
the local neighborhood construction method is to respectively calculate an image f p And image f q K-dimensional test between the feature points currently being compared and the 3 feature points nearest to the feature points and the Hamming distance of the neighborhood of the feature points;
at f p A is searched out in the corresponding grid according to Euclidean distance i Three nearest neighbor feature points around the feature point are respectively: m is a,0 、m a,1 And m a,2 . At f q B is searched out in the corresponding grid in i Three surrounding nearest neighbor feature points, each being m b,0 、m b,1 And m b,2 . By computing the Hamming distance of the k-dimensional test and its neighborhood, pair f p And f q A in i And b i Local neighborhood constraints are constructed separately, as shown in FIG. 5, and are expressed as:
Figure BDA0003705466890000073
and
Figure BDA0003705466890000074
wherein, a i For assuming feature point pairs f p M characteristic point of the current comparison a,j Is around the feature point a i The three nearest neighbor feature points found in its neighborhood, cost (,) is the similarity of the ORB descriptor represented by calculating the hamming distance between the current feature point and the nearest neighbor feature point;
s8: setting a comparison threshold, comparing f according to the similarity function p And image f q Judging whether the result of the similarity function is smaller than the threshold value or not according to the local neighborhood constraint similarity of the corresponding characteristic points;
contrast image f for the local neighborhood constraint constructed in S7 p And image f q And (3) distinguishing real matching and error matching through a similarity function according to the similarity of local neighborhood constraints of the corresponding feature points, and defining the similarity function of the local neighborhood constraints as:
Figure BDA0003705466890000081
setting a threshold value epsilon for comparing similar functions through a plurality of groups of experiments, wherein the threshold value epsilon is selected through a plurality of groups of experimental experiencesThe number of false matches removed is the largest at the threshold, while the number of correct matches is kept the largest, and the accuracy and recall rate of correct matches are both the best, and then the result of the similarity function is compared with the threshold, and F (a) is judged i ,b i ) If the similarity function value is smaller than the threshold epsilon, the similarity function value of the true match is smaller, and the similarity function value of the false match is larger, and the comparison formula is as follows:
Figure BDA0003705466890000082
if yes, the characteristic point pair is a characteristic point pair which assumes true matching, and the characteristic point pair is merged into an initially set true matching point set Y, that is, Y ═ us $ i
Otherwise, the characteristic point pair is a characteristic point pair which is supposed to be in error matching, and the characteristic point pair is merged into an initially set error matching point set M, namely M is MeU i
S9: as shown in fig. 2, after the matching of the true and false feature points of the assumed feature point set in the previous step is completed, the point set is divided into two parts, which are: a set of true matching feature points Y and a set of false matching feature points M. In the step, a solution is taken to the problem that the nearest neighbor feature points close to the edge of the grid possibly fall into different grids to form mismatching, firstly, a wrong point set M distributed in the previous step is input, and the images f are respectively searched p And image f q Feature point a currently being compared with i 、b i With their respective nearest neighbor feature points a j 、b j A connecting line between and G 0 The feature points of the interlaced grid edge lines are the current feature points a i 、b i And its nearest neighbor feature point a j 、b j Pixel distance therebetween is noted as L 1 The current feature point a i 、b i And G 0 The pixel distance between the intersections of the grid edge lines is recorded as L 2 And comparing L 1 And L 2 The comparative formula is as follows:
Figure BDA0003705466890000091
if yes, then the current feature point a is represented i 、b i Nearest neighbor feature point a of j 、b j Within the other mesh, mesh G needs to be re-divided 1 And proceeds to S10;
if not, the current characteristic point a is represented i 、b i Nearest neighbor feature point a of j 、b j Directly outputting a real feature point set Y in the grid;
s10: defining a group of variable neighborhoods R by using the idea of variable neighborhoods, inputting the variable neighborhoods R into an algorithm, and re-determining a second grid matrix G 1
The variable neighborhood R is specifically defined as:
Figure BDA0003705466890000092
wherein R is the neighborhood of the current feature point, R m M is the number of a group of neighborhoods set in advance;
after a set of neighborhoods is set, the neighborhood defined in S2 becomes the neighborhood
Figure BDA0003705466890000093
Inputting the neighborhood into an algorithm, and finally determining the scale of a second grid matrix after a plurality of groups of experimental tests, wherein the number of wrong matching removal under the grid matrix is the largest, the number of correct matching is kept the largest, and the accuracy and the recall rate of the correct matching are both the best 1 The scale of (a) is 6X 7;
s11: repeating the steps S7 to S10, and finally outputting a feature point set Y which is truly matched;
let the correspondence of the primary feature point pair in the assumed mismatching feature point set M output at S9 be h i For all h i Finding out respective nearest neighbor characteristic points of characteristic point pairs belonging to the M, and setting the nearest neighbor characteristic points at f p Of the characters currently being comparedCharacteristic point is p i Its nearest neighbor feature point is p j ,f q The feature point currently being compared is q i Its nearest neighbor feature point is q j
According to the formula
Figure BDA0003705466890000094
Calculating the similarity of the two;
setting a threshold lambda for similarity comparison, selecting the threshold lambda through a plurality of groups of experiments, determining that the number of wrong matches removed is the largest under the threshold, the number of correct matches is the largest, the accuracy and recall rate of the correct matches are both the best, and judging
Figure BDA0003705466890000095
Whether it is greater than λ:
if yes, it means that the similarity of two feature points is low, and it is an error match, and then it is merged into the initially set error matching feature point set M, i.e. M ═ mues i
Otherwise, searching p in the corresponding grid of fp i Three nearest neighbor feature points around the feature point are respectively: m is p,0 、m p,1 And m p,2 . Search out b within the corresponding grid in fq i The three surrounding nearest neighbor feature points are: m is a unit of q,0 、m q,1 And m q,2 . Local neighborhood constraints are respectively constructed for two feature points in the two images by calculating k-dimensional inspection and the Hamming distance of the neighborhood, and are expressed as follows:
Figure BDA0003705466890000101
and
Figure BDA0003705466890000102
then, the similarity function of the two local neighborhood constraints is calculated, and the formula is as follows:
Figure BDA0003705466890000103
setting a threshold epsilon for comparing similar functions, wherein the threshold epsilon is selected through a plurality of groups of experimental experiences, the number of wrong matching removal under the threshold is the largest, the number of correct matching is kept the largest, the accuracy and the recall rate of the correct matching are both the best, and then comparing the calculation result of the similar functions with the threshold to judge: f (p) i ,q i ) If the similarity function value is smaller than the threshold epsilon, the similarity function value of the true match is smaller, and the similarity function value of the false match is larger, and the comparison formula is as follows:
Figure BDA0003705466890000104
if yes, it means that the local neighborhood structure around the feature point in the two images is true matching, and the feature point is merged into the initially set true matching point set Y, i.e. Y ═ us i
Otherwise, the local neighborhood structure surrounding the feature point in the two images is false matching, and the feature point is merged into a false matching point set M which is initially set, namely M is Meu i
The invention tests in the data set of the pipeline, white circles in figure 6 represent feature points, white lines represent matching pairs of real feature points in two images, and black lines represent matching pairs of wrong feature points in two images; the white circles and connecting lines in fig. 7 represent the two images after the mismatch is removed.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The principle and the implementation mode of the invention are explained by applying specific embodiments in the invention, and the description of the embodiments is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, the specific embodiments and the application range may be changed, and in summary, the content of the present specification should not be construed as a limitation to the present invention.
An embodiment of the present application further provides a specific implementation manner of an electronic device, which is capable of implementing all steps in the method in the foregoing embodiment, where the electronic device specifically includes the following contents:
a processor (processor), a memory (memory), a communication Interface (Communications Interface), and a bus;
the processor, the memory and the communication interface complete mutual communication through the bus;
the processor is configured to call a computer program in the memory, and the processor implements all the steps of the method in the above embodiments when executing the computer program, for example, the processor implements the following steps when executing the computer program:
s1: two frames of images f under the same or similar scenes are subjected to ORB (object oriented bounding box) rapid feature point extraction and description algorithm p And image f q Detecting and describing the characteristics thereof, and extracting an image f p And image f q The feature point of (1);
s2: calculating the pixel distance at the Euclidean distance and searching the image f in the step S1 p And image f q Neighborhood N of each feature point in i Setting a threshold r, and excluding the feature points exceeding the threshold r from the neighborhood;
s3: the image f in step S2 is selected p Where each feature point corresponds to an image f q The adjacent characteristic points in the corresponding adjacent domains form an assumed characteristic point pair, and the assumed characteristic point pair set is S;
s4: inputting a set S of assumed characteristic point pairs in an algorithm, and simultaneously establishing two point sets: a real matching feature point set Y and an error matching feature point set M;
s5: finding image f by GMS grid method p And image f q All the feature points in the image f and the nearest neighbor feature point p And image f q Divide the non-overlapping initial grid matrix G 0
S6: according to the grid matrix G divided in the step S4 0 And the characteristic points extracted in the step S1, for the characteristic points in the step S1, the grid matrix G is formed according to Euclidean distance 0 Searching for the current feature point a i 、b i Nearest neighbor feature point a of j 、b j And calculating whether the similarity is greater than a set threshold value or not under the Hamming distance;
if yes, the similarity of the two feature points is low, and the current feature points are merged into an initially set error matching feature point set M for error matching;
otherwise, go to S7;
s7: searching 3 nearest neighbor feature points around the feature points currently being compared according to the Euclidean distance, and constructing local neighborhood constraint by calculating k-dimensional inspection and the Hamming distance of the neighborhood;
s8: setting a comparison threshold, comparing f according to the similarity function p And image f q Judging whether the result of the similarity function is smaller than the threshold value or not according to the local neighborhood constraint similarity of the corresponding characteristic points:
if so, the characteristic point pair is a characteristic point pair which is supposed to be actually matched, and the characteristic point pair is merged into an initially set actually matched point set Y;
if not, the characteristic point pair is assumed to be a characteristic point pair in error matching, and the characteristic point pair is merged into an initially set error matching point set M;
s9: searching for images f separately p And image f q Feature point a currently being compared with i 、b i With their respective nearest neighbor feature points a j 、b j A connecting line between and G 0 The feature points of the interlaced grid edge lines are the current feature points a i 、b i And its nearest neighbor feature point a j 、b j Pixel distance therebetween is noted as L 1 The current feature point a i 、b i And G 0 The pixel distance between the intersections of the grid edge lines is recorded as L 2 And comparing L 1 And L 2 The comparative formula is as follows:
Figure BDA0003705466890000131
if yes, then the current feature point a is represented i 、b i Nearest neighbor feature point a of j 、b j Within the other mesh, mesh G needs to be re-divided 1 And proceeds to S10;
if not, the current characteristic point a is represented i 、b i Nearest neighbor feature point a of j 、b j Directly outputting a real feature point set Y in the grid;
s10: defining a group of variable neighborhoods R by a variable neighborhood searching algorithm, inputting the variable neighborhoods R into the algorithm, and re-determining a secondary grid matrix G 1
S11: repeating the steps S7 to S10, and finally outputting the feature point set Y that is truly matched.
Embodiments of the present application also provide a computer-readable storage medium capable of implementing all the steps of the method in the above embodiments, where the computer-readable storage medium stores thereon a computer program, and the computer program when executed by a processor implements all the steps of the method in the above embodiments, for example, the processor implements the following steps when executing the computer program:
s1: two frames of images f under the same or similar scenes are subjected to ORB (object oriented bounding box) rapid feature point extraction and description algorithm p And image f q Detecting and describing the characteristics thereof, and extracting an image f p And image f q The feature point of (1);
s2: calculating the pixel distance at the Euclidean distance and searching the image f in the step S1 p And image f q Neighborhood N of each feature point in i Setting a threshold r, and excluding the feature points exceeding the threshold r from the neighborhood;
s3: the image f in step S2 is selected p Where each feature point corresponds to an image f q The adjacent characteristic points in the corresponding neighborhood form an assumed characteristic point pair, and the assumed characteristic point pair set is S;
s4: inputting an assumed characteristic point pair combination set S in the algorithm, and simultaneously establishing two point sets: a real matching feature point set Y and an error matching feature point set M;
s5: finding image f by GMS grid method p And image f q All the feature points in the image are summed up, and the nearest neighbor feature point is added to the imagef p And image f q Divide the non-overlapping initial grid matrix G 0
S6: according to the grid matrix G divided in the step S4 0 And the characteristic points extracted in the step S1, for the characteristic points in the step S1, the grid matrix G is formed according to Euclidean distance 0 Searching for the current feature point a i 、b i Nearest neighbor feature point a of j 、b j And calculating whether the similarity is greater than a set threshold value or not under the Hamming distance;
if yes, the similarity of the two feature points is low, and the current feature points are merged into an initially set error matching feature point set M for error matching;
otherwise, go to S7;
s7: searching 3 nearest neighbor feature points around the feature points which are currently compared according to the Euclidean distance, and constructing local neighborhood constraint by calculating k-dimensional inspection and the Hamming distance of the neighborhood thereof;
s8: setting a comparison threshold, comparing f according to the similarity function p And image f q Judging whether the result of the similarity function is smaller than the threshold value or not according to the local neighborhood constraint similarity of the corresponding characteristic points:
if so, the characteristic point pair is a characteristic point pair which is supposed to be actually matched, and the characteristic point pair is merged into an initially set actually matched point set Y;
if not, the characteristic point pair is assumed to be a characteristic point pair in error matching, and the characteristic point pair is merged into an initially set error matching point set M;
s9: searching for images f separately p And image f q Feature point a currently being compared with i 、b i With their respective nearest neighbor feature points a j 、b j A connecting line between and G 0 The feature points of the interlaced grid edge lines are the current feature points a i 、b i And its nearest neighbor feature point a j 、b j Pixel distance therebetween is noted as L 1 The current feature point a i 、b i And G 0 The pixel distance between the intersections of the grid edge lines is recorded as L 2 And compareL 1 And L 2 The comparative formula is as follows:
Figure BDA0003705466890000141
if yes, then the current feature point a is represented i 、b i Nearest neighbor feature point a of j 、b j Within the other mesh, mesh G needs to be re-divided 1 And proceeds to S10;
if not, the current characteristic point a is represented i 、b i Nearest neighbor feature point a of j 、b j Directly outputting a real feature point set Y in the grid;
s10: defining a group of variable neighborhoods R through a variable neighborhood search algorithm, inputting the variable neighborhoods R into the algorithm, and re-determining a second grid matrix G 1
S11: repeating the steps S7 to S10, and finally outputting the feature point set Y that is truly matched.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the hardware + program class embodiment, since it is basically similar to the method embodiment, the description is relatively simple, and the relevant points can be referred to the partial description of the method embodiment. Although embodiments of the present description provide method steps as described in embodiments or flowcharts, more or fewer steps may be included based on conventional or non-inventive means. The order of steps recited in the embodiments is merely one manner of performing the steps in a multitude of orders and does not represent the only order of execution. When an actual apparatus or end product executes, it may execute sequentially or in parallel (e.g., parallel processors or multi-threaded environments, or even distributed data processing environments) according to the method shown in the embodiment or the figures. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the presence of additional identical or equivalent elements in a process, method, article, or apparatus that comprises the recited elements is not excluded. For convenience of description, the above devices are described as being divided into various modules by functions, and are described separately. Of course, in implementing the embodiments of the present description, the functions of each module may be implemented in one or more software and/or hardware, or a module implementing the same function may be implemented by a combination of multiple sub-modules or sub-units, and the like. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of logical functional division, and other divisions may be realized in practice, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
As will be appreciated by one skilled in the art, embodiments of the present description may be provided as a method, system, or computer program product. Accordingly, embodiments of the present description may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present description may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and so forth) having computer-usable program code embodied therein. The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment. In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of an embodiment of the specification.
In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction. The above description is only an example of the embodiments of the present disclosure, and is not intended to limit the embodiments of the present disclosure. Various modifications and variations to the embodiments described herein will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the embodiments of the present specification should be included in the scope of the claims of the embodiments of the present specification.

Claims (7)

1. A feature point local neighborhood feature matching method based on ORB descriptors is characterized by comprising the following steps:
s1: two frames of images f under the same or similar scenes are subjected to ORB (object oriented bounding box) rapid feature point extraction and description algorithm p And image f q Detecting and describing the characteristics thereof, and extracting an image f p And image f q The feature point of (1);
s2: calculating the pixel distance at the Euclidean distance and searching the image f in the step S1 p And image f q Neighborhood N of each feature point in i Setting a threshold r, and excluding the feature points exceeding the threshold r from the neighborhood;
s3: the image f in step S2 is selected p Where each feature point corresponds to an image f q The adjacent characteristic points in the corresponding neighborhood form an assumed characteristic point pair, and the assumed characteristic point pair set is S;
s4: inputting an assumed characteristic point pair combination set S in the algorithm, and simultaneously establishing two point sets: a real matching feature point set Y and an error matching feature point set M;
s5: finding image f by GMS grid method p And image f q All the feature points in the image f and the nearest neighbor feature point p And image f q Divide the non-overlapping initial grid matrix G 0
S6: according to the grid matrix G divided in the step S4 0 And the characteristic points extracted in the step S1, for the characteristic points in the step S1, the grid matrix G is formed according to Euclidean distance 0 Searching for the current feature point a i 、b i Nearest neighbor feature point a of j 、b j And calculating whether the similarity is greater than a set threshold value or not under the Hamming distance;
if yes, the similarity of the two feature points is low, and the current feature points are merged into an initially set error matching feature point set M for error matching;
otherwise, go to S7;
s7: searching 3 nearest neighbor feature points around the feature points which are currently compared according to the Euclidean distance, and constructing local neighborhood constraint by calculating k-dimensional inspection and the Hamming distance of the neighborhood thereof;
s8: setting a comparison threshold, comparing f according to the similarity function p And image f q The similarity of local neighborhood constraints of the corresponding characteristic points is judged to judge whether the result of the similarity function isLess than the threshold:
if yes, the characteristic point pair is a characteristic point pair which is supposed to be actually matched, and the characteristic point pair is combined into an initially set actually matched point set Y;
if not, the characteristic point pair is a characteristic point pair which is supposed to be in error matching, and the characteristic point pair is merged into an initially set error matching point set M;
s9: searching for images f separately p And image f q Feature point a currently being compared with i 、b i With their respective nearest neighbor feature points a j 、b j A connecting line between and G 0 The feature points of the interlaced grid edge lines are the current feature points a i 、b i And its nearest neighbor feature point a j 、b j Pixel distance therebetween is noted as L 1 The current feature point a i 、b i And G 0 The pixel distance between the intersections of the grid edge lines is recorded as L 2 And comparing L 1 And L 2 The comparative formula is as follows:
Figure FDA0003705466880000021
if yes, then the current feature point a is represented i 、b i Nearest neighbor feature point a of j 、b j Within the other mesh, mesh G needs to be re-divided 1 And proceeds to S10;
if not, the current characteristic point a is represented i 、b i Nearest neighbor feature point a of j 、b j Directly outputting a real feature point set Y in the grid;
s10: defining a group of variable neighborhoods R through a variable neighborhood search algorithm, inputting the variable neighborhoods R into the algorithm, and re-determining a second grid matrix G 1
S11: repeating the steps S7 to S10, and finally outputting the feature point set Y that is truly matched.
2. The ORB descriptor-based local neighborhood of feature points according to claim 1The feature matching method is characterized in that in the step S2, the neighborhood N i Expressed as:
N i ={n j |n j ≠n i ,L(n j ,n i )<r};
wherein n is i As the current feature point, n j Is equal to n i Different characteristic points, L (n) j ,n i ) R is the comparison threshold of the set neighborhood, which is the pixel distance between two feature points.
3. The method according to claim 1, wherein in step S3, it is assumed that the feature point pair S is defined as:
Figure FDA0003705466880000022
wherein S i For correspondence of pairs of primary characteristic points within the S set, p i As an image f p Characteristic point of (1), q i As an image f q Neutralization image f p The corresponding feature point, d (,) is the Hamming distance between two feature descriptors, d H N is the number of pairs of feature points for the true corresponding distance threshold.
4. The method as claimed in claim 1, wherein in step S5, by setting a series of grid matrix scales and performing algorithm execution speed test, one grid matrix scale with the highest algorithm execution speed is finally selected as the grid matrix G 0 The grid matrix size of (a).
5. The method according to claim 1, wherein in step S8, the similarity calculation formula is:
Figure FDA0003705466880000031
wherein, the first and the second end of the pipe are connected with each other,
Figure FDA0003705466880000032
for an XOR operation, f k (a j ) And f k (b j ) K-dimensional tests, which are all ORB descriptors, satisfy a Gaussian distribution, and have the calculation formula:
Figure FDA0003705466880000033
Figure FDA0003705466880000034
wherein τ (a; x) j ,y j ) For the binary intensity test based on the gray scale image, the formula is as follows:
Figure FDA0003705466880000035
6. the ORB descriptor-based feature point local neighborhood feature matching method according to claim 1, wherein in step S7, the local neighborhood construction method is to calculate f separately p And f q The hamming distance of k-dimensional test and its neighborhood between the feature point currently being compared and its nearest neighbor 3 feature points.
7. The method according to claim 1, wherein in step S10, the local neighborhood feature matching method for feature points based on ORB descriptors is specifically defined as:
Figure FDA0003705466880000036
wherein R is the neighborhood of the current feature point, R m For a set of neighborhoods set in advance, M is the number of the set of neighborhoods.
CN202210703901.1A 2022-06-21 2022-06-21 ORB descriptor-based feature point local neighborhood feature matching method Active CN115049847B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210703901.1A CN115049847B (en) 2022-06-21 2022-06-21 ORB descriptor-based feature point local neighborhood feature matching method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210703901.1A CN115049847B (en) 2022-06-21 2022-06-21 ORB descriptor-based feature point local neighborhood feature matching method

Publications (2)

Publication Number Publication Date
CN115049847A true CN115049847A (en) 2022-09-13
CN115049847B CN115049847B (en) 2024-04-16

Family

ID=83163829

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210703901.1A Active CN115049847B (en) 2022-06-21 2022-06-21 ORB descriptor-based feature point local neighborhood feature matching method

Country Status (1)

Country Link
CN (1) CN115049847B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109697692A (en) * 2018-12-29 2019-04-30 安徽大学 One kind being based on the similar feature matching method of partial structurtes
CN110675437A (en) * 2019-09-24 2020-01-10 重庆邮电大学 Image matching method based on improved GMS-ORB characteristics and storage medium
WO2021035988A1 (en) * 2019-08-30 2021-03-04 长安大学 Method and apparatus for quickly matching and extracting feature of unmanned aerial vehicle visual image
WO2021212874A1 (en) * 2020-04-24 2021-10-28 平安科技(深圳)有限公司 Palm print mismatching point elimination method, apparatus, and device, and storage medium
WO2022002039A1 (en) * 2020-06-30 2022-01-06 杭州海康机器人技术有限公司 Visual positioning method and device based on visual map

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109697692A (en) * 2018-12-29 2019-04-30 安徽大学 One kind being based on the similar feature matching method of partial structurtes
WO2021035988A1 (en) * 2019-08-30 2021-03-04 长安大学 Method and apparatus for quickly matching and extracting feature of unmanned aerial vehicle visual image
CN110675437A (en) * 2019-09-24 2020-01-10 重庆邮电大学 Image matching method based on improved GMS-ORB characteristics and storage medium
WO2021212874A1 (en) * 2020-04-24 2021-10-28 平安科技(深圳)有限公司 Palm print mismatching point elimination method, apparatus, and device, and storage medium
WO2022002039A1 (en) * 2020-06-30 2022-01-06 杭州海康机器人技术有限公司 Visual positioning method and device based on visual map

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
C.CAMPOS ET AL.: "Orb-slam3: An accurate open-source library for visual visual-inertial and multimap slam", 《IEEE TRANSACTIONS ON ROBOTICS》, vol. 37, no. 6, 31 December 2021 (2021-12-31), pages 1874 - 1890, XP011890943, DOI: 10.1109/TRO.2021.3075644 *
唐灿;唐亮贵;刘波;: "图像特征检测与匹配方法研究综述", 南京信息工程大学学报(自然科学版), no. 03, 28 May 2020 (2020-05-28), pages 261 - 273 *
廖弘真等: "一种改进的ORB特征匹配算法", 《北京航空航天大学学报》, vol. 47, no. 10, 31 December 2021 (2021-12-31), pages 2149 - 2154 *
陈方杰等: "基于改进网格划分统计的特征点快速匹配方法", 《计算机测量与控制》, vol. 27, no. 08, 31 December 2019 (2019-12-31), pages 231 - 235 *

Also Published As

Publication number Publication date
CN115049847B (en) 2024-04-16

Similar Documents

Publication Publication Date Title
CN107392930B (en) Quantum Canny edge detection method
Li et al. Line segment matching and reconstruction via exploiting coplanar cues
US20230169677A1 (en) Pose Estimation Method and Apparatus
US9430711B2 (en) Feature point matching device, feature point matching method, and non-transitory computer readable medium storing feature matching program
CN111797744B (en) Multimode remote sensing image matching method based on co-occurrence filtering algorithm
CN111696130A (en) Target tracking method, target tracking apparatus, and computer-readable storage medium
CN112508835A (en) Non-contrast agent medical image enhancement modeling method based on GAN
CN112215079B (en) Global multistage target tracking method
CN114581646A (en) Text recognition method and device, electronic equipment and storage medium
CN112465876A (en) Stereo matching method and equipment
Zhang et al. Splitting and merging based multi-model fitting for point cloud segmentation
CN109840529B (en) Image matching method based on local sensitivity confidence evaluation
CN111311593A (en) Multi-ellipse detection and evaluation algorithm, device, terminal and readable storage medium based on image gradient information
CN112288009A (en) R-SIFT chip hardware Trojan horse image registration method based on template matching
CN115049847A (en) Characteristic point local neighborhood characteristic matching method based on ORB descriptor
CN110969128A (en) Method for detecting infrared ship under sea surface background based on multi-feature fusion
CN111709955B (en) Image segmentation checking method, device, terminal and storage medium
CN113723428A (en) Image feature matching method, device and system and PCB visual detection equipment
CN113111741A (en) Assembly state identification method based on three-dimensional feature points
CN113496230A (en) Image matching method and system
Shih et al. An efficient fragment reconstruction using RANSAC algorithm
TWI776668B (en) Image processing method and image processing system
CN116994002B (en) Image feature extraction method, device, equipment and storage medium
CN112184833B (en) Hardware implementation system and method of binocular stereo matching algorithm
Bonato et al. A parallel hardware architecture for image feature detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant