CN109934298B - Progressive graph matching method and device of deformation graph based on clustering - Google Patents
Progressive graph matching method and device of deformation graph based on clustering Download PDFInfo
- Publication number
- CN109934298B CN109934298B CN201910209027.4A CN201910209027A CN109934298B CN 109934298 B CN109934298 B CN 109934298B CN 201910209027 A CN201910209027 A CN 201910209027A CN 109934298 B CN109934298 B CN 109934298B
- Authority
- CN
- China
- Prior art keywords
- image
- image feature
- matched
- matching
- feature cluster
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000000034 method Methods 0.000 title claims abstract description 57
- 230000000750 progressive effect Effects 0.000 title claims abstract description 33
- 239000013598 vector Substances 0.000 claims abstract description 115
- 239000011159 matrix material Substances 0.000 claims abstract description 112
- 241000970807 Thermoanaerobacterales Species 0.000 claims description 39
- 238000012546 transfer Methods 0.000 claims description 31
- 230000005540 biological transmission Effects 0.000 claims description 23
- 238000011156 evaluation Methods 0.000 claims description 18
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 claims description 9
- 239000000126 substance Substances 0.000 claims description 9
- 230000009466 transformation Effects 0.000 claims description 8
- 238000010606 normalization Methods 0.000 claims description 6
- 230000006870 function Effects 0.000 description 48
- 238000004422 calculation algorithm Methods 0.000 description 19
- 230000004913 activation Effects 0.000 description 14
- 238000001514 detection method Methods 0.000 description 8
- 230000008569 process Effects 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 230000004044 response Effects 0.000 description 5
- 230000002776 aggregation Effects 0.000 description 4
- 238000004220 aggregation Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 2
- 238000012217 deletion Methods 0.000 description 2
- 230000037430 deletion Effects 0.000 description 2
- 239000003814 drug Substances 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 238000002187 spin decoupling employing ultra-broadband-inversion sequences generated via simulated annealing Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000011524 similarity measure Methods 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Landscapes
- Image Analysis (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The invention discloses a progressive graph matching method and a progressive graph matching device of a deformation graph based on clustering, wherein the method comprises the following steps: 1) Acquiring the image characteristics of the image to be matched; 2) Combining two adjacent image features with the difference value smaller than a preset threshold value into one image feature to obtain an image feature cluster; 3) Acquiring similarity values of the candidate matches according to the matching relation among the image feature clusters, and acquiring indication vectors of the candidate matches according to the maximum value of the similarity values; 4) Judging whether the similarity value corresponding to the indication vector is converged; 5) If not, obtaining the confidence coefficient of the indication vector according to the indication vector of each candidate matching to obtain an updated matching matrix; and returning to execute the step 3) until the similarity value corresponding to the indication vector is converged; 6) And if so, taking the corresponding relation of each image characteristic in the matching matrix as the matching result of the image to be matched. By applying the embodiment of the invention, the efficiency of image feature matching can be improved.
Description
Technical Field
The invention relates to an image matching method and device, in particular to a progressive graph matching method and device based on a clustering deformation graph.
Background
With the rapid development of image matching technology, the application of image matching technology is gradually expanded to more new fields, such as image medicine, mapping, remote sensing signal processing, industrial detection, target identification and tracking, and the like. Moreover, in many studies and project projects on images, the application of image matching algorithms can have a significant impact on the overall study outcome. Therefore, the method has very important practical significance for the research of the image matching algorithm. The image matching is to perform matching comparison processing on two images to be matched, judge the similarity degree between the two images, and further judge whether the two images to be matched contain the same or similar contents. In general, the image matching process can be divided into two parts: firstly, selecting image matching characteristics: then matching image matching characteristics, and using a similarity measurement algorithm for the traditional Chinese medicine in the matching process. There are many algorithms for image matching, which can be generally classified into the following two categories: 1) The image matching method based on the pixel gray value in the image to be matched comprises the following steps: the characteristics can be described by utilizing all pixel information in a selected region in the image, the matching precision is high, and because the statistical analysis is performed in the region, the number of pixels needing to be processed is large, and accordingly, the calculation cost is high. 2) The image matching method based on the characteristics comprises the following steps: the features can be generally understood as features based on geometric shapes, and only the features such as points, lines, regions and the like in the image, such as edges, angular points and contours, are extracted. The similarity measure method of the features is to select according to the features extracted previously, and the normalized correlation measure, the distance-based measure, the mutual information-based measure and the like are commonly used.
The image matching algorithm commonly used at present is a feature-based image matching algorithm, and in the algorithm, the number of feature points of an image is far less than the number of pixel points, so that the calculation amount is greatly reduced. The image can be preprocessed before the characteristics are extracted to reduce the influence of noise, and the characteristic points have stronger adaptability to gray scale change, image size scaling and shielding. In addition, the characteristic points are sensitive to position change during matching, so that a part of matching points which are not corresponding to the positions can be filtered out, the matching precision is improved, and problems caused by using gray information for matching are effectively avoided.
Local features, which are representative local information on an image, are used in the feature-based image matching method. Common features used in such image matching methods are: corner points, lines, contours, edges, etc. The angular point detection algorithm comprises: harris corner detection algorithm, SUSAN corner detection algorithm, etc. After the image corner features are extracted by the method, the relationship of the corner information between two images is obtained by using some specific methods, so as to judge whether the segment corner pairs are matched. The straight line feature is widely used in image matching. The common Hough transform can be used to extract straight line features in the image. Medioni and Nevatia use straight-line features for image matching. Contour features are also a more common feature, and there are many shape-based matching algorithms. Shi and Kaick use the contour as a feature and match by calculating the consistency of the shape. Yang et al combines shape characteristics with position information to construct a low-dimensional image descriptor. Shu et al propose a profile feature based on the distribution of the midpoints of the profile in polar coordinates: the contour point distribution histogram, the characteristic accords with human visual perception, and the computational complexity is low.
However, the inventor finds that the feature-based matching algorithm only needs to calculate the related information in a single feature point or a feature point neighborhood, so that the calculation amount is greatly reduced, and the operation is simpler. Therefore, the feature-based image matching method is becoming more and more widespread in practical applications. In the conventional image matching method, the feature point matching is directly performed after the feature extraction is completed. The more objects in the image, the richer the content of the image, the more the number of extracted features is, and the feature points are all high-dimensional data. Therefore, when feature searching is carried out in a large amount of high-dimensional data, the required calculation amount is large, and the efficiency is low.
Disclosure of Invention
The invention aims to provide a progressive graph matching method and device based on a clustering deformation graph so as to improve the efficiency of image matching.
The invention solves the technical problems through the following technical scheme:
the embodiment of the invention provides a cluster-based progressive graph matching method of a deformation graph, which comprises the following steps:
1) Acquiring the image characteristics of the image to be matched, wherein the image to be matched comprises: the image matching method comprises the following steps of obtaining a first image to be matched and a second image to be matched, wherein the image characteristics comprise: image geometry features;
2) Combining two adjacent image features with the difference value smaller than a preset threshold value into one image feature to obtain an image feature cluster;
3) Acquiring candidate matching aiming at the image to be matched and other images to be matched according to the matching relation between the image feature cluster of the image to be matched and the image feature clusters of other images to be matched, acquiring a similarity value of the candidate matching according to a distribution matrix and a symmetrical similarity matrix of the candidate matching aiming at each candidate matching, and acquiring an indication vector of the candidate matching according to the maximum value of the similarity value;
4) Judging whether the similarity value corresponding to the indication vector is converged;
5) If not, obtaining the confidence coefficient of the indication vector according to the indication vector of each candidate matching, and updating the element values in the matching matrix under the condition that the confidence coefficient of the indication vector is not less than a preset threshold value to obtain an updated matching matrix; and returning to execute the step 3) until the similarity values corresponding to the indication vectors are converged, and taking the corresponding relation of each image feature in the matching matrix as the matching result of the image to be matched under the condition that the similarity values corresponding to each indication vector are all converged;
6) And if so, taking the corresponding relation of each image feature in the matching matrix as the matching result of the image to be matched under the condition that the similarity value corresponding to each indication vector is converged.
Optionally, the step 2) includes:
a: converting an image to be matched into an activated image, and taking each image feature in the activated image as an image feature cluster;
b: matching each image feature cluster in the activated image, and aiming at each image feature cluster pair in each image feature cluster pair, using a formula,obtaining a number of minimum dissimilarity points of the pair of image feature clusters, wherein,
k is the number of all possible element pair dissimilarity points in the image feature cluster pair; k is a radical of AP Is a preset first control parameter; c a The number of elements in the image feature cluster a; c b The number of elements in the image feature cluster b; r is AP The preset second control parameter; | is a Euclidean distance function; i C a ||C b Is a drawing |The number of possible pairs of elements between pairs of image feature clusters;
c: by means of the formula (I) and (II),and acquiring difference values of the image feature cluster pairs, wherein,
D kNN (k,C a ,C b ) The difference value between the image feature cluster a and the image feature cluster b in the image feature cluster pair is obtained; gamma is the number of the pairing element pairs between the image feature cluster a and the image feature cluster b in the image feature cluster pair; min is a minimum evaluation function; sigma is a summation function; d (m) i ,m j ) For the element m in the image feature cluster a i With the element m in the image feature cluster b j The similarity between them; i is the element m i The serial number of (2); j is the element m j The serial number of (2);
d: judging whether the difference value of the image feature cluster pair is smaller than a preset difference threshold value or not;
e: if yes, combining the image feature cluster pairs into an image feature cluster, and returning to execute the step B until the difference value of any image feature cluster pair is not smaller than the difference threshold value;
f: and if not, taking the image feature cluster as a combined image feature cluster under the condition that the difference value of other image feature cluster pairs is not smaller than a preset difference threshold value.
Optionally, the obtaining the similarity value of the candidate match according to the distribution matrix and the symmetric similarity matrix of the candidate match includes:
obtaining the constraint condition corresponding to the candidate match,wherein the content of the first and second substances,
m t matching corresponding initial indication vectors for the candidates; m t Is an allocation matrix; n is P The number of the image feature clusters in the first image to be matched is obtained; n is Q The number of the image feature clusters in the second image to be matched is obtained;is a full one vector with the scale of n;is a full one vector with the scale of n;
by means of the formula (I) and (II),calculating a transfer error of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched, corresponding to the candidate match, wherein,
d jb|ia a transmission error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and a matching (i, a) formed by an image feature cluster i in the first image to be matched and an image feature cluster a in the second image to be matched; | | is a norm function;the image feature cluster b in the second image to be matched is obtained;affine homodyne transformation results of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched are obtained;the image feature cluster j in the first image to be matched is obtained;
according to the transfer error from the image characteristic cluster j in the first image to be matched to the image characteristic cluster b in the second image to be matched, by using a formula,and calculating a symmetric similarity value corresponding to the candidate match, wherein,
W ia;jb matching the corresponding symmetric similarity value for the candidate, and the candidate matching is including the edgeAndthe candidate match of (2); p is a first image to be matched; q is a second image to be matched; epsilon P Is a set of edges contained in the first image to be matched; epsilon Q Is a set of edges contained in the second image to be matched; i is the ith point in the first image to be matched; j is the j point in the first image to be matched; a is the a-th point in the second image to be matched; b is the b-th point in the second image to be matched;matching corresponding clusters of image features for candidate matchesAnd matching image feature clustersA corresponding second order similarity function of symmetric transmission errors; d bj|ai A transmission error between a matching (b, j) pair formed by an image feature cluster b in the second image to be matched and an image feature cluster j in the first image to be matched and a matching (a, i) formed by an image feature cluster a in the second image to be matched and an image feature cluster i in the first image to be matched; d ia|jb A transfer error between a matching (i, a) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster a in the second image to be matched and a matching (j, b) formed by the image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched; d ai|bj The transfer error between the matching (a, i) formed by the image feature cluster a in the second image to be matched to the image feature cluster i in the first image to be matched and the matching (b, j) formed by the image feature cluster b in the second image to be matched to the image feature cluster j in the first image to be matched is obtained; alpha is a preset threshold value similar to the image feature cluster; max () is the maximum value evaluation function;
according to the symmetrical similarity value corresponding to the candidate matching, the symmetrical similarity value is used as an element to construct a symmetrical similarity matrix;
using a formula, S (M), based on a constraint matrix corresponding to the candidate match and a symmetric similarity matrix corresponding to the candidate match t )=M t T WM t Obtaining a similarity value for the candidate match, wherein,
S(M t ) For the t-th iteration, the distribution matrix is M t A similarity value of the candidate match; m t T For the t-th iteration, a matrix M is allocated t Transposing; w is the symmetric transmission error corresponding to the candidate match.
Optionally, the obtaining the candidate matching indication vector according to the maximum value of the similarity value includes:
by means of the formula (I) and (II),computing an indication vector corresponding to the candidate match, wherein,
matching corresponding indication vectors for the candidates;a variable evaluation function that is a maximum of the function; s (M) t ) A similarity value for the candidate match.
Optionally, the obtaining the confidence level of the indication vector includes:
by means of the formula (I) and (II),obtaining Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i The probability of correlation, wherein,
is Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i A probability of correlation; v Q The set of image feature clusters in the second image to be matched is obtained;the image feature cluster b in the image feature cluster set in the second image to be matched is obtained; v P The method comprises the steps of collecting image feature clusters in a first image to be matched;the image feature cluster b in the set of image feature clusters in the first image to be matched; m is a set of intermediate variables in the current iteration; match t A set of matching edges corresponding to the matching matrix; m is i Is the ith matching edge in the matching matrix, an An edge between an image feature cluster p and an image feature cluster i in a first image to be matched;an edge between an image feature cluster q and an image feature cluster i in a second image to be matched is defined; NN (·) is a nearest neighbor image feature function;is given by m i When intermediate variables are present, image feature clustersThe nearest neighbor feature of (a); k is the number of the minimum dissimilarity points of the image feature cluster pair; k is a radical of 2 Clustering a second parameter in the image feature cluster in the current iteration of kNN; z is a normalization function; and isexp () is an exponential function with a natural base number as the base;a transfer error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and an intermediate variable;
by means of the formula (I) and (II),is obtained in Match t In the selection of m i As the probability of the intermediate variable, wherein,
p(V P =v P |M=m i ,Match t ) In Match t In selecting m i Probability as an intermediate variable; v. of P The image feature clusters in the set of image feature clusters in the first image to be matched are obtained; k is a radical of 1 Clustering a first parameter in image feature clusters in the current iteration of kNN;
by means of the formula (I) and (II),obtaining Match t Middle image feature cluster V Q With intermediate variable m i The probability of correlation, wherein,
p(M=m i |Match t ) Is Match t Middle image feature cluster V Q With intermediate variable m i A probability of correlation; match t The number of matching edges corresponding to the matching matrix;
p(V P ,V Q |M t ) Is a confidence level indicating a vector; m t Is a matching matrix.
The embodiment of the invention also provides a progressive graph matching device of the deformation graph based on clustering, which comprises the following components:
the first acquisition module is used for acquiring image characteristics of an image to be matched, wherein the image to be matched comprises: the image matching method comprises the following steps of obtaining a first image to be matched and a second image to be matched, wherein the image characteristics comprise: image geometry features;
the merging module is used for merging two adjacent image features with the difference value smaller than a preset threshold value into one image feature to obtain an image feature cluster;
the second acquisition module is used for acquiring candidate matching aiming at the image to be matched and other images to be matched according to the matching relation between the image feature cluster of the image to be matched and the image feature clusters of other images to be matched, acquiring a similarity value of the candidate matching according to a distribution matrix and a symmetrical similarity matrix of the candidate matching aiming at each candidate matching, and acquiring an indication vector of the candidate matching according to the maximum value of the similarity value;
the judging module is used for judging whether the similarity value corresponding to the indication vector is converged;
a third obtaining module, configured to, if the determination result of the determining module is negative, obtain a confidence level of each candidate matching indication vector according to the indication vector, and update an element value in the matching matrix to obtain an updated matching matrix if the confidence level of the indication vector is not less than a preset threshold; a second obtaining module, configured to take the corresponding relationship of each image feature in the matching matrix as a matching result of the image to be matched, until the similarity value corresponding to the indication vector converges, and under the condition that the similarity value corresponding to each indication vector converges;
and the setting module is used for taking the corresponding relation of each image characteristic in the matching matrix as the matching result of the image to be matched under the condition that the judgment result of the judgment module is yes and the similarity values corresponding to the indication vectors are all converged.
Optionally, the merging module is configured to:
a: converting an image to be matched into an activated image, and taking each image feature in the activated image as an image feature cluster;
b: matching each image feature cluster in the activated image, and aiming at each image feature cluster pair in each image feature cluster pair, using a formula,obtaining a number of minimum dissimilarity points of the pair of image feature clusters, wherein,
k is the number of dissimilarity points of all possible element pairs in the image feature cluster pair; k is a radical of formula AP Is a preset first control parameter; c a The number of elements in the image feature cluster a; c b The number of elements in the image feature cluster b; r is AP The preset second control parameter; | · | is the euclidean distance function; i C a ||C b L is the number of possible pairs of elements between pairs of image feature clusters;
c: by means of the formula (I) and (II),and obtaining the difference value of the image characteristic cluster pair, wherein,
D kNN (k,C a ,C b ) The difference value between the image feature cluster a and the image feature cluster b in the image feature cluster pair is obtained; f is the number of pairing element pairs between the image feature cluster a and the image feature cluster b in the image feature cluster pair; min is a minimum evaluation function; sigma is a summation function; d (m) i ,m j ) For the element m in the image feature cluster a i With the element m in the image feature cluster b j The similarity between them; i is the element m i The serial number of (2); j is an element m j The serial number of (2);
d: judging whether the difference value of the image feature cluster pair is smaller than a preset difference threshold value or not;
e: if yes, combining the image feature cluster pairs into an image feature cluster, and returning to execute the step B until the difference value of any image feature cluster pair is not smaller than the difference threshold value;
f: and if not, taking the image feature cluster as a combined image feature cluster under the condition that the difference value of other image feature cluster pairs is not smaller than a preset difference threshold value.
Optionally, the second obtaining module is configured to:
obtaining the constraint condition corresponding to the candidate match,wherein the content of the first and second substances,
m t matching corresponding initial indication vectors for the candidates; m t Is an allocation matrix; n is P The number of the image feature clusters in the first image to be matched is obtained; n is Q The number of the image feature clusters in the second image to be matched is set;is a full one vector with the scale of n;is a full one vector with the scale of n;
by means of the formula (I) and (II),calculating a transfer error of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched, corresponding to the candidate match, wherein,
d jb|ia a transmission error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and a matching (i, a) formed by an image feature cluster i in the first image to be matched and an image feature cluster a in the second image to be matched; | | is a norm function;the image feature cluster b in the second image to be matched is obtained;is firstAffine homodyne transformation results from the image feature cluster j in the image to be matched to the image feature cluster b in the second image to be matched;the image feature cluster j in the first image to be matched is obtained;
according to the transfer error from the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched, by using a formula,and calculating a symmetric similarity value corresponding to the candidate match, wherein,
W ia;jb matching the corresponding symmetric similarity value for the candidate, and the candidate matching is including the edgeAndthe candidate match of (2); p is a first image to be matched; q is a second image to be matched; epsilon P Is a set of edges contained in the first image to be matched; epsilon Q Is a set of edges contained in the second image to be matched; i is the ith point in the first image to be matched; j is the jth point in the first image to be matched; a is the a-th point in the second image to be matched; b is the b-th point in the second image to be matched;matching corresponding clusters of image features for candidate matchesAnd matching image feature clustersA corresponding second order similarity function of symmetric transmission errors; d bj|ai Is a graph from the image feature cluster b in the second image to be matched to the first image to be matchedA matching (b, j) pair formed by the image feature cluster j and a transmission error between the matching (a, i) formed by the image feature cluster a in the second image to be matched and the image feature cluster i in the first image to be matched; d ia|jb A transfer error between a matching (i, a) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster a in the second image to be matched and a matching (j, b) formed by the image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched; d ai|bj The transfer error between the matching (a, i) formed by the image feature cluster a in the second image to be matched to the image feature cluster i in the first image to be matched and the matching (b, j) formed by the image feature cluster b in the second image to be matched to the image feature cluster j in the first image to be matched is obtained; alpha is a preset threshold value similar to the image feature cluster; max () is the maximum value evaluation function;
according to the symmetrical similarity value corresponding to the candidate matching, the symmetrical similarity value is used as an element to construct a symmetrical similarity matrix;
using a formula, S (M), based on a constraint matrix corresponding to the candidate match and a symmetric similarity matrix corresponding to the candidate match t )=M t T WM t Obtaining a similarity value for the candidate match, wherein,
S(M t ) For the t-th iteration, the distribution matrix is M t A similarity value of the candidate match; m t T For the t-th iteration, a matrix M is allocated t Transposing; w is the symmetric transmission error corresponding to the candidate match.
Optionally, the second obtaining module is configured to:
by means of the formula (I) and (II),computing an indication vector corresponding to the candidate match, wherein,
matching corresponding indication vectors for the candidates;a variable evaluation function that is a maximum of the function; s (M) t ) A similarity value for the candidate match.
Optionally, the third obtaining module is configured to:
by means of the formula (I) and (II),obtaining Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i The probability of correlation, wherein,
is Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i A probability of correlation; v Q The set of image feature clusters in the second image to be matched is obtained;the image feature cluster b in the image feature cluster set in the second image to be matched is obtained; v P The method comprises the steps of setting a set of image feature clusters in a first image to be matched;the image feature cluster b in the set of image feature clusters in the first image to be matched; m is a set of intermediate variables in the current iteration; match t A set of matching edges corresponding to the matching matrix; m is i Is the ith matching edge in the matching matrix, an An edge between an image feature cluster p and an image feature cluster i in a first image to be matched;an edge between an image feature cluster q and an image feature cluster i in a second image to be matched; NN (·) is a nearest neighbor image characteristic function;is given by m i When intermediate variables are present, image feature clustersThe nearest neighbor feature of (a); k is the number of the minimum dissimilarity points of the image feature cluster pair; k is a radical of 2 Clustering a second parameter in the image feature cluster in the current iteration of kNN; z is a normalization function; and isexp () is an exponential function with a natural base number as the base;a transfer error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and an intermediate variable;
by means of the formula (I) and (II),is obtained in Match t In the selection of m i As the probability of the intermediate variable, wherein,
p(V P =v P |M=m i ,Match t ) In Match t In selecting m i Probability as an intermediate variable; v. of P The image feature clusters in the set of image feature clusters in the first image to be matched are obtained; k is a radical of 1 Clustering a first parameter in image feature clusters in the current iteration of kNN;
by means of the formula (I) and (II),obtaining Match t Middle image feature cluster V Q And the middleVariable m i The probability of correlation, wherein,
p(M=m i |Match t ) Is Match t Middle image feature cluster V Q With intermediate variable m i A probability of correlation; match t The number of matching edges corresponding to the matching matrix;
p(V P ,V Q |M t ) Is a confidence level indicating a vector; m t Is a matching matrix.
Compared with the prior art, the invention has the following advantages:
by applying the embodiment of the invention, because more or less relevant information exists among a plurality of features extracted from the image to be matched, the relevant information can be used for marking the features. Therefore, after the image features are extracted, the features of the images are clustered, the features with similar attributes in the feature set are mined, and the features in the feature set are divided organically, so that the data volume of the features is reduced, the calculated amount in the feature matching process is reduced, and the efficiency of image feature matching is improved.
Drawings
Fig. 1 is a schematic flowchart of a progressive graph matching method based on clustering deformation graphs according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a matching relationship between image feature clusters in a cluster-based progressive graph matching method for a deformation graph according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a progressive graph matching apparatus based on a cluster deformation graph according to an embodiment of the present invention.
Detailed Description
The following examples are given for the detailed implementation and specific operation of the present invention, but the scope of the present invention is not limited to the following examples.
The embodiment of the invention provides a cluster-based progressive graph matching method and a cluster-based progressive graph matching device for a deformation graph, and firstly introduces the cluster-based progressive graph matching method for the deformation graph provided by the embodiment of the invention.
Fig. 1 is a schematic flowchart of a progressive graph matching method based on a cluster deformation graph according to an embodiment of the present invention, as shown in fig. 1, the method includes:
s101: acquiring image characteristics of an image to be matched, wherein the image to be matched comprises: the image matching method comprises the following steps of obtaining a first image to be matched and a second image to be matched, wherein the image characteristics comprise: image geometry characteristics.
Illustratively, the acquired images to be matched are an image P and an image Q, wherein one of the image P and the image Q is an unknown image, and the other image is a known image. For example, in the field of video surveillance, when a video frame corresponding to a target person is to be acquired from a large number of video images, the known image is an image of the target person acquired in advance; the unknown image is each frame image in the video image.
In practical applications, each image to be matched includes several geometric features, for example, points, lines, and regions in the image to be matched, such as edges, corners, contours, and the like. In general, each feature can be obtained by using a feature extraction algorithm, for example, the corner detection algorithm includes: harris corner detection algorithm, SUSAN corner detection algorithm, etc.
Further, the image features may include, but are not limited to, geometric features of the image, such as grayscale features of the image, color features of the image, brightness features of the image, and the like.
Finally, the obtained set of image features of the image to be matched is used as a candidate response set, for example, the candidate response set of the image P isWherein the content of the first and second substances,is a candidate response set; c. C i Is the ith image feature; n is the number of features in the candidate response set.
S102: and combining two adjacent image features with the difference value smaller than a preset threshold value into one image feature to obtain an image feature cluster.
Specifically, the step S102 may include the following steps:
a: converting an image to be matched into an activated image, and taking each image feature in the activated image as an image feature cluster.
For example, in the first iteration, based on the candidate response set in step S101, the aggregated active cluster is initialized, and θ is made 0 ={C i ={c i (i =1, …, n) }, wherein,
θ 0 initializing a set of aggregated activated image feature clusters; c i The ith activated image feature cluster.
Taking the tth iteration as an example, the active cluster of the aggregation at the tth iteration is:
θ t ={C i ={c i }(i=1,…,n)}
in this step, each image feature is taken as an active image feature cluster.
B: and performing pairing processing on each image feature cluster in the activated image, for example, pairing each image feature cluster in the image P to be matched with each image feature cluster in the image P to be matched. The following is then performed for each image feature cluster pair.
For each of the respective pairs of image feature clusters, using a formula,obtaining a number of minimum dissimilarity points of the pair of image feature clusters, wherein,
k is the number of all possible element pair dissimilarity points in the image feature cluster pair; k is a radical of AP Is a preset first controlA parameter; c a The number of elements in the image feature cluster a; c b The number of elements in the image feature cluster b; r is AP The preset second control parameter; | · | is the euclidean distance function; i C a ||C b And | is the number of possible pairs of elements between pairs of image feature clusters.
By applying the calculation method, the progressive link effect of a kNN (k-Nearest Neighbor algorithm) clustering model can be effectively avoided by adjusting the preset first control parameter and the preset second control parameter, and the deformation of the object can be effectively coped with.
C: by means of the formula (I) and (II),and obtaining the difference value of the image characteristic cluster pair, wherein,
D kNN (k,C a ,C b ) The difference value between the image feature cluster a and the image feature cluster b in the image feature cluster pair is obtained; f is the number of pairing element pairs between the image feature cluster a and the image feature cluster b in the image feature cluster pair; min is a minimum evaluation function; sigma is a summation function; d (m) i ,m j ) For the element m in the image feature cluster a i With the element m in the image feature cluster b j Similarity between them; i is an element m i The serial number of (2); j is the element m j The serial number of (2).
With the above embodiment of the present invention, the kNN connection model uses the average of k minimum dissimilarity points of all possible element pairs dissimilarity points between two image feature clusters. It is robust in restoring elongated or connected clusters, because it considers k pairs of support elements, rather than just one as a single key.
D: judging whether the difference value of the image feature cluster pair is smaller than a preset difference threshold value or not; if yes, executing step E; if not, executing step F.
Illustratively, whether the difference value between the image feature cluster a and the image feature cluster b is less than the difference threshold δ D ;
If the difference value is less than the difference threshold delta D And D, the similarity between the image feature cluster a and the image feature cluster b is relatively high, the image feature cluster a and the image feature cluster b can be combined into one image feature cluster, and the step E is executed at this moment.
If the difference threshold is less than delta D I.e. greater than or equal to the difference threshold delta D And D, explaining that the similarity between the image feature cluster a and the image feature cluster b is relatively small, and the image feature cluster a and the image feature cluster b cannot be combined into one image feature cluster, wherein the step F is executed at this moment.
E: and D, combining the image feature cluster pairs into an image feature cluster, and returning to execute the step B until the difference value of any image feature cluster pair is not less than the difference threshold value.
Illustratively, in this step, the image feature cluster a and the image feature cluster b are merged into one image feature cluster, and a new image feature cluster C is obtained q Then, new image feature cluster C is clustered q Adding to the active cluster set θ t In the method, an image feature cluster a and an image feature cluster b are selected from an active cluster set theta t Deletion, namely:
θ t =(θ t -{C a ,C b })∪{C q }。
it will be appreciated that on the first iteration, a new image feature cluster C will be clustered q Adding to the active cluster set θ 0 In the method, an image feature cluster a and an image feature cluster b are selected from an active cluster set theta 0 Deletion, namely:
θ 0 =(θ 0 -{C a ,C b })∪{C q }。
and repeating the step B after the updating of the activated cluster set is finished, and continuously executing the circulating operation until the difference value of each image feature cluster pair is not less than the difference threshold value delta D The loop is ended.
F: and under the condition that the difference values of other image feature cluster pairs are not smaller than a preset difference threshold value, taking the image feature cluster as a combined image feature cluster.
Illustratively, if and only if the difference between the image feature cluster a and the image feature cluster b is largeIs equal to or greater than a difference threshold δ D Meanwhile, the difference value between other image feature cluster pairs is greater than or equal to the difference threshold value delta D In this case, the iteration is ended, and the clustering result after the iteration is used as the final clustering result, so that a plurality of clustered image feature clusters can be obtained.
In step S102, the kNN clustering algorithm reflects the connectivity between deformable object parts and the compactness of the object parts. In each aggregation step, image feature clusters with high similarity may be merged into a larger image feature cluster, while image feature clusters with low similarity may still be smaller image feature clusters, thereby reducing the number of image feature clusters.
In embodiments of the present invention, clustering image features is analogous to giving each image feature a label. In recent years, robust feature correspondence methods have been proposed to account for geometric distortion of objects between images. They formulated visual correspondence as an image matching problem by defining an objective function based on photometric similarity and corresponding geometric compatibility. Although these methods show good performance, they all deal with the weakly supervised case where the outlier ratio is relatively low, i.e. two images have a common object or a model image is used. However, in the real world, image pairs may have significant clutter, multiple common objects, and even correspondence between multiple objects.
Therefore, the signature correspondence problem requires cross-searching in an unsupervised manner for multiple objects of significant outliers and multiple corresponding clusters. The method aims to establish feature correspondence and object-based clustering thereof aiming at the significant clutter and deformation of any image. Based on the following points:
in the embodiment of the invention, a bottom-up aggregation strategy is applied in the step S101: if we merge them with reliable neighbors step by step, starting from reliable initial matches, outliers can be collected efficiently despite the presence of a large number of distracting outliers. For example, seed-based detection methods show that such bottom-up aggregation with iterative match propagation can improve target recognition performance.
In addition, the embodiment of the invention also considers the connectivity among all parts: for deformable objects, feature correspondences do not form global compactness on two-by-two of their geometric similarities due to deformation, with deformed portions locally connected by some intervening portions. Therefore, for the clusters corresponding to the deformed object features, the connectivity criterion needs to be considered.
In summary, the embodiment of the present invention forms compact corresponding clusters in the early stage, gradually merges local connected clusters adapted to the object deformation part, and finally performs progressive graph matching optimization by constructing a graph model, thereby achieving a better matching effect.
S103: according to the matching relation between the image feature cluster of the image to be matched and the image feature clusters of other images to be matched, candidate matching aiming at the image to be matched and other images to be matched is obtained, for each candidate matching, the similarity value of the candidate matching is obtained according to the distribution matrix and the symmetrical similarity matrix of the candidate matching, and the indication vector of the candidate matching is obtained according to the maximum value of the similarity value.
The specific step S103 may include the following steps:
g: after the image P and the image Q are respectively processed in step S102, clustered image feature clusters are respectively obtained. And respectively constructing corresponding activation maps for the image P and the image Q:
when the indication vector of the candidate match is obtained, several iterations are also required, and the following description will be given by taking the t-th iteration as an example:
in the t-th iteration, the activation map corresponding to the image P is:wherein the content of the first and second substances,
V t P is a set of edges contained in the activation map corresponding to the image P;contained in activation maps corresponding to picture PA set of points;the image characteristic set of the image P at the t iteration is obtained;
at the t-th iteration, the activation map corresponding to the image Q is:wherein, the first and the second end of the pipe are connected with each other,
a set of edges contained in the activation map corresponding to the image P;a set of points contained in the activation map corresponding to the image P;is the image feature set of image Q at the t-th iteration.
And then to the activation mapEach image feature cluster in the image, and establishing the image feature cluster and activation mapCandidate matches between the respective image feature clusters in (a), namely:
C t is the set of candidate matches at the t-th iteration.
It is emphasized that in constructing the candidate matches, the following constraints need to be followed:
obtaining the constraint condition corresponding to the candidate match,wherein the content of the first and second substances,
m t matching corresponding initial indication vectors for the candidates; m t Is an allocation matrix; n is a radical of an alkyl radical P The number of the image feature clusters in the first image to be matched is obtained; n is Q The number of the image feature clusters in the second image to be matched is obtained;is a full one vector with the scale of n;is an all-one vector of scale n, andthe elements in the matrix are all less than or equal toEach element of (1).
The meaning of the above formula is bidirectional constraint representationAndis matched one-to-one such that M t A matrix is assigned.
The following description will be given of the instruction vector acquisition process taking one of several candidate matches between the image P and the image Q as an example.
H: fig. 2 is a schematic diagram of a matching relationship between image feature clusters in a cluster-based progressive graph matching method for a deformation graph according to an embodiment of the present invention, as shown in fig. 2,
in the context of figure 2 of the drawings,is the ith image feature cluster in the image P;is the jth image feature cluster in the image P;is the a-th image feature cluster in the image Q;is the b-th image feature cluster in the image Q;affine homodyne transformation results of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched are obtained; gamma-shaped ia The affine homography transformation result from the image feature cluster i in the first image to be matched to the image feature cluster a in the second image to be matched is obtained.
To be provided withThe centered affine region feature i can be represented by an elliptical region, the direction of which is estimated by the dominant direction of the gradient histogram of the local region. With these features, affine homostrain Γ from an image feature cluster i in the first image to be matched P to another image feature cluster feature a in the second image to be matched Q can be derived ia (. O) so that two points areAndneighborhood m of P And m Q The following steps are involved: m is a unit of Q =Γ ia (m P ). Then, given two matches (i, a) and (j, b), as shown in FIG. 2, the transfer error of (j, b) to (i, a) is given by d jb|ia And (4) showing.
I: by means of the formula (I) and (II),calculating image features in the first image to be matched corresponding to the candidate matchThe transfer error of the cluster j to the cluster b of image features in the second image to be matched, wherein,
d jb|ia a transfer error between a candidate matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and a candidate matching (i, a) pair formed by an image feature cluster i in the first image to be matched and an image feature cluster a in the second image to be matched; the integer is a norm function;the image feature cluster b in the second image to be matched is obtained;affine homodyne transformation results of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched are obtained;is the image feature cluster j in the first image to be matched. It will be appreciated that the image feature cluster is a cluster of edges in this step. An image feature cluster can be used as a node for iterative processing.
Γ ia Can better map the feature v j Is shifted to the image feature cluster v b A central point of (2), then even d jb|ia The value of (c) is minimal.
J: according to the transfer error from the image characteristic cluster j in the first image to be matched to the image characteristic cluster b in the second image to be matched, by using a formula,and calculating a symmetric similarity value corresponding to the candidate match, wherein,
W ia;jb matching the corresponding symmetric similarity value for the candidate, and the candidate matching is including the edgeAndthe candidate match of (2); p is a first image to be matched; q is a second image to be matched; epsilon P Is a set of edges contained in the first image to be matched; epsilon Q Is a set of edges contained in the second image to be matched; i is the ith point in the first image to be matched; j is the jth point in the first image to be matched; a is the a-th point in the second image to be matched; b is the b-th point in the second image to be matched;matching corresponding clusters of image features for candidate matchesAnd matching image feature clustersA corresponding second order similarity function of symmetric transmission errors; d bj|ai A transmission error between a matching (b, j) pair formed by an image feature cluster b in the second image to be matched and an image feature cluster j in the first image to be matched and a matching (a, i) formed by an image feature cluster a in the second image to be matched and an image feature cluster i in the first image to be matched; d ia|jb A transfer error between a matching (i, a) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster a in the second image to be matched and a matching (j, b) formed by the image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched; d is a radical of ai|bj The transfer error between the matching (a, i) formed by the image feature cluster a in the second image to be matched to the image feature cluster i in the first image to be matched and the matching (b, j) formed by the image feature cluster b in the second image to be matched to the image feature cluster j in the first image to be matched is obtained; alpha is a preset threshold value similar to the image feature cluster; max () is a maximum value evaluation function.
The above function may also be referred to as an STE second order similarity function.
K: and according to the symmetrical similarity value corresponding to the candidate matching, taking the symmetrical similarity value as an element to construct a symmetrical similarity matrix.
Illustratively, W may be ia;jb And further constructing a symmetric similarity matrix between the image P and the image Q as elements in the symmetric similarity matrix.
L: using a formula, S (M), based on a constraint matrix corresponding to the candidate match and a symmetric similarity matrix corresponding to the candidate match t )=M t T WM t Obtaining a similarity value for the candidate match, wherein,
S(M t ) For the t-th iteration, the distribution matrix is M t A similarity value of the candidate match; m t T For the t-th iteration, a matrix M is allocated t Transposing; w is the symmetric transmission error corresponding to the candidate match.
The second order similarity function is encoded in the symmetric similarity matrix W, and the elements in the second order similarity functionComprising two matching nodesAnd
m: obtaining the indication vector of the candidate match according to the maximum value of the similarity value, including:
by means of the formula (I) and (II),computing an indication vector corresponding to the candidate match, wherein,
matching corresponding indication vectors for the candidates;a variable evaluation function that is a maximum of the function; s (M) t ) A similarity value for the candidate match.
In this step, the matching relationship between the image feature clusters corresponding to the candidate match with the largest similarity value is used as an indication vector.
After hierarchical clustering is completed, a graph model-based progressive graph matching method is used, and the step consists of two alternative processes: graph matching and graph progression. The graph matching is an activation graph converted from the results obtained by the hierarchical clustering algorithm, the activation graph comprises multi-target matching with less features, and the graph ranking updates the activation graph and the similarity matrix thereof so as to improve the score of the next graph matching. The goal of progressive graph matching is to reconstruct the graph G by adapting P And G Q The graph matching score is further maximized.
S104: judging whether the similarity value corresponding to the indication vector is converged; if not, executing S105; if yes, go to step S106.
Exemplarily, judging whether a difference value between a maximum similarity value corresponding to the indication vector in the t iteration and a maximum similarity value in the t-1 iteration is smaller than a preset threshold value;
if yes, the similarity value corresponding to the indication vector is considered to be converged, and then step S105 is executed;
if not, indicating that the similarity value corresponding to the vector does not converge, and further executing the step S106.
S105: according to the candidate matched indication vectors, obtaining the confidence degrees of the indication vectors, and under the condition that the confidence degrees of the indication vectors are not smaller than a preset threshold value, updating element values in the matching matrix to obtain an updated matching matrix; and returning to execute the step S103 until the similarity values corresponding to the indication vectors are converged, and taking the corresponding relationship of each image feature in the matching matrix as the matching result of the image to be matched under the condition that the similarity values corresponding to each indication vector are all converged.
The specific step S105 may include the following steps:
n: can utilizeThe formula is shown in the figure,obtaining Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i The probability of correlation, wherein,
is Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i A probability of correlation; v Q The set of image feature clusters in the second image to be matched is obtained;an image feature cluster b in an image feature cluster set in a second image to be matched is obtained; v P The method comprises the steps of collecting image feature clusters in a first image to be matched;the image feature cluster b in the set of image feature clusters in the first image to be matched; m is a set of intermediate variables in the current iteration; match t A set of matching edges corresponding to the matching matrix; m is i Is the ith matching edge in the matching matrix, an The edge between the image feature cluster p and the image feature cluster i in the first image to be matched;an edge between an image feature cluster q and an image feature cluster i in a second image to be matched is defined; NN (·) is a nearest neighbor image characteristic function;is given by m i When intermediate variables are present, image feature clustersThe nearest neighbor feature of (a); k is the number of the minimum dissimilarity points of the image feature cluster pair; k is a radical of 2 For the second parameter in the clustering of image features in the current iteration of kNN, k is represented 2 Neighbor;
z is a normalization function; and isexp () is an exponential function with a natural base number as the base;and transmitting errors between a matching (j, b) pair formed by the image feature cluster j in the first image to be matched and the image feature cluster b in the second image to be matched and the intermediate variable.
In graph progression, M is matched given the current graph t (dimension h × w, graph P has h features, graph Q has w features), M is judged t Whether the elements of (i, j) and (j, i) positions of (a) and (b) are equal and 1, the matching edges of the graph P and the graph Q are obtained
O: by means of the formula (I) and (II),is obtained in Match t In selecting m i As the probability of the intermediate variable, wherein,
p(V P =v P |M=m i ,Match t ) Is in Match t In selecting m i Probability as an intermediate variable; v. of P The image feature clusters in the set of image feature clusters in the first image to be matched are obtained; k is a radical of 1 For the first parameter in the clustering of image features in the current iteration of kNN, represent k 1 Neighbor;
p: by means of the formula (I) and (II),obtaining Match t Middle image feature cluster V Q With intermediate variable m i The probability of correlation, wherein,
p(M=m i |Match t ) Is Match t Middle image feature cluster V Q With intermediate variable m i A probability of correlation; match t The number of matching edges corresponding to the matching matrix;
q: it is possible to use a formula of,
p(V P ,V Q |M t ) To indicate confidence of the vector, i.e. candidate matches between two very large mapsThe conditional probability of (a); m t Is a matching matrix; p (V) Q |V P ,M=m i ,M t ) For Match in N steps t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i A probability of correlation; p (V) P |M=m i ,M t ) In step O in Match t In selecting m i Probability as an intermediate variable; p (M = M) i |M t ) Is Match in P step t Middle image feature cluster V Q With intermediate variable m i The probability of correlation.
R: under the condition that the confidence of the indication vector is not less than a preset threshold, updating element values in the matching matrix to obtain an updated matching matrix:
judging whether the confidence coefficient is greater than or equal to a preset threshold value mu, and taking v higher than the preset threshold value mu P v Q And its constructed edges as new candidate matching set C t +1 element, and a new activation mapAndnew activation map is represented by C t Nodes and edges in + 1.
According to the corresponding relation between the image feature clusters corresponding to the indication vector, finding the corresponding element value in the matching matrix, then updating the element value into the confidence coefficient of the indication vector, further completing the updating of the matching vector, then returning to execute the step S103 until the similarity value corresponding to the indication vector is converged, and in the graph progressive matching, using the constraintGuarantee non-decreasing scoreTo achieve optimal map matching for each step. The iteration continues until the similarity values converge.
And under the condition that the similarity values corresponding to the indication vectors are all converged, taking the corresponding relation of the image features in the matching matrix as the matching result of the image to be matched.
If it is notNearest neighbor of (2)Already exists in the current matching Match t In the diagram, i.e. matchingIs considered to be very reliable and achieves a maximum score;
otherwise, whenWhen the temperature of the water is higher than the set temperature,the k nearest neighbor of (a) obtains the maximum score.
In practical application, p (V) can be used P ,V Q |Match t ) To determine the value of the threshold μ; in addition, intermediate variables are added for convenience of calculation.
S106: and under the condition that the similarity values corresponding to the indication vectors are all converged, taking the corresponding relation of the image features in the matching matrix as the matching result of the image to be matched.
Exemplarily, in the t-th iteration, the processing procedure of the other candidate matching indication vectors is as shown in steps H to S104, and the iteration is ended only when the similarity values of the indication vectors are converged;
if the similarity value of the indicated vector of one candidate match does not converge, step S105 needs to be performed.
By applying the embodiment of the invention shown in fig. 1, since there is more or less related information between many features in the features extracted from the image to be matched, the related information can be used to mark the features. Therefore, after the image features are extracted, the features of the images are clustered, the features with similar attributes in the feature set are mined, and the features in the feature set are divided organically, so that the data volume of the features is reduced, the calculated amount in the feature matching process is reduced, and the efficiency of image feature matching is improved.
Corresponding to the embodiment of the invention shown in fig. 1, the embodiment of the invention also provides a progressive graph matching device of the deformation graph based on clustering.
Fig. 3 is a schematic structural diagram of a progressive graph matching apparatus based on clustering deformation maps according to an embodiment of the present invention, as shown in fig. 3, the apparatus includes:
a first obtaining module 301, configured to obtain an image feature of an image to be matched, where the image to be matched includes: the image matching method comprises the following steps of obtaining a first image to be matched and a second image to be matched, wherein the image characteristics comprise: image geometry features;
a merging module 302, configured to merge two adjacent image features with a difference value smaller than a preset threshold into one image feature, so as to obtain an image feature cluster;
a second obtaining module 303, configured to obtain candidate matches for the image to be matched and other images to be matched according to matching relationships between image feature clusters of the image to be matched and image feature clusters of other images to be matched, obtain a similarity value of the candidate matches according to a distribution matrix and a symmetric similarity matrix of the candidate matches for each candidate match, and obtain an indication vector of the candidate match according to a maximum value of the similarity value;
a judging module 304, configured to judge whether a similarity value corresponding to the indication vector converges;
a third obtaining module 305, configured to, if the determination result of the determining module 304 is negative, obtain a confidence level of each candidate matching indication vector according to the indication vector, and update an element value in the matching matrix to obtain an updated matching matrix if the confidence level of the indication vector is not smaller than a preset threshold; triggering a second obtaining module 303 until the similarity values corresponding to the indication vectors are converged, and taking the corresponding relation of each image feature in the matching matrix as the matching result of the image to be matched under the condition that the similarity values corresponding to each indication vector are all converged;
a setting module 306, configured to, when the determination result of the determining module 304 is yes, take the corresponding relationship of each image feature in the matching matrix as the matching result of the image to be matched when the similarity value corresponding to each indication vector is converged.
By applying the embodiment of the invention shown in fig. 3, since there is more or less related information between many features in the features extracted from the image to be matched, the related information can be used to mark the features. Therefore, after the image features are extracted, the features of the images are clustered, the features with similar attributes in the feature set are mined, and the features in the feature set are organically divided, so that the data volume of the features is reduced, the calculated amount in the feature matching process is reduced, and the efficiency of image feature matching is improved.
In a specific implementation manner of the embodiment of the present invention, the merging module 302 is configured to:
a: converting an image to be matched into an activated image, and taking each image feature in the activated image as an image feature cluster;
b: matching each image feature cluster in the activated image, and aiming at each image feature cluster pair in each image feature cluster pair, using a formula,obtaining a number of minimum dissimilarity points of the pair of image feature clusters, wherein,
k is the number of all possible element pair dissimilarity points in the image feature cluster pair; k is a radical of AP Is a preset first control parameter; c a The number of elements in the image feature cluster a; c b The number of elements in the image feature cluster b; r is AP Is a preset second control parameter; | is a Euclidean distance function; i C a ||C b L is the number of possible pairs of elements between pairs of image feature clusters;
c: by means of the formula (I) and (II),and obtaining the difference value of the image characteristic cluster pair, wherein,
D kNN (k,C a ,C b ) The difference value between the image feature cluster a and the image feature cluster b in the image feature cluster pair is obtained; f is the number of pairing element pairs between the image feature cluster a and the image feature cluster b in the image feature cluster pair; min is a minimum evaluation function; sigma is a summation function; d (m) i ,m j ) For the element m in the image feature cluster a i With the element m in the image feature cluster b j The similarity between them; i is the element m i The serial number of (2); j is an element m j The serial number of (2);
d: judging whether the difference value of the image feature cluster pair is smaller than a preset difference threshold value or not;
e: if yes, combining the image feature cluster pairs into an image feature cluster, and returning to execute the step B until the difference value of any image feature cluster pair is not smaller than the difference threshold value;
f: and if not, taking the image feature cluster as a combined image feature cluster under the condition that the difference value of other image feature cluster pairs is not smaller than a preset difference threshold value.
In a specific implementation manner of the embodiment of the present invention, the second obtaining module 303 is configured to:
obtaining the constraint condition corresponding to the candidate match,wherein the content of the first and second substances,
m t matching corresponding initial indication vectors for the candidates; m t Is an allocation matrix; n is P The number of the image feature clusters in the first image to be matched is set; n is Q The number of the image feature clusters in the second image to be matched is set;is a full one vector with the scale of n;is a full one vector with the scale of n;
by means of the formula (I) and (II),calculating a transfer error of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched, corresponding to the candidate match, wherein,
d jb|ia a transmission error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and a matching (i, a) formed by an image feature cluster i in the first image to be matched and an image feature cluster a in the second image to be matched; | | is a norm function;the image feature cluster b in the second image to be matched is obtained;affine homodyne transformation results of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched are obtained;the image feature cluster j in the first image to be matched is obtained;
according to the transfer error from the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched, by using a formula,and calculating a symmetric similarity value corresponding to the candidate match, wherein,
W ia;jb matching the corresponding symmetric similarity value for the candidate, and the candidate matching is including the edgeAndthe candidate match of (2); p is a first image to be matched; q is a second image to be matched; epsilon P Is a set of edges contained in the first image to be matched; epsilon Q Is a set of edges contained in the second image to be matched; i is the ith point in the first image to be matched; j is the jth point in the first image to be matched; a is the a-th point in the second image to be matched; b is the b-th point in the second image to be matched;matching corresponding clusters of image features for candidate matchesAnd matching image feature clustersA corresponding second order similarity function of symmetric transmission errors; d is a radical of bj|ai A transmission error between a matching (b, j) pair formed by an image feature cluster b in the second image to be matched and an image feature cluster j in the first image to be matched and a matching (a, i) formed by an image feature cluster a in the second image to be matched and an image feature cluster i in the first image to be matched; d is a radical of ia|jb A transfer error between a matching (i, a) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster a in the second image to be matched and a matching (j, b) formed by the image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched; d ai|bj The transfer error between the matching (a, i) formed by the image feature cluster a in the second image to be matched to the image feature cluster i in the first image to be matched and the matching (b, j) formed by the image feature cluster b in the second image to be matched to the image feature cluster j in the first image to be matched is obtained; alpha is a preset threshold value similar to the image feature cluster; max () is the maximum value evaluation function;
according to the symmetrical similarity value corresponding to the candidate matching, the symmetrical similarity value is used as an element to construct a symmetrical similarity matrix;
using a formula, S (M), based on a constraint matrix corresponding to the candidate match and a symmetric similarity matrix corresponding to the candidate match t )=M t T WM t Obtaining a similarity value for the candidate match, wherein,
S(M t ) For the t-th iteration, the distribution matrix is M t A similarity value of the candidate match; m t T For the t-th iteration, a matrix M is allocated t Transposing; w is the symmetric transmission error corresponding to the candidate match.
In a specific implementation manner of the embodiment of the present invention, the second obtaining module 303 is configured to:
by means of the formula(s),computing an indication vector corresponding to the candidate match, wherein,
matching corresponding indication vectors for the candidates;a variable evaluation function that is a maximum of the function; s (M) t ) A similarity value for the candidate match.
In a specific implementation manner of the embodiment of the present invention, the third obtaining module 305 is configured to:
by means of the formula (I) and (II),obtaining Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i The probability of correlation, wherein,
is Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i A probability of correlation; v Q The set of image feature clusters in the second image to be matched is obtained;an image feature cluster b in an image feature cluster set in a second image to be matched is obtained; v P The method comprises the steps of collecting image feature clusters in a first image to be matched;the image feature cluster b in the set of image feature clusters in the first image to be matched; m is a set of intermediate variables in the current iteration; match t A set of matching edges corresponding to the matching matrix; m is i Is the ith matching edge in the matching matrix, an An edge between an image feature cluster p and an image feature cluster i in a first image to be matched;an edge between an image feature cluster q and an image feature cluster i in a second image to be matched is defined; NN (·) is a nearest neighbor image characteristic function;is given by m i When intermediate variables are present, image feature clustersThe nearest neighbor feature of (a); k is the number of the minimum dissimilarity points of the image feature cluster pair; k is a radical of 2 For the second parameter in the clustering of image features in the current iteration of kNN, k is represented 2 Neighbor; z is a normalization function; and isexp () is an exponential function with a natural base number as the base;a transfer error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and an intermediate variable;
by means of the formula (I) and (II),is obtained in Match t In the selection of m i As the probability of the intermediate variable, wherein,
p(V P =v P |M=m i ,Match t ) Is in Match t In the selection of m i Probability as an intermediate variable; v. of P Is a picture in the first image to be matchedImage feature clusters in the set of image feature clusters; k is a radical of 1 For the first parameter in the clustering of image features in the current iteration of kNN, represent k 1 Neighbor;
by means of the formula (I) and (II),obtaining Match t Middle image feature cluster V Q With intermediate variable m i The probability of correlation, wherein,
p(M=m i |Match t ) Is Match t Middle image feature cluster V Q With intermediate variable m i A probability of correlation; match t The number of matching edges corresponding to the matching matrix;
p(V P ,V Q |M t ) Is a confidence level indicating a vector; m is a group of t Is a matching matrix.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.
Claims (10)
1. A progressive graph matching method for a cluster-based deformation graph, the method comprising:
1) The image characteristics of the image to be matched are obtained, wherein the image to be matched comprises: the image matching method comprises the following steps of obtaining a first image to be matched and a second image to be matched, wherein the image characteristics comprise: an image geometry feature;
2) Combining two adjacent image features of which the difference value is smaller than a preset threshold value into one image feature to obtain an image feature cluster;
3) Acquiring candidate matching aiming at the image to be matched and other images to be matched according to the matching relation between the image feature cluster of the image to be matched and the image feature clusters of other images to be matched, acquiring a similarity value of the candidate matching according to a distribution matrix and a symmetrical similarity matrix of the candidate matching aiming at each candidate matching, and acquiring an indication vector of the candidate matching according to the maximum value of the similarity value;
4) Judging whether the similarity value corresponding to the indication vector is converged;
5) If not, acquiring the confidence coefficient of the indication vector according to the indication vector of each candidate matching, and updating the element value in the matching matrix under the condition that the confidence coefficient of the indication vector is not less than a preset threshold value to obtain an updated matching matrix; and returning to execute the step 3) until the similarity values corresponding to the indication vectors are converged, and taking the corresponding relation of each image feature in the matching matrix as the matching result of the image to be matched under the condition that the similarity values corresponding to each indication vector are converged;
6) And if so, taking the corresponding relation of each image feature in the matching matrix as the matching result of the image to be matched under the condition that the similarity value corresponding to each indication vector is converged.
2. The progressive graph matching method based on clustering of deformation maps according to claim 1, wherein the step 2) comprises:
a: converting an image to be matched into an activated image, and taking each image feature in the activated image as an image feature cluster;
b: matching each image feature cluster in the activated image, and aiming at each image feature cluster pair in each image feature cluster pair, using a formula,obtaining a number of minimum dissimilarity points of the pair of image feature clusters, wherein,
k is the number of all possible element pair dissimilarity points in the image feature cluster pair; k is a radical of AP Is a preset first control parameter;C a the number of elements in the image feature cluster a; c b The number of elements in the image feature cluster b; r is AP Is a preset second control parameter; | · | is the euclidean distance function; i C a ||C b L is the number of possible pairs of elements between pairs of image feature clusters;
c: by means of the formula (I) and (II),and obtaining the difference value of the image characteristic cluster pair, wherein,
D kNN (k,C a ,C b ) The difference value between the image feature cluster a and the image feature cluster b in the image feature cluster pair is obtained; gamma is the number of the pairing element pairs between the image feature cluster a and the image feature cluster b in the image feature cluster pair; min is a minimum evaluation function; sigma is a summation function; d (m) i ,m j ) For the element m in the image feature cluster a i With element m in image feature cluster b j The similarity between them; i is the element m i The serial number of (2); j is the element m j The serial number of (2);
d: judging whether the difference value of the image feature cluster pair is smaller than a preset difference threshold value or not;
e: if yes, combining the image feature cluster pairs into an image feature cluster, and returning to execute the step B until the difference value of any image feature cluster pair is not smaller than the difference threshold value;
f: and if not, taking the image feature cluster as a combined image feature cluster under the condition that the difference value of other image feature cluster pairs is not smaller than a preset difference threshold value.
3. The progressive graph matching method based on clustering of deformation graphs according to claim 1, wherein the obtaining of similarity values of the candidate matches according to the distribution matrix and the symmetric similarity matrix of the candidate matches comprises:
obtaining the constraint condition corresponding to the candidate match,wherein the content of the first and second substances,
m t matching corresponding initial indication vectors for the candidates; m t Is an allocation matrix; n is P The number of the image feature clusters in the first image to be matched is obtained; n is Q The number of the image feature clusters in the second image to be matched is set;is a full one vector with the scale of n;is a full one vector with the scale of n;
by means of the formula (I) and (II),calculating a transfer error of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched, corresponding to the candidate match, wherein,
d jb|ia a transmission error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and a matching (i, a) formed by an image feature cluster i in the first image to be matched and an image feature cluster a in the second image to be matched; | | is a norm function;the image feature cluster b in the second image to be matched is obtained;affine homodyne transformation results of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched are obtained;the image feature cluster j in the first image to be matched is obtained;
according to the first image to be matchedThe transfer error of the image feature cluster j in the second image to be matched to the image feature cluster b in the second image to be matched is determined by using a formula,and calculating a symmetric similarity value corresponding to the candidate match, wherein,
W ia;jb matching the corresponding symmetric similarity value for the candidate, and the candidate matching is including the edgeAndthe candidate match of (2); p is a first image to be matched; q is a second image to be matched; epsilon P Is a set of edges contained in the first image to be matched; epsilon Q Is a set of edges contained in the second image to be matched; i is the ith point in the first image to be matched; j is the jth point in the first image to be matched; a is the a-th point in the second image to be matched; b is the b-th point in the second image to be matched;matching corresponding clusters of image features for candidate matchesAnd matching image feature clustersA corresponding second order similarity function of symmetric transmission errors; d bj|ai A transmission error between a matching (b, j) pair formed by an image feature cluster b in the second image to be matched and an image feature cluster j in the first image to be matched and a matching (a, i) formed by an image feature cluster a in the second image to be matched and an image feature cluster i in the first image to be matched; d is a radical of ia|jb The image features from the image feature cluster j in the first image to be matched to the image features from the image feature cluster j in the second image to be matchedA matching (i, a) pair formed by the characteristic cluster a and a transfer error between an image characteristic cluster j in the first image to be matched and a matching (j, b) formed by an image characteristic cluster b in the second image to be matched; d ai|bj The transfer error between the matching (a, i) formed by the image feature cluster a in the second image to be matched to the image feature cluster i in the first image to be matched and the matching (b, j) formed by the image feature cluster b in the second image to be matched to the image feature cluster j in the first image to be matched is obtained; alpha is a preset threshold value similar to the image feature cluster; max () is the maximum evaluation function;
according to the symmetrical similarity value corresponding to the candidate matching, the symmetrical similarity value is used as an element to construct a symmetrical similarity matrix;
using a formula, S (M), based on a constraint matrix corresponding to the candidate match and a symmetric similarity matrix corresponding to the candidate match t )=M t T WM t Obtaining a similarity value for the candidate match, wherein,
S(M t ) For the t-th iteration, the distribution matrix is M t A similarity value of the candidate match; m t T For the t-th iteration, a matrix M is allocated t Transposing; w is the symmetric transmission error corresponding to the candidate match.
4. The progressive graph matching method based on cluster deformation graph according to claim 1, wherein said obtaining the indication vector of the candidate match according to the maximum value of the similarity value comprises:
by means of the formula (I) and (II),computing an indication vector corresponding to the candidate match, wherein,
5. The progressive graph matching method based on clustering deformation graphs according to claim 1, wherein the obtaining the confidence of the indication vector comprises:
by means of the formula(s),obtaining Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i The probability of correlation, wherein,
is Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i A probability of correlation; v Q The set of image feature clusters in the second image to be matched is obtained;the image feature cluster b in the image feature cluster set in the second image to be matched is obtained; v P The method comprises the steps of collecting image feature clusters in a first image to be matched;the image feature cluster b in the set of image feature clusters in the first image to be matched; m is a set of intermediate variables in the current iteration; match t A set of matching edges corresponding to the matching matrix; m is i Is the ith matching edge in the matching matrix, an An edge between an image feature cluster p and an image feature cluster i in a first image to be matched;an edge between an image feature cluster q and an image feature cluster i in a second image to be matched is defined; NN (·) is a nearest neighbor image characteristic function;is given by m i When intermediate variables are present, image feature clustersThe nearest neighbor feature of (a); k is the number of the minimum dissimilarity points of the image feature cluster pair; k is a radical of 2 A second parameter in the image feature cluster in the current iteration of kNN; z is a normalization function; and isexp () is an exponential function with a natural base number as the base;a transfer error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and an intermediate variable;
by means of the formula (I) and (II),is obtained in Match t In selecting m i As the probability of the intermediate variable, wherein,
p(V P =v P |M=m i ,Match t ) Is in Match t In selecting m i Probability as an intermediate variable; v. of P The image feature clusters in the set of image feature clusters in the first image to be matched are obtained; k is a radical of 1 Clustering image features in a kNN current iterationThe first parameter of (1);
by means of the formula (I) and (II),obtaining Match t Middle image feature cluster V Q With intermediate variable m i The probability of correlation, wherein,
p(M=m i |Match t ) Is Match t Middle image feature cluster V Q With intermediate variable m i A probability of correlation; | Match t L is the number of matching edges corresponding to the matching matrix;
p(V P ,V Q |M t ) Is a confidence level indicating a vector; m t Is a matching matrix.
6. An apparatus for progressive graph matching of clustering based deformation graphs, the apparatus comprising:
the first acquisition module is used for acquiring image characteristics of an image to be matched, wherein the image to be matched comprises: the image matching method comprises the following steps of obtaining a first image to be matched and a second image to be matched, wherein the image characteristics comprise: image geometry features;
the merging module is used for merging two adjacent image features with the difference value smaller than a preset threshold value into one image feature to obtain an image feature cluster;
the second acquisition module is used for acquiring candidate matching aiming at the image to be matched and other images to be matched according to the matching relation between the image feature cluster of the image to be matched and the image feature clusters of other images to be matched, acquiring the similarity value of the candidate matching according to the distribution matrix and the symmetrical similarity matrix of the candidate matching aiming at each candidate matching, and acquiring the indication vector of the candidate matching according to the maximum value of the similarity value;
the judging module is used for judging whether the similarity value corresponding to the indication vector is converged;
a third obtaining module, configured to, if the determination result of the determining module is negative, obtain a confidence level of each candidate matching indication vector according to the indication vector, and update an element value in the matching matrix to obtain an updated matching matrix if the confidence level of the indication vector is not less than a preset threshold; a second obtaining module, configured to take the corresponding relationship of each image feature in the matching matrix as a matching result of the image to be matched, until the similarity value corresponding to the indication vector converges, and under the condition that the similarity value corresponding to each indication vector converges;
and the setting module is used for taking the corresponding relation of each image characteristic in the matching matrix as the matching result of the image to be matched under the condition that the similarity value corresponding to each indication vector is converged under the condition that the judgment result of the judging module is yes.
7. The progressive graph matching apparatus based on clustering deformation graphs according to claim 6, wherein the merging module is configured to:
a: converting an image to be matched into an activated image, and taking each image feature in the activated image as an image feature cluster;
b: matching each image feature cluster in the activated image, and aiming at each image feature cluster pair in each image feature cluster pair, using a formula,obtaining a number of minimum dissimilarity points of the pair of image feature clusters, wherein,
k is the number of all possible element pair dissimilarity points in the image feature cluster pair; k is a radical of AP Is a preset first control parameter; c a The number of elements in the image feature cluster a; c b The number of elements in the image feature cluster b; r is AP Is a preset second control parameter; | be EuropeanObtaining a distance function; i C a ||C b L is the number of possible pairs of elements between pairs of image feature clusters;
c: by means of the formula(s),and obtaining the difference value of the image characteristic cluster pair, wherein,
D kNN (k,C a ,C b ) The difference value between the image feature cluster a and the image feature cluster b in the image feature cluster pair is obtained; gamma is the number of the pairing element pairs between the image feature cluster a and the image feature cluster b in the image feature cluster pair; min is a minimum evaluation function; sigma is a summation function; d (m) i ,m j ) For the element m in the image feature cluster a i With the element m in the image feature cluster b j The similarity between them; i is the element m i The serial number of (2); j is the element m j The serial number of (2);
d: judging whether the difference value of the image feature cluster pair is smaller than a preset difference threshold value or not;
e: if yes, combining the image feature cluster pairs into an image feature cluster, and returning to execute the step B until the difference value of any image feature cluster pair is not smaller than the difference threshold value;
f: and if not, taking the image feature cluster as a combined image feature cluster under the condition that the difference value of other image feature cluster pairs is not smaller than a preset difference threshold value.
8. The progressive graph matching apparatus based on clustering of deformation maps according to claim 6, wherein the second obtaining module is configured to:
obtaining the constraint condition corresponding to the candidate match,wherein the content of the first and second substances,
m t matching corresponding initial indication vectors for the candidates; m t Is an allocation matrix; n is P The number of the image feature clusters in the first image to be matched is obtained; n is Q Is the second to be matchedMatching the number of image feature clusters in the image;is a full one vector with the scale of n;is a full one vector with the scale of n;
by means of the formula(s),calculating a transfer error of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched, corresponding to the candidate match, wherein,
d jb|ia a transmission error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and a matching (i, a) formed by an image feature cluster i in the first image to be matched and an image feature cluster a in the second image to be matched; | | is a norm function;the image feature cluster b in the second image to be matched is obtained;affine homodyne transformation results of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched are obtained;the image feature cluster j in the first image to be matched is obtained;
according to the transfer error from the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched, by using a formula,computing candidate matchesA corresponding symmetric similarity value, wherein,
W ia;jb matching the corresponding symmetric similarity value for the candidate, and the candidate matching is including the edgeAndthe candidate match of (2); p is a first image to be matched; q is a second image to be matched; epsilon P Is a set of edges contained in the first image to be matched; epsilon Q Is a set of edges contained in the second image to be matched; i is the ith point in the first image to be matched; j is the jth point in the first image to be matched; a is the a-th point in the second image to be matched; b is the b-th point in the second image to be matched;matching corresponding clusters of image features for candidate matchesAnd matching image feature clustersA corresponding second order similarity function of symmetric transmission errors; d bj|ai A transmission error between a matching (b, j) pair formed by an image feature cluster b in the second image to be matched and an image feature cluster j in the first image to be matched and a matching (a, i) formed by an image feature cluster a in the second image to be matched and an image feature cluster i in the first image to be matched; d is a radical of ia|jb A transfer error between a matching (i, a) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster a in the second image to be matched and a matching (j, b) formed by the image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched; d ai|bj The image feature cluster a in the second image to be matched to the image in the first image to be matchedA match (a, i) formed by the feature cluster i and a transfer error between an image feature cluster b in the second image to be matched and a match (b, j) formed by an image feature cluster j in the first image to be matched; alpha is a preset threshold value similar to the image feature cluster; max () is the maximum value evaluation function;
according to the symmetrical similarity value corresponding to the candidate matching, the symmetrical similarity value is used as an element to construct a symmetrical similarity matrix;
using a formula, S (M), based on a constraint matrix corresponding to the candidate match and a symmetric similarity matrix corresponding to the candidate match t )=M t T WM t Obtaining a similarity value for the candidate match, wherein,
S(M t ) For the t-th iteration, the distribution matrix is M t A similarity value of the candidate match; m t T For the t-th iteration, a matrix M is allocated t Transposing; w is the symmetric transmission error corresponding to the candidate match.
9. The progressive graph matching apparatus based on clustering of deformation maps according to claim 6, wherein the second obtaining module is configured to:
by means of the formula (I) and (II),computing an indication vector corresponding to the candidate match, wherein,
10. The progressive graph matching apparatus based on clustering of deformation graphs according to claim 6, wherein the third obtaining module is configured to:
by means of the formula (I) and (II),obtaining Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i The probability of correlation, wherein,
is Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i A probability of correlation; v Q The set of image feature clusters in the second image to be matched is obtained;the image feature cluster b in the image feature cluster set in the second image to be matched is obtained; v P The method comprises the steps of collecting image feature clusters in a first image to be matched;the image feature cluster b in the set of image feature clusters in the first image to be matched; m is a set of intermediate variables in the current iteration; match t A set of matching edges corresponding to the matching matrix; m is i Is the ith matching edge in the matching matrix, an An edge between an image feature cluster p and an image feature cluster i in a first image to be matched;for the image feature cluster q and in the second image to be matchedEdges between image feature clusters i; NN (·) is a nearest neighbor image characteristic function;is represented by m i When intermediate variables are present, image feature clustersThe nearest neighbor feature of (a); k is the number of the minimum dissimilarity points of the image feature cluster pair; k is a radical of 2 Clustering a second parameter in the image feature cluster in the current iteration of kNN; z is a normalization function; and isexp () is an exponential function with a natural base number as the base;a transfer error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and an intermediate variable;
by means of the formula (I) and (II),is obtained in Match t In selecting m i As the probability of the intermediate variable, wherein,
p(V P =v P |M=m i ,Match t ) In Match t In selecting m i Probability as an intermediate variable; v. of P The image feature clusters in the set of image feature clusters in the first image to be matched are obtained; k is a radical of formula 1 Clustering a first parameter in image feature clusters in the current iteration of kNN;
by means of the formula (I) and (II),obtaining Match t Middle image feature cluster V Q With intermediate variable m i The probability of correlation, wherein,
p(M=m i |Match t ) Is Match t Middle image feature cluster V Q With intermediate variable m i A probability of correlation; | Match t L is the number of matching edges corresponding to the matching matrix;
p(V P ,V Q |M t ) Is a confidence level indicating a vector; m t Is a matching matrix.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910209027.4A CN109934298B (en) | 2019-03-19 | 2019-03-19 | Progressive graph matching method and device of deformation graph based on clustering |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910209027.4A CN109934298B (en) | 2019-03-19 | 2019-03-19 | Progressive graph matching method and device of deformation graph based on clustering |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109934298A CN109934298A (en) | 2019-06-25 |
CN109934298B true CN109934298B (en) | 2022-10-28 |
Family
ID=66987741
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910209027.4A Expired - Fee Related CN109934298B (en) | 2019-03-19 | 2019-03-19 | Progressive graph matching method and device of deformation graph based on clustering |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109934298B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112686880B (en) * | 2021-01-06 | 2021-09-14 | 哈尔滨市科佳通用机电股份有限公司 | Method for detecting abnormity of railway locomotive component |
CN112991408B (en) * | 2021-04-19 | 2021-07-30 | 湖南大学 | Large-scene high-resolution remote sensing image self-adaptive area multi-feature registration method and system |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104766084A (en) * | 2015-04-10 | 2015-07-08 | 南京大学 | Nearly copied image detection method based on multi-target matching |
CN106886794A (en) * | 2017-02-14 | 2017-06-23 | 湖北工业大学 | Take the heterologous remote sensing image homotopy mapping method of high-order structures feature into account |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10198858B2 (en) * | 2017-03-27 | 2019-02-05 | 3Dflow Srl | Method for 3D modelling based on structure from motion processing of sparse 2D images |
-
2019
- 2019-03-19 CN CN201910209027.4A patent/CN109934298B/en not_active Expired - Fee Related
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104766084A (en) * | 2015-04-10 | 2015-07-08 | 南京大学 | Nearly copied image detection method based on multi-target matching |
CN106886794A (en) * | 2017-02-14 | 2017-06-23 | 湖北工业大学 | Take the heterologous remote sensing image homotopy mapping method of high-order structures feature into account |
Non-Patent Citations (2)
Title |
---|
一种基于奇异值分解的图像匹配算法;赵峰等;《计算机研究与发展》;20100115(第01期);全文 * |
基于一致性随机采样的图像特征匹配鲁棒确认;刘毅;《重庆邮电大学学报(自然科学版)》;20100615(第03期);全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN109934298A (en) | 2019-06-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11670071B2 (en) | Fine-grained image recognition | |
Novotny et al. | Semi-convolutional operators for instance segmentation | |
CN106682598B (en) | Multi-pose face feature point detection method based on cascade regression | |
Jiang et al. | Robust feature matching for remote sensing image registration via linear adaptive filtering | |
Jiang et al. | Multiscale locality and rank preservation for robust feature matching of remote sensing images | |
CN110674866A (en) | Method for detecting X-ray breast lesion images by using transfer learning characteristic pyramid network | |
US9984280B2 (en) | Object recognition system using left and right images and method | |
Xia et al. | Loop closure detection for visual SLAM using PCANet features | |
Lee et al. | Place recognition using straight lines for vision-based SLAM | |
Liu et al. | A review of keypoints’ detection and feature description in image registration | |
CN109934298B (en) | Progressive graph matching method and device of deformation graph based on clustering | |
CN111199558A (en) | Image matching method based on deep learning | |
Cheung et al. | On deformable models for visual pattern recognition | |
Kumar et al. | A novel approach for multi-cue feature fusion for robust object tracking | |
CN109255043B (en) | Image retrieval method based on scene understanding | |
CN114358166A (en) | Multi-target positioning method based on self-adaptive k-means clustering | |
Tang et al. | Random walks with efficient search and contextually adapted image similarity for deformable registration | |
Zhao et al. | Learning probabilistic coordinate fields for robust correspondences | |
Zhou et al. | Retrieval and localization with observation constraints | |
Cai et al. | A target tracking method based on KCF for omnidirectional vision | |
Cheung et al. | Bidirectional deformable matching with application to handwritten character extraction | |
Zhao et al. | Remote sensing image registration based on dynamic threshold calculation strategy and multiple-feature distance fusion | |
CN115731576A (en) | Unsupervised pedestrian re-identification method based on key shielding area | |
Yamashita et al. | Facial point detection using convolutional neural network transferred from a heterogeneous task | |
Tang et al. | A GMS-guided approach for 2D feature correspondence selection |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20221028 |