CN109934298B - Progressive graph matching method and device of deformation graph based on clustering - Google Patents

Progressive graph matching method and device of deformation graph based on clustering Download PDF

Info

Publication number
CN109934298B
CN109934298B CN201910209027.4A CN201910209027A CN109934298B CN 109934298 B CN109934298 B CN 109934298B CN 201910209027 A CN201910209027 A CN 201910209027A CN 109934298 B CN109934298 B CN 109934298B
Authority
CN
China
Prior art keywords
image
image feature
matched
matching
feature cluster
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201910209027.4A
Other languages
Chinese (zh)
Other versions
CN109934298A (en
Inventor
张悦
江波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anhui University
Original Assignee
Anhui University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Anhui University filed Critical Anhui University
Priority to CN201910209027.4A priority Critical patent/CN109934298B/en
Publication of CN109934298A publication Critical patent/CN109934298A/en
Application granted granted Critical
Publication of CN109934298B publication Critical patent/CN109934298B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a progressive graph matching method and a progressive graph matching device of a deformation graph based on clustering, wherein the method comprises the following steps: 1) Acquiring the image characteristics of the image to be matched; 2) Combining two adjacent image features with the difference value smaller than a preset threshold value into one image feature to obtain an image feature cluster; 3) Acquiring similarity values of the candidate matches according to the matching relation among the image feature clusters, and acquiring indication vectors of the candidate matches according to the maximum value of the similarity values; 4) Judging whether the similarity value corresponding to the indication vector is converged; 5) If not, obtaining the confidence coefficient of the indication vector according to the indication vector of each candidate matching to obtain an updated matching matrix; and returning to execute the step 3) until the similarity value corresponding to the indication vector is converged; 6) And if so, taking the corresponding relation of each image characteristic in the matching matrix as the matching result of the image to be matched. By applying the embodiment of the invention, the efficiency of image feature matching can be improved.

Description

Progressive graph matching method and device of deformation graph based on clustering
Technical Field
The invention relates to an image matching method and device, in particular to a progressive graph matching method and device based on a clustering deformation graph.
Background
With the rapid development of image matching technology, the application of image matching technology is gradually expanded to more new fields, such as image medicine, mapping, remote sensing signal processing, industrial detection, target identification and tracking, and the like. Moreover, in many studies and project projects on images, the application of image matching algorithms can have a significant impact on the overall study outcome. Therefore, the method has very important practical significance for the research of the image matching algorithm. The image matching is to perform matching comparison processing on two images to be matched, judge the similarity degree between the two images, and further judge whether the two images to be matched contain the same or similar contents. In general, the image matching process can be divided into two parts: firstly, selecting image matching characteristics: then matching image matching characteristics, and using a similarity measurement algorithm for the traditional Chinese medicine in the matching process. There are many algorithms for image matching, which can be generally classified into the following two categories: 1) The image matching method based on the pixel gray value in the image to be matched comprises the following steps: the characteristics can be described by utilizing all pixel information in a selected region in the image, the matching precision is high, and because the statistical analysis is performed in the region, the number of pixels needing to be processed is large, and accordingly, the calculation cost is high. 2) The image matching method based on the characteristics comprises the following steps: the features can be generally understood as features based on geometric shapes, and only the features such as points, lines, regions and the like in the image, such as edges, angular points and contours, are extracted. The similarity measure method of the features is to select according to the features extracted previously, and the normalized correlation measure, the distance-based measure, the mutual information-based measure and the like are commonly used.
The image matching algorithm commonly used at present is a feature-based image matching algorithm, and in the algorithm, the number of feature points of an image is far less than the number of pixel points, so that the calculation amount is greatly reduced. The image can be preprocessed before the characteristics are extracted to reduce the influence of noise, and the characteristic points have stronger adaptability to gray scale change, image size scaling and shielding. In addition, the characteristic points are sensitive to position change during matching, so that a part of matching points which are not corresponding to the positions can be filtered out, the matching precision is improved, and problems caused by using gray information for matching are effectively avoided.
Local features, which are representative local information on an image, are used in the feature-based image matching method. Common features used in such image matching methods are: corner points, lines, contours, edges, etc. The angular point detection algorithm comprises: harris corner detection algorithm, SUSAN corner detection algorithm, etc. After the image corner features are extracted by the method, the relationship of the corner information between two images is obtained by using some specific methods, so as to judge whether the segment corner pairs are matched. The straight line feature is widely used in image matching. The common Hough transform can be used to extract straight line features in the image. Medioni and Nevatia use straight-line features for image matching. Contour features are also a more common feature, and there are many shape-based matching algorithms. Shi and Kaick use the contour as a feature and match by calculating the consistency of the shape. Yang et al combines shape characteristics with position information to construct a low-dimensional image descriptor. Shu et al propose a profile feature based on the distribution of the midpoints of the profile in polar coordinates: the contour point distribution histogram, the characteristic accords with human visual perception, and the computational complexity is low.
However, the inventor finds that the feature-based matching algorithm only needs to calculate the related information in a single feature point or a feature point neighborhood, so that the calculation amount is greatly reduced, and the operation is simpler. Therefore, the feature-based image matching method is becoming more and more widespread in practical applications. In the conventional image matching method, the feature point matching is directly performed after the feature extraction is completed. The more objects in the image, the richer the content of the image, the more the number of extracted features is, and the feature points are all high-dimensional data. Therefore, when feature searching is carried out in a large amount of high-dimensional data, the required calculation amount is large, and the efficiency is low.
Disclosure of Invention
The invention aims to provide a progressive graph matching method and device based on a clustering deformation graph so as to improve the efficiency of image matching.
The invention solves the technical problems through the following technical scheme:
the embodiment of the invention provides a cluster-based progressive graph matching method of a deformation graph, which comprises the following steps:
1) Acquiring the image characteristics of the image to be matched, wherein the image to be matched comprises: the image matching method comprises the following steps of obtaining a first image to be matched and a second image to be matched, wherein the image characteristics comprise: image geometry features;
2) Combining two adjacent image features with the difference value smaller than a preset threshold value into one image feature to obtain an image feature cluster;
3) Acquiring candidate matching aiming at the image to be matched and other images to be matched according to the matching relation between the image feature cluster of the image to be matched and the image feature clusters of other images to be matched, acquiring a similarity value of the candidate matching according to a distribution matrix and a symmetrical similarity matrix of the candidate matching aiming at each candidate matching, and acquiring an indication vector of the candidate matching according to the maximum value of the similarity value;
4) Judging whether the similarity value corresponding to the indication vector is converged;
5) If not, obtaining the confidence coefficient of the indication vector according to the indication vector of each candidate matching, and updating the element values in the matching matrix under the condition that the confidence coefficient of the indication vector is not less than a preset threshold value to obtain an updated matching matrix; and returning to execute the step 3) until the similarity values corresponding to the indication vectors are converged, and taking the corresponding relation of each image feature in the matching matrix as the matching result of the image to be matched under the condition that the similarity values corresponding to each indication vector are all converged;
6) And if so, taking the corresponding relation of each image feature in the matching matrix as the matching result of the image to be matched under the condition that the similarity value corresponding to each indication vector is converged.
Optionally, the step 2) includes:
a: converting an image to be matched into an activated image, and taking each image feature in the activated image as an image feature cluster;
b: matching each image feature cluster in the activated image, and aiming at each image feature cluster pair in each image feature cluster pair, using a formula,
Figure BDA0001999929070000031
obtaining a number of minimum dissimilarity points of the pair of image feature clusters, wherein,
k is the number of all possible element pair dissimilarity points in the image feature cluster pair; k is a radical of AP Is a preset first control parameter; c a The number of elements in the image feature cluster a; c b The number of elements in the image feature cluster b; r is AP The preset second control parameter; | is a Euclidean distance function; i C a ||C b Is a drawing |The number of possible pairs of elements between pairs of image feature clusters;
c: by means of the formula (I) and (II),
Figure BDA0001999929070000032
and acquiring difference values of the image feature cluster pairs, wherein,
D kNN (k,C a ,C b ) The difference value between the image feature cluster a and the image feature cluster b in the image feature cluster pair is obtained; gamma is the number of the pairing element pairs between the image feature cluster a and the image feature cluster b in the image feature cluster pair; min is a minimum evaluation function; sigma is a summation function; d (m) i ,m j ) For the element m in the image feature cluster a i With the element m in the image feature cluster b j The similarity between them; i is the element m i The serial number of (2); j is the element m j The serial number of (2);
d: judging whether the difference value of the image feature cluster pair is smaller than a preset difference threshold value or not;
e: if yes, combining the image feature cluster pairs into an image feature cluster, and returning to execute the step B until the difference value of any image feature cluster pair is not smaller than the difference threshold value;
f: and if not, taking the image feature cluster as a combined image feature cluster under the condition that the difference value of other image feature cluster pairs is not smaller than a preset difference threshold value.
Optionally, the obtaining the similarity value of the candidate match according to the distribution matrix and the symmetric similarity matrix of the candidate match includes:
obtaining the constraint condition corresponding to the candidate match,
Figure BDA0001999929070000033
wherein the content of the first and second substances,
m t matching corresponding initial indication vectors for the candidates; m t Is an allocation matrix; n is P The number of the image feature clusters in the first image to be matched is obtained; n is Q The number of the image feature clusters in the second image to be matched is obtained;
Figure BDA0001999929070000041
is a full one vector with the scale of n;
Figure BDA0001999929070000042
is a full one vector with the scale of n;
by means of the formula (I) and (II),
Figure BDA0001999929070000043
calculating a transfer error of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched, corresponding to the candidate match, wherein,
d jb|ia a transmission error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and a matching (i, a) formed by an image feature cluster i in the first image to be matched and an image feature cluster a in the second image to be matched; | | is a norm function;
Figure BDA0001999929070000044
the image feature cluster b in the second image to be matched is obtained;
Figure BDA0001999929070000045
affine homodyne transformation results of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched are obtained;
Figure BDA0001999929070000046
the image feature cluster j in the first image to be matched is obtained;
according to the transfer error from the image characteristic cluster j in the first image to be matched to the image characteristic cluster b in the second image to be matched, by using a formula,
Figure BDA0001999929070000047
and calculating a symmetric similarity value corresponding to the candidate match, wherein,
W ia;jb matching the corresponding symmetric similarity value for the candidate, and the candidate matching is including the edge
Figure BDA0001999929070000048
And
Figure BDA0001999929070000049
the candidate match of (2); p is a first image to be matched; q is a second image to be matched; epsilon P Is a set of edges contained in the first image to be matched; epsilon Q Is a set of edges contained in the second image to be matched; i is the ith point in the first image to be matched; j is the j point in the first image to be matched; a is the a-th point in the second image to be matched; b is the b-th point in the second image to be matched;
Figure BDA00019999290700000410
matching corresponding clusters of image features for candidate matches
Figure BDA00019999290700000411
And matching image feature clusters
Figure BDA00019999290700000412
A corresponding second order similarity function of symmetric transmission errors; d bj|ai A transmission error between a matching (b, j) pair formed by an image feature cluster b in the second image to be matched and an image feature cluster j in the first image to be matched and a matching (a, i) formed by an image feature cluster a in the second image to be matched and an image feature cluster i in the first image to be matched; d ia|jb A transfer error between a matching (i, a) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster a in the second image to be matched and a matching (j, b) formed by the image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched; d ai|bj The transfer error between the matching (a, i) formed by the image feature cluster a in the second image to be matched to the image feature cluster i in the first image to be matched and the matching (b, j) formed by the image feature cluster b in the second image to be matched to the image feature cluster j in the first image to be matched is obtained; alpha is a preset threshold value similar to the image feature cluster; max () is the maximum value evaluation function;
according to the symmetrical similarity value corresponding to the candidate matching, the symmetrical similarity value is used as an element to construct a symmetrical similarity matrix;
using a formula, S (M), based on a constraint matrix corresponding to the candidate match and a symmetric similarity matrix corresponding to the candidate match t )=M t T WM t Obtaining a similarity value for the candidate match, wherein,
S(M t ) For the t-th iteration, the distribution matrix is M t A similarity value of the candidate match; m t T For the t-th iteration, a matrix M is allocated t Transposing; w is the symmetric transmission error corresponding to the candidate match.
Optionally, the obtaining the candidate matching indication vector according to the maximum value of the similarity value includes:
by means of the formula (I) and (II),
Figure BDA0001999929070000051
computing an indication vector corresponding to the candidate match, wherein,
Figure BDA0001999929070000052
matching corresponding indication vectors for the candidates;
Figure BDA0001999929070000053
a variable evaluation function that is a maximum of the function; s (M) t ) A similarity value for the candidate match.
Optionally, the obtaining the confidence level of the indication vector includes:
by means of the formula (I) and (II),
Figure BDA0001999929070000054
obtaining Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i The probability of correlation, wherein,
Figure BDA0001999929070000055
is Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i A probability of correlation; v Q The set of image feature clusters in the second image to be matched is obtained;
Figure BDA0001999929070000056
the image feature cluster b in the image feature cluster set in the second image to be matched is obtained; v P The method comprises the steps of collecting image feature clusters in a first image to be matched;
Figure BDA0001999929070000057
the image feature cluster b in the set of image feature clusters in the first image to be matched; m is a set of intermediate variables in the current iteration; match t A set of matching edges corresponding to the matching matrix; m is i Is the ith matching edge in the matching matrix, an
Figure BDA0001999929070000058
Figure BDA0001999929070000059
An edge between an image feature cluster p and an image feature cluster i in a first image to be matched;
Figure BDA00019999290700000510
an edge between an image feature cluster q and an image feature cluster i in a second image to be matched is defined; NN (·) is a nearest neighbor image feature function;
Figure BDA0001999929070000061
is given by m i When intermediate variables are present, image feature clusters
Figure BDA0001999929070000062
The nearest neighbor feature of (a); k is the number of the minimum dissimilarity points of the image feature cluster pair; k is a radical of 2 Clustering a second parameter in the image feature cluster in the current iteration of kNN; z is a normalization function; and is
Figure DEST_PATH_IMAGE002
exp () is an exponential function with a natural base number as the base;
Figure BDA0001999929070000064
a transfer error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and an intermediate variable;
by means of the formula (I) and (II),
Figure BDA0001999929070000065
is obtained in Match t In the selection of m i As the probability of the intermediate variable, wherein,
p(V P =v P |M=m i ,Match t ) In Match t In selecting m i Probability as an intermediate variable; v. of P The image feature clusters in the set of image feature clusters in the first image to be matched are obtained; k is a radical of 1 Clustering a first parameter in image feature clusters in the current iteration of kNN;
by means of the formula (I) and (II),
Figure BDA0001999929070000066
obtaining Match t Middle image feature cluster V Q With intermediate variable m i The probability of correlation, wherein,
p(M=m i |Match t ) Is Match t Middle image feature cluster V Q With intermediate variable m i A probability of correlation; match t The number of matching edges corresponding to the matching matrix;
by means of the formula (I) and (II),
Figure BDA0001999929070000067
a confidence level indicating the vector is obtained, wherein,
p(V P ,V Q |M t ) Is a confidence level indicating a vector; m t Is a matching matrix.
The embodiment of the invention also provides a progressive graph matching device of the deformation graph based on clustering, which comprises the following components:
the first acquisition module is used for acquiring image characteristics of an image to be matched, wherein the image to be matched comprises: the image matching method comprises the following steps of obtaining a first image to be matched and a second image to be matched, wherein the image characteristics comprise: image geometry features;
the merging module is used for merging two adjacent image features with the difference value smaller than a preset threshold value into one image feature to obtain an image feature cluster;
the second acquisition module is used for acquiring candidate matching aiming at the image to be matched and other images to be matched according to the matching relation between the image feature cluster of the image to be matched and the image feature clusters of other images to be matched, acquiring a similarity value of the candidate matching according to a distribution matrix and a symmetrical similarity matrix of the candidate matching aiming at each candidate matching, and acquiring an indication vector of the candidate matching according to the maximum value of the similarity value;
the judging module is used for judging whether the similarity value corresponding to the indication vector is converged;
a third obtaining module, configured to, if the determination result of the determining module is negative, obtain a confidence level of each candidate matching indication vector according to the indication vector, and update an element value in the matching matrix to obtain an updated matching matrix if the confidence level of the indication vector is not less than a preset threshold; a second obtaining module, configured to take the corresponding relationship of each image feature in the matching matrix as a matching result of the image to be matched, until the similarity value corresponding to the indication vector converges, and under the condition that the similarity value corresponding to each indication vector converges;
and the setting module is used for taking the corresponding relation of each image characteristic in the matching matrix as the matching result of the image to be matched under the condition that the judgment result of the judgment module is yes and the similarity values corresponding to the indication vectors are all converged.
Optionally, the merging module is configured to:
a: converting an image to be matched into an activated image, and taking each image feature in the activated image as an image feature cluster;
b: matching each image feature cluster in the activated image, and aiming at each image feature cluster pair in each image feature cluster pair, using a formula,
Figure BDA0001999929070000071
obtaining a number of minimum dissimilarity points of the pair of image feature clusters, wherein,
k is the number of dissimilarity points of all possible element pairs in the image feature cluster pair; k is a radical of formula AP Is a preset first control parameter; c a The number of elements in the image feature cluster a; c b The number of elements in the image feature cluster b; r is AP The preset second control parameter; | · | is the euclidean distance function; i C a ||C b L is the number of possible pairs of elements between pairs of image feature clusters;
c: by means of the formula (I) and (II),
Figure BDA0001999929070000072
and obtaining the difference value of the image characteristic cluster pair, wherein,
D kNN (k,C a ,C b ) The difference value between the image feature cluster a and the image feature cluster b in the image feature cluster pair is obtained; f is the number of pairing element pairs between the image feature cluster a and the image feature cluster b in the image feature cluster pair; min is a minimum evaluation function; sigma is a summation function; d (m) i ,m j ) For the element m in the image feature cluster a i With the element m in the image feature cluster b j The similarity between them; i is the element m i The serial number of (2); j is an element m j The serial number of (2);
d: judging whether the difference value of the image feature cluster pair is smaller than a preset difference threshold value or not;
e: if yes, combining the image feature cluster pairs into an image feature cluster, and returning to execute the step B until the difference value of any image feature cluster pair is not smaller than the difference threshold value;
f: and if not, taking the image feature cluster as a combined image feature cluster under the condition that the difference value of other image feature cluster pairs is not smaller than a preset difference threshold value.
Optionally, the second obtaining module is configured to:
obtaining the constraint condition corresponding to the candidate match,
Figure BDA0001999929070000081
wherein the content of the first and second substances,
m t matching corresponding initial indication vectors for the candidates; m t Is an allocation matrix; n is P The number of the image feature clusters in the first image to be matched is obtained; n is Q The number of the image feature clusters in the second image to be matched is set;
Figure BDA0001999929070000082
is a full one vector with the scale of n;
Figure BDA0001999929070000083
is a full one vector with the scale of n;
by means of the formula (I) and (II),
Figure BDA0001999929070000084
calculating a transfer error of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched, corresponding to the candidate match, wherein,
d jb|ia a transmission error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and a matching (i, a) formed by an image feature cluster i in the first image to be matched and an image feature cluster a in the second image to be matched; | | is a norm function;
Figure BDA0001999929070000085
the image feature cluster b in the second image to be matched is obtained;
Figure BDA0001999929070000086
is firstAffine homodyne transformation results from the image feature cluster j in the image to be matched to the image feature cluster b in the second image to be matched;
Figure BDA0001999929070000087
the image feature cluster j in the first image to be matched is obtained;
according to the transfer error from the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched, by using a formula,
Figure BDA0001999929070000088
and calculating a symmetric similarity value corresponding to the candidate match, wherein,
W ia;jb matching the corresponding symmetric similarity value for the candidate, and the candidate matching is including the edge
Figure BDA0001999929070000089
And
Figure BDA00019999290700000810
the candidate match of (2); p is a first image to be matched; q is a second image to be matched; epsilon P Is a set of edges contained in the first image to be matched; epsilon Q Is a set of edges contained in the second image to be matched; i is the ith point in the first image to be matched; j is the jth point in the first image to be matched; a is the a-th point in the second image to be matched; b is the b-th point in the second image to be matched;
Figure BDA0001999929070000091
matching corresponding clusters of image features for candidate matches
Figure BDA0001999929070000092
And matching image feature clusters
Figure BDA0001999929070000093
A corresponding second order similarity function of symmetric transmission errors; d bj|ai Is a graph from the image feature cluster b in the second image to be matched to the first image to be matchedA matching (b, j) pair formed by the image feature cluster j and a transmission error between the matching (a, i) formed by the image feature cluster a in the second image to be matched and the image feature cluster i in the first image to be matched; d ia|jb A transfer error between a matching (i, a) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster a in the second image to be matched and a matching (j, b) formed by the image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched; d ai|bj The transfer error between the matching (a, i) formed by the image feature cluster a in the second image to be matched to the image feature cluster i in the first image to be matched and the matching (b, j) formed by the image feature cluster b in the second image to be matched to the image feature cluster j in the first image to be matched is obtained; alpha is a preset threshold value similar to the image feature cluster; max () is the maximum value evaluation function;
according to the symmetrical similarity value corresponding to the candidate matching, the symmetrical similarity value is used as an element to construct a symmetrical similarity matrix;
using a formula, S (M), based on a constraint matrix corresponding to the candidate match and a symmetric similarity matrix corresponding to the candidate match t )=M t T WM t Obtaining a similarity value for the candidate match, wherein,
S(M t ) For the t-th iteration, the distribution matrix is M t A similarity value of the candidate match; m t T For the t-th iteration, a matrix M is allocated t Transposing; w is the symmetric transmission error corresponding to the candidate match.
Optionally, the second obtaining module is configured to:
by means of the formula (I) and (II),
Figure BDA0001999929070000094
computing an indication vector corresponding to the candidate match, wherein,
Figure BDA0001999929070000095
matching corresponding indication vectors for the candidates;
Figure BDA0001999929070000096
a variable evaluation function that is a maximum of the function; s (M) t ) A similarity value for the candidate match.
Optionally, the third obtaining module is configured to:
by means of the formula (I) and (II),
Figure BDA0001999929070000101
obtaining Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i The probability of correlation, wherein,
Figure BDA0001999929070000102
is Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i A probability of correlation; v Q The set of image feature clusters in the second image to be matched is obtained;
Figure BDA0001999929070000103
the image feature cluster b in the image feature cluster set in the second image to be matched is obtained; v P The method comprises the steps of setting a set of image feature clusters in a first image to be matched;
Figure BDA0001999929070000104
the image feature cluster b in the set of image feature clusters in the first image to be matched; m is a set of intermediate variables in the current iteration; match t A set of matching edges corresponding to the matching matrix; m is i Is the ith matching edge in the matching matrix, an
Figure BDA0001999929070000105
Figure BDA0001999929070000106
An edge between an image feature cluster p and an image feature cluster i in a first image to be matched;
Figure BDA0001999929070000107
an edge between an image feature cluster q and an image feature cluster i in a second image to be matched; NN (·) is a nearest neighbor image characteristic function;
Figure BDA0001999929070000108
is given by m i When intermediate variables are present, image feature clusters
Figure BDA0001999929070000109
The nearest neighbor feature of (a); k is the number of the minimum dissimilarity points of the image feature cluster pair; k is a radical of 2 Clustering a second parameter in the image feature cluster in the current iteration of kNN; z is a normalization function; and is
Figure DEST_PATH_IMAGE003
exp () is an exponential function with a natural base number as the base;
Figure BDA00019999290700001011
a transfer error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and an intermediate variable;
by means of the formula (I) and (II),
Figure BDA00019999290700001012
is obtained in Match t In the selection of m i As the probability of the intermediate variable, wherein,
p(V P =v P |M=m i ,Match t ) In Match t In selecting m i Probability as an intermediate variable; v. of P The image feature clusters in the set of image feature clusters in the first image to be matched are obtained; k is a radical of 1 Clustering a first parameter in image feature clusters in the current iteration of kNN;
by means of the formula (I) and (II),
Figure BDA0001999929070000111
obtaining Match t Middle image feature cluster V Q And the middleVariable m i The probability of correlation, wherein,
p(M=m i |Match t ) Is Match t Middle image feature cluster V Q With intermediate variable m i A probability of correlation; match t The number of matching edges corresponding to the matching matrix;
by means of the formula(s),
Figure BDA0001999929070000112
a confidence level indicating the vector is obtained, wherein,
p(V P ,V Q |M t ) Is a confidence level indicating a vector; m t Is a matching matrix.
Compared with the prior art, the invention has the following advantages:
by applying the embodiment of the invention, because more or less relevant information exists among a plurality of features extracted from the image to be matched, the relevant information can be used for marking the features. Therefore, after the image features are extracted, the features of the images are clustered, the features with similar attributes in the feature set are mined, and the features in the feature set are divided organically, so that the data volume of the features is reduced, the calculated amount in the feature matching process is reduced, and the efficiency of image feature matching is improved.
Drawings
Fig. 1 is a schematic flowchart of a progressive graph matching method based on clustering deformation graphs according to an embodiment of the present invention;
fig. 2 is a schematic diagram of a matching relationship between image feature clusters in a cluster-based progressive graph matching method for a deformation graph according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a progressive graph matching apparatus based on a cluster deformation graph according to an embodiment of the present invention.
Detailed Description
The following examples are given for the detailed implementation and specific operation of the present invention, but the scope of the present invention is not limited to the following examples.
The embodiment of the invention provides a cluster-based progressive graph matching method and a cluster-based progressive graph matching device for a deformation graph, and firstly introduces the cluster-based progressive graph matching method for the deformation graph provided by the embodiment of the invention.
Fig. 1 is a schematic flowchart of a progressive graph matching method based on a cluster deformation graph according to an embodiment of the present invention, as shown in fig. 1, the method includes:
s101: acquiring image characteristics of an image to be matched, wherein the image to be matched comprises: the image matching method comprises the following steps of obtaining a first image to be matched and a second image to be matched, wherein the image characteristics comprise: image geometry characteristics.
Illustratively, the acquired images to be matched are an image P and an image Q, wherein one of the image P and the image Q is an unknown image, and the other image is a known image. For example, in the field of video surveillance, when a video frame corresponding to a target person is to be acquired from a large number of video images, the known image is an image of the target person acquired in advance; the unknown image is each frame image in the video image.
In practical applications, each image to be matched includes several geometric features, for example, points, lines, and regions in the image to be matched, such as edges, corners, contours, and the like. In general, each feature can be obtained by using a feature extraction algorithm, for example, the corner detection algorithm includes: harris corner detection algorithm, SUSAN corner detection algorithm, etc.
Further, the image features may include, but are not limited to, geometric features of the image, such as grayscale features of the image, color features of the image, brightness features of the image, and the like.
Finally, the obtained set of image features of the image to be matched is used as a candidate response set, for example, the candidate response set of the image P is
Figure BDA0001999929070000121
Wherein the content of the first and second substances,
Figure BDA0001999929070000122
is a candidate response set; c. C i Is the ith image feature; n is the number of features in the candidate response set.
S102: and combining two adjacent image features with the difference value smaller than a preset threshold value into one image feature to obtain an image feature cluster.
Specifically, the step S102 may include the following steps:
a: converting an image to be matched into an activated image, and taking each image feature in the activated image as an image feature cluster.
For example, in the first iteration, based on the candidate response set in step S101, the aggregated active cluster is initialized, and θ is made 0 ={C i ={c i (i =1, …, n) }, wherein,
θ 0 initializing a set of aggregated activated image feature clusters; c i The ith activated image feature cluster.
Taking the tth iteration as an example, the active cluster of the aggregation at the tth iteration is:
θ t ={C i ={c i }(i=1,…,n)}
in this step, each image feature is taken as an active image feature cluster.
B: and performing pairing processing on each image feature cluster in the activated image, for example, pairing each image feature cluster in the image P to be matched with each image feature cluster in the image P to be matched. The following is then performed for each image feature cluster pair.
For each of the respective pairs of image feature clusters, using a formula,
Figure BDA0001999929070000131
obtaining a number of minimum dissimilarity points of the pair of image feature clusters, wherein,
k is the number of all possible element pair dissimilarity points in the image feature cluster pair; k is a radical of AP Is a preset first controlA parameter; c a The number of elements in the image feature cluster a; c b The number of elements in the image feature cluster b; r is AP The preset second control parameter; | · | is the euclidean distance function; i C a ||C b And | is the number of possible pairs of elements between pairs of image feature clusters.
By applying the calculation method, the progressive link effect of a kNN (k-Nearest Neighbor algorithm) clustering model can be effectively avoided by adjusting the preset first control parameter and the preset second control parameter, and the deformation of the object can be effectively coped with.
C: by means of the formula (I) and (II),
Figure BDA0001999929070000132
and obtaining the difference value of the image characteristic cluster pair, wherein,
D kNN (k,C a ,C b ) The difference value between the image feature cluster a and the image feature cluster b in the image feature cluster pair is obtained; f is the number of pairing element pairs between the image feature cluster a and the image feature cluster b in the image feature cluster pair; min is a minimum evaluation function; sigma is a summation function; d (m) i ,m j ) For the element m in the image feature cluster a i With the element m in the image feature cluster b j Similarity between them; i is an element m i The serial number of (2); j is the element m j The serial number of (2).
With the above embodiment of the present invention, the kNN connection model uses the average of k minimum dissimilarity points of all possible element pairs dissimilarity points between two image feature clusters. It is robust in restoring elongated or connected clusters, because it considers k pairs of support elements, rather than just one as a single key.
D: judging whether the difference value of the image feature cluster pair is smaller than a preset difference threshold value or not; if yes, executing step E; if not, executing step F.
Illustratively, whether the difference value between the image feature cluster a and the image feature cluster b is less than the difference threshold δ D
If the difference value is less than the difference threshold delta D And D, the similarity between the image feature cluster a and the image feature cluster b is relatively high, the image feature cluster a and the image feature cluster b can be combined into one image feature cluster, and the step E is executed at this moment.
If the difference threshold is less than delta D I.e. greater than or equal to the difference threshold delta D And D, explaining that the similarity between the image feature cluster a and the image feature cluster b is relatively small, and the image feature cluster a and the image feature cluster b cannot be combined into one image feature cluster, wherein the step F is executed at this moment.
E: and D, combining the image feature cluster pairs into an image feature cluster, and returning to execute the step B until the difference value of any image feature cluster pair is not less than the difference threshold value.
Illustratively, in this step, the image feature cluster a and the image feature cluster b are merged into one image feature cluster, and a new image feature cluster C is obtained q Then, new image feature cluster C is clustered q Adding to the active cluster set θ t In the method, an image feature cluster a and an image feature cluster b are selected from an active cluster set theta t Deletion, namely:
θ t =(θ t -{C a ,C b })∪{C q }。
it will be appreciated that on the first iteration, a new image feature cluster C will be clustered q Adding to the active cluster set θ 0 In the method, an image feature cluster a and an image feature cluster b are selected from an active cluster set theta 0 Deletion, namely:
θ 0 =(θ 0 -{C a ,C b })∪{C q }。
and repeating the step B after the updating of the activated cluster set is finished, and continuously executing the circulating operation until the difference value of each image feature cluster pair is not less than the difference threshold value delta D The loop is ended.
F: and under the condition that the difference values of other image feature cluster pairs are not smaller than a preset difference threshold value, taking the image feature cluster as a combined image feature cluster.
Illustratively, if and only if the difference between the image feature cluster a and the image feature cluster b is largeIs equal to or greater than a difference threshold δ D Meanwhile, the difference value between other image feature cluster pairs is greater than or equal to the difference threshold value delta D In this case, the iteration is ended, and the clustering result after the iteration is used as the final clustering result, so that a plurality of clustered image feature clusters can be obtained.
In step S102, the kNN clustering algorithm reflects the connectivity between deformable object parts and the compactness of the object parts. In each aggregation step, image feature clusters with high similarity may be merged into a larger image feature cluster, while image feature clusters with low similarity may still be smaller image feature clusters, thereby reducing the number of image feature clusters.
In embodiments of the present invention, clustering image features is analogous to giving each image feature a label. In recent years, robust feature correspondence methods have been proposed to account for geometric distortion of objects between images. They formulated visual correspondence as an image matching problem by defining an objective function based on photometric similarity and corresponding geometric compatibility. Although these methods show good performance, they all deal with the weakly supervised case where the outlier ratio is relatively low, i.e. two images have a common object or a model image is used. However, in the real world, image pairs may have significant clutter, multiple common objects, and even correspondence between multiple objects.
Therefore, the signature correspondence problem requires cross-searching in an unsupervised manner for multiple objects of significant outliers and multiple corresponding clusters. The method aims to establish feature correspondence and object-based clustering thereof aiming at the significant clutter and deformation of any image. Based on the following points:
in the embodiment of the invention, a bottom-up aggregation strategy is applied in the step S101: if we merge them with reliable neighbors step by step, starting from reliable initial matches, outliers can be collected efficiently despite the presence of a large number of distracting outliers. For example, seed-based detection methods show that such bottom-up aggregation with iterative match propagation can improve target recognition performance.
In addition, the embodiment of the invention also considers the connectivity among all parts: for deformable objects, feature correspondences do not form global compactness on two-by-two of their geometric similarities due to deformation, with deformed portions locally connected by some intervening portions. Therefore, for the clusters corresponding to the deformed object features, the connectivity criterion needs to be considered.
In summary, the embodiment of the present invention forms compact corresponding clusters in the early stage, gradually merges local connected clusters adapted to the object deformation part, and finally performs progressive graph matching optimization by constructing a graph model, thereby achieving a better matching effect.
S103: according to the matching relation between the image feature cluster of the image to be matched and the image feature clusters of other images to be matched, candidate matching aiming at the image to be matched and other images to be matched is obtained, for each candidate matching, the similarity value of the candidate matching is obtained according to the distribution matrix and the symmetrical similarity matrix of the candidate matching, and the indication vector of the candidate matching is obtained according to the maximum value of the similarity value.
The specific step S103 may include the following steps:
g: after the image P and the image Q are respectively processed in step S102, clustered image feature clusters are respectively obtained. And respectively constructing corresponding activation maps for the image P and the image Q:
when the indication vector of the candidate match is obtained, several iterations are also required, and the following description will be given by taking the t-th iteration as an example:
in the t-th iteration, the activation map corresponding to the image P is:
Figure BDA0001999929070000151
wherein the content of the first and second substances,
V t P is a set of edges contained in the activation map corresponding to the image P;
Figure BDA0001999929070000152
contained in activation maps corresponding to picture PA set of points;
Figure BDA0001999929070000153
the image characteristic set of the image P at the t iteration is obtained;
at the t-th iteration, the activation map corresponding to the image Q is:
Figure BDA0001999929070000154
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0001999929070000155
a set of edges contained in the activation map corresponding to the image P;
Figure BDA0001999929070000156
a set of points contained in the activation map corresponding to the image P;
Figure BDA0001999929070000157
is the image feature set of image Q at the t-th iteration.
And then to the activation map
Figure BDA0001999929070000158
Each image feature cluster in the image, and establishing the image feature cluster and activation map
Figure BDA0001999929070000159
Candidate matches between the respective image feature clusters in (a), namely:
Figure BDA0001999929070000161
wherein the content of the first and second substances,
C t is the set of candidate matches at the t-th iteration.
It is emphasized that in constructing the candidate matches, the following constraints need to be followed:
obtaining the constraint condition corresponding to the candidate match,
Figure BDA0001999929070000162
wherein the content of the first and second substances,
m t matching corresponding initial indication vectors for the candidates; m t Is an allocation matrix; n is a radical of an alkyl radical P The number of the image feature clusters in the first image to be matched is obtained; n is Q The number of the image feature clusters in the second image to be matched is obtained;
Figure BDA0001999929070000163
is a full one vector with the scale of n;
Figure BDA0001999929070000164
is an all-one vector of scale n, and
Figure BDA0001999929070000165
the elements in the matrix are all less than or equal to
Figure BDA0001999929070000166
Each element of (1).
The meaning of the above formula is bidirectional constraint representation
Figure BDA0001999929070000167
And
Figure BDA0001999929070000168
is matched one-to-one such that M t A matrix is assigned.
The following description will be given of the instruction vector acquisition process taking one of several candidate matches between the image P and the image Q as an example.
H: fig. 2 is a schematic diagram of a matching relationship between image feature clusters in a cluster-based progressive graph matching method for a deformation graph according to an embodiment of the present invention, as shown in fig. 2,
in the context of figure 2 of the drawings,
Figure BDA0001999929070000169
is the ith image feature cluster in the image P;
Figure BDA00019999290700001610
is the jth image feature cluster in the image P;
Figure BDA00019999290700001611
is the a-th image feature cluster in the image Q;
Figure BDA00019999290700001612
is the b-th image feature cluster in the image Q;
Figure BDA00019999290700001613
affine homodyne transformation results of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched are obtained; gamma-shaped ia The affine homography transformation result from the image feature cluster i in the first image to be matched to the image feature cluster a in the second image to be matched is obtained.
To be provided with
Figure BDA00019999290700001614
The centered affine region feature i can be represented by an elliptical region, the direction of which is estimated by the dominant direction of the gradient histogram of the local region. With these features, affine homostrain Γ from an image feature cluster i in the first image to be matched P to another image feature cluster feature a in the second image to be matched Q can be derived ia (. O) so that two points are
Figure BDA00019999290700001615
And
Figure BDA00019999290700001616
neighborhood m of P And m Q The following steps are involved: m is a unit of Q =Γ ia (m P ). Then, given two matches (i, a) and (j, b), as shown in FIG. 2, the transfer error of (j, b) to (i, a) is given by d jb|ia And (4) showing.
I: by means of the formula (I) and (II),
Figure BDA00019999290700001617
calculating image features in the first image to be matched corresponding to the candidate matchThe transfer error of the cluster j to the cluster b of image features in the second image to be matched, wherein,
d jb|ia a transfer error between a candidate matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and a candidate matching (i, a) pair formed by an image feature cluster i in the first image to be matched and an image feature cluster a in the second image to be matched; the integer is a norm function;
Figure BDA0001999929070000171
the image feature cluster b in the second image to be matched is obtained;
Figure BDA0001999929070000172
affine homodyne transformation results of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched are obtained;
Figure BDA0001999929070000173
is the image feature cluster j in the first image to be matched. It will be appreciated that the image feature cluster is a cluster of edges in this step. An image feature cluster can be used as a node for iterative processing.
Γ ia Can better map the feature v j Is shifted to the image feature cluster v b A central point of (2), then even d jb|ia The value of (c) is minimal.
J: according to the transfer error from the image characteristic cluster j in the first image to be matched to the image characteristic cluster b in the second image to be matched, by using a formula,
Figure BDA0001999929070000174
and calculating a symmetric similarity value corresponding to the candidate match, wherein,
W ia;jb matching the corresponding symmetric similarity value for the candidate, and the candidate matching is including the edge
Figure BDA0001999929070000175
And
Figure BDA0001999929070000176
the candidate match of (2); p is a first image to be matched; q is a second image to be matched; epsilon P Is a set of edges contained in the first image to be matched; epsilon Q Is a set of edges contained in the second image to be matched; i is the ith point in the first image to be matched; j is the jth point in the first image to be matched; a is the a-th point in the second image to be matched; b is the b-th point in the second image to be matched;
Figure BDA00019999290700001710
matching corresponding clusters of image features for candidate matches
Figure BDA0001999929070000178
And matching image feature clusters
Figure BDA0001999929070000179
A corresponding second order similarity function of symmetric transmission errors; d bj|ai A transmission error between a matching (b, j) pair formed by an image feature cluster b in the second image to be matched and an image feature cluster j in the first image to be matched and a matching (a, i) formed by an image feature cluster a in the second image to be matched and an image feature cluster i in the first image to be matched; d ia|jb A transfer error between a matching (i, a) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster a in the second image to be matched and a matching (j, b) formed by the image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched; d is a radical of ai|bj The transfer error between the matching (a, i) formed by the image feature cluster a in the second image to be matched to the image feature cluster i in the first image to be matched and the matching (b, j) formed by the image feature cluster b in the second image to be matched to the image feature cluster j in the first image to be matched is obtained; alpha is a preset threshold value similar to the image feature cluster; max () is a maximum value evaluation function.
The above function may also be referred to as an STE second order similarity function.
K: and according to the symmetrical similarity value corresponding to the candidate matching, taking the symmetrical similarity value as an element to construct a symmetrical similarity matrix.
Illustratively, W may be ia;jb And further constructing a symmetric similarity matrix between the image P and the image Q as elements in the symmetric similarity matrix.
L: using a formula, S (M), based on a constraint matrix corresponding to the candidate match and a symmetric similarity matrix corresponding to the candidate match t )=M t T WM t Obtaining a similarity value for the candidate match, wherein,
S(M t ) For the t-th iteration, the distribution matrix is M t A similarity value of the candidate match; m t T For the t-th iteration, a matrix M is allocated t Transposing; w is the symmetric transmission error corresponding to the candidate match.
The second order similarity function is encoded in the symmetric similarity matrix W, and the elements in the second order similarity function
Figure BDA0001999929070000181
Comprising two matching nodes
Figure BDA0001999929070000182
And
Figure BDA0001999929070000183
m: obtaining the indication vector of the candidate match according to the maximum value of the similarity value, including:
by means of the formula (I) and (II),
Figure BDA0001999929070000184
computing an indication vector corresponding to the candidate match, wherein,
Figure BDA0001999929070000185
matching corresponding indication vectors for the candidates;
Figure BDA0001999929070000186
a variable evaluation function that is a maximum of the function; s (M) t ) A similarity value for the candidate match.
In this step, the matching relationship between the image feature clusters corresponding to the candidate match with the largest similarity value is used as an indication vector.
After hierarchical clustering is completed, a graph model-based progressive graph matching method is used, and the step consists of two alternative processes: graph matching and graph progression. The graph matching is an activation graph converted from the results obtained by the hierarchical clustering algorithm, the activation graph comprises multi-target matching with less features, and the graph ranking updates the activation graph and the similarity matrix thereof so as to improve the score of the next graph matching. The goal of progressive graph matching is to reconstruct the graph G by adapting P And G Q The graph matching score is further maximized.
S104: judging whether the similarity value corresponding to the indication vector is converged; if not, executing S105; if yes, go to step S106.
Exemplarily, judging whether a difference value between a maximum similarity value corresponding to the indication vector in the t iteration and a maximum similarity value in the t-1 iteration is smaller than a preset threshold value;
if yes, the similarity value corresponding to the indication vector is considered to be converged, and then step S105 is executed;
if not, indicating that the similarity value corresponding to the vector does not converge, and further executing the step S106.
S105: according to the candidate matched indication vectors, obtaining the confidence degrees of the indication vectors, and under the condition that the confidence degrees of the indication vectors are not smaller than a preset threshold value, updating element values in the matching matrix to obtain an updated matching matrix; and returning to execute the step S103 until the similarity values corresponding to the indication vectors are converged, and taking the corresponding relationship of each image feature in the matching matrix as the matching result of the image to be matched under the condition that the similarity values corresponding to each indication vector are all converged.
The specific step S105 may include the following steps:
n: can utilizeThe formula is shown in the figure,
Figure BDA0001999929070000191
obtaining Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i The probability of correlation, wherein,
Figure BDA0001999929070000192
is Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i A probability of correlation; v Q The set of image feature clusters in the second image to be matched is obtained;
Figure BDA0001999929070000193
an image feature cluster b in an image feature cluster set in a second image to be matched is obtained; v P The method comprises the steps of collecting image feature clusters in a first image to be matched;
Figure BDA0001999929070000194
the image feature cluster b in the set of image feature clusters in the first image to be matched; m is a set of intermediate variables in the current iteration; match t A set of matching edges corresponding to the matching matrix; m is i Is the ith matching edge in the matching matrix, an
Figure BDA0001999929070000195
Figure BDA0001999929070000196
The edge between the image feature cluster p and the image feature cluster i in the first image to be matched;
Figure BDA0001999929070000197
an edge between an image feature cluster q and an image feature cluster i in a second image to be matched is defined; NN (·) is a nearest neighbor image characteristic function;
Figure BDA0001999929070000198
is given by m i When intermediate variables are present, image feature clusters
Figure BDA0001999929070000199
The nearest neighbor feature of (a); k is the number of the minimum dissimilarity points of the image feature cluster pair; k is a radical of 2 For the second parameter in the clustering of image features in the current iteration of kNN, k is represented 2 Neighbor;
z is a normalization function; and is
Figure BDA00019999290700001910
exp () is an exponential function with a natural base number as the base;
Figure BDA00019999290700001911
and transmitting errors between a matching (j, b) pair formed by the image feature cluster j in the first image to be matched and the image feature cluster b in the second image to be matched and the intermediate variable.
In graph progression, M is matched given the current graph t (dimension h × w, graph P has h features, graph Q has w features), M is judged t Whether the elements of (i, j) and (j, i) positions of (a) and (b) are equal and 1, the matching edges of the graph P and the graph Q are obtained
Figure BDA0001999929070000201
O: by means of the formula (I) and (II),
Figure BDA0001999929070000202
is obtained in Match t In selecting m i As the probability of the intermediate variable, wherein,
p(V P =v P |M=m i ,Match t ) Is in Match t In selecting m i Probability as an intermediate variable; v. of P The image feature clusters in the set of image feature clusters in the first image to be matched are obtained; k is a radical of 1 For the first parameter in the clustering of image features in the current iteration of kNN, represent k 1 Neighbor;
p: by means of the formula (I) and (II),
Figure BDA0001999929070000203
obtaining Match t Middle image feature cluster V Q With intermediate variable m i The probability of correlation, wherein,
p(M=m i |Match t ) Is Match t Middle image feature cluster V Q With intermediate variable m i A probability of correlation; match t The number of matching edges corresponding to the matching matrix;
q: it is possible to use a formula of,
Figure BDA0001999929070000204
a confidence level indicating the vector is obtained, wherein,
p(V P ,V Q |M t ) To indicate confidence of the vector, i.e. candidate matches between two very large maps
Figure BDA0001999929070000205
The conditional probability of (a); m t Is a matching matrix; p (V) Q |V P ,M=m i ,M t ) For Match in N steps t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i A probability of correlation; p (V) P |M=m i ,M t ) In step O in Match t In selecting m i Probability as an intermediate variable; p (M = M) i |M t ) Is Match in P step t Middle image feature cluster V Q With intermediate variable m i The probability of correlation.
R: under the condition that the confidence of the indication vector is not less than a preset threshold, updating element values in the matching matrix to obtain an updated matching matrix:
judging whether the confidence coefficient is greater than or equal to a preset threshold value mu, and taking v higher than the preset threshold value mu P v Q And its constructed edges as new candidate matching set C t +1 element, and a new activation map
Figure BDA0001999929070000206
And
Figure BDA0001999929070000207
new activation map is represented by C t Nodes and edges in + 1.
According to the corresponding relation between the image feature clusters corresponding to the indication vector, finding the corresponding element value in the matching matrix, then updating the element value into the confidence coefficient of the indication vector, further completing the updating of the matching vector, then returning to execute the step S103 until the similarity value corresponding to the indication vector is converged, and in the graph progressive matching, using the constraint
Figure BDA0001999929070000211
Guarantee non-decreasing score
Figure BDA0001999929070000218
To achieve optimal map matching for each step. The iteration continues until the similarity values converge.
And under the condition that the similarity values corresponding to the indication vectors are all converged, taking the corresponding relation of the image features in the matching matrix as the matching result of the image to be matched.
If it is not
Figure BDA0001999929070000213
Nearest neighbor of (2)
Figure BDA0001999929070000214
Already exists in the current matching Match t In the diagram, i.e. matching
Figure BDA0001999929070000215
Is considered to be very reliable and achieves a maximum score;
otherwise, when
Figure BDA0001999929070000216
When the temperature of the water is higher than the set temperature,
Figure BDA0001999929070000217
the k nearest neighbor of (a) obtains the maximum score.
In practical application, p (V) can be used P ,V Q |Match t ) To determine the value of the threshold μ; in addition, intermediate variables are added for convenience of calculation.
S106: and under the condition that the similarity values corresponding to the indication vectors are all converged, taking the corresponding relation of the image features in the matching matrix as the matching result of the image to be matched.
Exemplarily, in the t-th iteration, the processing procedure of the other candidate matching indication vectors is as shown in steps H to S104, and the iteration is ended only when the similarity values of the indication vectors are converged;
if the similarity value of the indicated vector of one candidate match does not converge, step S105 needs to be performed.
By applying the embodiment of the invention shown in fig. 1, since there is more or less related information between many features in the features extracted from the image to be matched, the related information can be used to mark the features. Therefore, after the image features are extracted, the features of the images are clustered, the features with similar attributes in the feature set are mined, and the features in the feature set are divided organically, so that the data volume of the features is reduced, the calculated amount in the feature matching process is reduced, and the efficiency of image feature matching is improved.
Corresponding to the embodiment of the invention shown in fig. 1, the embodiment of the invention also provides a progressive graph matching device of the deformation graph based on clustering.
Fig. 3 is a schematic structural diagram of a progressive graph matching apparatus based on clustering deformation maps according to an embodiment of the present invention, as shown in fig. 3, the apparatus includes:
a first obtaining module 301, configured to obtain an image feature of an image to be matched, where the image to be matched includes: the image matching method comprises the following steps of obtaining a first image to be matched and a second image to be matched, wherein the image characteristics comprise: image geometry features;
a merging module 302, configured to merge two adjacent image features with a difference value smaller than a preset threshold into one image feature, so as to obtain an image feature cluster;
a second obtaining module 303, configured to obtain candidate matches for the image to be matched and other images to be matched according to matching relationships between image feature clusters of the image to be matched and image feature clusters of other images to be matched, obtain a similarity value of the candidate matches according to a distribution matrix and a symmetric similarity matrix of the candidate matches for each candidate match, and obtain an indication vector of the candidate match according to a maximum value of the similarity value;
a judging module 304, configured to judge whether a similarity value corresponding to the indication vector converges;
a third obtaining module 305, configured to, if the determination result of the determining module 304 is negative, obtain a confidence level of each candidate matching indication vector according to the indication vector, and update an element value in the matching matrix to obtain an updated matching matrix if the confidence level of the indication vector is not smaller than a preset threshold; triggering a second obtaining module 303 until the similarity values corresponding to the indication vectors are converged, and taking the corresponding relation of each image feature in the matching matrix as the matching result of the image to be matched under the condition that the similarity values corresponding to each indication vector are all converged;
a setting module 306, configured to, when the determination result of the determining module 304 is yes, take the corresponding relationship of each image feature in the matching matrix as the matching result of the image to be matched when the similarity value corresponding to each indication vector is converged.
By applying the embodiment of the invention shown in fig. 3, since there is more or less related information between many features in the features extracted from the image to be matched, the related information can be used to mark the features. Therefore, after the image features are extracted, the features of the images are clustered, the features with similar attributes in the feature set are mined, and the features in the feature set are organically divided, so that the data volume of the features is reduced, the calculated amount in the feature matching process is reduced, and the efficiency of image feature matching is improved.
In a specific implementation manner of the embodiment of the present invention, the merging module 302 is configured to:
a: converting an image to be matched into an activated image, and taking each image feature in the activated image as an image feature cluster;
b: matching each image feature cluster in the activated image, and aiming at each image feature cluster pair in each image feature cluster pair, using a formula,
Figure BDA0001999929070000221
obtaining a number of minimum dissimilarity points of the pair of image feature clusters, wherein,
k is the number of all possible element pair dissimilarity points in the image feature cluster pair; k is a radical of AP Is a preset first control parameter; c a The number of elements in the image feature cluster a; c b The number of elements in the image feature cluster b; r is AP Is a preset second control parameter; | is a Euclidean distance function; i C a ||C b L is the number of possible pairs of elements between pairs of image feature clusters;
c: by means of the formula (I) and (II),
Figure BDA0001999929070000231
and obtaining the difference value of the image characteristic cluster pair, wherein,
D kNN (k,C a ,C b ) The difference value between the image feature cluster a and the image feature cluster b in the image feature cluster pair is obtained; f is the number of pairing element pairs between the image feature cluster a and the image feature cluster b in the image feature cluster pair; min is a minimum evaluation function; sigma is a summation function; d (m) i ,m j ) For the element m in the image feature cluster a i With the element m in the image feature cluster b j The similarity between them; i is the element m i The serial number of (2); j is an element m j The serial number of (2);
d: judging whether the difference value of the image feature cluster pair is smaller than a preset difference threshold value or not;
e: if yes, combining the image feature cluster pairs into an image feature cluster, and returning to execute the step B until the difference value of any image feature cluster pair is not smaller than the difference threshold value;
f: and if not, taking the image feature cluster as a combined image feature cluster under the condition that the difference value of other image feature cluster pairs is not smaller than a preset difference threshold value.
In a specific implementation manner of the embodiment of the present invention, the second obtaining module 303 is configured to:
obtaining the constraint condition corresponding to the candidate match,
Figure BDA0001999929070000232
wherein the content of the first and second substances,
m t matching corresponding initial indication vectors for the candidates; m t Is an allocation matrix; n is P The number of the image feature clusters in the first image to be matched is set; n is Q The number of the image feature clusters in the second image to be matched is set;
Figure BDA0001999929070000233
is a full one vector with the scale of n;
Figure BDA0001999929070000234
is a full one vector with the scale of n;
by means of the formula (I) and (II),
Figure BDA0001999929070000235
calculating a transfer error of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched, corresponding to the candidate match, wherein,
d jb|ia a transmission error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and a matching (i, a) formed by an image feature cluster i in the first image to be matched and an image feature cluster a in the second image to be matched; | | is a norm function;
Figure BDA0001999929070000236
the image feature cluster b in the second image to be matched is obtained;
Figure BDA0001999929070000237
affine homodyne transformation results of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched are obtained;
Figure BDA0001999929070000238
the image feature cluster j in the first image to be matched is obtained;
according to the transfer error from the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched, by using a formula,
Figure BDA0001999929070000241
and calculating a symmetric similarity value corresponding to the candidate match, wherein,
W ia;jb matching the corresponding symmetric similarity value for the candidate, and the candidate matching is including the edge
Figure BDA0001999929070000242
And
Figure BDA0001999929070000243
the candidate match of (2); p is a first image to be matched; q is a second image to be matched; epsilon P Is a set of edges contained in the first image to be matched; epsilon Q Is a set of edges contained in the second image to be matched; i is the ith point in the first image to be matched; j is the jth point in the first image to be matched; a is the a-th point in the second image to be matched; b is the b-th point in the second image to be matched;
Figure BDA0001999929070000244
matching corresponding clusters of image features for candidate matches
Figure BDA0001999929070000245
And matching image feature clusters
Figure BDA0001999929070000246
A corresponding second order similarity function of symmetric transmission errors; d is a radical of bj|ai A transmission error between a matching (b, j) pair formed by an image feature cluster b in the second image to be matched and an image feature cluster j in the first image to be matched and a matching (a, i) formed by an image feature cluster a in the second image to be matched and an image feature cluster i in the first image to be matched; d is a radical of ia|jb A transfer error between a matching (i, a) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster a in the second image to be matched and a matching (j, b) formed by the image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched; d ai|bj The transfer error between the matching (a, i) formed by the image feature cluster a in the second image to be matched to the image feature cluster i in the first image to be matched and the matching (b, j) formed by the image feature cluster b in the second image to be matched to the image feature cluster j in the first image to be matched is obtained; alpha is a preset threshold value similar to the image feature cluster; max () is the maximum value evaluation function;
according to the symmetrical similarity value corresponding to the candidate matching, the symmetrical similarity value is used as an element to construct a symmetrical similarity matrix;
using a formula, S (M), based on a constraint matrix corresponding to the candidate match and a symmetric similarity matrix corresponding to the candidate match t )=M t T WM t Obtaining a similarity value for the candidate match, wherein,
S(M t ) For the t-th iteration, the distribution matrix is M t A similarity value of the candidate match; m t T For the t-th iteration, a matrix M is allocated t Transposing; w is the symmetric transmission error corresponding to the candidate match.
In a specific implementation manner of the embodiment of the present invention, the second obtaining module 303 is configured to:
by means of the formula(s),
Figure BDA0001999929070000247
computing an indication vector corresponding to the candidate match, wherein,
Figure BDA0001999929070000251
matching corresponding indication vectors for the candidates;
Figure BDA0001999929070000252
a variable evaluation function that is a maximum of the function; s (M) t ) A similarity value for the candidate match.
In a specific implementation manner of the embodiment of the present invention, the third obtaining module 305 is configured to:
by means of the formula (I) and (II),
Figure BDA0001999929070000253
obtaining Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i The probability of correlation, wherein,
Figure BDA0001999929070000254
is Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i A probability of correlation; v Q The set of image feature clusters in the second image to be matched is obtained;
Figure BDA0001999929070000255
an image feature cluster b in an image feature cluster set in a second image to be matched is obtained; v P The method comprises the steps of collecting image feature clusters in a first image to be matched;
Figure BDA0001999929070000256
the image feature cluster b in the set of image feature clusters in the first image to be matched; m is a set of intermediate variables in the current iteration; match t A set of matching edges corresponding to the matching matrix; m is i Is the ith matching edge in the matching matrix, an
Figure BDA0001999929070000257
Figure BDA0001999929070000258
An edge between an image feature cluster p and an image feature cluster i in a first image to be matched;
Figure BDA0001999929070000259
an edge between an image feature cluster q and an image feature cluster i in a second image to be matched is defined; NN (·) is a nearest neighbor image characteristic function;
Figure BDA00019999290700002510
is given by m i When intermediate variables are present, image feature clusters
Figure BDA00019999290700002511
The nearest neighbor feature of (a); k is the number of the minimum dissimilarity points of the image feature cluster pair; k is a radical of 2 For the second parameter in the clustering of image features in the current iteration of kNN, k is represented 2 Neighbor; z is a normalization function; and is
Figure BDA00019999290700002512
exp () is an exponential function with a natural base number as the base;
Figure BDA00019999290700002513
a transfer error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and an intermediate variable;
by means of the formula (I) and (II),
Figure BDA00019999290700002514
is obtained in Match t In the selection of m i As the probability of the intermediate variable, wherein,
p(V P =v P |M=m i ,Match t ) Is in Match t In the selection of m i Probability as an intermediate variable; v. of P Is a picture in the first image to be matchedImage feature clusters in the set of image feature clusters; k is a radical of 1 For the first parameter in the clustering of image features in the current iteration of kNN, represent k 1 Neighbor;
by means of the formula (I) and (II),
Figure BDA0001999929070000261
obtaining Match t Middle image feature cluster V Q With intermediate variable m i The probability of correlation, wherein,
p(M=m i |Match t ) Is Match t Middle image feature cluster V Q With intermediate variable m i A probability of correlation; match t The number of matching edges corresponding to the matching matrix;
by means of the formula (I) and (II),
Figure BDA0001999929070000262
a confidence level indicating the vector is obtained, wherein,
p(V P ,V Q |M t ) Is a confidence level indicating a vector; m is a group of t Is a matching matrix.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (10)

1. A progressive graph matching method for a cluster-based deformation graph, the method comprising:
1) The image characteristics of the image to be matched are obtained, wherein the image to be matched comprises: the image matching method comprises the following steps of obtaining a first image to be matched and a second image to be matched, wherein the image characteristics comprise: an image geometry feature;
2) Combining two adjacent image features of which the difference value is smaller than a preset threshold value into one image feature to obtain an image feature cluster;
3) Acquiring candidate matching aiming at the image to be matched and other images to be matched according to the matching relation between the image feature cluster of the image to be matched and the image feature clusters of other images to be matched, acquiring a similarity value of the candidate matching according to a distribution matrix and a symmetrical similarity matrix of the candidate matching aiming at each candidate matching, and acquiring an indication vector of the candidate matching according to the maximum value of the similarity value;
4) Judging whether the similarity value corresponding to the indication vector is converged;
5) If not, acquiring the confidence coefficient of the indication vector according to the indication vector of each candidate matching, and updating the element value in the matching matrix under the condition that the confidence coefficient of the indication vector is not less than a preset threshold value to obtain an updated matching matrix; and returning to execute the step 3) until the similarity values corresponding to the indication vectors are converged, and taking the corresponding relation of each image feature in the matching matrix as the matching result of the image to be matched under the condition that the similarity values corresponding to each indication vector are converged;
6) And if so, taking the corresponding relation of each image feature in the matching matrix as the matching result of the image to be matched under the condition that the similarity value corresponding to each indication vector is converged.
2. The progressive graph matching method based on clustering of deformation maps according to claim 1, wherein the step 2) comprises:
a: converting an image to be matched into an activated image, and taking each image feature in the activated image as an image feature cluster;
b: matching each image feature cluster in the activated image, and aiming at each image feature cluster pair in each image feature cluster pair, using a formula,
Figure FDA0001999929060000021
obtaining a number of minimum dissimilarity points of the pair of image feature clusters, wherein,
k is the number of all possible element pair dissimilarity points in the image feature cluster pair; k is a radical of AP Is a preset first control parameter;C a the number of elements in the image feature cluster a; c b The number of elements in the image feature cluster b; r is AP Is a preset second control parameter; | · | is the euclidean distance function; i C a ||C b L is the number of possible pairs of elements between pairs of image feature clusters;
c: by means of the formula (I) and (II),
Figure FDA0001999929060000022
and obtaining the difference value of the image characteristic cluster pair, wherein,
D kNN (k,C a ,C b ) The difference value between the image feature cluster a and the image feature cluster b in the image feature cluster pair is obtained; gamma is the number of the pairing element pairs between the image feature cluster a and the image feature cluster b in the image feature cluster pair; min is a minimum evaluation function; sigma is a summation function; d (m) i ,m j ) For the element m in the image feature cluster a i With element m in image feature cluster b j The similarity between them; i is the element m i The serial number of (2); j is the element m j The serial number of (2);
d: judging whether the difference value of the image feature cluster pair is smaller than a preset difference threshold value or not;
e: if yes, combining the image feature cluster pairs into an image feature cluster, and returning to execute the step B until the difference value of any image feature cluster pair is not smaller than the difference threshold value;
f: and if not, taking the image feature cluster as a combined image feature cluster under the condition that the difference value of other image feature cluster pairs is not smaller than a preset difference threshold value.
3. The progressive graph matching method based on clustering of deformation graphs according to claim 1, wherein the obtaining of similarity values of the candidate matches according to the distribution matrix and the symmetric similarity matrix of the candidate matches comprises:
obtaining the constraint condition corresponding to the candidate match,
Figure FDA0001999929060000031
wherein the content of the first and second substances,
m t matching corresponding initial indication vectors for the candidates; m t Is an allocation matrix; n is P The number of the image feature clusters in the first image to be matched is obtained; n is Q The number of the image feature clusters in the second image to be matched is set;
Figure FDA0001999929060000032
is a full one vector with the scale of n;
Figure FDA0001999929060000033
is a full one vector with the scale of n;
by means of the formula (I) and (II),
Figure FDA0001999929060000034
calculating a transfer error of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched, corresponding to the candidate match, wherein,
d jb|ia a transmission error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and a matching (i, a) formed by an image feature cluster i in the first image to be matched and an image feature cluster a in the second image to be matched; | | is a norm function;
Figure FDA0001999929060000035
the image feature cluster b in the second image to be matched is obtained;
Figure FDA0001999929060000036
affine homodyne transformation results of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched are obtained;
Figure FDA0001999929060000037
the image feature cluster j in the first image to be matched is obtained;
according to the first image to be matchedThe transfer error of the image feature cluster j in the second image to be matched to the image feature cluster b in the second image to be matched is determined by using a formula,
Figure FDA0001999929060000038
and calculating a symmetric similarity value corresponding to the candidate match, wherein,
W ia;jb matching the corresponding symmetric similarity value for the candidate, and the candidate matching is including the edge
Figure FDA0001999929060000039
And
Figure FDA00019999290600000310
the candidate match of (2); p is a first image to be matched; q is a second image to be matched; epsilon P Is a set of edges contained in the first image to be matched; epsilon Q Is a set of edges contained in the second image to be matched; i is the ith point in the first image to be matched; j is the jth point in the first image to be matched; a is the a-th point in the second image to be matched; b is the b-th point in the second image to be matched;
Figure FDA00019999290600000311
matching corresponding clusters of image features for candidate matches
Figure FDA00019999290600000312
And matching image feature clusters
Figure FDA00019999290600000313
A corresponding second order similarity function of symmetric transmission errors; d bj|ai A transmission error between a matching (b, j) pair formed by an image feature cluster b in the second image to be matched and an image feature cluster j in the first image to be matched and a matching (a, i) formed by an image feature cluster a in the second image to be matched and an image feature cluster i in the first image to be matched; d is a radical of ia|jb The image features from the image feature cluster j in the first image to be matched to the image features from the image feature cluster j in the second image to be matchedA matching (i, a) pair formed by the characteristic cluster a and a transfer error between an image characteristic cluster j in the first image to be matched and a matching (j, b) formed by an image characteristic cluster b in the second image to be matched; d ai|bj The transfer error between the matching (a, i) formed by the image feature cluster a in the second image to be matched to the image feature cluster i in the first image to be matched and the matching (b, j) formed by the image feature cluster b in the second image to be matched to the image feature cluster j in the first image to be matched is obtained; alpha is a preset threshold value similar to the image feature cluster; max () is the maximum evaluation function;
according to the symmetrical similarity value corresponding to the candidate matching, the symmetrical similarity value is used as an element to construct a symmetrical similarity matrix;
using a formula, S (M), based on a constraint matrix corresponding to the candidate match and a symmetric similarity matrix corresponding to the candidate match t )=M t T WM t Obtaining a similarity value for the candidate match, wherein,
S(M t ) For the t-th iteration, the distribution matrix is M t A similarity value of the candidate match; m t T For the t-th iteration, a matrix M is allocated t Transposing; w is the symmetric transmission error corresponding to the candidate match.
4. The progressive graph matching method based on cluster deformation graph according to claim 1, wherein said obtaining the indication vector of the candidate match according to the maximum value of the similarity value comprises:
by means of the formula (I) and (II),
Figure FDA0001999929060000041
computing an indication vector corresponding to the candidate match, wherein,
Figure FDA0001999929060000042
matching corresponding indication vectors for the candidates;
Figure FDA0001999929060000043
a variable evaluation function that is a maximum of the function; s (M) t ) A similarity value for the candidate match.
5. The progressive graph matching method based on clustering deformation graphs according to claim 1, wherein the obtaining the confidence of the indication vector comprises:
by means of the formula(s),
Figure FDA0001999929060000051
obtaining Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i The probability of correlation, wherein,
Figure FDA0001999929060000052
is Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i A probability of correlation; v Q The set of image feature clusters in the second image to be matched is obtained;
Figure FDA0001999929060000053
the image feature cluster b in the image feature cluster set in the second image to be matched is obtained; v P The method comprises the steps of collecting image feature clusters in a first image to be matched;
Figure FDA0001999929060000054
the image feature cluster b in the set of image feature clusters in the first image to be matched; m is a set of intermediate variables in the current iteration; match t A set of matching edges corresponding to the matching matrix; m is i Is the ith matching edge in the matching matrix, an
Figure FDA0001999929060000055
Figure FDA0001999929060000056
An edge between an image feature cluster p and an image feature cluster i in a first image to be matched;
Figure FDA0001999929060000057
an edge between an image feature cluster q and an image feature cluster i in a second image to be matched is defined; NN (·) is a nearest neighbor image characteristic function;
Figure FDA0001999929060000058
is given by m i When intermediate variables are present, image feature clusters
Figure FDA0001999929060000059
The nearest neighbor feature of (a); k is the number of the minimum dissimilarity points of the image feature cluster pair; k is a radical of 2 A second parameter in the image feature cluster in the current iteration of kNN; z is a normalization function; and is
Figure FDA00019999290600000510
exp () is an exponential function with a natural base number as the base;
Figure FDA00019999290600000511
a transfer error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and an intermediate variable;
by means of the formula (I) and (II),
Figure FDA00019999290600000512
is obtained in Match t In selecting m i As the probability of the intermediate variable, wherein,
p(V P =v P |M=m i ,Match t ) Is in Match t In selecting m i Probability as an intermediate variable; v. of P The image feature clusters in the set of image feature clusters in the first image to be matched are obtained; k is a radical of 1 Clustering image features in a kNN current iterationThe first parameter of (1);
by means of the formula (I) and (II),
Figure FDA0001999929060000061
obtaining Match t Middle image feature cluster V Q With intermediate variable m i The probability of correlation, wherein,
p(M=m i |Match t ) Is Match t Middle image feature cluster V Q With intermediate variable m i A probability of correlation; | Match t L is the number of matching edges corresponding to the matching matrix;
by means of the formula (I) and (II),
Figure FDA0001999929060000062
a confidence level indicating the vector is obtained, wherein,
p(V P ,V Q |M t ) Is a confidence level indicating a vector; m t Is a matching matrix.
6. An apparatus for progressive graph matching of clustering based deformation graphs, the apparatus comprising:
the first acquisition module is used for acquiring image characteristics of an image to be matched, wherein the image to be matched comprises: the image matching method comprises the following steps of obtaining a first image to be matched and a second image to be matched, wherein the image characteristics comprise: image geometry features;
the merging module is used for merging two adjacent image features with the difference value smaller than a preset threshold value into one image feature to obtain an image feature cluster;
the second acquisition module is used for acquiring candidate matching aiming at the image to be matched and other images to be matched according to the matching relation between the image feature cluster of the image to be matched and the image feature clusters of other images to be matched, acquiring the similarity value of the candidate matching according to the distribution matrix and the symmetrical similarity matrix of the candidate matching aiming at each candidate matching, and acquiring the indication vector of the candidate matching according to the maximum value of the similarity value;
the judging module is used for judging whether the similarity value corresponding to the indication vector is converged;
a third obtaining module, configured to, if the determination result of the determining module is negative, obtain a confidence level of each candidate matching indication vector according to the indication vector, and update an element value in the matching matrix to obtain an updated matching matrix if the confidence level of the indication vector is not less than a preset threshold; a second obtaining module, configured to take the corresponding relationship of each image feature in the matching matrix as a matching result of the image to be matched, until the similarity value corresponding to the indication vector converges, and under the condition that the similarity value corresponding to each indication vector converges;
and the setting module is used for taking the corresponding relation of each image characteristic in the matching matrix as the matching result of the image to be matched under the condition that the similarity value corresponding to each indication vector is converged under the condition that the judgment result of the judging module is yes.
7. The progressive graph matching apparatus based on clustering deformation graphs according to claim 6, wherein the merging module is configured to:
a: converting an image to be matched into an activated image, and taking each image feature in the activated image as an image feature cluster;
b: matching each image feature cluster in the activated image, and aiming at each image feature cluster pair in each image feature cluster pair, using a formula,
Figure FDA0001999929060000071
obtaining a number of minimum dissimilarity points of the pair of image feature clusters, wherein,
k is the number of all possible element pair dissimilarity points in the image feature cluster pair; k is a radical of AP Is a preset first control parameter; c a The number of elements in the image feature cluster a; c b The number of elements in the image feature cluster b; r is AP Is a preset second control parameter; | be EuropeanObtaining a distance function; i C a ||C b L is the number of possible pairs of elements between pairs of image feature clusters;
c: by means of the formula(s),
Figure FDA0001999929060000081
and obtaining the difference value of the image characteristic cluster pair, wherein,
D kNN (k,C a ,C b ) The difference value between the image feature cluster a and the image feature cluster b in the image feature cluster pair is obtained; gamma is the number of the pairing element pairs between the image feature cluster a and the image feature cluster b in the image feature cluster pair; min is a minimum evaluation function; sigma is a summation function; d (m) i ,m j ) For the element m in the image feature cluster a i With the element m in the image feature cluster b j The similarity between them; i is the element m i The serial number of (2); j is the element m j The serial number of (2);
d: judging whether the difference value of the image feature cluster pair is smaller than a preset difference threshold value or not;
e: if yes, combining the image feature cluster pairs into an image feature cluster, and returning to execute the step B until the difference value of any image feature cluster pair is not smaller than the difference threshold value;
f: and if not, taking the image feature cluster as a combined image feature cluster under the condition that the difference value of other image feature cluster pairs is not smaller than a preset difference threshold value.
8. The progressive graph matching apparatus based on clustering of deformation maps according to claim 6, wherein the second obtaining module is configured to:
obtaining the constraint condition corresponding to the candidate match,
Figure FDA0001999929060000082
wherein the content of the first and second substances,
m t matching corresponding initial indication vectors for the candidates; m t Is an allocation matrix; n is P The number of the image feature clusters in the first image to be matched is obtained; n is Q Is the second to be matchedMatching the number of image feature clusters in the image;
Figure FDA0001999929060000083
is a full one vector with the scale of n;
Figure FDA0001999929060000084
is a full one vector with the scale of n;
by means of the formula(s),
Figure FDA0001999929060000085
calculating a transfer error of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched, corresponding to the candidate match, wherein,
d jb|ia a transmission error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and a matching (i, a) formed by an image feature cluster i in the first image to be matched and an image feature cluster a in the second image to be matched; | | is a norm function;
Figure FDA0001999929060000091
the image feature cluster b in the second image to be matched is obtained;
Figure FDA0001999929060000092
affine homodyne transformation results of the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched are obtained;
Figure FDA0001999929060000093
the image feature cluster j in the first image to be matched is obtained;
according to the transfer error from the image feature cluster j in the first image to be matched to the image feature cluster b in the second image to be matched, by using a formula,
Figure FDA0001999929060000094
computing candidate matchesA corresponding symmetric similarity value, wherein,
W ia;jb matching the corresponding symmetric similarity value for the candidate, and the candidate matching is including the edge
Figure FDA0001999929060000095
And
Figure FDA0001999929060000096
the candidate match of (2); p is a first image to be matched; q is a second image to be matched; epsilon P Is a set of edges contained in the first image to be matched; epsilon Q Is a set of edges contained in the second image to be matched; i is the ith point in the first image to be matched; j is the jth point in the first image to be matched; a is the a-th point in the second image to be matched; b is the b-th point in the second image to be matched;
Figure FDA0001999929060000097
matching corresponding clusters of image features for candidate matches
Figure FDA0001999929060000098
And matching image feature clusters
Figure FDA0001999929060000099
A corresponding second order similarity function of symmetric transmission errors; d bj|ai A transmission error between a matching (b, j) pair formed by an image feature cluster b in the second image to be matched and an image feature cluster j in the first image to be matched and a matching (a, i) formed by an image feature cluster a in the second image to be matched and an image feature cluster i in the first image to be matched; d is a radical of ia|jb A transfer error between a matching (i, a) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster a in the second image to be matched and a matching (j, b) formed by the image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched; d ai|bj The image feature cluster a in the second image to be matched to the image in the first image to be matchedA match (a, i) formed by the feature cluster i and a transfer error between an image feature cluster b in the second image to be matched and a match (b, j) formed by an image feature cluster j in the first image to be matched; alpha is a preset threshold value similar to the image feature cluster; max () is the maximum value evaluation function;
according to the symmetrical similarity value corresponding to the candidate matching, the symmetrical similarity value is used as an element to construct a symmetrical similarity matrix;
using a formula, S (M), based on a constraint matrix corresponding to the candidate match and a symmetric similarity matrix corresponding to the candidate match t )=M t T WM t Obtaining a similarity value for the candidate match, wherein,
S(M t ) For the t-th iteration, the distribution matrix is M t A similarity value of the candidate match; m t T For the t-th iteration, a matrix M is allocated t Transposing; w is the symmetric transmission error corresponding to the candidate match.
9. The progressive graph matching apparatus based on clustering of deformation maps according to claim 6, wherein the second obtaining module is configured to:
by means of the formula (I) and (II),
Figure FDA0001999929060000101
computing an indication vector corresponding to the candidate match, wherein,
Figure FDA0001999929060000102
matching corresponding indication vectors for the candidates;
Figure FDA0001999929060000103
a variable evaluation function that is a maximum of the function; s (M) t ) A similarity value for the candidate match.
10. The progressive graph matching apparatus based on clustering of deformation graphs according to claim 6, wherein the third obtaining module is configured to:
by means of the formula (I) and (II),
Figure FDA0001999929060000104
obtaining Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i The probability of correlation, wherein,
Figure FDA0001999929060000105
is Match t Middle image feature cluster V Q And image feature cluster V P Set of (2) and intermediate variable m i A probability of correlation; v Q The set of image feature clusters in the second image to be matched is obtained;
Figure FDA0001999929060000106
the image feature cluster b in the image feature cluster set in the second image to be matched is obtained; v P The method comprises the steps of collecting image feature clusters in a first image to be matched;
Figure FDA0001999929060000111
the image feature cluster b in the set of image feature clusters in the first image to be matched; m is a set of intermediate variables in the current iteration; match t A set of matching edges corresponding to the matching matrix; m is i Is the ith matching edge in the matching matrix, an
Figure FDA0001999929060000112
Figure FDA0001999929060000113
An edge between an image feature cluster p and an image feature cluster i in a first image to be matched;
Figure FDA0001999929060000114
for the image feature cluster q and in the second image to be matchedEdges between image feature clusters i; NN (·) is a nearest neighbor image characteristic function;
Figure FDA0001999929060000115
is represented by m i When intermediate variables are present, image feature clusters
Figure FDA0001999929060000116
The nearest neighbor feature of (a); k is the number of the minimum dissimilarity points of the image feature cluster pair; k is a radical of 2 Clustering a second parameter in the image feature cluster in the current iteration of kNN; z is a normalization function; and is
Figure FDA0001999929060000117
exp () is an exponential function with a natural base number as the base;
Figure FDA0001999929060000118
a transfer error between a matching (j, b) pair formed by an image feature cluster j in the first image to be matched and an image feature cluster b in the second image to be matched and an intermediate variable;
by means of the formula (I) and (II),
Figure FDA0001999929060000119
is obtained in Match t In selecting m i As the probability of the intermediate variable, wherein,
p(V P =v P |M=m i ,Match t ) In Match t In selecting m i Probability as an intermediate variable; v. of P The image feature clusters in the set of image feature clusters in the first image to be matched are obtained; k is a radical of formula 1 Clustering a first parameter in image feature clusters in the current iteration of kNN;
by means of the formula (I) and (II),
Figure FDA00019999290600001110
obtaining Match t Middle image feature cluster V Q With intermediate variable m i The probability of correlation, wherein,
p(M=m i |Match t ) Is Match t Middle image feature cluster V Q With intermediate variable m i A probability of correlation; | Match t L is the number of matching edges corresponding to the matching matrix;
by means of the formula(s),
Figure FDA0001999929060000121
a confidence level indicating the vector is obtained, wherein,
p(V P ,V Q |M t ) Is a confidence level indicating a vector; m t Is a matching matrix.
CN201910209027.4A 2019-03-19 2019-03-19 Progressive graph matching method and device of deformation graph based on clustering Expired - Fee Related CN109934298B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910209027.4A CN109934298B (en) 2019-03-19 2019-03-19 Progressive graph matching method and device of deformation graph based on clustering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910209027.4A CN109934298B (en) 2019-03-19 2019-03-19 Progressive graph matching method and device of deformation graph based on clustering

Publications (2)

Publication Number Publication Date
CN109934298A CN109934298A (en) 2019-06-25
CN109934298B true CN109934298B (en) 2022-10-28

Family

ID=66987741

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910209027.4A Expired - Fee Related CN109934298B (en) 2019-03-19 2019-03-19 Progressive graph matching method and device of deformation graph based on clustering

Country Status (1)

Country Link
CN (1) CN109934298B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112686880B (en) * 2021-01-06 2021-09-14 哈尔滨市科佳通用机电股份有限公司 Method for detecting abnormity of railway locomotive component
CN112991408B (en) * 2021-04-19 2021-07-30 湖南大学 Large-scene high-resolution remote sensing image self-adaptive area multi-feature registration method and system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104766084A (en) * 2015-04-10 2015-07-08 南京大学 Nearly copied image detection method based on multi-target matching
CN106886794A (en) * 2017-02-14 2017-06-23 湖北工业大学 Take the heterologous remote sensing image homotopy mapping method of high-order structures feature into account

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10198858B2 (en) * 2017-03-27 2019-02-05 3Dflow Srl Method for 3D modelling based on structure from motion processing of sparse 2D images

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104766084A (en) * 2015-04-10 2015-07-08 南京大学 Nearly copied image detection method based on multi-target matching
CN106886794A (en) * 2017-02-14 2017-06-23 湖北工业大学 Take the heterologous remote sensing image homotopy mapping method of high-order structures feature into account

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
一种基于奇异值分解的图像匹配算法;赵峰等;《计算机研究与发展》;20100115(第01期);全文 *
基于一致性随机采样的图像特征匹配鲁棒确认;刘毅;《重庆邮电大学学报(自然科学版)》;20100615(第03期);全文 *

Also Published As

Publication number Publication date
CN109934298A (en) 2019-06-25

Similar Documents

Publication Publication Date Title
US11670071B2 (en) Fine-grained image recognition
Novotny et al. Semi-convolutional operators for instance segmentation
CN106682598B (en) Multi-pose face feature point detection method based on cascade regression
Jiang et al. Robust feature matching for remote sensing image registration via linear adaptive filtering
Jiang et al. Multiscale locality and rank preservation for robust feature matching of remote sensing images
CN110674866A (en) Method for detecting X-ray breast lesion images by using transfer learning characteristic pyramid network
US9984280B2 (en) Object recognition system using left and right images and method
Xia et al. Loop closure detection for visual SLAM using PCANet features
Lee et al. Place recognition using straight lines for vision-based SLAM
Liu et al. A review of keypoints’ detection and feature description in image registration
CN109934298B (en) Progressive graph matching method and device of deformation graph based on clustering
CN111199558A (en) Image matching method based on deep learning
Cheung et al. On deformable models for visual pattern recognition
Kumar et al. A novel approach for multi-cue feature fusion for robust object tracking
CN109255043B (en) Image retrieval method based on scene understanding
CN114358166A (en) Multi-target positioning method based on self-adaptive k-means clustering
Tang et al. Random walks with efficient search and contextually adapted image similarity for deformable registration
Zhao et al. Learning probabilistic coordinate fields for robust correspondences
Zhou et al. Retrieval and localization with observation constraints
Cai et al. A target tracking method based on KCF for omnidirectional vision
Cheung et al. Bidirectional deformable matching with application to handwritten character extraction
Zhao et al. Remote sensing image registration based on dynamic threshold calculation strategy and multiple-feature distance fusion
CN115731576A (en) Unsupervised pedestrian re-identification method based on key shielding area
Yamashita et al. Facial point detection using convolutional neural network transferred from a heterogeneous task
Tang et al. A GMS-guided approach for 2D feature correspondence selection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20221028