CN111783834B - Heterogeneous image matching method based on joint graph spectrum feature analysis - Google Patents

Heterogeneous image matching method based on joint graph spectrum feature analysis Download PDF

Info

Publication number
CN111783834B
CN111783834B CN202010493597.3A CN202010493597A CN111783834B CN 111783834 B CN111783834 B CN 111783834B CN 202010493597 A CN202010493597 A CN 202010493597A CN 111783834 B CN111783834 B CN 111783834B
Authority
CN
China
Prior art keywords
point
matching
feature
points
characteristic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010493597.3A
Other languages
Chinese (zh)
Other versions
CN111783834A (en
Inventor
王鑫
张丽荷
张之露
严勤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hohai University HHU
Original Assignee
Hohai University HHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hohai University HHU filed Critical Hohai University HHU
Priority to CN202010493597.3A priority Critical patent/CN111783834B/en
Publication of CN111783834A publication Critical patent/CN111783834A/en
Application granted granted Critical
Publication of CN111783834B publication Critical patent/CN111783834B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a heterogeneous image matching method based on joint graph spectrum feature analysis. Firstly, extracting angular points of a visible light image and an infrared image, and constructing an adjacent matrix by using the angular point relation of the two images; secondly, obtaining a characteristic vector of the joint graph by defining the conventional Laplace and decomposing the characteristic value of the adjacent matrix, and then constructing a characteristic function pair by three-dimensional reconstruction; thirdly, providing a SUSAN-MSER-SURF algorithm to detect the extreme value region of each group of feature function pairs, and matching the region features by using the provided Euclidean-Hamming algorithm; and finally, matching the extreme value regions of the characteristic spectrum pairs reconstructed by the minimum K characteristic vectors according to the steps to obtain a final matching result.

Description

Heterogeneous image matching method based on joint graph spectrum feature analysis
Technical Field
The invention belongs to the field of image processing, and particularly relates to a heterogeneous image matching method based on joint graph spectrum feature analysis.
Background
In the heterogeneous image matching, the traditional matching mode based on gray scale is difficult to accurately match, and the heterogeneous image matching based on characteristics is a common heterogeneous image matching mode. With the fact that numerous scholars at home and abroad propose a plurality of characteristic-based algorithms with excellent performance, the heterogeneous image matching technology has a quite remarkable breakthrough, but the problem that the matching precision is not enough exists in images shot under some special conditions. Visible light images taken during high-light day or low-light night tend to be blurred, and in this case, the use of the feature-based matching method is affected by blurring of local feature descriptors, which results in a decrease in matching accuracy.
Aiming at the scenes, a method based on the frequency spectrum characteristic analysis of the joint image is provided, the structural relation of angular points in the visible light image and the infrared image is utilized to construct the joint image by applying the K neighbor rule, the characteristic vector of the joint image is obtained by defining the conventional Laplace and decomposing the characteristic value of the Laplace, and the three-dimensional reconstruction is carried out to construct the characteristic function pair. The extreme value of the characteristic function pair represents the image persistent region, so the invention provides a SUSAN-MSER-SURF maximum stable extreme value region detector for detecting the position of the extreme value and matching the maximum stable extreme value region. Experimental results show that the scheme has excellent matching rate when applied to the scenes.
Disclosure of Invention
The invention aims to: aiming at the problems, the invention provides a heterogeneous image matching method based on joint graph spectrum feature analysis to solve the problem of matching visible light and infrared images shot under the condition of over-high or over-low light intensity.
The technical scheme is as follows: in order to realize the purpose of the invention, the technical scheme adopted by the invention is as follows: a heterogeneous image matching method based on combined graph spectrum feature analysis comprises the following steps:
(1) extracting angular points of the visible light and the infrared image, and constructing an adjacent matrix by using the angular point relation of the two images;
(2) obtaining a feature vector of a joint graph by defining a conventional Laplace and decomposing the feature values of adjacent matrixes, and constructing a feature function pair by three-dimensional reconstruction;
(3) providing a SUSAN-MSER-SURF algorithm to detect an extreme value region of each group of feature function pairs, and matching the region features by using the provided Euclidean-Hamming algorithm;
(4) and matching the extreme value regions of the feature spectrum pairs reconstructed by the minimum K feature vectors according to the steps to obtain a final matching result.
Further, in the step (1), angular points of the visible light image and the infrared image are extracted, and an adjacent matrix is constructed by using the angular point relation of the two images, wherein the specific method comprises the following steps:
(1.1) giving a visible light image and an infrared image to be matched, extracting angular points of the visible light image and the infrared image to be matched by using an SIFT algorithm respectively, and collecting the angular points into an angular point set V 1 And V 2 Where V is the set of all the corner points, called node set { V } 1 ,v 2 ,...,v n N denotes the number of nodes;
(1.2) corner Point set V 1 And V 2 Their correlation diagrams are G 1 (V 1 ,E 1 ,W 1 ) And G 2 (V 2 ,E 2 ,W 2 ). Wherein E is 1 Is V 1 Set of edges between any two nodes in (E) 2 Is V 2 Set of edges between any two nodes in (W) 1 Is an n adjacency matrix containing weight values between all points, defined as follows:
Figure GDA0003699297550000021
n 1 reference to a set of nodes V 1 Number of middle nodes, W 1 Set of reference connections V 1 N composed of the weighted values of the edges of any two nodes 1 ×n 1 The neighboring matrix of (a); w is a ij Is a weight value which explains the angular point v i And v j The correlation between the represented pixels.
The joint graph G (V, E, W) is defined as V ═ V 1 ∪V 2 ,E=E 1 ∪E 2 ∪E 12 ,E 12 Denotes a connection V 1 Any one of the nodes and V 2 A set of edges for any one of the nodes. The adjacency matrix W of the join graph G is defined as follows:
Figure GDA0003699297550000031
c means a connection set V 1 Any one node and set V 2 N composed of weighted values of edges of any node 1 ×n 2 An adjacent matrix; c T Is the transposed matrix of C.
(1.3) assigning the adjacent matrix: the basic rule of the adjacent matrix construction is that if the distance between two points is short, the weight value of the edge connecting the two points is larger; the smaller the reverse. Traversing all points in the corner point set by using a KNN algorithm, taking k points nearest to each sample point as the neighbors of the points, and only taking the weight values w between the k points nearest to the sample point ij > 0, and the weight values between the points and other points are 0. To ensure symmetry, s is retained as long as one point is in the K neighbor of another point ij . For measuring edges connecting two angular pointsThe formula of the weight value is as follows:
Figure GDA0003699297550000032
further, in the step (2), the feature vector of the joint graph is obtained by defining the feature values of the conventional laplacian and decomposing the adjacent matrix, and then the specific implementation manner of constructing the feature function pair by three-dimensional reconstruction is as follows:
(2.1) using the adjacency matrix W and the degree matrix D thereof, we can calculate the laplacian matrix as L ═ W-D and normalize it;
(2.2) decomposition of formula Using eigenvalues
Figure GDA0003699297550000033
Its feature vector is calculated. Wherein the feature vector U 1 ,U 2 ,...,U k Corresponding to the minimum K characteristic values respectively, the dimension of each characteristic vector is n 1 +n 2 These feature vectors have a close relationship with the structure of the image.
(2.3) for each of the K feature vectors, we split it into two vectors, with dimensions n respectively 1 And n 2
(2.4) extracting n from each feature vector 1 The dimensional vector is reconstructed to the size of the visible image, and the reconstructed feature function is called the feature spectrum by assigning its component values to sampling positions (the sampling positions are the positions where the feature points are extracted), and linearly interpolating the values between the sampling positions
Figure GDA0003699297550000041
N extracted from each feature vector is similarly 2 The vector of dimensions is reconstructed to the size of the infrared image, thus obtaining a characteristic frequency spectrum
Figure GDA0003699297550000042
(2.5) according to the above method, we start from the smallest KEigenvectors U of eigenvalues 1 ,U 2 ,...,U k Respectively obtaining K groups of characteristic frequency spectrum pairs
Figure GDA0003699297550000043
Further, in the step (3), a specific implementation manner of detecting the extremum regions of each group of feature function pairs by using the SUSAN-MSER-SURF algorithm is provided as follows:
(3.1) checking all pixel points in the image, if the gray value of the pixel point is larger than or equal to the pixel of the threshold value, storing the pixel point, otherwise, ignoring the pixel point. In the stored points, the adjacent points form an extremum area;
(3.2) the stability of these along with the components is checked: starting from 0 to 255, the threshold value is gradually increased to the size of delta, the above steps are repeated again, and for the areas which have passed through several threshold changes and have no change in size, the areas are reserved as the maximum stable extremum areas.
(3.3) extracting corner points in the characteristic function by adopting an SUSAN operator, and reserving the MSER region when the number of the corner points in the MSER region is more than or equal to m;
(3.4) performing affine normalization on each detected MSER ellipse, taking the center point of the normalized MSER ellipse as an interest point, taking a circular region around the interest point, and calculating a SURF descriptor for the region by segmenting the region and calculating gradients on each sub-region;
(3.5) Using this method to detect each pair of feature function pairs
Figure GDA0003699297550000044
All MSER regions of (a) normalize it to a point of interest and compute SURF descriptors for it.
Further, in the step (3), the region features are matched by using the proposed Euclidean-Hamming algorithm. The specific implementation mode is as follows:
(4.1) for one feature point in the feature point set of the infrared image, calculating the improved Euclidean distance from the feature point to all the feature points on the visible light image so as to obtain a set of distances, easily obtaining the minimum distance through comparison, wherein the visible light feature point with the minimum distance to the infrared feature point is the matching point of the infrared feature point, and if the minimum distance exceeds the threshold value MatchThreshold, excluding the matching point pair. Finding out all matching point pairs of the infrared image and the visible light image according to the rule;
(4.2) setting a comparison parameter MaxRadio, and if the ratio of the minimum distance and the next minimum distance between a certain characteristic point on the visible light and the characteristic point on the infrared image is smaller than the comparison parameter, reserving the matching point pair. Wherein, the improved Euclidean distance adds one more addition and subtraction operation on the original Euclidean distance. For vector a ═ x 1 ,x 2 ,...,x n ) Sum vector b ═ y 1 ,y 2 ,...,y n ) The modified euclidean distance therebetween is formulated as follows:
Figure GDA0003699297550000051
and (4.3) screening the coarse matching points matched by the nearest neighbor rule. Let the minimum Euclidean distance of all matching point pairs be denoted as d omin And respectively calculating the Hamming distance of the matched point pairs. Comparing the Hamming distances of all matching point pairs if they are less than d omin Twice as many as this, the matching pair is considered a mis-match, otherwise greater than this value the matching pair is considered a correct matching pair.
Has the advantages that: compared with the prior art, the technical scheme of the invention has the following beneficial technical effects:
(1) the invention is different from the prior matching method based on geometric features, and is a matching method for image structure analysis instead. This method does not rely on matching features detected separately in the two images on their descriptors, but on analyzing the feature spectrum of a combined image made up of the structural relationship of the corner points of the two images.
(2) The union graph is used for combining the structural relation of corner points in two images as an adjacent matrix. The method depends on simultaneously using the structure calculation characteristics of two images, obtains characteristic vectors through Laplace decomposition, and splits the characteristic vectors to obtain corresponding vector pairs. The complicated search strategy for searching the corresponding relation between the feature vectors is omitted. The matching method is strong in fault tolerance and better in time performance.
(3) Since the extreme value of the characteristic function frequency spectrum pair represents the persistent region of the image, the invention extracts and matches the extreme value of the characteristic function pair. Not only the structural information of the graph is considered, but also the stable region characteristics of the image are fully utilized.
(4) We need to study the MSER regions with dense number of corners, so we propose a SUSAN-MSER-SURF algorithm to detect the extremum regions of each set of feature function pairs. Different from the traditional detection method of the maximum stable extremum region, the detection region is screened after the region is extracted. The MSER algorithm, while avoiding unusually small and large regions, returns only a few normal smooth regions. When these regions enclose a corner point, they are not completely smooth and therefore their regions are more characteristic.
Drawings
FIG. 1 is an overall flow chart of the present invention;
FIG. 2 is a graph of the matching comparison result of visible light and infrared images under the condition of low exposure level;
FIG. 3 is a graph showing the comparison result of matching between visible light and infrared images under high exposure conditions in accordance with the present invention;
FIG. 4 is a graph of the comparison results of the precision fold lines of various heterogeneous image matching methods;
FIG. 5 is a comparison graph of RMSE statistics;
fig. 6 is a graph comparing matching times.
Detailed Description
The technical solution of the present invention is further described below with reference to the accompanying drawings and examples.
As shown in fig. 1, the present invention provides a heterogeneous image matching method based on a composite operator, which comprises the following steps:
(1) respectively acquiring edge images of a visible light image and an infrared image by adopting an improved Harris edge extraction operator;
(2) extracting characteristic points from the edges of the visible light image and the infrared image by using an SURF operator, and establishing a characteristic description vector;
(3) an improved nearest neighbor principle is provided, matching pairs are extracted by utilizing the similarity between Euclidean distance measurement feature points, the average Euclidean distance and the variance of all matching point pairs are calculated, a threshold value is set according to the variance, wrong matching point pairs are removed, and coarse matching is achieved;
(4) and finally, fitting all rough matching point pairs by adopting a gradient descent method based on a neural network to obtain a function model, calculating the errors of all the rough matching point pairs and the function model, and removing the matching point pairs with large errors to obtain an accurate matching result.
Further, in the step (1), corner points of the visible light image and the infrared image are extracted, and an adjacent matrix is constructed by using the relationship between the corner points of the two images, wherein the specific method comprises the following steps:
(1.1) giving a visible light image and an infrared image to be matched, extracting angular points of the visible light image and the infrared image to be matched by using an SIFT algorithm respectively, and collecting the angular points into an angular point set V 1 And V 2 Where V is the set of all the corner points, called node set { V } 1 ,v 2 ,...,v n N refers to the number of nodes;
(1.2) corner Point set V 1 And V 2 Their correlation diagrams are G 1 (V 1 ,E 1 ,W 1 ) And G 2 (V 2 ,E 2 ,W 2 ). Wherein, E 1 Is V 1 Set of edges between any two nodes in (E) 2 Is a V 2 Set of edges between any two nodes in (1), W 1 Is an n adjacency matrix containing weight values between all points, defined as follows:
Figure GDA0003699297550000071
n 1 reference to a set of nodes V 1 Number of middle nodes, W 1 To meanConnection set V 1 N composed of weighted values of edges of any two nodes 1 ×n 1 The neighboring matrix of (a); w is a ij Is a weight value which explains the angle v i And v j The correlation between the represented pixels.
The joint graph G (V, E, W) is defined as V ═ V 1 ∪V 2 ,E=E 1 ∪E 2 ∪E 12 ,E 12 Denotes a connection V 1 Any one node and V 2 A set of edges for any of the nodes. The adjacency matrix W of the join graph G is defined as follows:
Figure GDA0003699297550000072
c means a connection set V 1 Any node and set V 2 N composed of weighted values of edges of any node 1 ×n 2 An adjacent matrix; c T Is the transposed matrix of C.
(1.3) assigning the adjacent matrix: the basic rule of the adjacent matrix construction is that if the distance between two points is short, the weight value of the edge connecting the two points is larger; the smaller the opposite. Traversing all points in the corner point set by using a KNN algorithm, taking k points nearest to each sample point as the neighbors of the points, and only taking the weight values w between the k points nearest to the sample point ij > 0, and the weight values between the points and other points are 0. To ensure symmetry, s is retained as long as one point is in the K neighbor of another point ij . The formula for measuring the edge weight value connecting two corners is as follows:
Figure GDA0003699297550000081
further, in the step (2), the feature vector of the joint graph is obtained by defining the feature values of the conventional laplacian and decomposing the adjacent matrix, and then the specific implementation manner of constructing the feature function pair by three-dimensional reconstruction is as follows:
(2.1) using the adjacency matrix W and its degree matrix D, we can calculate the laplace matrix as L ═ W-D and normalize it. For the definition of its degree matrix D, it is an n × n diagonal matrix, with only the main diagonal having a value corresponding to the sum of each column. It is defined as follows:
Figure GDA0003699297550000082
Figure GDA0003699297550000083
using the adjacency matrix W and the degree matrix D, we can compute the Laplace matrix (Graph Laplacians). Its definition is simple, and the laplace matrix L ═ W-D. The normalization operation is carried out to obtain:
Figure GDA0003699297550000084
(2.2) decomposition of formula Using eigenvalues
Figure GDA0003699297550000085
Its feature vector is calculated. Wherein the feature vector U 1 ,U 2 ,...,U k Corresponding to the minimum K characteristic values, and the dimension of each characteristic vector is n 1 +n 2 These feature vectors have a close relationship with the structure of the image.
(2.3) for each of the K feature vectors, we split it into two vectors, whose dimensions are n, respectively 1 And n 2
(2.4) extracting n from each feature vector 1 The dimensional vector is reconstructed to the size of the visible image, and the reconstructed feature function is called the feature spectrum by assigning its component values to sampling positions (the sampling positions are the positions where the feature points are extracted), and linearly interpolating the values between the sampling positions
Figure GDA0003699297550000091
N extracted from each feature vector is similarly 2 The vector of dimensions is reconstructed to the size of the infrared image, thus obtaining a characteristic frequency spectrum
Figure GDA0003699297550000092
(2.5) according to the above method, we derive from the eigenvector U corresponding to the smallest K eigenvalues 1 ,U 2 ,...,U k Respectively obtaining K groups of characteristic frequency spectrum pairs
Figure GDA0003699297550000093
Further, in the step (3), a specific implementation manner of detecting the extremum regions of each group of feature function pairs by using the SUSAN-MSER-SURF algorithm is provided as follows:
(3.1) checking all pixel points in the image, if the gray value of the pixel point is larger than or equal to the pixel of the threshold value, storing the pixel point, otherwise, ignoring the pixel point. In the stored points, the adjacent points form an extremum area;
(3.2) these are checked along with the stability of the components: starting from 0 to 255, the threshold value is gradually increased to the size of delta, the above steps are repeated again, and for the areas which have passed through several threshold changes and have no change in size, the areas are reserved as the maximum stable extremum areas.
(3.3) extracting corner points in the characteristic function by adopting an SUSAN operator, and reserving the MSER region when the number of the corner points in the MSER region is more than or equal to m;
(3.4) performing affine normalization on each detected MSER ellipse, taking the center point of the normalized MSER ellipse as an interest point, taking a circular region around the interest point, and calculating a SURF descriptor for the region by segmenting the region and calculating gradients on each sub-region;
(3.5) Using this method to detect each pair of feature function pairs
Figure GDA0003699297550000094
All MSER regions of (a) normalize it to a point of interest and compute SURF traces for itAnd (6) describing the symbol.
Further, in the step (3), the region features are matched by using the proposed Euclidean-Hamming algorithm. The specific implementation mode is as follows:
(4.1) for one feature point in the feature point set of the infrared image, calculating the improved Euclidean distance from the feature point to all the feature points on the visible light image so as to obtain a set of distances, easily obtaining the minimum distance through comparison, wherein the visible light feature point with the minimum distance from the infrared feature point is the matching point of the infrared feature point, and if the minimum distance exceeds the threshold value MatchThreshold, excluding the matching point pair. Finding out all matching point pairs of the infrared image and the visible light image according to the rule;
and (4.2) setting a comparison parameter Maxradio, and if the ratio of the minimum distance between a certain characteristic point on the visible light and the characteristic point on the infrared image to the next minimum distance is smaller than the comparison parameter, reserving the matching point pair. Wherein, the improved Euclidean distance adds one more addition and subtraction operation on the original Euclidean distance. For vector a ═ x 1 ,x 2 ,...,x n ) Sum vector b ═ y 1 ,y 2 ,...,y n ) The formula of the improved Euclidean distance between the two is as follows:
Figure GDA0003699297550000101
and (4.3) screening the coarse matching points matched by the nearest neighbor rule. Let the minimum Euclidean distance of all matching point pairs be denoted as d omin And respectively calculating hamming distances of the matched point pairs. Comparing hamming distances of all matching point pairs if they are less than d omin Twice, the matching pair is considered as a mis-match, otherwise, if the value is larger than the value, the matching pair is considered as a correct matching pair.
To validate the algorithm proposed by the present invention, we performed experimental analysis.
Firstly, in order to verify the Matching effect of the visible light and the infrared Image under exposure or over exposure in the Method, we select SURF + PIID + RPM algorithm (Gang Wang et al, "Robust Point Matching Method for Multi-modal Image Registration", biological Signal Processing and Control,2015, Vol.19, pp.68-76.), classical fuzzy shape context algorithm, dense + ORB operator algorithm (Reed intensity, Schopper, Livich, etc.. Density-ORB feature based Image feature Point Matching algorithm% Image feature Point Matching on dense-ORB feature [ J ]. Nature university of technology, 2019,035(001): 13-20) and the heterogeneous Image Matching Method based on the joint graph feature analysis proposed by the present invention to perform comparison experiment. Fig. 2-3 are graphs comparing the results of several sets of experiments.
Fig. 2 shows a matching graph of visible and infrared images of a road in dark light. Wherein (a) in fig. 2 uses SURF + PIID-RPM algorithm, there are 10 matching pairs in total, 3 of which are correct matching pairs, the matching rate is 0.3, and it takes 2.5484 seconds; fig. 2 (b) uses the fuzzy shape context algorithm, and there are 13 matching pairs in total, 2 of which are correct matching pairs, and the matching rate is 0.153, which takes 3.5108 seconds; fig. 2 (c) uses the dense + ORB matching algorithm, and there are 7 matching pairs, of which 1 is the correct matching pair, and the matching rate is 0.142, which takes 1.5612 seconds; fig. 2 (d) uses the heterogeneous image matching method based on the joint graph spectrum feature analysis proposed in the present invention, which has 8 matching pairs in total, of which 6 are correct matching pairs, and the matching rate is 0.75, which takes 2.7061 seconds.
From the matching rate, the method based on the combined graph spectrum feature analysis has obvious advantages. Because the application scene is shot under the condition of low light intensity, the image lacks of local intensity and a gray mode, the algorithm of the invention carries out matching by analyzing the characteristic spectrum of the combined image formed by the structural relationship of the corner points of the two images, and does not depend on the description of the characteristic local neighborhood information, thereby having better matching effect. The dense + ORB has obvious advantages in the matching speed, and because the ORB is a fast key point detector and descriptor, the defects of large calculation amount and low speed of SIFT and SURF algorithms are overcome.
Figure 3 shows a matching graph of visible and infrared images of the billboard in the case of bright light. Wherein (a) in fig. 3 uses SURF + PIID + RPM algorithm, there are 10 matching pairs in total, 4 of which are correct matching pairs, the matching rate is 0.4, and it takes 2.7302 seconds; fig. 3 (b) uses a fuzzy shape context algorithm, and has a total of 8 matching pairs, of which 2 are correct matching pairs, and the matching rate is 0.25, which takes 4.09 seconds; fig. 3 (c) uses the diversity + ORB algorithm, and has a total of 8 matching pairs, wherein 1 is the correct matching pair, and the matching rate is 0.125, which takes 2.1835 seconds; fig. 3 (d) uses the heterogeneous image matching method based on the joint graph spectral feature analysis proposed in the present invention, which has a total of 12 matching pairs, 11 of which are correct matching pairs, and the matching rate is 0.916, which takes 2.8669 seconds.
The application scene is shot under the condition of higher light intensity, and similarly, the method based on the combined graph spectrum characteristic analysis has better matching rate and shorter time consumption in the scene. The fuzzy shape context utilizes the unequal coordinate space to extract the histogram of the local shape statistical information, so that the time consumption is long, and the matching effect is poor due to single matching characteristic. And the SURF + PIID + RPM algorithm is applied to a plurality of complex multimodal retinal images with poor quality, is not suitable for the scene, and therefore the matching rate is low. The similarity + ORB performs matching analysis by using the similarity of pixel densities in the neighborhood space of a certain feature point in the same area in two images, and cannot solve the problem that the images lack of local intensity and gray level modes.
Through the comparison experiment, the method for matching the heterogeneous images based on the combined image spectral feature analysis is effective in matching the visible light images with the infrared images with low definition caused by underexposure or overexposure.
Secondly, in order to verify the reliability of the algorithm, the invention respectively uses SURF + PIID + RPM, fuzzy shape context, dense + ORB operator and the algorithm of the invention to carry out 10 times of experiments on 9 groups of images. Line graphs of the average accuracy contrast of the experiments were taken. Nine groups of visible light and infrared images are used as an experimental data set in the experiment, and the sample set is a visible light and infrared image with too high or low light. The matching comparison line graph is shown in fig. 4, in which the horizontal axis of the image represents the group number of the nine groups of images, respectively, and the vertical axis represents the average matching rate obtained after ten times of tests on the group of images. Through the line graph, the judgment can be clearly made, in most experiments, the algorithm provided by the invention is better than other methods when being applied to the scene, and the matching performance of repeated experiments is stable.
Thirdly, the invention respectively counts the RMSE of 9 groups of images by using 4 methods, as shown in FIG. 5, and it can be seen that the RMSE value of the method of the invention is relatively smooth and lower, and the matching is more accurate. The fuzzy shape context utilizes the characteristic that two images are only similar in shape in a fuzzy manner, has a certain matching effect, but has poor stability due to single characteristic. The dense + ORB performs matching by utilizing the similarity of pixel density in a certain feature point neighborhood space of the same region in two images, and cannot solve the problem that the images lack local intensity and gray level modes. Whereas SURF + PIID + RPM are mostly applied to complex multimodal images, poor quality, non-vascular images, etc., and the scene matching is not good.
Finally, the time lengths consumed in matching the four algorithms are compared. As shown in fig. 6, it can be seen that the matching algorithm of the present invention does not provide much advantage when used. The algorithm of the invention needs to detect and match the combined image frequency spectrum of two images and the three-dimensional reconstruction characteristic function to the maximum value area of each group of characteristic functions, thereby consuming more time and improving the matching rate. The fuzzy shape context extracts the histogram of the local shape statistics using the unequal coordinate space, resulting in a long time consumption. The similarity + ORB algorithm is shortest in matching time length and is suitable for tracking and matching of images.

Claims (5)

1. A heterogeneous image matching method based on combined graph spectral feature analysis is characterized by comprising the following steps:
(1) extracting angular points of the visible light image and the infrared image with too high or low light rays, and constructing an adjacent matrix by using the angular point relation of the two images;
(2) obtaining a feature vector of a joint graph by defining a conventional Laplace and decomposing the feature values of adjacent matrixes, and constructing a feature function pair by three-dimensional reconstruction;
(3) detecting an extreme value area of each group of characteristic function pairs, and matching area characteristics by using a proposed Euclidean-Hamming algorithm;
(4) and matching the extreme value regions of the feature spectrum pairs reconstructed by the minimum K feature vectors according to the steps to obtain a final matching result.
2. The method for matching the heterologous image based on the combined image spectral feature analysis according to claim 1, wherein in the step (1), the corners of the visible light image and the infrared image are extracted, and the adjacent matrix is constructed by using the relationship between the corners of the two images, and the specific method is as follows:
(1.1) giving a visible light image and an infrared image to be matched, extracting angular points of the visible light image and the infrared image to be matched by using an SIFT algorithm respectively, and collecting the angular points into an angular point set V 1 And V 2 Wherein V is 1 Set of all corner points in the visible light image, called node set
Figure FDA0003699297540000011
n 1 The number of the designated nodes; v 2 Set of all corner points in the infrared image, called node set
Figure FDA0003699297540000012
n 2 Refers to the number of nodes;
(1.2) corner Point set V 1 And V 2 Their correlation diagrams are G 1 (V 1 ,E 1 ,W 1 ) And G 2 (V 2 ,E 2 ,W 2 ) Wherein E is 1 Is a V 1 Set of edges between any two nodes in (E) 2 Is V 2 Set of edges between any two nodes in (1), W 1 Is a weight value of n between all corner points of the visible light image 1 ×n 1 The adjacency matrix of (a), which is defined as follows:
Figure FDA0003699297540000013
wherein n is 1 Reference to a set of nodes V 1 Number of middle nodes, W 1 Set of reference connections V 1 N composed of weighted values of edges of any two nodes 1 ×n 1 Of adjacent matrices, w ij Is a weight value, which represents the corner point v i And v j The correlation between the represented pixels;
W 2 is a weighted value n between all corner points of the infrared image 2 ×n 2 The adjacency matrix of (a), which is defined as follows:
Figure FDA0003699297540000021
wherein n is 2 Reference to a set of nodes V 2 Number of middle nodes, W 2 Set of reference connections V 2 N composed of weighted values of edges of any two nodes 2 ×n 2 Of adjacent matrices, w ij Is a weight value, which represents the weight of the corner v i And v j The correlation between the represented pixels;
the joint graph G (V, E, W) is defined as V ═ V 1 ∪V 2 ,E=E 1 ∪E 2 ∪E 12 ,E 12 Denotes a connection V 1 Any one of the nodes and V 2 The definition of the adjacency matrix W of the joint graph G for the set of edges of any node is as follows:
Figure FDA0003699297540000022
wherein C refers to the connection set V 1 Any one node and set V 2 N composed of the weight values of the edges of any node 1 ×n 2 Adjacent matrix,C T Is the transposed matrix of C;
(1.3) traversing all points in the corner point set by using a KNN algorithm, taking k points nearest to each sample point as the neighbors of the points, and only taking the weight values w between the k points nearest to the sample point ij If the value is more than 0, the weight value between the point and other points is 0, and s is reserved as long as one point is in the K neighbor of the other point in order to ensure the symmetry ij The formula for measuring the edge weight value connecting two corners is as follows:
Figure FDA0003699297540000023
wherein x is i ,x j Is any two angular points v i ,v j Descriptor of (1), x i ∈KNN(x j ) Refer to the corner point v i Pertaining to a distance corner point v j One of the nearest k points, x j ∈KNN(x i ) Refer to the corner point v j Pertaining to a distance corner v i One of the nearest k points.
3. The method for matching a heterogeneous image based on the spectral feature analysis of a joint graph according to claim 2, wherein in the step (2), feature vectors of the joint graph are obtained by defining feature values of a conventional Laplace and a decomposed adjacent matrix, and feature function pairs are constructed by three-dimensional reconstruction, and the specific method is as follows:
(2.1) using the adjacency matrix W and the degree matrix D thereof, we can calculate the laplacian matrix as L ═ W-D and normalize it;
(2.2) decomposition of formula Using eigenvalues
Figure FDA0003699297540000031
Calculating its feature vector, wherein the feature vector U 1 ,U 2 ,...,U k Corresponding to the minimum K characteristic values respectively, the dimension of each characteristic vector is n 1 +n 2
(2.3) for each of the K feature vectorsSplitting it into two vectors with dimensions n 1 And n 2
(2.4) extracting n from each feature vector 1 The dimensional vector is reconstructed as the size of the visible image, and the reconstructed feature function is called the feature spectrum by assigning it to the sampling positions, i.e. where the feature points are extracted, and linearly inserting a self-defined value between these sampling positions
Figure FDA0003699297540000032
N extracted from each feature vector 2 The vector of dimensions is reconstructed to the size of the infrared image, thereby obtaining a characteristic frequency spectrum
Figure FDA0003699297540000033
(2.5) according to the method described above, from the eigenvector U corresponding to the smallest K eigenvalues 1 ,U 2 ,...,U k Respectively obtaining K groups of characteristic frequency spectrum pairs
Figure FDA0003699297540000034
4. The method for matching the heterogeneous image based on the spectrum feature analysis of the joint graph according to claim 3, wherein in the step (3), a SUSAN-MSER-SURF algorithm is provided for detecting the extremum regions of each group of feature spectrum pairs, and the specific method is as follows:
(3.1) examining the first set of characteristic spectral pairs
Figure FDA0003699297540000035
If the gray value of the pixel point is larger than or equal to the pixel of the threshold value, storing the pixel point, otherwise, neglecting, and forming an extreme value area by adjacent points in the stored points;
(3.2) detecting the stability of the extremum regions, wherein the speed of the area change of the extremum regions can be expressed by the following formula:
Figure FDA0003699297540000036
wherein Δ represents a change in threshold, Q i It is indicated that the area of the ith extremal region is gradually increased from 0 to 255 to a threshold value of delta size, and for regions with a changed threshold value but no change in region size, they are retained as the most stable extremal region, i.e. the region Q is considered when v (i) indicating how fast the area of the extremal region changes is smaller than a given threshold value i Is the region of maximum stable extremum;
(3.3) extracting the corner points in the characteristic function by adopting an SUSAN operator, and reserving the maximum stable extremum region when the number of the corner points in the maximum stable extremum region is more than or equal to m;
(3.4) performing affine normalization on each detected MSER ellipse, taking the center point of the normalized MSER ellipse as an interest point, taking a circular region around the interest point, and calculating a SURF descriptor for the region by segmenting the region and calculating gradients on each sub-region;
(3.5) detecting each pair of feature function pairs using steps (3.1) - (3.4)
Figure FDA0003699297540000041
All MSER regions of (a) normalize it to a point of interest and compute SURF descriptors for it.
5. The method for matching heterogeneous images based on the joint graph spectral feature analysis according to claim 3, wherein in the step (3), the region features are matched by using a proposed Euclidean-Hamming algorithm, which is implemented as follows:
(4.1) for feature function
Figure FDA0003699297540000042
To calculate it to a feature function
Figure FDA0003699297540000043
Obtaining a set of distances by the Euclidean distances of all the characteristic points, and obtaining the minimum distance by comparison
Figure FDA0003699297540000044
The upper characteristic point is the matching point of the upper characteristic point, so that a matching point pair is obtained; if the minimum distance exceeds the threshold, the matching point pair is excluded, and the feature function pair is found out according to the rule
Figure FDA0003699297540000045
All matching point pairs; wherein, the Euclidean distance is added and subtracted once more on the original Euclidean distance, and the vector a is equal to (x) 1 ,x 2 ,...,x n ) Sum vector b ═ y 1 ,y 2 ,...,y n ) The formula of the improved Euclidean distance between the two is as follows:
Figure FDA0003699297540000046
(4.2) setting a comparison parameter MaxRadio if
Figure FDA0003699297540000047
Point of certain characteristic
Figure FDA0003699297540000048
If the ratio of the minimum distance of the upper characteristic point to the next minimum distance is smaller than the comparison parameter, the matching point pair is reserved;
(4.3) recording the minimum Euclidean distance of all the matching point pairs as d omin Respectively calculating the Hamming distances of the matched point pairs, comparing the Hamming distances of all the matched point pairs, and if they are less than d omin Twice, the matching point pair is considered as a mismatching, otherwise, the matching point pair is larger than d omin Twice, the matching point pair is considered to be the correct matching point pair.
CN202010493597.3A 2020-06-03 2020-06-03 Heterogeneous image matching method based on joint graph spectrum feature analysis Active CN111783834B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010493597.3A CN111783834B (en) 2020-06-03 2020-06-03 Heterogeneous image matching method based on joint graph spectrum feature analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010493597.3A CN111783834B (en) 2020-06-03 2020-06-03 Heterogeneous image matching method based on joint graph spectrum feature analysis

Publications (2)

Publication Number Publication Date
CN111783834A CN111783834A (en) 2020-10-16
CN111783834B true CN111783834B (en) 2022-08-19

Family

ID=72753662

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010493597.3A Active CN111783834B (en) 2020-06-03 2020-06-03 Heterogeneous image matching method based on joint graph spectrum feature analysis

Country Status (1)

Country Link
CN (1) CN111783834B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112396112B (en) * 2020-11-20 2024-05-14 北京百度网讯科技有限公司 Clustering method, clustering device, electronic equipment and storage medium
CN113516184B (en) * 2021-07-09 2022-04-12 北京航空航天大学 Mismatching elimination method and system for image feature point matching
CN117596487B (en) * 2024-01-18 2024-04-26 深圳市城市公共安全技术研究院有限公司 Camera disturbance self-correction method, device, equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140004942A (en) * 2012-07-03 2014-01-14 삼성테크윈 주식회사 Image matching method
CN110097093A (en) * 2019-04-15 2019-08-06 河海大学 A kind of heterologous accurate matching of image method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140004942A (en) * 2012-07-03 2014-01-14 삼성테크윈 주식회사 Image matching method
CN110097093A (en) * 2019-04-15 2019-08-06 河海大学 A kind of heterologous accurate matching of image method

Also Published As

Publication number Publication date
CN111783834A (en) 2020-10-16

Similar Documents

Publication Publication Date Title
CN111783834B (en) Heterogeneous image matching method based on joint graph spectrum feature analysis
CN104574347B (en) Satellite in orbit image geometry positioning accuracy evaluation method based on multi- source Remote Sensing Data data
CN113065558A (en) Lightweight small target detection method combined with attention mechanism
Zou et al. Harf: Hierarchy-associated rich features for salient object detection
CN103426182A (en) Electronic image stabilization method based on visual attention mechanism
CN111369495B (en) Panoramic image change detection method based on video
CN108109163A (en) A kind of moving target detecting method for video of taking photo by plane
CN107622239B (en) Detection method for remote sensing image specified building area constrained by hierarchical local structure
CN108399627B (en) Video inter-frame target motion estimation method and device and implementation device
CN110135438B (en) Improved SURF algorithm based on gradient amplitude precomputation
CN109215053A (en) Moving vehicle detection method containing halted state in a kind of unmanned plane video
CN101630407B (en) Method for positioning forged region based on two view geometry and image division
CN110598613B (en) Expressway agglomerate fog monitoring method
CN113689331B (en) Panoramic image stitching method under complex background
CN112287906B (en) Template matching tracking method and system based on depth feature fusion
CN112734822A (en) Stereo matching algorithm based on infrared and visible light images
CN115601574A (en) Unmanned aerial vehicle image matching method for improving AKAZE characteristics
CN116664892A (en) Multi-temporal remote sensing image registration method based on cross attention and deformable convolution
CN112614167A (en) Rock slice image alignment method combining single-polarization and orthogonal-polarization images
CN112818905A (en) Finite pixel vehicle target detection method based on attention and spatio-temporal information
CN112946679A (en) Unmanned aerial vehicle surveying and mapping jelly effect detection method and system based on artificial intelligence
CN117036404A (en) Monocular thermal imaging simultaneous positioning and mapping method and system
CN112967305B (en) Image cloud background detection method under complex sky scene
CN117036235A (en) Relay protection cabinet terminal wire arrangement sequence detection method
CN117078726A (en) Different spectrum image registration method based on edge extraction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant