CN109800787B - Image template matching method based on relative feature distance error measurement - Google Patents

Image template matching method based on relative feature distance error measurement Download PDF

Info

Publication number
CN109800787B
CN109800787B CN201811534705.6A CN201811534705A CN109800787B CN 109800787 B CN109800787 B CN 109800787B CN 201811534705 A CN201811534705 A CN 201811534705A CN 109800787 B CN109800787 B CN 109800787B
Authority
CN
China
Prior art keywords
relative
feature
distance difference
image
pairs
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811534705.6A
Other languages
Chinese (zh)
Other versions
CN109800787A (en
Inventor
杨旸
陈卓
李孝丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Jiaotong University
Original Assignee
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Jiaotong University filed Critical Xian Jiaotong University
Priority to CN201811534705.6A priority Critical patent/CN109800787B/en
Publication of CN109800787A publication Critical patent/CN109800787A/en
Application granted granted Critical
Publication of CN109800787B publication Critical patent/CN109800787B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The image template matching method based on the relative feature distance error measurement specifically comprises the following steps: firstly, extracting feature pairs which are nearest neighbors from a template image and a target image of a current window respectively by taking pixel block features as units; secondly, respectively calculating the relative characteristic distance difference and the mean value of the relative characteristic distance difference of every two characteristic pairs; thirdly, calculating a similarity metric value based on the relative feature distance difference and the mean value of the relative feature distance difference; fourthly, moving the sliding window according to the unit distance, and repeating the first step to the third step until the window slides to the lower right corner of the target image; fifthly, selecting the maximum similarity measurement result, and taking the corresponding candidate window position as a matching output result; the invention provides a similarity measurement method based on relative characteristic distance errors, which is applied to image template matching and can effectively improve the matching precision under different noise interferences.

Description

Image template matching method based on relative feature distance error measurement
Technical Field
The invention relates to the field of image matching, in particular to an image template matching method based on relative feature distance error measurement.
Background
The template matching algorithm is a basic research problem in the field of computer vision, and has practical application in the problems of target tracking, scene reconstruction, image splicing and the like. Many template matching algorithms have been proposed, but it is still a challenging task to overcome noise interference, background occlusion, and non-rigid deformation of the target.
Based on the similarity measure between pixels or image features, which is the core of template matching, global similarity methods such as Sum-of-Squared Differences (SSD), Sum-of-Absolute Differences (SAD), Normalized Cross-Correlation (NCC) were mainly applied for matching in the early days, and these algorithms are susceptible to background noise and small deformation since all pixels are considered on average. The Robust local Feature extraction method adds the constraint of local matching Features for template matching, such as Scale-inverse Feature Transform (SIFT) [1] and Speeded Up Robust Features (SURF) [2] algorithms, and has invariance to rotation, Scale and translation of target local key points. In addition, the extraction method [3] based on the local characteristic line can ensure the maintenance of the line structure in the matching process. Most of these algorithms rely on the prior detection of a set of key points or lines and are not suitable for smoother or smaller images. In recent years, some search strategies using sliding windows and methods based on local block matching have made a breakthrough in the problem of template matching. The Best-costs Similarity (BBS) [4] algorithm selects a matching area by comparing the number of pixel blocks which are nearest to each other in the template image and the target image. Deformable Diversity Similarity (DDIS) [5] method statistics match feature deformation divergence estimate Similarity. Although the matching effect of the two methods is improved in robustness, the matching performance of the local blocks is still independently calculated, and the relationship between the features is ignored. The Co-occurrence based template matching (CoTM) 6 method further takes the probability of the common appearance of two features as prior information, improves the precision of the algorithm, but still ignores the original structural features among the features and is easily interfered by local similar noise.
[1]Lowe D,Distinctive image features from scale-invariant keypoints.International Journal of Computer Vision,2004:60(2),91-110.
[2]Bay H,Tuytelaars T,et al.SURF:speeded up robust features.In Proceedings of European Conference on Computer Vision,2006:404–417.
[3]Zhang Y,Qu H,Rotation invariant feature Lines transform for image matching.Journal of Electronic Imaging,2014:23(5),053002.
[4]Dekel T,Oron S,et al.Best-buddies similarity for robust template matching.In Proceedings of IEEE Conference on Computer Vision and Pattern Recognition,2015:2021–2029.
[5]Talmi R,Mechrez R,et al.Template matching with deformable diversity similarity.In Proceedings of IEEE Conference on Computer Vision and Pattern Recognition,2017:1311-1319.
[6]Kat R,Jevnisek R,et al.Matching pixels using co-occurrence statistics.In Proceedings of IEEE Conference on Computer Vision and Pattern Recognition,2018:1751-1759.
Disclosure of Invention
In order to solve the problems in the prior art, the invention provides an image template matching method based on relative feature distance error measurement, wherein the relative feature distance error is calculated, the similarity measurement value is calculated based on the relative feature distance error, the maximum value of the similarity measurement value is selected, the position of a candidate window corresponding to the value is the result of template matching, and the accuracy of template matching is effectively improved.
In order to achieve the above purpose, the invention adopts the following technical scheme:
the image template matching method based on relative feature distance error measurement comprises the steps that firstly, in a template image and a target image of a current window, feature pairs which are nearest neighbors are respectively extracted by taking pixel block features as units; secondly, respectively calculating the relative characteristic distance difference of every two characteristic pairs, and calculating the average value of the relative characteristic distance difference; thirdly, calculating a similarity metric value based on the relative feature distance difference and the mean value of the relative feature distance difference; fourthly, moving the sliding window according to the unit distance, and repeating the first step to the third step until the window slides to the lower right corner of the target image; and fifthly, selecting the maximum similarity metric value result, and taking the corresponding candidate window position as a matching output result.
The method for calculating the relative feature distance difference of every two feature pairs and calculating the mean value of the relative feature distance difference comprises the following steps: set the current feature pair as
Figure GDA0002738600600000034
Wherein N is the total number of the feature pairs under the current window,
Figure GDA0002738600600000035
coordinate values representing the features extracted in the template image,
Figure GDA0002738600600000036
and coordinate values representing the features extracted from the current window of the target image, and the relative feature distance difference between the two pairs of feature pairs is defined as:
Δdij=||Δxij-Δyij||2 (1)
wherein i, j is in the {1, 2.. N },
Figure GDA0002738600600000031
and
Figure GDA0002738600600000032
in addition, two-to-two relative distance differences between all the feature pairs are counted, and the mean value sigma of the relative feature distance differences is calculated, wherein the specific mathematical formula is as follows:
Figure GDA0002738600600000033
the similarity metric value is calculated based on the relative feature distance difference and the mean value of the relative distance difference, and the specific method comprises the following steps: calculating the sum of pairwise relative Gaussian distance differences between all current feature pairs, wherein the mathematical definition is as follows:
Figure GDA0002738600600000041
wherein exp (-) is an exponential function, and S is the similarity between the template and the current window image.
Compared with the prior art, the invention has the following innovation points:
conventional image matching methods evaluate a similarity measure by calculating the distance between corresponding pixels of a template image and a target image. Due to the fact that the background variation interference exists in the image, the distance measurement results among all pixels are integrated and are greatly influenced by the background interference. And when the target itself deforms, the position of the corresponding pixel changes, which is not beneficial to similarity calculation. The method for matching partial features based on feature extraction only considers the corresponding relation of more obvious local features in the image, can effectively overcome the interference of irrelevant points, and has the characteristics of flexible feature selection and small influence by the background. However, the method based on local features ignores the structural relationship among the features and is sensitive to the influence of local close features. The invention defines a similarity measurement method based on relative feature distance errors, and on the basis of extracted local features, the similarity of relative spatial distances between every two features is measured for matching. Compared with the traditional measurement matching method for respectively counting local features, the method has the advantages that the structural change among the targets can be measured, so that a more accurate matching result is obtained.
Drawings
FIG. 1 is a flowchart of an image template matching method based on relative feature distance error metric according to the present invention.
FIG. 2 is an experimental result comparing template matching, wherein: the content marked in fig. 2(a) is the template image, fig. 2(b) is the true value of the matching result, and fig. 2(c) and fig. 2(d) show the matching result of the BBS method and the matching result of the method of the present invention, respectively.
FIG. 3 is another experimental result comparing template matching, wherein: the content marked in fig. 3(a) is the template image, fig. 3(b) is the true value of the matching result, and fig. 3(c) and fig. 3(d) show the matching result of the BBS method and the matching result of the method of the present invention, respectively.
Detailed Description
The invention is described in further detail below with reference to the following figures and embodiments:
as shown in fig. 1, in the image template matching method based on relative feature distance error measurement of the present invention, first, in a template image and a target image of a current window, feature pairs which are nearest neighbors to each other are respectively extracted in units of pixel block features; secondly, respectively calculating the relative characteristic distance difference of every two characteristic pairs, and calculating the average value of the relative characteristic distance difference; thirdly, calculating a similarity metric value based on the relative feature distance difference and the mean value of the relative feature distance difference; fourthly, moving the sliding window according to the unit distance, and repeating the first step to the third step until the window slides to the lower right corner of the target image; and fifthly, selecting the maximum similarity metric value result, and taking the corresponding candidate window position as a matching output result.
Step 1: feature pair extraction
And uniformly meshing the pixel blocks with the size of 3 multiplied by 3 in the template image and the target image of the current window. The feature vector defining a pixel block consists of two parts, one being the color vector (r, g, b) averaged over the pixel block and one being the coordinates (x, y) of the center of the pixel block. The pixel block feature set of the corresponding template image is expressed as
Figure GDA0002738600600000061
The characteristic set of the pixel blocks under the current window of the target image is expressed as
Figure GDA0002738600600000062
Respectively calculating the characteristic distance between every two pixel blocks between the two sets,
Figure GDA0002738600600000063
extracting feature pairs which are nearest to each other in the two sets, i.e. when
Figure GDA0002738600600000064
While satisfying NN
Figure GDA0002738600600000065
The pair of features of (1). Wherein the content of the first and second substances,
Figure GDA0002738600600000066
representation feature
Figure GDA0002738600600000067
Nearest neighbor features in set Q.
Step 2: calculation of distance difference between two features relative to each other
In step 1The coordinate set of the mutually adjacent feature pairs is obtained as
Figure GDA0002738600600000068
Wherein N is the total number of the feature pairs under the current window,
Figure GDA0002738600600000069
coordinate values representing the features extracted in the template image,
Figure GDA00027386006000000610
and (3) coordinate values representing the features extracted from the current window of the target image, and then the relative feature distance difference between the ith pair and the jth pair of features is defined as:
Δdij=||Δxij-Δyij||2 (4)
wherein i, j is in the {1, 2.. N },
Figure GDA00027386006000000611
and
Figure GDA00027386006000000612
representing the relative feature distances corresponding to the two pairs of feature pairs on the template image and the target image.
In addition, two-to-two relative distance differences between all the feature pairs are counted, and the mean value sigma of the relative feature distance differences is calculated, wherein the specific mathematical formula is as follows:
Figure GDA00027386006000000613
and step 3: calculating a similarity metric
Based on the relative feature distance difference and the relative distance difference mean value obtained in the step 2, calculating a Gaussian value of the relative distance difference between the template image and the target image under the current window, and solving the sum of the Gaussian distance differences of all pairs of features, wherein a specific mathematical formula is as follows:
Figure GDA0002738600600000071
where exp (. cndot.) is an exponential function. And S is the similarity between the template and the current window image.
And 4, step 4: moving the sliding window according to the unit distance, repeating the first to third steps, and respectively calculating the similarity { S ] under all windows1,S2,...,SMAnd M is the number of the sliding windows until the window slides to the lower right corner of the target image.
And 5: at { S1,S2,...,SMAnd selecting a maximum value, wherein the position of the sliding window corresponding to the maximum value is the position of the template in the target graph, and outputting a final template matching result.
Fig. 2 and 3 compare the experimental results of template matching. The contents marked in fig. 2(a) and 3(a) are template images, fig. 2(b) and 3(b) are true values of the matching results, fig. 2(c) and 3(c) show the matching results of the BBS method, and fig. 2(d) and 3(d) show the matching results of the method of the present invention. Compared with a BBS algorithm, the method provided by the invention has higher accuracy.

Claims (1)

1. The image template matching method based on the relative feature distance error measurement is characterized by comprising the following steps of: firstly, extracting feature pairs which are nearest neighbors from a template image and a target image of a current window respectively by taking pixel block features as units; secondly, respectively calculating the relative characteristic distance difference of every two characteristic pairs, and calculating the average value of the relative characteristic distance difference; thirdly, calculating a similarity metric value based on the relative feature distance difference and the mean value of the relative feature distance difference; fourthly, moving the sliding window according to the unit distance, and repeating the first step to the third step until the window slides to the lower right corner of the target image; fifthly, selecting the maximum similarity metric value result, and taking the corresponding candidate window position as a matching output result;
the method for calculating the relative feature distance difference of every two feature pairs and calculating the mean value of the relative feature distance difference comprises the following steps: set the current feature pair as
Figure FDA0002762042950000011
Wherein N is the total number of the feature pairs under the current window,
Figure FDA0002762042950000012
coordinate values representing the features extracted in the template image,
Figure FDA0002762042950000013
and coordinate values representing the features extracted from the current window of the target image, and the relative feature distance difference between the two pairs of feature pairs is defined as:
Δdij=||Δxij-Δyij||2 (1)
wherein i, j is in the {1, 2.. N },
Figure FDA0002762042950000014
and
Figure FDA0002762042950000015
representing the relative feature distance of the ith pair and the jth pair corresponding to the feature pairs on the template image and the target image;
in addition, two-to-two relative distance differences between all the feature pairs are counted, and the mean value sigma of the relative feature distance differences is calculated, wherein the specific mathematical formula is as follows:
Figure FDA0002762042950000016
the similarity metric value is calculated based on the relative feature distance difference and the mean value of the relative feature distance difference, and the specific method comprises the following steps: calculating the sum of pairwise relative Gaussian distance differences between all current feature pairs, wherein the mathematical definition is as follows:
Figure FDA0002762042950000021
wherein exp (-) is an exponential function, and S is the similarity between the template and the current window image.
CN201811534705.6A 2018-12-14 2018-12-14 Image template matching method based on relative feature distance error measurement Active CN109800787B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811534705.6A CN109800787B (en) 2018-12-14 2018-12-14 Image template matching method based on relative feature distance error measurement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811534705.6A CN109800787B (en) 2018-12-14 2018-12-14 Image template matching method based on relative feature distance error measurement

Publications (2)

Publication Number Publication Date
CN109800787A CN109800787A (en) 2019-05-24
CN109800787B true CN109800787B (en) 2020-12-29

Family

ID=66556840

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811534705.6A Active CN109800787B (en) 2018-12-14 2018-12-14 Image template matching method based on relative feature distance error measurement

Country Status (1)

Country Link
CN (1) CN109800787B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111709434B (en) * 2020-06-28 2022-10-04 哈尔滨工业大学 Robust multi-scale template matching method based on nearest neighbor feature point matching
CN112257714B (en) * 2020-11-13 2023-10-10 南京工业大学 Template matching method for non-rigid change image
CN117849760B (en) * 2024-03-07 2024-05-14 云南云金地科技有限公司 Laser radar point cloud data processing method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102074001A (en) * 2010-11-25 2011-05-25 上海合合信息科技发展有限公司 Method and system for stitching text images
CN108256564A (en) * 2018-01-10 2018-07-06 广东工业大学 A kind of adaptive masterplate matching process and device based on distance metric distinctiveness ratio
CN108629769A (en) * 2018-05-02 2018-10-09 山东师范大学 Eye fundus image optic disk localization method and system based on best fraternal similarity

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101082987B (en) * 2007-06-28 2012-01-18 复旦大学 Column diagram comparability measurement method based on average difference between windows
CN102592148A (en) * 2011-12-29 2012-07-18 华南师范大学 Face identification method based on non-negative matrix factorization and a plurality of distance functions
US10043058B2 (en) * 2016-03-09 2018-08-07 International Business Machines Corporation Face detection, representation, and recognition

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102074001A (en) * 2010-11-25 2011-05-25 上海合合信息科技发展有限公司 Method and system for stitching text images
CN108256564A (en) * 2018-01-10 2018-07-06 广东工业大学 A kind of adaptive masterplate matching process and device based on distance metric distinctiveness ratio
CN108629769A (en) * 2018-05-02 2018-10-09 山东师范大学 Eye fundus image optic disk localization method and system based on best fraternal similarity

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Best-Buddies Similarity—Robust Template Matching Using Mutual Nearest Neighbors;Shaul Oron et al.;《IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE》;20180831;第40卷(第8期);第1799-1813页 *
Maximum Correntropy Estimation Is a Smoothed MAP Estimation;Maximum Correntropy Estimation Is a Smoothed MAP Estimation;《IEEE SIGNAL PROCESSING LETTERS》;20120831;第19卷(第8期);第491-494页 *
基于互信息的图像配准算法研究;曹蹊渺;《中国优秀硕士学位论文全文数据库 信息科技辑》;20080915(第09期);论文第46-47页 *

Also Published As

Publication number Publication date
CN109800787A (en) 2019-05-24

Similar Documents

Publication Publication Date Title
CN108388896B (en) License plate identification method based on dynamic time sequence convolution neural network
CN109800787B (en) Image template matching method based on relative feature distance error measurement
CN103049892B (en) Non-local image denoising method based on similar block matrix rank minimization
CN110175649B (en) Rapid multi-scale estimation target tracking method for re-detection
CN112085772B (en) Remote sensing image registration method and device
CN113361542B (en) Local feature extraction method based on deep learning
CN106778517A (en) A kind of monitor video sequence image vehicle knows method for distinguishing again
CN110135438B (en) Improved SURF algorithm based on gradient amplitude precomputation
CN110084830B (en) Video moving object detection and tracking method
CN109472770B (en) Method for quickly matching image characteristic points in printed circuit board detection
CN107180436A (en) A kind of improved KAZE image matching algorithms
CN108830283B (en) Image feature point matching method
CN113763269A (en) Stereo matching method for binocular images
Pan et al. An adaptive multifeature method for semiautomatic road extraction from high-resolution stereo mapping satellite images
Lecca et al. Comprehensive evaluation of image enhancement for unsupervised image description and matching
CN109508674B (en) Airborne downward-looking heterogeneous image matching method based on region division
CN109766943B (en) Template matching method and system based on global perception diversity measurement
CN108764343B (en) Method for positioning tracking target frame in tracking algorithm
CN107564008A (en) Rapid SAR image segmentation method based on crucial pixel fuzzy clustering
CN112598711B (en) Hyperspectral target tracking method based on joint spectrum dimensionality reduction and feature fusion
CN109118493B (en) Method for detecting salient region in depth image
CN113066015B (en) Multi-mode remote sensing image rotation difference correction method based on neural network
Kheng Mean shift tracking
CN107451574B (en) Motion estimation method based on Haar-like visual feature perception
CN108304863B (en) Terra-cotta warriors image matching method using learning invariant feature transformation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant