CN111709434B - Robust multi-scale template matching method based on nearest neighbor feature point matching - Google Patents
Robust multi-scale template matching method based on nearest neighbor feature point matching Download PDFInfo
- Publication number
- CN111709434B CN111709434B CN202010597667.XA CN202010597667A CN111709434B CN 111709434 B CN111709434 B CN 111709434B CN 202010597667 A CN202010597667 A CN 202010597667A CN 111709434 B CN111709434 B CN 111709434B
- Authority
- CN
- China
- Prior art keywords
- matching
- template
- image
- feature
- correlation coefficient
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/56—Extraction of image or video features relating to colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Abstract
A robust multi-scale template matching method based on nearest neighbor feature point matching belongs to the field of computer vision and image processing, and solves the problem that the detection accuracy is not enough when a detection target is zoomed in a larger scale in the existing method based on nearest neighbor point matching. The method extracts features from pixel points of a template image and a searched image; carrying out nearest neighbor feature matching on pixel points; preliminarily screening search positions according to the preliminarily matched correlation coefficient; and calculating the template matching correlation of all the preliminary screening search positions with different scales, and obtaining the optimal matching result. The invention is suitable for image matching.
Description
Technical Field
The invention belongs to the field of computer vision and image processing,
background
Template matching is a pattern recognition method that recognizes an object by translating a template over a searched image and calculating the similarity of the template image to the overlaid searched image area. The commonly used template matching method usually presupposes a transformation relationship between a template and a target, such as translation, rotation, scaling, affine, and the like, and then searches for an optimal transformation relationship. However, in an image captured in a real scene, occlusion, blurring, complex deformation, and the like may occur, and the transformation relationship between the template and the object cannot be expressed by a simple model.
In recent years, scholars propose a method based on nearest neighbor point matching, which improves the robustness to complex deformation conditions, but generally assume that the size of a search window is the same as that of a template, and the detection accuracy is not enough when a detection target is zoomed in a larger scale.
Disclosure of Invention
The invention aims to solve the problems that the existing method based on nearest neighbor point matching has insufficient detection accuracy when a detection target is subjected to large-scale scaling, and provides a robust multi-scale template matching method based on nearest neighbor feature point matching.
The invention relates to a robust multi-scale template matching method based on nearest neighbor feature point matching, which specifically comprises the following steps:
step one, respectively extracting pixel characteristics of the template image T and the searched image S to obtain a template characteristic set F T And a set of target features F S ;
Step two, in the template characteristic set F T Middle search target feature set F S The nearest feature of each target feature in the template feature set F is obtained T The nearest neighboring feature in (a);
step three, utilizing the target characteristic set F S Calculating an initial matching correlation coefficient of each pixel point in the searched image S and the template image T according to the nearest characteristic of each target characteristic;
step four, screening pixel points in the searched image S by using the initial matching correlation coefficient in the step three to obtain a matching pixel point coordinate set;
and fifthly, zooming the size of the searched image S, calculating a correlation coefficient of each pixel point in the coordinate set of the matching pixel points under different sizes, acquiring the coordinate of the pixel point with the maximum correlation coefficient and the corresponding zoom size, and taking the coordinate of the pixel point with the maximum correlation coefficient and the corresponding zoom size as the optimal detection result of template matching to finish primary image matching.
Further, in the present invention, the template feature set F T And is searched forThe features of the image S are color features or depth features.
Further, in the invention, the extraction of the depth features of the pixel points is realized by adopting a VGG19 neural network model.
Further, in the invention, in the second step, each target feature is acquired from the template feature set F T The nearest neighbor method in (1) is the FLANN fast nearest neighbor search library method.
Further, in the present invention, the specific method for calculating the initial matching correlation coefficient between each pixel point in the searched image S and the template image T in step three is as follows:
by the formula:
calculating to obtain a search position coordinate point (i) s ,j s ) Has an initial matching correlation coefficient of C 0 (i s ,j s ) Wherein | { } | is the number of points in the calculation set, (nn) x (i″ s ,j″ s ),nn y (i″ s ,j″ s ) ) is a characteristic of f s (i″ s ,j″ s ) At F T The pixel coordinate of the nearest neighbor feature (i ″) t ,j″ t ) Points in a pixel point set P of the template image;
P={(i″ t ,j″ t ),i″ t =0,1,...,w-1,j″ t =0,1,...,h-1},
q is the set of pixel points for the overlapping region of the searched images,
Q={(i″ s ,j″ s ),i″ s =i″ sl ,i″ sl +1,...,i″ sr ,j″ s =j″ sl ,j″ sl +1,...,j″ sr },
i″ sl =max(i s -w 2 ,0),
i″ sr =min(i s +w-1-w 2 ,W-1),
j″ sl =max(j s -h 2 ,0),
j″ sr =min(j s +h-1-h 2 ,H-1),
(i s ,j s ) To search for a location coordinate point, w is the template image width, and when w is even, w 2 = w/2, when w is odd, w 2 = (w-1)/2, when h is even number, h 2 H/2, when h is odd, h 2 = H-1)/2,w is the width of the searched image S, and H is the height of the searched image S.
Further, in the present invention, the specific method for screening the pixel points in the searched image S by using the initial matching correlation coefficient in the step three in the fourth step is as follows:
S′={(i′ s ,j′ s ),C 0 (i′ s ,j′ s )≥0.7×C 0m }
wherein, C 0m For the maximum of the correlation coefficients of all initial search positions, C 0 (i′ s ,j′ s ) Is a coordinate point (i' s ,j′ s ) Initial match correlation coefficient of (i' s ,j′ s ) To search for a position coordinate point.
Further, in the present invention, the specific method for calculating the correlation coefficient of each pixel point in the matching pixel point coordinate set under different sizes in the fifth step is:
by the formula:
calculating to obtain a scale of alpha, and searching for a position coordinate point of (i' s ,j′ s ) Time template matching correlation coefficient C (alpha, i' s ,j′ s ) Wherein, the selection range of the scale alpha is 0.5 to 2, the step length is 0.1, and Q' is a pixel point set of the overlapped area of the searched images;
Q′={(i″′ s ,j″′ s ),i″′ s =i″′ sl ,i″′ sl +1,...,i″′ sr ,j″′ s =j″′ sl ,j″′ sl +1,...,j″′ sr },
i″′ sl =max(i′ s -w 1 ,0),
j″′ sl =max(j s -h 1 ,0),
i [ ] is an indication function.
The method is mainly used for searching the target in the searched image by using the template when the object has complex deformation and large-scale scaling. The method comprises extracting features from pixel points of a template image and a searched image; carrying out nearest neighbor feature matching on pixel points; preliminarily screening search positions according to the preliminarily matched correlation coefficient; and calculating the template matching correlation of all the preliminary screening search positions with different scales, and obtaining the optimal matching result. The method can improve the accuracy of template matching when the target in the real scene image has complex deformation and large-scale zooming.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is an image taken in a real scene, with a green rectangular frame region as a template image;
FIG. 3 is an image taken in a real scene, where a green rectangular frame region is a real region of a target, and a red rectangular frame region is a template matching result of the method;
FIG. 4 is a graph of the relevance score heat of a template image and a search coverage sub-image.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the embodiments and features of the embodiments of the present invention may be combined with each other without conflict.
The first embodiment is as follows: the present embodiment is described below with reference to fig. 1, where the method for matching a robust multi-scale template based on nearest neighbor feature point matching specifically includes:
step one, respectively extracting pixel characteristics of the template image T and the searched image S to obtain a template characteristic set F T And a set of target features F S ;
Step two, in the template characteristic set F T Middle search target feature set F S The nearest feature of each target feature in the template feature set F is obtained T The nearest neighboring feature in (a);
step three, utilizing the target feature set F S Calculating an initial matching correlation coefficient of each pixel point in the searched image S and the template image T according to the nearest characteristic of each target characteristic;
step four, screening pixel points in the searched image S by using the initial matching correlation coefficient in the step three to obtain a matching pixel point coordinate set;
and fifthly, zooming the size of the searched image S, calculating the correlation coefficient of each pixel point in the matching pixel point coordinate set under different sizes, acquiring the coordinate of the pixel point with the maximum correlation coefficient and the corresponding zoom size, and taking the coordinate of the pixel point with the maximum correlation coefficient and the corresponding zoom size as the optimal detection result of template matching to finish primary image matching.
Further, in this embodiment, the pixel feature in the first step is a color feature or a depth feature.
Further, in the present embodiment, the extraction of the depth features is implemented by using a VGG19 neural network model.
The VGG19 network adopted by the implementation mode is a classical neural network model, and the image features are extracted after parameters are obtained through pre-training on an ImageNet data set. Compared with the traditional feature extraction method, the method can obtain the high-level semantics of the image.
Further, in the present embodiment, in the second step, the template feature set F of each target feature is obtained T The nearest neighbor method in (1) is the FLANN fast nearest neighbor search library method.
Further, in the present embodiment, the specific method for calculating the initial matching correlation coefficient between each pixel point in the searched image S and the template image T in the third step is as follows:
by the formula:
calculating to obtain a search position coordinate point (i) s ,j s ) Has an initial matching correlation coefficient of C 0 (i s ,j s ) Wherein | { } | is the number of points in the calculation set, (nn) x (i″ s ,j″ s ),nn y (i″ s ,j″ s ) Is characterized by f) s (i″ s ,j″ s ) At F T The pixel coordinate of the nearest neighbor feature (i ″) t ,j″ t ) Points in a set P of pixel points of the template image;
P={(i″ t ,j″ t ),i″ t =0,1,...,w-1,j″ t =0,1,...,h-1},
q is the set of pixel points for the overlapping region of the searched image,
Q={(i″ s ,j″ s ),i″ s =i″ sl ,i″ sl +1,...,i″ sr ,j″ s =j″ sl ,j″ sl +1,...,j″ sr },
i″ sl =max(i s -w 2 ,0),
i″ sr =min(i s +w-1-w 2 ,W-1),
j″ sl =max(j s -h 2 ,0),
j″ sr =min(j s +h-1-h 2 ,H-1),
(i s ,j s ) To search for a location coordinate point, w is the template image width, and when w is even, w 2 = w/2, w is an odd number when w is an odd number 2 = 2, h is an even number when h is an integer 2 H/2, when h is odd, h 2 = H-1)/2,w is the width of the searched image S, and H is the height of the searched image S.
Further, in this embodiment, the specific method for screening the pixel points in the searched image S by using the initial matching correlation coefficient in the step three in the step four is as follows:
S′={(i′ s ,j′ s ),C 0 (i′ s ,j′ s )≥0.7×C 0m }
wherein, C 0m For all initial search positions the maximum value of the correlation coefficient, C 0 (i′ s ,j′ s ) Is a coordinate point (i' s ,j′ s ) Initial match correlation coefficient of (i' s ,j′ s ) To search for a position coordinate point.
Further, in this embodiment, the specific method for calculating the correlation coefficient of each pixel in the matching pixel coordinate set under different sizes in step five is as follows:
by the formula:
calculating to obtain a scale alpha, and searching for a position coordinate point of (i' s ,j′ s ) Time template matching correlation coefficient C (alpha, i' s ,j′ s ) Wherein, the selection range of the scale alpha is 0.5 to 2, the step length is 0.1, and Q' is a pixel point set of the overlapped area of the searched images;
Q′={(i″′ s ,j″′ s ),i″′ s =i″′ sl ,i″′ sl +1,...,i″′ sr ,j″′ s =j″′ sl ,j″′ sl +1,...,j″′ sr },
i″′ sl =max(i′ s -w 1 ,0),
j″′ sl =max(j s -h 1 ,0),
i [ ] is an indication function.
In the present invention, as shown in fig. 2 to 4, the features of each pixel point in the template image T and the searched image S are extracted,
obtaining a feature set:
F T ={f t (i t ,j t ),i t =0,1,...,w-1,j t =0,1,...,h-1}
F S ={f s (i s ,j s ),i s =0,1,...,W-1,j s =0,1,...,H-1}
wherein, the template image T is a three-channel image with width W and height H, the searched image S is a three-channel image with width W and height H, f t (i t ,j t ) As a template image coordinate point (i) t ,j t ) Is processed to extract features of f s (i s ,j s ) As a searched image coordinate point (i) s ,j s ) The extracted features are processed. The characteristics can be color characteristics or depth characteristics, wherein the color characteristics of the pixel points are 27-dimensional vectors formed by arranging RGB color values of 3 x 3 image blocks with the pixel points as centers, the depth characteristics of the pixel points are obtained by connecting network layers conv1_2 and conv3_4 of a VGG19 neural network model, and VGG19 network model parameters are obtained by pre-training on an ImageNet data set.
Then, for the feature set F S In the feature set F T Searching to obtain the nearest neighbor feature, wherein the feature f s (i s ,j s ) At F T The pixel coordinate of the middle nearest neighbor feature is (nn) x (i s ,j s ),nn y (i s ,j s ))。
The nearest neighbor feature search method adopts a FLANN fast nearest neighbor search library.
Then, each coordinate point in the searched image S is used as an initial searching position, and an initial matching correlation coefficient is calculated for each initial searching position, wherein the searching position is the coordinate point (i) s ,j s ) Has an initial matching correlation coefficient of C 0 (i s ,j s ) And calculating the maximum value C of all correlation coefficients of the initial search position 0m (ii) a Then screening template matching searchThe coordinate set of the position and the searched position after screening is S '= { (i' s ,j′ s ),C 0 (i′ s ,j′ s )≥0.7×C 0m }。
And finally, calculating the correlation coefficient of the template matching of all the screened searching positions for different scales, wherein when the scale is alpha, the position coordinate point (i ') is searched' s ,j′ s ) The template matching correlation coefficient of is C (alpha, i' s ,j′ s ) The selection range of the scale alpha is from 0.5 to 2, and the step length is 0.1; then the template matches the scale alpha corresponding to the maximum value of the correlation coefficient m And coordinates (i) sm ,j sm ) I.e. the best detection result for template matching, i.e.
Although the invention herein has been described with reference to particular embodiments, it is to be understood that these embodiments are merely illustrative of the principles and applications of the present invention. It is therefore to be understood that numerous modifications may be made to the illustrative embodiments and that other arrangements may be devised without departing from the spirit and scope of the present invention as defined by the appended claims. It should be understood that various dependent claims and the features described herein may be combined in ways different from those described in the original claims. It is also to be understood that features described in connection with individual embodiments may be used in other described embodiments.
Claims (4)
1. A robust multi-scale template matching method based on nearest neighbor feature point matching is characterized by comprising the following steps:
step one, respectively extracting pixel characteristics of the template image T and the searched image S to obtain a template characteristic set F T And a set of target features F S ;
Step two, setting a template feature set F T Middle search target feature set F S The nearest feature of each target feature in the template feature set F is obtained T Is the nearest neighbor ofCharacteristic;
step three, utilizing the target characteristic set F S Calculating an initial matching correlation coefficient of each pixel point in the searched image S and the template image T according to the nearest characteristic of each target characteristic;
step four, screening pixel points in the searched image S by using the initial matching correlation coefficient in the step three to obtain a matching pixel point coordinate set;
step five, zooming the size of the searched image S, calculating the correlation coefficient of each pixel point in the matching pixel point coordinate set under different sizes, obtaining the coordinate of the pixel point with the maximum correlation coefficient and the corresponding zoom size, and taking the coordinate of the pixel point with the maximum correlation coefficient and the corresponding zoom size as the optimal detection result of template matching to finish primary image matching;
the specific method for calculating the initial matching correlation coefficient of each pixel point in the searched image S and the template image T in the third step is as follows:
by the formula:
calculating to obtain a search position coordinate point (i) s ,j s ) Has an initial matching correlation coefficient of C 0 (i s ,j s ) Wherein | { } | is the number of points in the calculation set, (nn) x (i″ s ,j″ s ),nn y (i″ s ,j″ s ) ) is a characteristic of f s (i″ s ,j″ s ) At F T The pixel coordinate of the nearest neighbor feature (i ″) t ,j″ t ) Points in a pixel point set P of the template image;
P={(i″ t ,j″ t ),i″ t =0,1,…,w-1,j″ t =0,1,…,h-1},
q is the set of pixel points for the overlapping region of the searched image,
Q={(i″ s ,j″ s ),i″ s =i″ sl ,i″ sl +1,…,i″ sr ,j s ″=j″ sl ,j″ sl +1,…,j″ sr },
i″ sl =max(i s -w 2 ,0),
i″ sr =min(i s +w-1-w 2 ,W-1),
j″ sl =max(j s -h 2 ,0),
j″ sr =min(j s +h-1-h 2 ,H-1),
(i s ,j s ) To search for a location coordinate point, w is the template image width, and when w is even, w 2 = w/2, w is an odd number when w is an odd number 2 = (w-1)/2, when h is even number, h 2 H/2, when h is odd, h 2 = (H-1)/2,w is the width of the searched image S, and H is the height of the searched image S;
in the fourth step, the specific method for screening the pixel points in the searched image S by using the initial matching correlation coefficient in the third step is as follows:
S′={(i′ s ,j′ s ),C 0 (i′ s ,j′ s )≥0.7×C 0m }
calculation is realized, wherein S' is a coordinate set of matched pixel points, C 0m For the maximum of the correlation coefficients of all initial search positions, C 0 (i′ s ,j′ s ) Is a coordinate point (i' s ,j′ s ) Initial match correlation coefficient of (i' s ,j′ s ) Searching position coordinate points;
the concrete method for calculating the correlation coefficient of each pixel point in the matching pixel point coordinate set under different sizes in the fifth step is as follows:
by the formula:
calculating to obtain a scale of alpha, and searching for a position coordinate point of (i' s ,j′ s ) Time template matching correlation coefficient C (alpha, i' s ,j′ s ) Wherein, the selection range of the scale alpha is 0.5 to 2, the step length is 0.1, and Q' is a pixel point set of the overlapped area of the searched images;
Q′={(i″′ s ,j″′ s ),i″′ s =i″′ sl ,i″′ sl +1,…,i″′ sr ,j″′ s =j″′ sl ,j″′ sl +1,…,j″′ sr },
i″′ sl =max(i′ s -w 1 ,0),
j″′ sl =max(j s -h 1 ,0),
and I [ ] is an indication function.
2. The robust nearest neighbor feature point matching-based multi-scale template matching method according to claim 1, wherein the pixel feature of step one is a color feature or a depth feature.
3. The robust nearest neighbor feature point matching-based multi-scale template matching method according to claim 2, wherein the extraction of the depth features is implemented by using a VGG19 neural network model.
4. The robust multi-scale template matching method based on nearest neighbor feature point matching as claimed in claim 1, wherein in step two, each target feature is obtained in a template feature set F T The nearest neighbor method in (1) is the FLANN fast nearest neighbor search library method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010597667.XA CN111709434B (en) | 2020-06-28 | 2020-06-28 | Robust multi-scale template matching method based on nearest neighbor feature point matching |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010597667.XA CN111709434B (en) | 2020-06-28 | 2020-06-28 | Robust multi-scale template matching method based on nearest neighbor feature point matching |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111709434A CN111709434A (en) | 2020-09-25 |
CN111709434B true CN111709434B (en) | 2022-10-04 |
Family
ID=72543704
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010597667.XA Active CN111709434B (en) | 2020-06-28 | 2020-06-28 | Robust multi-scale template matching method based on nearest neighbor feature point matching |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111709434B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112966725B (en) * | 2021-02-08 | 2024-01-16 | 深圳市优必选科技股份有限公司 | Method and device for matching template images and terminal equipment |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104680550A (en) * | 2015-03-24 | 2015-06-03 | 江南大学 | Method for detecting defect on surface of bearing by image feature points |
CN104866806A (en) * | 2014-02-21 | 2015-08-26 | 深圳富泰宏精密工业有限公司 | Self-timer system and method with face positioning auxiliary function |
CN104966047A (en) * | 2015-05-22 | 2015-10-07 | 浪潮电子信息产业股份有限公司 | Method and device for identifying vehicle license |
CN106407987A (en) * | 2016-08-31 | 2017-02-15 | 中国工程物理研究院激光聚变研究中心 | Image cross correlation matching-based electric light crystal optical axis extend-out point extraction method |
CN106886748A (en) * | 2016-12-28 | 2017-06-23 | 中国航天电子技术研究院 | A kind of mutative scale method for tracking target suitable for unmanned plane based on TLD |
CN107239792A (en) * | 2017-05-12 | 2017-10-10 | 大连理工大学 | A kind of workpiece identification method and device based on binary descriptor |
CN108629769A (en) * | 2018-05-02 | 2018-10-09 | 山东师范大学 | Eye fundus image optic disk localization method and system based on best fraternal similarity |
CN109271996A (en) * | 2018-08-21 | 2019-01-25 | 南京理工大学 | FPC automatic image registration method based on SURF feature and Hash perception algorithm |
CN109389038A (en) * | 2018-09-04 | 2019-02-26 | 阿里巴巴集团控股有限公司 | A kind of detection method of information, device and equipment |
CN109766943A (en) * | 2019-01-10 | 2019-05-17 | 哈尔滨工业大学(深圳) | A kind of template matching method and system based on global perception diversity measurement |
CN109800787A (en) * | 2018-12-14 | 2019-05-24 | 西安交通大学 | Image template matching method based on relative characteristic range error measurement |
-
2020
- 2020-06-28 CN CN202010597667.XA patent/CN111709434B/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104866806A (en) * | 2014-02-21 | 2015-08-26 | 深圳富泰宏精密工业有限公司 | Self-timer system and method with face positioning auxiliary function |
CN104680550A (en) * | 2015-03-24 | 2015-06-03 | 江南大学 | Method for detecting defect on surface of bearing by image feature points |
CN104966047A (en) * | 2015-05-22 | 2015-10-07 | 浪潮电子信息产业股份有限公司 | Method and device for identifying vehicle license |
CN106407987A (en) * | 2016-08-31 | 2017-02-15 | 中国工程物理研究院激光聚变研究中心 | Image cross correlation matching-based electric light crystal optical axis extend-out point extraction method |
CN106886748A (en) * | 2016-12-28 | 2017-06-23 | 中国航天电子技术研究院 | A kind of mutative scale method for tracking target suitable for unmanned plane based on TLD |
CN107239792A (en) * | 2017-05-12 | 2017-10-10 | 大连理工大学 | A kind of workpiece identification method and device based on binary descriptor |
CN108629769A (en) * | 2018-05-02 | 2018-10-09 | 山东师范大学 | Eye fundus image optic disk localization method and system based on best fraternal similarity |
CN109271996A (en) * | 2018-08-21 | 2019-01-25 | 南京理工大学 | FPC automatic image registration method based on SURF feature and Hash perception algorithm |
CN109389038A (en) * | 2018-09-04 | 2019-02-26 | 阿里巴巴集团控股有限公司 | A kind of detection method of information, device and equipment |
CN109800787A (en) * | 2018-12-14 | 2019-05-24 | 西安交通大学 | Image template matching method based on relative characteristic range error measurement |
CN109766943A (en) * | 2019-01-10 | 2019-05-17 | 哈尔滨工业大学(深圳) | A kind of template matching method and system based on global perception diversity measurement |
Non-Patent Citations (2)
Title |
---|
"Template Matching with Deformable Diversity Similarity;I. Talmi, R. Mechrez and L. Zelnik-Manor;《2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR)》;20171109;正文第1311-1319页 * |
Best-Buddies Similarity—Robust Template Matching Using Mutual Nearest Neighbors;S. Oron, T. Dekel, T. Xue, W. T. Freeman and S. Avidan;《IEEE Transactions on Pattern Analysis and Machine Intelligence》;20180801;正文第1799-1813 * |
Also Published As
Publication number | Publication date |
---|---|
CN111709434A (en) | 2020-09-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11423701B2 (en) | Gesture recognition method and terminal device and computer readable storage medium using the same | |
CN110472625B (en) | Chinese chess piece visual identification method based on Fourier descriptor | |
CN108154151B (en) | Rapid multi-direction text line detection method | |
CN114187665A (en) | Multi-person gait recognition method based on human body skeleton heat map | |
CN112308128B (en) | Image matching method based on attention mechanism neural network | |
CN110751154A (en) | Complex environment multi-shape text detection method based on pixel-level segmentation | |
CN108710883B (en) | Complete salient object detection method adopting contour detection | |
CN112785626A (en) | Twin network small target tracking method based on multi-scale feature fusion | |
CN114429649B (en) | Target image identification method and device | |
Zhao et al. | Automatic registration of images with inconsistent content through line-support region segmentation and geometrical outlier removal | |
CN111709434B (en) | Robust multi-scale template matching method based on nearest neighbor feature point matching | |
CN105678720A (en) | Image matching judging method and image matching judging device for panoramic stitching | |
CN114333023A (en) | Face gait multi-mode weighting fusion identity recognition method and system based on angle estimation | |
CN109711420B (en) | Multi-affine target detection and identification method based on human visual attention mechanism | |
CN111915645B (en) | Image matching method and device, computer equipment and computer readable storage medium | |
CN111199199B (en) | Action recognition method based on self-adaptive context area selection | |
Ihmeida et al. | Image registration techniques and applications: Comparative study on remote sensing imagery | |
CN111079585B (en) | Pedestrian re-identification method combining image enhancement with pseudo-twin convolutional neural network | |
CN107016414A (en) | A kind of recognition methods of footprint | |
CN111260555A (en) | Improved image splicing method based on SURF | |
CN115909401A (en) | Cattle face identification method and device integrating deep learning, electronic equipment and medium | |
CN115619637A (en) | Automatic splicing and fusing method of panoramic image capable of being transplanted to ARM end | |
CN112380966B (en) | Monocular iris matching method based on feature point re-projection | |
CN111008955B (en) | Rapid copying, pasting and tampering detection method for multi-scale image block matching | |
CN108304863B (en) | Terra-cotta warriors image matching method using learning invariant feature transformation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |