CN111340134B - Rapid template matching method based on local dynamic warping - Google Patents
Rapid template matching method based on local dynamic warping Download PDFInfo
- Publication number
- CN111340134B CN111340134B CN202010164515.0A CN202010164515A CN111340134B CN 111340134 B CN111340134 B CN 111340134B CN 202010164515 A CN202010164515 A CN 202010164515A CN 111340134 B CN111340134 B CN 111340134B
- Authority
- CN
- China
- Prior art keywords
- similarity
- test
- image
- template
- template image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 53
- 239000013598 vector Substances 0.000 claims abstract description 81
- 238000012360 testing method Methods 0.000 claims abstract description 69
- 238000012216 screening Methods 0.000 claims abstract description 4
- 238000004364 calculation method Methods 0.000 claims description 31
- 239000011159 matrix material Substances 0.000 claims description 26
- 230000001186 cumulative effect Effects 0.000 claims description 12
- 238000001914 filtration Methods 0.000 claims description 4
- 238000009499 grossing Methods 0.000 claims description 3
- 238000005070 sampling Methods 0.000 claims description 3
- 238000005286 illumination Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000012935 Averaging Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/751—Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/30—Computing systems specially adapted for manufacturing
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Evolutionary Computation (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a rapid template matching method based on local dynamic warping, which can be applied to the fields of workpiece positioning, industrial sorting, target tracking and the like. The method comprises the following steps: extracting feature vectors of a template image and a test subgraph by using an improved ring projection method (IRPT), initially estimating the similarity, screening out candidate test subgraphs, and calculating the similarity and the scaling coefficient of the candidate test subgraphs by using a proposed local dynamic warping method (LDTW); and taking the test subgraph with the highest similarity value, cutting out the minimum region containing the target object at the corresponding position of the test subgraph based on the corresponding scaling coefficient of the test subgraph, and finally calculating the rotation angle of the region by using a direction code method (OC). Compared with the prior art, the method can calculate the scaling factor and the rotation angle by only one template image, solves the problem that the scaling factor and the rotation angle can be calculated by a large number of template images with different scaling factors and rotation angle combinations in the conventional algorithm, and greatly simplifies the algorithm.
Description
Technical Field
The invention relates to the technical field of machine vision positioning, in particular to a rapid template matching method based on local dynamic warping.
Background
The template matching algorithm is a key technology in machine vision, and by using a given template image, similar test subgraphs (namely target workpieces) are identified and positioned in a test image, so that the template matching algorithm has wide application in a workpiece positioning system and a product quality detection system. With the increasingly complex working conditions, higher requirements are put on the real-time performance and robustness of the template matching algorithm.
Conventional template matching methods usually face the challenges of scaling, rotation, noise and illumination variation, and currently, a better solution is still lacking. For example, patent document 1(CN105046271A) discloses a MELF element positioning and detecting method based on template matching, which obtains a large number of template images by rotating and scaling original template images, and then matches test images one by one with each template image, thereby achieving the purpose of identifying a rotation angle and a scaling coefficient, but the algorithm complexity is high; for another example, patent document 2(CN108805220A) discloses a gradient integral-based fast template matching algorithm, which combines an image pyramid, contour point extraction and gradient integral to reduce redundancy of data calculation, but the algorithm still relies on a large number of template images with different scaling and rotation angle combinations to identify the scaling factor and the rotation angle of the workpiece; also, for example, patent document 3(CN102254181A) discloses a multi-order differential annular template matching tracking method, which achieves calculation of the rotation angle of the target object based on the annular template matching criterion, but still cannot calculate the scaling coefficient of the target object, and is not robust to illumination changes. In summary, a template matching method capable of calculating a rotation angle and a scaling coefficient simultaneously, with low algorithm complexity and good robustness is still lacking at present.
Disclosure of Invention
In order to improve the real-time performance and robustness of a template matching algorithm and simplify the algorithm, the invention provides a rapid template matching method based on local dynamic warping.
The technical scheme adopted by the invention is as follows:
a fast template matching method based on local dynamic warping comprises the following steps:
step 1, traversing the test graph, extracting a test subgraph with the same size as the template image, and extracting ring projection characteristic vectors of the test subgraph and the template image by using a ring projection algorithm;
step 2, taking the ring projection feature vector obtained in the step 1 as input, calculating the rough estimation similarity of the test subgraph and the template image, screening out the test subgraph with the similarity larger than a first set threshold value, and listing as a candidate test subgraph;
step 3, aiming at the candidate test subgraph obtained in the step 2, calculating the similarity and the image scaling coefficient by using a local dynamic warping algorithm and projecting the curve outline of the feature vector through a local alignment ring;
step 4, after traversing the test chart, taking the maximum value of the similarity, if the maximum value of the similarity is greater than or equal to a third set threshold value, determining the coordinate of the corresponding test subgraph as the target position, and cutting out the minimum area containing the target object from the test chart according to the corresponding scaling coefficient;
and 5, extracting the direction code characteristic vectors of the minimum region and the template image obtained in the step 4 by using a direction code algorithm, and then calculating the rotation angle of the image based on the direction code characteristic vectors to finally obtain the target position, the scaling coefficient and the rotation angle.
Preferably, the ring projection algorithm in step 1 is as follows: the size of the template image is recorded as M multiplied by N, and the central point (x) of the template image is used 0 ,y 0 ) A polar coordinate system is established for the origin, any one pixel is denoted as T (r, theta), the ring projection feature vector is denoted as IRPT,
wherein,R max min (M/2, N/2), s (r) is the number of pixels on the circle with radius r, T min (r, θ) is the minimum of all pixel intensities on the circle.
Preferably, the algorithm for testing the rough estimation similarity between the subgraph and the template image in step 2 is as follows: the ring projection characteristic vector extracted from the test subgraph is marked as S, the ring projection characteristic vector extracted from the template image is marked as T, and the roughly estimated similarity between the test subgraph and the template image is marked as K c ,K c The larger the size the more similar the image is,
where n is the dimension of vector X, S [ 0: m/2 is the first m/2 dimension of the feature vector S.
Preferably, the first set threshold in step 2 is recorded as β 1 Taking beta of not more than 0.35 1 ≤0.5。
Preferably, in the step 3, the method 1 and the method 2 are to calculate the similarity and the image scaling factor by finding an optimal local matching relationship between the candidate test subgraph and the ring projection feature vector of the template image.
Preferably, one specific method in the step 3 is as follows:
step 3.1a, recording the ring projection characteristic vector extracted from the candidate test subgraph as S, recording the ring projection characteristic vector extracted from the template image as T, inputting S and T, wherein the dimensionality is m respectively s And m t Creating a distance matrix D and a cumulative distance matrix D acc All dimensions of m t ×m s Initializing a distance function DIS ═ x-y |;
step 3.2a, calculating the distance between each element of the eigenvector S and each element of the eigenvector T by using the distance function DIS to obtain a distance matrix D, and then assigning the distance matrix D to the accumulated distance matrix D acc ;
Step 3.3a. update the cumulative distance matrix D using the following formula acc After the updating of each element value, the accumulative distance matrix D is obtained acc ,
Step 3.4a. for the cumulative distance matrix D acc The last column of (1) searching the element with the smallest value from bottom to top, and recording the value as temp 1 The position of the element is (i1, m) s );
Step 3.5a. for cumulative distanceMatrix D acc The last row of (1), searching the element with the smallest value from right to left, and recording the value as temp 2 The position of the element is (i2, m) s );
Step 3.6a. similarity of the feature vectors S and T is recorded as K s The scaling factor, denoted as K,
Preferably, another specific method in step 3 is:
step 3.1b, smoothing and denoising the characteristic curve by Gaussian filtering, wherein the RPT characteristic vector is marked as f (x), the Gaussian function is marked as g (x, sigma), and the filtered RPT characteristic vector F (x) is
Step 3.2b. when the convolution kernel is marked as T, the discrete slope curve sequence F' (x) is
And 3.3b, respectively recording the slope curve sequences obtained from the template image and the test subgraph as T 'and S', recording the scaling coefficient as k, and initializing the calculation range of the scaling coefficient as k 1 ,k 2 ]The calculation precision (step length) of the scaling coefficient is K', and the similarity K corresponding to each scaling coefficient K is calculated by the following formula s Then the maximum similarityThe corresponding scaling factor K is the desired scaling factor K,
Preferably, the third threshold value in step 4 is set to be β 3 Taking beta not more than 0.55 3 ≤0.7。
Preferably, the direction code algorithm in step 5 is a sector sampling method, that is, dividing the image into n sectors, and then averaging all the pixel intensity values in the sectors to obtain an element of the direction code feature vector, thereby obtaining a direction code feature vector associated with the rotation angle.
Preferably, the step 5 specifically comprises:
step 5.1, the size of the input image I is recorded as M multiplied by N, and the central point (x) of the input image is used 0 ,y 0 ) Establishing a polar coordinate system for the origin, any one pixel can be represented as I (r, θ), initializing the angle calculation precision θ' of the direction code method, and calculating the direction code feature vector OC according to the following formula:
step 5.2, the input image is a template image and is marked as T, each theta corresponds to a direction code feature vector, 360 degrees/theta' direction code feature vectors corresponding to the template image can be obtained by utilizing the calculation method in the step 5.1, and the feature vector corresponding to the angle theta is expressed asThe calculation formula is as follows:
wherein n is max =360°/θ′-1;
Step 5.3, the input image is a minimum area marked as S, and a direction code characteristic vector corresponding to the minimum area is obtained by utilizing the calculation method in the step 5.1 and is expressed asThe calculation formula is as follows:
step 5.4, calculating the characteristic vector corresponding to each thetaAnd feature vectorsSimilarity K of (2) (θ,0°) The calculation formula is as follows:
Step 5.5, find the biggest similarity K (θ,0°) And the corresponding theta is the counterclockwise rotation angle of the minimum area relative to the template picture, and the calculation formula is as follows:
the invention has the beneficial effects that:
(1) the conventional template matching method needs a large number of template images with different zooming and rotating angle combinations to construct a template library, and then the large number of template images and the test images are matched one by one to identify the zooming coefficients and the rotating angles of the workpiece. The invention skillfully solves the problem by using the proposed Local Dynamic Time Warping (LDTW) algorithm, can simultaneously calculate the zoom factor and the target position by only needing one template image, and then calculates the rotation angle by using an OC (Orientation Codes) algorithm method. Compared with the conventional template matching method, the method has very low algorithm complexity, and effectively improves the real-time performance of identification and positioning. In addition, the LDTW method has good illumination robustness and better stability.
(2) The conventional template matching method generally suffers from noise and illumination variation, so that the positioning accuracy is unstable, and production accidents are easily caused on an industrial field. The invention adopts the Improved Ring Projection (IRPT) algorithm to extract the characteristics, and the characteristics have good noise robustness.
(3) The algorithm can perform parallel calculation in the process of traversing the test chart to calculate the similarity between the test subgraph and the template picture and calculating the rotation angle by utilizing the OC characteristic vector, thereby meeting the real-time requirement of industrial application.
Drawings
FIG. 1 is a flowchart of a template matching method according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a visual image processing process according to an embodiment of the present invention (method 1).
Fig. 3 is a schematic view of a process of processing a visual image according to an embodiment of the present invention (method 2).
FIG. 4 is a diagram illustrating the result of processing various scaling factors and rotation angles by using a template picture according to an embodiment of the present invention (method 1).
FIG. 5 is a diagram illustrating the result of processing various zoom factors and rotation angles by using a template picture according to an embodiment of the present invention (method 2).
Detailed Description
The invention is further illustrated by the following examples in conjunction with the accompanying drawings. The embodiment provides a fast template matching method based on local dynamic warping, as shown in fig. 1 to 3, the steps of which are as follows.
And S1, traversing the test graph, extracting a test subgraph with the same size as the template image, and extracting the ring projection characteristic vectors of the test subgraph and the template image by using a ring projection algorithm.
S1.1, traversing the test graph from top to bottom and from left to right, cutting out the test subgraph with the same size as the template image, and taking the pixel coordinate of the central point of the test subgraph as the position coordinate of the test subgraph.
S1.2, extracting IRPT (initial Ring Projection Transformation) feature vectors of a test subgraph and a template image by using a proposed Improved IRPT algorithm; specifically, the size of the template image is expressed as M × N, and the center point (x) of the template image is used as the center point 0 ,y 0 ) A polar coordinate system is established for the origin, any one pixel is denoted as T (r, theta), the ring projection feature vector is denoted as IRPT,
wherein,R max min (M/2, N/2), s (r) is the number of pixels on the circle with radius r, T min (r, θ) is the minimum of all pixel intensities on the circle.
Step S2, the ring projection feature vector obtained in the step S1 is used as input, the roughly estimated similarity between the test subgraph and the template image is calculated, the test subgraphs with the similarity larger than a set threshold are screened out and listed as candidate test subgraphs, and the first set subgraphThe threshold is recorded as beta 1 。
The algorithm for roughly estimating the similarity is as follows: the ring projection characteristic vector extracted from the test subgraph is marked as S, the ring projection characteristic vector extracted from the template image is marked as T, and the rough estimation similarity between the test subgraph and the template image is marked as K c ,K c The larger the size the more similar the image is,
where n is the dimension of vector X, S [ 0: m/2 is the first m/2 dimension of the feature vector S.
In general, beta 1 If the size is too large, the correct candidate test subgraph can be filtered out in a wrong way, if the size is too small, the preliminary screening function can be lost, and beta is not less than 0.35 1 Less than or equal to 0.5. If K c Greater than or equal to a threshold value beta 1 If so, the test subgraph is listed as a candidate test subgraph; if the similarity is smaller than the preset threshold, setting the similarity to be 0. It is worth noting that the similarity is calculated using only the top m/2 dimensions of the feature vector S, which is advantageous for filtering background noise due to image scaling.
And S3, aiming at the candidate test subgraph obtained in the S2, calculating the similarity and the image scaling coefficient through the curve outline of the Local alignment ring projection feature vector by using a Local Dynamic Warping (LDTW) algorithm based on the rotation invariance and the global outline invariance of the IRPT feature vector. The present embodiment provides two implementation methods, namely method 1 and method 2, which are essential to calculate similarity and image scaling coefficient by finding an optimal local matching relationship between a candidate test sub-image and a ring projection feature vector of a template image.
The method 1 specifically comprises the following steps:
s3.1a, recording the ring projection characteristic vector extracted from the candidate test subgraph as S, recording the ring projection characteristic vector extracted from the template image as T, and inputting S and T, wherein the dimensionalities of the S and T are m respectively s And m t Creating a distance matrix D and a cumulative distance matrix D acc All dimensions of m t ×m s The distance function DIS is initialized to | x-y |.
S3.2a, calculating the distance between each element of the characteristic vectors S and T by using a distance function DIS to obtain a distance matrix D, and then assigning the distance matrix D to an accumulated distance matrix D acc 。
S3.3a. update the cumulative distance matrix D using the following formula acc After the updating of each element value, the accumulative distance matrix D is obtained acc ,
S3.4a. for the cumulative distance matrix D acc The last column of (1) searching the element with the smallest value from bottom to top, and recording the value as temp 1 The position of the element is (i1, m) s )。
S3.5a. for the cumulative distance matrix D acc The last row of (1), searching the element with the smallest value from right to left, and recording the value as temp 2 The position of the element is (i2, m) s )。
S3.6a. the similarity of the characteristic vectors S and T is recorded as K s The scaling factor, denoted as K,
The method 2 specifically comprises the following steps:
and S3.1b, smoothing and denoising the characteristic curve by utilizing Gaussian filtering. The RPT feature vector is denoted as f (x) and the gaussian function is denoted as g (x, σ). The filtered RPT feature vector f (x) is:
s3.2b, recording a convolution kernel as T, wherein a discrete slope curve sequence F' (x) is as follows:
s3.3b, the slope curve sequence obtained by the template image and the test subgraph is respectively marked as T 'and S', the zoom factor is marked as k, and the calculation range of the initialized zoom factor is [ k ] 1 ,k 2 ]The calculation precision (step length) of the scaling coefficient is K', and the similarity K corresponding to each scaling coefficient K is calculated by the following formula s Then the maximum similarityThe corresponding scaling factor K is the desired scaling factor K,
And S4, after traversing the test chart, taking the maximum value of the similarity. If the maximum value of the similarity is larger than or equal to the third set threshold value, the coordinate of the corresponding test subgraph is the target position, and meanwhile, a minimum Region (ROI) containing a target object is cut out from the test graph according to the corresponding scaling coefficient; if the value is less than the preset value, no target workpiece exists in the test chart. Setting threshold value of third number as beta 3 ,β 3 Both too large and too small will result in mismatch, usually 0.55. ltoreq. beta 3 ≤0.7。
And S5, extracting the direction code characteristic vectors of the minimum Region (ROI) and the template image obtained in the S4 by using a direction code (OC) algorithm, and then calculating the rotation angle of the image based on the direction code characteristic vectors to finally obtain the target position, the scaling coefficient and the rotation angle.
The direction code algorithm is a sector sampling method, namely, an image is divided into n sector areas, and then all pixel intensity values in the sector areas are averaged and used as one element of a direction code feature vector, so that a direction code feature vector associated with a rotation angle is obtained. The method specifically comprises the following steps:
s5.1, recording the size of the input image as M multiplied by N, and taking the central point (x) of the input image as 0 ,y 0 ) Establishing a polar coordinate system for the origin, any one pixel can be represented as T (r, θ); and (3) initializing the angle calculation precision theta' of the direction code method, and taking theta as 1 DEG, wherein the calculation formula of the direction code feature vector OC is as follows:
S5.2, the input image is a template image and is marked as T, each theta corresponds to a direction code feature vector, 360 degrees/theta' direction code feature vectors corresponding to the template image can be obtained by utilizing the calculation method in the step 5.1, and the feature vector corresponding to the angle theta is expressed asThe calculation formula is as follows:
wherein n is max =360°/θ′-1。
S5.3, the input image is a minimum area and is marked as S, and a direction code characteristic vector corresponding to the minimum area is obtained by the calculation method in the step 5.1 and is expressed asThe calculation formula is as follows:
s5.4, calculating the characteristic vector corresponding to each thetaAnd feature vectorsSimilarity K of (θ,0°) The calculation formula is as follows:
wherein,β 4 setting a threshold value for No. four; beta is a beta 4 The function of calculating the rotation angle is easily lost when the rotation angle is too large, the noise robustness is not good when the rotation angle is too small, and beta is obtained according to the actual parameter adjustment experience 4 Typically set at 15-25.
S5.5, finding out the maximum similarity K (θ,0°) And the corresponding theta is the counterclockwise rotation angle of the minimum area relative to the template picture, and the calculation formula is as follows:
and S6, obtaining the target position, the scaling coefficient and the rotation angle by utilizing S5 to perform work such as workpiece positioning, industrial sorting or target tracking and the like.
It should be understood that the above-described embodiments of the present invention are merely examples for illustrating the present invention, and are not intended to limit the embodiments of the present invention. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. There is no need or no way to give poor examples of all embodiments. And such obvious changes and modifications which are within the spirit of the invention are deemed to be covered by the present invention.
Claims (5)
1. A fast template matching method based on local dynamic warping is characterized by comprising the following steps:
step 1, traversing the test graph, extracting a test subgraph with the same size as the template image, and extracting ring projection characteristic vectors of the test subgraph and the template image by using a ring projection algorithm;
the ring projection algorithm is as follows: the size of the template image is recorded as M multiplied by N, and the central point (x) of the template image is used 0 ,y 0 ) A polar coordinate system is established for the origin, any one pixel is denoted as T (r, theta), the ring projection feature vector is denoted as IRPT,
wherein,R max min (M/2, N/2), s (r) is the number of pixels on the circle with radius r, T min (r, θ) is the minimum of all pixel intensities on the circle;
step 2, taking the ring projection feature vector obtained in the step 1 as input, calculating the roughly estimated similarity between the test subgraph and the template image, screening out the test subgraphs with the similarity larger than a first set threshold value, and listing the test subgraphs as candidate test subgraphs;
step 3, aiming at the candidate test subgraph obtained in the step 2, calculating the similarity and the image scaling coefficient by using a local dynamic warping algorithm and locally aligning the curve outline of the ring projection feature vector, namely, searching the optimal local matching relation of the candidate test subgraph and the ring projection feature vector of the template image and calculating the similarity and the image scaling coefficient;
a specific method of the step 3 is as follows:
step 3.1a, recording the ring projection characteristic vector extracted from the candidate test subgraph as S, recording the ring projection characteristic vector extracted from the template image as T, and inputting S and T, wherein the dimensionalities of the S and T are m respectively s And m t Creating a distance matrix D and a cumulative distance matrix D acc All dimensions of m t ×m s Initializing distance function DIS ═ x-y |;
step 3.2a, calculating the distance between each element of the eigenvector S and each element of the eigenvector T by using the distance function DIS to obtain a distance matrix D, and then assigning the distance matrix D to the accumulated distance matrix D acc ;
Step 3.3a. update the cumulative distance matrix D using the following formula acc After the updating of each element value, the accumulative distance matrix D is obtained acc ,
Step 3.4a. for the cumulative distance matrix D acc The last column of (1) searching the element with the smallest value from bottom to top, and recording the value as temp 1 The position of the element is (i1, m) s );
Step 3.5a. for the cumulative distance matrix D acc The last line of (1) searches for the element with the smallest value from right to left, and the value is recorded as temp 2 The position of the element is (i2, m) s );
Step 3.6a. the similarity of the characteristic vectors S and T is recorded as K s The scaling factor, denoted as K,
The other specific method of the step 3 is as follows:
step 3.1b, smoothing and denoising the characteristic curve by Gaussian filtering, wherein a ring projection characteristic vector is recorded as f (x), a Gaussian function is recorded as g (x, sigma), and a filtered ring projection characteristic vector F (x) is
Step 3.2b. the convolution kernel is marked as T, then the discrete slope curve sequence F' (x) is
And 3.3b, respectively recording the slope curve sequences obtained from the template image and the test subgraph as T 'and S', recording the scaling coefficient as k, and initializing the calculation range of the scaling coefficient as k 1 ,k 2 ]The calculation precision of the scaling coefficient is K', and the similarity K corresponding to each scaling coefficient K is calculated by using the following formula s Then the maximum similarityThe corresponding scaling factor K is the desired scaling factor K,
Step 4, after traversing the test chart, taking the maximum value of the similarity, if the maximum value of the similarity is greater than or equal to a third set threshold value, determining the coordinate of the corresponding test subgraph as the target position, and cutting out the minimum area containing the target object from the test chart according to the corresponding scaling coefficient;
step 5, extracting the direction code characteristic vectors of the minimum region and the template image obtained in the step 4 by using a direction code algorithm, and then calculating the rotation angle of the image based on the direction code characteristic vectors to finally obtain a target position, a scaling coefficient and the rotation angle; the direction code algorithm is a sector sampling method, namely, an image is divided into n sector areas, all pixel intensity values in the sector areas are averaged and used as an element of a direction code feature vector, and therefore a direction code feature vector related to a rotation angle is obtained.
2. The fast template matching method based on local dynamic warping as claimed in claim 1, wherein the algorithm for testing the rough estimated similarity between the sub-graph and the template image in step 2 is as follows: the ring projection characteristic vector extracted from the test subgraph is marked as S, the ring projection characteristic vector extracted from the template image is marked as T, and the rough estimation similarity between the test subgraph and the template image is marked as K c ,K c The larger the size the more similar the image is,
where n is the dimension of vector X, S [ 0: m/2 is the first m/2 dimension of the feature vector S.
3. The fast template matching method based on local dynamic warping as claimed in claim 1 or 2, wherein: setting a threshold value in the step 2 as beta 1 Taking beta of not more than 0.35 1 ≤0.5。
4. The fast template matching method based on local dynamic warping as claimed in claim 1, wherein: setting a threshold value beta in the third step of the step 4 3 Taking beta not more than 0.55 3 ≤0.7。
5. The fast template matching method based on local dynamic warping as claimed in claim 1, wherein the step 5 specifically comprises:
step 5.1, recording the size of the input image I as M multiplied by N, and inputting the central point (x) of the image 0 ,y 0 ) Establishing a polar coordinate system for the origin, any one pixel can be represented as I (r, θ), the angle calculation precision θ' of the direction code method is initialized, and the direction code feature vector OC is calculated as follows:
step 5.2, the input image is a template image and is marked as T, each theta corresponds to a direction code feature vector, 360 degrees/theta' direction code feature vectors corresponding to the template image can be obtained by utilizing the calculation method in the step 5.1, and the feature vector corresponding to the angle theta is expressed asThe calculation formula is as follows:
wherein n is max =360°/θ′-1;
Step 5.3, the input image is a minimum area and is marked as S, and a direction code characteristic vector corresponding to the minimum area is obtained by using the calculation method in the step 5.1 and is expressed asThe calculation formula is as follows:
step 5.4, calculating the characteristic vector corresponding to each thetaAnd feature vectorsSimilarity K of (2) (θ,0°) The calculation formula is as follows:
Step 5.5, find the biggest similarity K (θ,0°) And the corresponding theta is the counterclockwise rotation angle of the minimum area relative to the template picture, and the calculation formula is as follows:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010164515.0A CN111340134B (en) | 2020-03-11 | 2020-03-11 | Rapid template matching method based on local dynamic warping |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010164515.0A CN111340134B (en) | 2020-03-11 | 2020-03-11 | Rapid template matching method based on local dynamic warping |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111340134A CN111340134A (en) | 2020-06-26 |
CN111340134B true CN111340134B (en) | 2022-09-06 |
Family
ID=71182292
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010164515.0A Active CN111340134B (en) | 2020-03-11 | 2020-03-11 | Rapid template matching method based on local dynamic warping |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111340134B (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112950623B (en) * | 2021-03-29 | 2024-08-02 | 云印技术(深圳)有限公司 | Mark identification method and system |
CN115166721B (en) * | 2022-09-05 | 2023-04-07 | 湖南众天云科技有限公司 | Radar and GNSS information calibration fusion method and device in roadside sensing equipment |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109559749A (en) * | 2018-12-24 | 2019-04-02 | 苏州思必驰信息科技有限公司 | Combined decoding method and system for speech recognition system |
CN110136160A (en) * | 2019-05-13 | 2019-08-16 | 南京大学 | A kind of rapid image matching method based on circular projection |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4404095B2 (en) * | 2005-01-11 | 2010-01-27 | 日本電気株式会社 | Method for template matching, apparatus therefor, and recording medium recording program therefor |
-
2020
- 2020-03-11 CN CN202010164515.0A patent/CN111340134B/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109559749A (en) * | 2018-12-24 | 2019-04-02 | 苏州思必驰信息科技有限公司 | Combined decoding method and system for speech recognition system |
CN110136160A (en) * | 2019-05-13 | 2019-08-16 | 南京大学 | A kind of rapid image matching method based on circular projection |
Also Published As
Publication number | Publication date |
---|---|
CN111340134A (en) | 2020-06-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7929728B2 (en) | Method and apparatus for tracking a movable object | |
US8774510B2 (en) | Template matching with histogram of gradient orientations | |
US9141871B2 (en) | Systems, methods, and software implementing affine-invariant feature detection implementing iterative searching of an affine space | |
US8483489B2 (en) | Edge based template matching | |
Cucchiara et al. | A Hough transform-based method for radial lens distortion correction | |
JP2006523345A (en) | Shape matching method for indexing and searching multimedia data | |
CN111680699B (en) | Air-ground infrared time-sensitive weak small target detection method based on background suppression | |
CN108805915B (en) | Visual angle change resistant close-range image region feature matching method | |
CN111340134B (en) | Rapid template matching method based on local dynamic warping | |
CN107862319B (en) | Heterogeneous high-light optical image matching error eliminating method based on neighborhood voting | |
CN108537832B (en) | Image registration method and image processing system based on local invariant gray feature | |
Wang et al. | An improved ORB image feature matching algorithm based on SURF | |
CN106296587B (en) | Splicing method of tire mold images | |
CN114331879A (en) | Visible light and infrared image registration method for equalized second-order gradient histogram descriptor | |
CN107808165B (en) | Infrared image matching method based on SUSAN corner detection | |
CN117372480A (en) | Point cloud registration method utilizing self-adaptive downsampling and curvature enhancement point-to-point characteristics | |
CN112734816A (en) | Heterogeneous image registration method based on CSS-Delaunay | |
CN115641367A (en) | Infrared and visible light image registration method based on multi-stage feature matching | |
CN113963174B (en) | Bogie identification image feature extraction method for multi-view intensity domain and frequency domain fusion | |
CN110246165B (en) | Method and system for improving registration speed of visible light image and SAR image | |
CN108520533B (en) | Workpiece positioning-oriented multi-dimensional feature registration method | |
CN106651756B (en) | Image registration method based on SIFT and verification mechanism | |
Hu et al. | Digital video stabilization based on multilayer gray projection | |
Koutaki et al. | Fast and high accuracy pattern matching using multi-stage refining eigen template | |
CN110910418B (en) | Target tracking algorithm based on rotation invariance image feature descriptor |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |