CN110728326A - Edge template matching method with rotation - Google Patents
Edge template matching method with rotation Download PDFInfo
- Publication number
- CN110728326A CN110728326A CN201910977014.1A CN201910977014A CN110728326A CN 110728326 A CN110728326 A CN 110728326A CN 201910977014 A CN201910977014 A CN 201910977014A CN 110728326 A CN110728326 A CN 110728326A
- Authority
- CN
- China
- Prior art keywords
- layer
- image
- matching
- pyramid
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/75—Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
- G06V10/751—Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a method for matching edge templates with rotation, which comprises the following steps: 1) pyramid layering is respectively carried out on the template image and the target image; 2) setting the rotation step length of each layer, and rotating each characteristic layer of the pyramid of the template image according to the corresponding rotation step length to obtain a rotation image set Mif(ii) a 3) Extraction of MifAmplitude information and approximate direction information of each image; 4) extracting amplitude information and approximate direction information of each pyramid characteristic layer image of the target image; 5) calculating the similarity of the pyramid characteristic layer from the topmost pyramid characteristic layer of the target image to obtain coarse matching information; setting the search range of the second-level pyramid layer according to the coarse matching information of the top-level pyramid to obtain new matching information; repeating the steps until the matching information is transmitted to the pyramid at the bottom layer, and recording the matching information at the bottom layer as the final matching point coordinate and the final angle value; the method effectively improves the real-time performance of matching and positioning.
Description
Technical Field
The invention relates to the technical field of image positioning and identification, in particular to a method for matching an edge template with rotation.
Background
The template matching is a process of searching a sub-image similar to the template image from the template image to another image; in practical application, because objects in a target image can rotate at different angles, the template image and the target image cannot be completely matched, in order to solve the problem of template matching with rotation, a commonly used processing method at the present stage is to generate a series of template image sets angle by angle, and then match the target image by using each image in the template image sets when the templates are matched, the method needs to use a plurality of templates with fixed step length for matching, and then compare the templates one by one, so that the process is tedious, the time consumption is long, and the application of the template matching technology is greatly limited.
Disclosure of Invention
Aiming at the problems, the invention provides a rotating edge template matching method, which can realize rapid and stable matching and positioning under the conditions of displacement, rotation, partial shielding and nonlinear illumination change of a target image, output the position coordinates and the rotation angle of the template image in the target image and effectively improve the real-time performance of matching and positioning.
A method for matching a rotating edge template comprises the following steps:
1) setting the image pyramid layer number according to the complexity of the image characteristics, and performing image pyramid layering on the template image and the target image respectively;
2) setting the rotation step length angleStep of each layeriI represents the ith pyramid hierarchy; wherein, the rotation step length of the pyramid characteristic layer at the bottom layer is minimum, and the angleStep of other layersiH represents a proportionality coefficient, and h is 5-20;
rotating each characteristic layer of the pyramid of the template image according to the corresponding rotation step length; recording the pyramid characteristic layer of the rotated angle template images as a rotating image set M of the characteristic layerifWhere f denotes the number of rotated images in the i-th layer, and f is 0,1,2 … … ceil ((360 °/angleStep)i) -1); ceil represents an rounding-up function;
3) extracting a rotating image set MifEdge gradient information of each image, the edge gradient information including amplitude informationAnd direction informationRemoving the bottom pyramid feature layer, and aligning the direction information at each edge point of other pyramid feature layersThe following processing is carried out to obtain approximate direction information
Wherein (x)j,yj) Representing the j-th edge point coordinate in the template image, wherein j is 1,2 … … k, and k is the total number of edge points of the ith pyramid feature layer; round represents a rounding function;
4) extracting the edge gradient information of each pyramid characteristic layer image of the target image by adopting the same method in the step 3), namely the amplitude information of each edge pointAnd approximate direction information GS i(x,y)', where (x, y) represents edge point coordinates in the target image;
5) starting from the topmost pyramid feature layer of the target image, setting the search range as the whole image, and calculating the similarity of the search range to obtain coarse matching information; setting the search range of the second-level pyramid feature layer of the target image according to the coarse matching information of the top-level pyramid to obtain new matching information; and by analogy, the search range of the next pyramid feature layer is set according to the matching information of the previous layer until the search range is transmitted to the bottom pyramid feature layer, and the matching information of the bottom layer is recorded as the final matching point coordinate and the final angle value.
Further, when the image resolution meets the requirement, the rotation step length of the pyramid feature layer at the bottom layer can be set to be less than or equal to 2 degrees.
Further, in order to improve the matching precision and reduce the requirement on the image resolution, the rotation step length of the pyramid feature layer at the bottom layer is set to be less than or equal to 1 degree, and the setting values of the proportional coefficients h of other layers except the bottom layer are the same.
Further, the search range in step 5) is set as: (u)'±Δu,v'±Δv,θ'±angleStepi+1) Wherein, (u ', v ') is matching point coordinate information obtained in a layer above a layer to be matched, and theta ' is angle information; angleStepi+1Is the rotation step length of the layer above the layer to be matched.
Further, at the target image point (u, v), the similarity of the layers is calculated:
when i is n, θf=0°,angleStepn,2angleStepn,3angleStepn……;
When i ≠ n, θf=[θ'-angleStepi+1,θ'+angleStepi+1];
Namely: when the template image set is matched with the top layer, performing similarity calculation on the template image set obtained by rotating the top layer and the top layer of the target image; obtaining a matching result for each rotated template image, screening out a result (u ', v', theta ') with the maximum similarity, and setting a search range (u' +/-delta u, v '+/-delta v, theta' +/-angleStep)i+1) Transferring to the next layer;
when the next layer is matched, similarity calculation is carried out on the template image with the rotation angle of the layer within the search range and the characteristic layer of the target image; each template image obtains a matching result, one result with the maximum similarity is screened out, the search range is set again, and the result is transmitted to the next layer; repeating the steps until the matching information of the pyramid feature layer at the bottom layer is screened, and recording the matching information at the bottom layer as the final matching point coordinate and angle value.
Wherein the content of the first and second substances,respectively representing the x-direction and y-direction components of the amplitude information in the template image,respectively representing the x-and y-directional components of the amplitude information in the target imageObtained by the following calculation:
further, according to the lowest layer matching information (u)final,vfinal,θfinal) Separately calculating thetafinal+angleStep0And thetafinal-angleStep0Matching similarity Score at two anglesfinal' and Scorefinal", wherein, angleStep0Representing the rotation step of the lowest pyramid feature layer; fitting the three matching angles and the corresponding matching scores by using a quadratic curve, solving to obtain a sub-angle value corresponding to the maximum value of the similarity scores on the curve, and recording the sub-angle value as a final matching angle thetafinal'。
Further, before the step 1), preprocessing the template image and the target image, wherein the preprocessing comprises: the method comprises the steps of smoothing the edge of an image through Gaussian filtering, eliminating slowly-changing noise in the background of the image, removing the edge with the communication length smaller than a threshold value t, and obtaining a reliable edge point, wherein t is 10-100 pixels.
Further, Sobel operator is adopted for extracting the edge gradient in the step 3).
The method sets different angle rotation step lengths on different characteristic layers of a template image pyramid, and simultaneously approximates the gradient direction of the image edge points, so that the problems of complex calculation and long matching time caused by enumerating the template image angle by angle in the existing matching method are solved; meanwhile, in the process of coarse matching of each pyramid feature layer, rapid search in the matching process is realized in a mode that approximate direction information in the gradient information is matched with the rotation step length; the method disclosed by the invention has the advantages that on the basis of ensuring the matching precision, the matching efficiency is improved by more than one time, the matching precision can reach 0.1 degree, the rapid and stable matching positioning can be realized under the conditions of displacement, rotation, partial shielding and nonlinear illumination change of a target image, and the real-time performance of the matching process is greatly improved.
Drawings
FIG. 1 is a schematic diagram of sub-angle values obtained by quadratic curve fitting.
Detailed Description
The technical solution of the present invention is described in detail below with reference to the accompanying drawings and examples.
Example 1
A method for matching a rotating edge template comprises the following steps:
1) setting the image pyramid layer number according to the complexity of the image characteristics, and performing image pyramid layering on the template image and the target image respectively; in the embodiment, a 4-layer pyramid is taken as an example;
in order to obtain an image more favorable for processing, in this embodiment, before performing step 1), the template image and the target image are preprocessed, where the preprocessing includes: gaussian filtering is carried out to smooth the image edge, slowly-changing noise in the image background is removed, the edge with the communication length smaller than a threshold value t is removed, and a reliable edge point is obtained, wherein t is 10-100 pixels.
2) Setting the rotation step length angleStep of each layeriI is 0,1,2,3, wherein the rotation step length angleStep of the pyramid feature layer at the bottom layer0Minimum value, other layers (i > 0), angleStepiH represents a proportionality coefficient, and h is 5-20;
when the image quality is high and the definition is good, the angleStep can be set0≤2°;
In order to improve the matching precision and reduce the requirement on the image resolution, angleStep can be set during actual setting0Less than or equal to 1 degree, for the convenience of calculation,when i is larger than 1, the setting values of the proportionality coefficients h of all layers are the same;
in this example, h is 10, angleStep0=1°、angleStep1=10°、angleStep2=20°、angleStep3=30°;
Rotating each characteristic layer of the pyramid of the template image according to the corresponding rotation step length; recording the rotation images of each angle in the pyramid feature layer of the rotated template image as a rotation template image set MifWhere f denotes the number of rotated images in the i-th layer, and f is 0,1,2 … … ceil ((360 °/angleStep)i) -1); ceil represents an rounding-up function;
namely: when i is 0, f is 0,1,2 … … 359; when i is 1, f is 0,1,2 … … 35;
when i is 2, f is 0,1,2 … … 17; when i is 3, f is 0,1,2 … … 11;
3) extracting a rotating image set MifEdge gradient information of each image, the edge gradient information including amplitude informationAnd direction informationRemoving the bottom pyramid feature layer, and aligning the direction information at each edge point of other pyramid feature layersThe following processing is carried out to obtain approximate direction information
Wherein (x)j,yj) Representing the j-th edge point coordinate in the template image, wherein j is 1,2 … … k, and k is the total number of edge points of the ith pyramid feature layer; round stands for a rounding function;
4) Extracting the edge gradient information of each pyramid characteristic layer image of the target image by adopting the same method in the step 3), namely the amplitude information of each edge pointAnd approximate direction information GS i(x,y)', where (x, y) represents edge point coordinates in the target image;
5) starting from the topmost pyramid feature layer of the target image, setting the search range as the whole image, and calculating the similarity of the search range to obtain coarse matching information; setting the search range of the second-level pyramid feature layer of the target image according to the coarse matching information of the top-level pyramid to obtain new matching information; and by analogy, the search range of the next pyramid feature layer is set according to the matching information of the previous layer until the search range is transmitted to the bottom pyramid feature layer, and the matching information of the bottom layer is recorded as the final matching point coordinate and the final angle value.
Wherein the search range is set as: (u ' + -. DELTA.u, v ' + -. DELTA.v,. theta ' + -. angleStep)i+1) Wherein, (u ', v ') is matching point coordinate information obtained in a layer above a layer to be matched, and theta ' is angle information; angleStepi+1For the rotation step of the layer above the layer to be matched, Δ u and Δ v may be set to be 5 pixels in this embodiment, according to the image condition.
In step 5), at the target image point (u, v), calculating the similarity of each layer:
when i is n, θf=0°,angleStepn,2angleStepn,3angleStepn……;
When i ≠ n, θf=[θ'-angleStepi+1,θ'+angleStepi+1];
Specifically, in this embodiment, i is 3, and θf0 °, 30 °, 60 °, 90 ° … … 330 °; theta' 60 deg., angleDegree search range 30-90 °
i=2,θfTaking values of 40 degrees, 60 degrees and 80 degrees; when theta' is 60 degrees, the angle searching range is 40-80 degrees
i=1,θfTaking values of 40 degrees, 50 degrees, 60 degrees, 70 degrees and 80 degrees; when theta' is 50 degrees, the angle searching range is 40-60 degrees
i=0,θfThe values are 40 degrees, 41 degrees, 42 degrees, 43 degrees and … … 60 degrees;
it can be seen that in the highest layer (i ═ 3), θfThere are 12, corresponding calculation obtains 12 similarity values, and selects the (u, v, theta) corresponding to the maximum valuef) The values are noted as (u ', v ', θ ');
i 2, second to top, θf3 similarity values are correspondingly calculated for 3 similarity values between 30 degrees and 90 degrees, and the (u, v, theta) corresponding to the maximum value is selected from the similarity valuesf) The values are noted as (u ', v ', θ '); the next level … … is calculated until the lowest pyramid feature level is reached, and the lowest level matching information is recorded as the final matching point coordinates and angle values.
Wherein the content of the first and second substances,respectively representing the x-direction and y-direction components of the amplitude information in the template image,the components representing the x-direction and y-direction of the amplitude information in the target image, respectively, are obtained by the following calculation:
to obtain more accurate matching angle values, the matching angle values can be obtained according to the lowest-layer matching information (u)final,vfinal,θfinal) Separately calculating thetafinal+angleStep1And thetafinal-angleStep1Matching similarity Score at two anglesfinal' and Scorefinal"; as shown in fig. 1, fitting three matching angles and matching scores corresponding to the three matching angles by using a quadratic curve, solving to obtain a sub-angle value corresponding to the maximum value of the similarity scores on the curve, and recording the sub-angle value as a final matching angle θfinal'。
Example 2
A method for matching a rotating edge template comprises the following steps:
1) setting the image pyramid layer number according to the complexity of the image characteristics, and performing image pyramid layering on the template image and the target image respectively; the present embodiment takes 3 layers as an example;
2) setting the rotation step length angleStep of each layeriI is 0,1,2, wherein the rotation step length angleStep of the pyramid feature layer at the bottom layer0Minimum value, other layers (i > 0), angleStepiH represents a proportionality coefficient, and h is 5-20; angleStep0≤2°;
In this example, angleStep02 ° (image resolution 1024 × 1024 in this embodiment), angleStep1=9°、angleStep2=26°;
Rotating each characteristic layer of the pyramid of the template image according to the corresponding rotation step length; recording the rotation images of each angle in the pyramid feature layer of the rotated template image as a rotation template image set MifWhere f denotes the number of rotated images in the i-th layer, and f is 0,1,2 … … ceil ((360 °/angleStep)i) -1); ceil represents an rounding-up function;
namely: when i is 0, f is 0,1,2 … … 179;
when i is 1, f is 0,1,2,3 … … 39;
when i is 2, f is 0,1,2 … … 13;
3) extracting a rotating image set MifEdge gradient information of each image, the edge gradient information including amplitude informationAnd direction informationRemoving the bottom pyramid feature layer, and aligning the direction information at each edge point of other pyramid feature layersThe following processing is carried out to obtain approximate direction information
Wherein (x)j,yj) Representing the j-th edge point coordinate in the template image, wherein j is 1,2 … … k, and k is the total number of edge points of the ith pyramid feature layer; round represents a rounding function;
4) extracting the edge gradient information of each pyramid characteristic layer image of the target image by adopting the same method in the step 3), namely the amplitude information of each edge pointAnd approximate direction information GS i(x,y)', where (x, y) represents edge point coordinates in the target image;
5) starting from the topmost pyramid feature layer of the target image, setting the search range as the whole image, and calculating the similarity of the search range to obtain coarse matching information; setting the search range of the second-level pyramid feature layer of the target image according to the coarse matching information of the top-level pyramid to obtain new matching information; and by analogy, the search range of the next pyramid feature layer is set according to the matching information of the previous layer until the search range is transmitted to the bottom pyramid feature layer, and the matching information of the bottom layer is recorded as the final matching point coordinate and the final angle value.
Wherein the search range is set as: (u ' + -. DELTA.u, v ' + -. DELTA.v,. theta ' + -. angleStep)i+1) (u ', v ') is matching point coordinate information obtained in a layer above the layer to be matched, and theta ' is angle information; angleStepi+1For the rotation step of the layer above the layer to be matched, Δ u and Δ v may be set to be 5 pixels in this embodiment, according to the image condition.
In step 5), at the target image point (u, v), calculating the similarity of each layer:
when i is n, θf=0°,angleStepn,2angleStepn,3angleStepn……;
When i ≠ n, θf=[θ'-angleStepi+1,θ'+angleStepi+1];
Specifically, in this embodiment, i is 2, and θf0 °, 26 °, 52 °, 78 ° … … 312 °, 338 °; the angle search range is 52-104 degrees when theta' is 78 degrees;
i=1,θftaking the values of 54 degrees, 63 degrees, 72 degrees, 81 degrees, 90 degrees and 99 degrees; the angle search range of theta is 81 degrees and is 72 degrees to 90 degrees;
i=0,θftaking values of 72 degrees, 74 degrees, 76 degrees, … degrees and 90 degrees;
it can be seen that in the highest layer (i ═ 2), θf14 similarity values are calculated, and the (u, v, theta) corresponding to the maximum value is selected from the 14 similarity valuesf) The values are noted as (u ', v ', θ ');
when the second top layer (i ═ 1), θf6 similarity values are calculated between 52 degrees and 104 degrees, and the (u, v, theta) corresponding to the maximum value is selected from the similarity valuesf) The values are noted as (u ', v ', θ ');
when the lowest layer (i ═ 0), θf10 ones between 72-90 deg. are correspondent to the counterCalculating to obtain 10 similarity values, and selecting (u) corresponding to the maximum valuefinal,vfinal,θfinal) Recording the value as a final matching point coordinate and an angle value;
wherein the content of the first and second substances,respectively representing the x-direction and y-direction components of the amplitude information in the template image,the components representing the x-direction and y-direction of the amplitude information in the target image, respectively, are obtained by the following calculation:
to obtain more accurate matching angle values, the matching angle values can be obtained according to the lowest-layer matching information (u)final,vfinal,θfinal) Separately calculating thetafinal+angleStep1And thetafinal-angleStep1Matching similarity Score at two anglesfinal' and Scorefinal"; fitting the three matching angles and the corresponding matching scores by using a quadratic curve, solving to obtain a sub-angle value corresponding to the maximum value of the similarity scores on the curve, and recording the sub-angle value as a final matching angle thetafinal'。
The foregoing descriptions of specific exemplary embodiments of the present invention have been presented for purposes of illustration and description. The foregoing description is not intended to be exhaustive or to limit the invention to the precise form disclosed, and obviously many modifications and variations are possible in light of the above teaching. The exemplary embodiments were chosen and described in order to explain certain principles of the invention and its practical application to enable others skilled in the art to make and use various exemplary embodiments of the invention and various alternatives and modifications thereof. It is intended that the scope of the invention be defined by the following claims and their equivalents.
Claims (8)
1. A method for matching a rotating edge template is characterized by comprising the following steps:
1) setting the image pyramid layer number according to the complexity of the image characteristics, and performing image pyramid layering on the template image and the target image respectively;
2) setting the rotation step length angleStep of each layeriI represents the ith pyramid hierarchy; wherein, the rotation step length of the pyramid characteristic layer at the bottom layer is minimum, and the angleStep of other layersiH represents a proportionality coefficient, and h is 5-20;
rotating each characteristic layer of the pyramid of the template image according to the corresponding rotation step length; recording the pyramid characteristic layer of the rotated angle template images as a rotating image set M of the characteristic layerifWhere f denotes the number of rotated images in the i-th layer, and f is 0,1,2 … … ceil ((360 °/angleStep)i) -1); ceil represents an rounding-up function;
3) extracting a rotating image set MifEdge gradient information of each image, the edge gradient information including amplitude informationAnd direction informationRemoving the bottom pyramid feature layer, and aligning the direction information at each edge point of other pyramid feature layersThe following processing is carried out to obtain approximate direction information
Wherein (x)j,yj) Representing the j-th edge point coordinate in the template image, wherein j is 1,2 … … k, and k is the total number of edge points of the ith pyramid feature layer; round represents a rounding function;
4) extracting the edge gradient information of each pyramid characteristic layer image of the target image by adopting the same method in the step 3), namely the amplitude information of each edge pointAnd approximate direction information GS i(x,y)', where (x, y) represents edge point coordinates in the target image;
5) starting from the topmost pyramid feature layer of the target image, setting the search range as the whole image, and calculating the similarity of the search range to obtain coarse matching information; setting the search range of the second-level pyramid feature layer of the target image according to the coarse matching information of the top-level pyramid to obtain new matching information; and by analogy, the search range of the next pyramid feature layer is set according to the matching information of the previous layer until the search range is transmitted to the bottom pyramid feature layer, and the matching information of the bottom layer is recorded as the final matching point coordinate and the final angle value.
2. The edge template matching method with rotation according to claim 1, characterized in that: the rotation step length of the pyramid feature layer at the bottom layer is less than or equal to 2 degrees.
3. The edge template matching method with rotation according to claim 1, characterized in that: the rotation step length of the pyramid characteristic layer at the bottommost layer is less than or equal to 1 degree; the setting values of the scaling factors h of other layers except the bottommost layer are the same.
4. The edge template matching method with rotation according to claim 1, characterized in that: setting the search range in the step 5) as follows: (u ' + -. DELTA.u, v ' + -. DELTA.v,. theta ' + -. angleStep)i+1) Wherein, (u ', v ') is matching point coordinate information obtained in a layer above a layer to be matched, and theta ' is angle information; angleStepi+1Is the rotation step length of the layer above the layer to be matched.
5. The edge template matching method using rotation according to claim 4, wherein: at the target image point (u, v), the similarity of the layers is calculated:
when i is n, θf=0°,angleStepn,2angleStepn,3angleStepn……;
When i ≠ n, θf=[θ'-angleStepi+1,θ'+angleStepi+1];
Wherein the content of the first and second substances,respectively representing the x-direction and y-direction components of the amplitude information in the template image,the components representing the x-direction and y-direction of the amplitude information in the target image, respectively, are obtained by the following calculation:
6. the edge template matching method using rotation as set forth in claim 1, wherein: based on the lowest layer matching information (u)final,vfinal,θfinal) Separately calculating thetafinal+angleStep0And thetafinal-angleStep0Matching similarity Score at two anglesfinal' and Scorefinal", wherein, angleStep0Representing the rotation step of the lowest pyramid feature layer; fitting the three matching angles and the corresponding matching scores by using a quadratic curve, solving to obtain a sub-angle value corresponding to the maximum value of the similarity scores on the curve, and recording the sub-angle value as a final matching angle thetafinal'。
7. The edge template matching method with rotation according to claim 1, characterized in that: before the step 1), preprocessing the template image and the target image, wherein the preprocessing comprises the following steps: the method comprises the steps of smoothing the edge of an image through Gaussian filtering, eliminating slowly-changing noise in the background of the image, removing the edge with the communication length smaller than a threshold value t, and obtaining a reliable edge point, wherein t is 10-100 pixels.
8. The edge template matching method with rotation according to claim 1, characterized in that: and 3) extracting the edge gradient by adopting a Sobel operator.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910977014.1A CN110728326A (en) | 2019-10-15 | 2019-10-15 | Edge template matching method with rotation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910977014.1A CN110728326A (en) | 2019-10-15 | 2019-10-15 | Edge template matching method with rotation |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110728326A true CN110728326A (en) | 2020-01-24 |
Family
ID=69221292
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910977014.1A Pending CN110728326A (en) | 2019-10-15 | 2019-10-15 | Edge template matching method with rotation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110728326A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111860501A (en) * | 2020-07-14 | 2020-10-30 | 哈尔滨市科佳通用机电股份有限公司 | High-speed rail height adjusting rod falling-out fault image identification method based on shape matching |
CN111931786A (en) * | 2020-06-23 | 2020-11-13 | 联宝(合肥)电子科技有限公司 | Image processing method and device and computer readable storage medium |
CN112308121A (en) * | 2020-10-16 | 2021-02-02 | 易思维(杭州)科技有限公司 | Template image edge point optimization method |
CN113205145A (en) * | 2021-05-18 | 2021-08-03 | 广州大学 | Template matching method, system, device and medium based on normalized cross correlation |
CN113378886A (en) * | 2021-05-14 | 2021-09-10 | 珞石(山东)智能科技有限公司 | Method for automatically training shape matching model |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102654902A (en) * | 2012-01-16 | 2012-09-05 | 江南大学 | Contour vector feature-based embedded real-time image matching method |
CN105261012A (en) * | 2015-09-25 | 2016-01-20 | 上海瑞伯德智能系统科技有限公司 | Template matching method based on Sobel vectors |
JP2016018359A (en) * | 2014-07-08 | 2016-02-01 | キヤノン株式会社 | Image processing method, image processing device, program, and recording medium |
WO2017206099A1 (en) * | 2016-06-01 | 2017-12-07 | 深圳配天智能技术研究院有限公司 | Method and device for image pattern matching |
CN108010082A (en) * | 2017-12-28 | 2018-05-08 | 上海觉感视觉科技有限公司 | A kind of method of geometric match |
CN110197232A (en) * | 2019-06-05 | 2019-09-03 | 中科新松有限公司 | Image matching method based on edge direction and Gradient Features |
-
2019
- 2019-10-15 CN CN201910977014.1A patent/CN110728326A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102654902A (en) * | 2012-01-16 | 2012-09-05 | 江南大学 | Contour vector feature-based embedded real-time image matching method |
JP2016018359A (en) * | 2014-07-08 | 2016-02-01 | キヤノン株式会社 | Image processing method, image processing device, program, and recording medium |
CN105261012A (en) * | 2015-09-25 | 2016-01-20 | 上海瑞伯德智能系统科技有限公司 | Template matching method based on Sobel vectors |
WO2017206099A1 (en) * | 2016-06-01 | 2017-12-07 | 深圳配天智能技术研究院有限公司 | Method and device for image pattern matching |
CN108010082A (en) * | 2017-12-28 | 2018-05-08 | 上海觉感视觉科技有限公司 | A kind of method of geometric match |
CN110197232A (en) * | 2019-06-05 | 2019-09-03 | 中科新松有限公司 | Image matching method based on edge direction and Gradient Features |
Non-Patent Citations (1)
Title |
---|
YU XIANG等: "Target image matching algorithm based on pyramid model and higher", 《JOURNAL OF COMPUTATIONAL SCIENCE》 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111931786A (en) * | 2020-06-23 | 2020-11-13 | 联宝(合肥)电子科技有限公司 | Image processing method and device and computer readable storage medium |
CN111931786B (en) * | 2020-06-23 | 2022-02-01 | 联宝(合肥)电子科技有限公司 | Image processing method and device and computer readable storage medium |
CN111860501A (en) * | 2020-07-14 | 2020-10-30 | 哈尔滨市科佳通用机电股份有限公司 | High-speed rail height adjusting rod falling-out fault image identification method based on shape matching |
CN111860501B (en) * | 2020-07-14 | 2021-02-05 | 哈尔滨市科佳通用机电股份有限公司 | High-speed rail height adjusting rod falling-out fault image identification method based on shape matching |
CN112308121A (en) * | 2020-10-16 | 2021-02-02 | 易思维(杭州)科技有限公司 | Template image edge point optimization method |
CN112308121B (en) * | 2020-10-16 | 2022-06-14 | 易思维(杭州)科技有限公司 | Template image edge point optimization method |
CN113378886A (en) * | 2021-05-14 | 2021-09-10 | 珞石(山东)智能科技有限公司 | Method for automatically training shape matching model |
CN113205145A (en) * | 2021-05-18 | 2021-08-03 | 广州大学 | Template matching method, system, device and medium based on normalized cross correlation |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110728326A (en) | Edge template matching method with rotation | |
CN108921865B (en) | Anti-interference sub-pixel straight line fitting method | |
WO2021017361A1 (en) | Template matching algorithm based on edge and gradient feature | |
CN108416342B (en) | Fingerprint identification method combining thin node and thin line structure | |
CN111080529A (en) | Unmanned aerial vehicle aerial image splicing method for enhancing robustness | |
CN102865859B (en) | Aviation sequence image position estimating method based on SURF (Speeded Up Robust Features) | |
CN111079803B (en) | Template matching method based on gradient information | |
CN108830888B (en) | Coarse matching method based on improved multi-scale covariance matrix characteristic descriptor | |
CN103426186A (en) | Improved SURF fast matching method | |
CN107292869B (en) | Image speckle detection method based on anisotropic Gaussian kernel and gradient search | |
CN111640154B (en) | Vertical needle micro-plane sub-pixel level positioning method based on micro-vision | |
CN108010082B (en) | Geometric matching method | |
CN110310310A (en) | A kind of improved method for aviation image registration | |
CN102800099A (en) | Multi-feature multi-level visible light and high-spectrum image high-precision registering method | |
CN109472770B (en) | Method for quickly matching image characteristic points in printed circuit board detection | |
CN113392856A (en) | Image forgery detection device and method | |
CN112614167A (en) | Rock slice image alignment method combining single-polarization and orthogonal-polarization images | |
CN110246165B (en) | Method and system for improving registration speed of visible light image and SAR image | |
CN116704557A (en) | Low-quality fingerprint matching method based on texture information | |
CN112801141B (en) | Heterogeneous image matching method based on template matching and twin neural network optimization | |
CN113673515A (en) | Computer vision target detection algorithm | |
CN111079802B (en) | Matching method based on gradient information | |
CN116612165A (en) | Registration method for large-view-angle difference SAR image | |
CN116630662A (en) | Feature point mismatching eliminating method applied to visual SLAM | |
CN108985294B (en) | Method, device and equipment for positioning tire mold picture and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20200124 |