CN114549400A - Image identification method and device - Google Patents

Image identification method and device Download PDF

Info

Publication number
CN114549400A
CN114549400A CN202111676569.6A CN202111676569A CN114549400A CN 114549400 A CN114549400 A CN 114549400A CN 202111676569 A CN202111676569 A CN 202111676569A CN 114549400 A CN114549400 A CN 114549400A
Authority
CN
China
Prior art keywords
image
target
feature points
gradient
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111676569.6A
Other languages
Chinese (zh)
Inventor
陈鲁
肖遥
佟异
张嵩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhongke Feice Technology Co Ltd
Original Assignee
Shenzhen Zhongke Feice Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhongke Feice Technology Co Ltd filed Critical Shenzhen Zhongke Feice Technology Co Ltd
Priority to CN202111676569.6A priority Critical patent/CN114549400A/en
Publication of CN114549400A publication Critical patent/CN114549400A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses an image recognition method, which comprises the following steps: acquiring a template image, and acquiring feature points in the template image, wherein the feature points are pixel points of which the gradient amplitudes are larger than a first threshold; traversing the feature points for screening, and taking the traversed feature points as first target feature points: searching a feature point with a gradient amplitude larger than that of the first target feature point along the forward and reverse extension directions of the gradient direction of the first target feature point in a search area with a first size taking the first target feature point as a center, and if the feature point is searched, excluding the first target feature point; and taking the characteristic points obtained by traversing and screening as the shape outline of the template image. The application also correspondingly discloses an image recognition device, and the image recognition method and the image recognition device utilize the thin profile to carry out template matching, so that the accuracy is high, and the efficiency is higher.

Description

Image identification method and device
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and apparatus.
Background
In the image recognition technology in the existing defect detection field, usually, a pattern in an image needs to be matched with a template pattern to check whether the shape of an object or a defect in the image meets requirements, in the process, accurate contour information of the pattern needs to be obtained, and then the pattern to be detected is matched with the template pattern, so the accuracy of the contour of the pattern influences the accuracy of a matching result. In the prior art, the contour extraction is generally rough, so that the accuracy of image identification is influenced.
Disclosure of Invention
The invention provides a method for overcoming the problems in the prior art. An image recognition method, comprising:
acquiring a template image, and acquiring feature points in the template image, wherein the feature points are pixel points of which the gradient amplitude is greater than a first threshold;
traversing the feature points for screening, and taking the traversed feature points as first target feature points: searching a feature point with a gradient amplitude larger than that of the first target feature point along the forward and reverse extension directions of the gradient direction of the first target feature point in a search area with a first size taking the first target feature point as a center, and if the feature point is searched, excluding the first target feature point;
and taking the characteristic points obtained by traversing and screening as the shape outline of the template image.
In one embodiment, the method comprises the following steps:
acquiring a target image, traversing the feature points of the shape outline of the template image, and taking the traversed feature points as second target feature points:
acquiring a mapping point corresponding to the position of the second target characteristic point in the target image;
searching a characteristic point with the maximum gradient amplitude in a search area with a second size and taking the mapping point as a center in the target image, acquiring the gradient direction of the characteristic point with the maximum gradient amplitude, and calculating the angle difference between the gradient direction of the second target characteristic point corresponding to the mapping point and the gradient direction of the characteristic point with the maximum gradient amplitude;
and determining whether the target image and the template image are similar according to the angle difference.
In one embodiment, the determining whether the target image and the template image are similar according to the angle difference includes:
calculating the average value of the angle differences, and acquiring the number of the characteristic points of which the deviation values of the angle differences from the average value are smaller than a second threshold value;
and determining that the target image is similar to the template image when the number of the feature points is larger than a third threshold value.
In one embodiment, the acquiring the target image includes:
and acquiring an image to be recognized, and sequentially extracting target images with the same size and shape as the template image from the image to be recognized.
In one embodiment, the obtaining the feature points in the template image includes:
traversing pixel points of the template image, and for the traversed pixel points Pi,j
According to the following steps:
Figure BDA0003451519440000021
calculating a gradient value Gx in the X direction;
according to the following steps:
Figure BDA0003451519440000022
calculating gradient value Gy in the Y direction;
according to the following steps:
D=arctan(Gy/Gx)
calculating a pixel Pi,jIn the gradient direction Di,j
According to
Figure BDA0003451519440000023
Calculating a pixel Pi,jGradient amplitude G ofi,j
In addition, in order to overcome the problems in the prior art, the present application further provides an image recognition apparatus based on the foregoing image recognition method, including:
the template coarse contour extraction module is used for acquiring a template image and acquiring feature points in the template image, wherein the feature points are pixel points of which the gradient amplitude is greater than a first threshold;
and the template fine contour extraction module is used for traversing the feature points to carry out screening, and taking the traversed feature points as first target feature points: searching a feature point with a gradient amplitude larger than that of the first target feature point along the forward and reverse extension directions of the gradient direction of the first target feature point in a search area with a first size taking the first target feature point as a center, and if the feature point is searched, excluding the first target feature point; and taking the characteristic points obtained by traversing and screening as the shape outline of the template image.
In one embodiment, the apparatus comprises:
the target image acquisition module is used for acquiring a target image;
and the target fine contour extraction module is used for traversing the feature points of the shape contour of the template image, and taking the traversed feature points as second target feature points: acquiring a mapping point corresponding to the position of the second target characteristic point in the target image; searching a characteristic point with the maximum gradient amplitude in a search area with a second size and taking the mapping point as a center in the target image, acquiring the gradient direction of the characteristic point with the maximum gradient amplitude, and calculating the angle difference between the gradient direction of the second target characteristic point corresponding to the mapping point and the gradient direction of the characteristic point with the maximum gradient amplitude;
and the similarity judging module is used for determining whether the target image is similar to the template image according to the angle difference.
In one embodiment, the similarity determination module is configured to calculate a mean value of the angle differences, and obtain the number of feature points of the feature points of which deviation values of the angle differences from the mean value are smaller than a second threshold; and determining that the target image is similar to the template image when the number of the feature points is larger than a third threshold value.
In one embodiment, the target image obtaining module is configured to obtain an image to be recognized, and sequentially extract a target image with the same size and shape as the template image from the image to be recognized.
In one embodiment, the template coarse contour extraction module is configured to traverse pixel points of the template image, and for the traversed pixel point Pi,j
According to the following steps:
Figure BDA0003451519440000041
calculating a gradient value Gx in the X direction;
according to the following steps:
Figure BDA0003451519440000042
calculating gradient value Gy in the Y direction;
according to the following steps:
D=arctan(Gy/Gx)
calculating a pixel Pi,jIn the gradient direction Di,j
According to
Figure BDA0003451519440000043
Calculating a pixel Pi,jGradient amplitude G ofi,j
After the image identification method and the image identification device are adopted, the rough contour of the template image is calculated, then the characteristic points are traversed, the characteristic point with the maximum gradient amplitude in the gradient direction of the characteristic point is searched in the search area with the first size of the characteristic point and is used as the representative characteristic point in the search area and the gradient direction, other characteristic points with unobvious contour features are eliminated, and the fine contour of the image is obtained. The template matching is carried out on the basis of the fine contour of the image in the image recognition process, so that the accuracy is improved, and the calculated amount is reduced due to the elimination of the feature points with unobvious features, thereby improving the execution efficiency.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts. Wherein:
FIG. 1 is a flow chart of an image recognition method according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating matching of a template image and a target image according to an embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating a process of extracting a thin outline of a template image according to an embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating a process of extracting contour features of a target image according to a fine contour search of a template image according to an embodiment of the present invention;
fig. 5 is a schematic diagram of an image recognition apparatus according to an embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that if directional indications (such as up, down, left, right, front, and back … …) are referred to in the embodiments of the present application, the directional indications are only used to explain the relative positional relationship between the components, the movement situation, and the like in a specific posture (as shown in the drawings), and if the specific posture is changed, the directional indications are changed accordingly.
In addition, if there is a description of "first", "second", etc. in the embodiments of the present application, the description of "first", "second", etc. is for descriptive purposes only and is not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In addition, technical solutions between various embodiments may be combined with each other, but must be realized by a person skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination should not be considered to exist, and is not within the protection scope of the present application.
In order to solve the problem of the conventional technique that the contour extraction is coarse, resulting in low accuracy of template matching, the present invention proposes an image recognition method whose execution is based on a computer program that can be run on a computer system based on a von neumann architecture or harvard architecture.
Specifically, as shown in fig. 1, the method includes:
step S102: and acquiring a template image, and acquiring feature points in the template image, wherein the feature points are pixel points of which the gradient amplitude is greater than a first threshold value.
The template image is a preset reference image, and the target image is an image which needs to be judged whether the target image is similar to the template image. For example, in the field of defect search, a shape image of a defect may be set as a template image, and when a defect is detected in a product, a target image is obtained by photographing the product, and when it is determined that the target image is similar to the template image, it is determined that the product has a defect matching the template.
The template image needs to be preprocessed, namely a coarse contour of the template image is extracted, the method for extracting the coarse contour needs to extract the gradient value of each pixel point on the template image, and the gradient value is a vector value and comprises two dimensions of a gradient amplitude value and a gradient direction. For a two-dimensional image, the gradient direction can be represented by an image coordinate system in the XY direction.
Specifically, the pixel points of the template image can be traversed, and the traversed pixel point P isi,j
According to the following steps:
Figure BDA0003451519440000061
calculating a gradient value Gx in the X direction;
according to the following steps:
Figure BDA0003451519440000062
calculating gradient value Gy in the Y direction;
according to the following steps:
D=arctan(Gy/Gx)
calculating a pixel Pi,jIn the gradient direction Di,j
According to
Figure BDA0003451519440000063
Calculating a pixel Pi,jGradient amplitude G ofi,j
Therefore, any pixel point P can be obtainedi,jGradient amplitude G ofi,j. In this embodiment, the gradient magnitude G may be seti,jThe pixel points larger than the first threshold value are used as characteristic points for expressing the coarse contour, and the pixel points can also be used for expressing the coarse contour according to the gradient amplitude G of the pixel pointsi,jAnd sorting, and extracting the first N larger pixel points as the characteristic points for expressing the coarse contour. By this step, a rough contour of the shape contour in the template image is obtained.
Step S104: traversing the feature points for screening, and taking the traversed feature points as first target feature points:
and searching a feature point with a gradient magnitude larger than that of the first target feature point along the forward and backward extension directions of the gradient direction of the first target feature point in a search area with a first size taking the first target feature point as a center, and if the feature point is searched, excluding the first target feature point.
Step S106: and taking the characteristic points obtained by traversing and screening as the shape outline of the template image.
As shown with reference to fig. 3, to traverse to feature point Pi,jFor example, a first size search area is selected as the feature point Pi,jA 7 × 7 region at the center, i.e., a region of X coordinates i-3 to i +3 and Y coordinates j-3 to j +3, a feature point Pi,jIs the direction of extension along the X-axis direction. It can be seen that the feature point Pi+3,jLocated in a search region of a first size and located at a feature point Pi,jIn the forward or reverse direction of the gradient direction of (B), but the characteristic point Pi+3,jIs smaller than the characteristic point Pi,j. Although the characteristic point Pi+1,j-1And a feature point Pi+2,j-2Also within the search area of the first size, but not at the feature point Pi,jIn the direction of the gradient of (c), is not taken into account.
Thus, the characteristic point Pi,jThe feature point P, which is the feature point in the first search region having the largest gradient magnitude in the gradient direction along the X-axisi,jWill be retained and continue to traverse to the next feature point on the coarse contour.
When traversing to the feature point Pi+1,j-1Then, it can be seen that, in the search region of the first size 7 × 7 centered on it, only the feature point P is present in the gradient direction (45 degrees oblique) thereofi+2,j-2Meets the condition and is the characteristic point Pi+2,j-2Is greater than the characteristic point Pi+1,j-1Magnitude of gradient, therefore, characteristic point Pi+1,j-1The feature points in the direction of the region that most express the contour information are excluded.
Traversing all the feature points of the coarse contour in the manner, the feature points which partially express unclear contour information in the coarse contour can be excluded, and only the feature points which can express the contour information most in the specific direction of each region are reserved, so that the fine contour of the shape contour in the template image is obtained.
Preferably, when searching for feature points along the extending direction of the gradient direction of the first target feature point in the forward direction and the reverse direction, a certain search range may be allowed to be expanded, and feature points on both sides along the gradient direction may also be used as candidates for search comparison. As in fig. 3, the feature point Pi+3,j-1Not at the characteristic point Pi,jBut the positions are on both sides of the line along the gradient direction, so that the feature point P can also be used as a candidate for search and comparison if the feature point Pi+3,j-1Is greater than the characteristic point Pi,jThe gradient magnitude of (1), then the characteristic point Pi,jWill be excluded.
After a thin outline of the shape outline of the template image is obtained, it can be stored in memory for subsequent template matching. For example, in the field of defect detection, for a product on a production line, an image of the outer surface of the product can be captured, and then the image is compared with a fine outline of the shape outline of a template image of a defect feature in a memory, and whether a defect exists on the outer surface of the product is judged through similarity.
As shown in fig. 2, when the image to be recognized is large, the image to be recognized may be cut according to the size of the template image, and the target images having the same size and shape as the template image may be sequentially extracted from the image to be recognized. The target image is then compared with the template images one by one.
For matching the target image and the template image, specifically, the following steps are performed:
traversing the feature points of the shape outline of the template image, and taking the traversed feature points as second target feature points:
acquiring a mapping point corresponding to the position of the second target characteristic point in the target image; searching a characteristic point with the maximum gradient amplitude in a search area with a second size and taking the mapping point as a center in the target image, acquiring the gradient direction of the characteristic point with the maximum gradient amplitude, and calculating the angle difference between the gradient direction of the second target characteristic point corresponding to the mapping point and the gradient direction of the characteristic point with the maximum gradient amplitude;
and determining whether the target image and the template image are similar according to the angle difference.
In the present embodiment, since the template image and the target image are the same in size and shape, the feature point P in the template imagei,jMapped in the target image by the relative position and also taken as a pixel point Pi,jThe size of the second size search area may be different from the first size, for example, as shown in FIG. 4, the pixel point P that the second size search area can select to traverse toi,j3 × 3 region as center, when traversing to pixel point Pi,jThen, in the search area, the gradient amplitude and the gradient direction of each pixel point in the 3 × 3 area are calculated in the manner of step S102, and if the pixel point P is foundi,jIf the gradient amplitude is maximum, the gradient direction and the characteristic point P in the template image are calculatedi,jThe angle difference in the gradient direction of (a). In this way, the angle difference between each feature point in the template image and the gradient direction of the mapping point in the target image can be obtained.
In this embodiment, the determination is performed based on the number of angular differences that do not deviate from each other.
Specifically, the mean value of the angle differences in the gradient direction of each feature point position may be calculated, and the number of feature points of which the deviation value of the angle difference from the mean value is smaller than a second threshold value is obtained; and determining that the target image is similar to the template image when the number of the feature points is larger than a third threshold value.
As in fig. 4, although the feature point P is presenti+4,j+6The angular differences of the template image and the target image are determined to be similar to each other, and the similar positions can be determined according to the position distribution of the characteristic points of which the angular differences are not deviated, under the condition that the deviation of the angular differences of the other characteristic points is still within the range of the second threshold value and the number of the deviated characteristic points is larger than the third threshold value.
In order to solve the problem of the conventional technique that the contour extraction is thick and the template matching accuracy is low, the present invention further provides an image recognition apparatus, as shown in fig. 5, including:
the template coarse contour extraction module 102 is configured to obtain a template image, and obtain feature points in the template image, where the feature points are pixel points whose gradient amplitudes are greater than a first threshold;
a template fine contour extraction module 104, configured to traverse the feature points for screening, and take the traversed feature points as first target feature points: searching a feature point with a gradient amplitude larger than that of the first target feature point along the forward and reverse extension directions of the gradient direction of the first target feature point in a search area with a first size taking the first target feature point as a center, and if the feature point is searched, excluding the first target feature point; and taking the characteristic points obtained by traversing and screening as the shape outline of the template image.
In one embodiment, as shown in fig. 5, the image recognition apparatus further includes:
and a target image obtaining module 106, configured to obtain a target image.
A target fine contour extraction module 108, configured to traverse feature points of the shape contour of the template image, and use the traversed feature points as second target feature points: acquiring a mapping point corresponding to the position of the second target characteristic point in the target image; searching a characteristic point with the maximum gradient amplitude in a search area with a second size taking the mapping point as the center in the target image, acquiring the gradient direction of the characteristic point with the maximum gradient amplitude, and calculating the angle difference between the gradient direction of the second target characteristic point corresponding to the mapping point and the gradient direction of the characteristic point with the maximum gradient amplitude.
A similarity determining module 110, configured to determine whether the target image and the template image are similar according to the angle difference.
In one embodiment, the similarity determination module 110 is configured to calculate a mean value of the angle differences, and obtain the number of feature points of the feature points of which deviation values of the angle differences from the mean value are smaller than a second threshold; and determining that the target image is similar to the template image when the number of the feature points is larger than a third threshold value.
In one embodiment, the target image obtaining module 106 is configured to obtain an image to be recognized, and sequentially extract a target image having the same size and shape as the template image from the image to be recognized.
In one embodiment, the template coarse contour extraction module 102 is configured to traverse pixel points of the template image, and for the traversed pixel point Pi,j
According to the following steps:
Figure BDA0003451519440000101
calculating a gradient value Gx in the X direction;
according to the following steps:
Figure BDA0003451519440000102
calculating gradient value Gy in the Y direction;
according to the following steps:
D=arctan(Gy/Gx)
calculating a pixel Pi,jIn the gradient direction Di,j
According to
Figure BDA0003451519440000103
Calculating a pixel Pi,jGradient amplitude G ofi,j
After the image identification method and the image identification device are adopted, the rough contour of the template image is calculated, then the characteristic points are traversed, the characteristic point with the maximum gradient amplitude in the gradient direction of the characteristic point is searched in the search area with the first size of the characteristic point and is used as the representative characteristic point in the search area and the gradient direction, other characteristic points with unobvious contour features are eliminated, and the fine contour of the image is obtained. The template matching is carried out on the basis of the fine contour of the image in the image recognition process, so that the accuracy is improved, and the calculated amount is reduced due to the elimination of the feature points with unobvious features, thereby improving the execution efficiency.
The above embodiments are merely examples, and not intended to limit the scope of the present application, and all modifications, equivalents, and flow charts using the contents of the specification and drawings of the present application, or those directly or indirectly applied to other related arts, are included in the scope of the present application.

Claims (10)

1. An image recognition method, comprising:
acquiring a template image, and acquiring feature points in the template image, wherein the feature points are pixel points of which the gradient amplitude is greater than a first threshold;
traversing the feature points for screening, and taking the traversed feature points as first target feature points: searching a feature point with a gradient amplitude larger than that of the first target feature point along the forward and reverse extension directions of the gradient direction of the first target feature point in a search area with a first size taking the first target feature point as a center, and if the feature point is searched, excluding the first target feature point;
and taking the characteristic points obtained by traversing and screening as the shape outline of the template image.
2. The image recognition method according to claim 1, comprising:
acquiring a target image, traversing the feature points of the shape outline of the template image, and taking the traversed feature points as second target feature points:
acquiring a mapping point corresponding to the position of the second target characteristic point in the target image;
searching a characteristic point with the maximum gradient amplitude in a search area with a second size and taking the mapping point as a center in the target image, acquiring the gradient direction of the characteristic point with the maximum gradient amplitude, and calculating the angle difference between the gradient direction of the second target characteristic point corresponding to the mapping point and the gradient direction of the characteristic point with the maximum gradient amplitude;
and determining whether the target image and the template image are similar according to the angle difference.
3. The image recognition method of claim 2, wherein the determining whether the target image and the template image are similar according to the angular difference comprises:
calculating the average value of the angle differences, and acquiring the number of the characteristic points of which the deviation values of the angle differences from the average value are smaller than a second threshold value;
and determining that the target image is similar to the template image when the number of the feature points is larger than a third threshold value.
4. The image recognition method according to any one of claims 1 to 3, wherein the acquiring the target image includes:
and acquiring an image to be recognized, and sequentially extracting target images with the same size and shape as the template image from the image to be recognized.
5. The image recognition method according to any one of claims 1 to 3, wherein the acquiring the feature points in the template image includes:
traversing pixel points of the template image, and for the traversed pixel points Pi,j
According to the following steps:
Figure FDA0003451519430000021
calculating a gradient value Gx in the X direction;
according to the following steps:
Figure FDA0003451519430000022
calculating gradient value Gy in the Y direction;
according to the following steps:
D=arctan(Gy/Gx)
calculating a pixel Pi,jIn the gradient direction Di,j
According to
Figure FDA0003451519430000023
Calculating a pixel Pi,jGradient amplitude G ofi,j
6. An image recognition apparatus, comprising:
the template coarse contour extraction module is used for acquiring a template image and acquiring feature points in the template image, wherein the feature points are pixel points of which the gradient amplitude is greater than a first threshold;
and the template fine contour extraction module is used for traversing the feature points to carry out screening, and taking the traversed feature points as first target feature points: searching a feature point with a gradient amplitude larger than that of the first target feature point along the forward and reverse extension directions of the gradient direction of the first target feature point in a search area with a first size taking the first target feature point as a center, and if the feature point is searched, excluding the first target feature point; and taking the characteristic points obtained by traversing and screening as the shape outline of the template image.
7. The image recognition apparatus according to claim 6, comprising:
the target image acquisition module is used for acquiring a target image;
and the target fine contour extraction module is used for traversing the feature points of the shape contour of the template image, and taking the traversed feature points as second target feature points: acquiring a mapping point corresponding to the position of the second target characteristic point in the target image; searching a characteristic point with the maximum gradient amplitude in a search area with a second size and taking the mapping point as a center in the target image, acquiring the gradient direction of the characteristic point with the maximum gradient amplitude, and calculating the angle difference between the gradient direction of the second target characteristic point corresponding to the mapping point and the gradient direction of the characteristic point with the maximum gradient amplitude;
and the similarity judging module is used for determining whether the target image is similar to the template image according to the angle difference.
8. The image recognition device according to claim 7, wherein the similarity determination module is configured to calculate a mean value of the angle differences, and obtain the number of feature points of the feature points of which the deviation values of the angle differences from the mean value are smaller than a second threshold; and determining that the target image is similar to the template image when the number of the feature points is larger than a third threshold value.
9. The image recognition device according to any one of claims 6 to 8, wherein the target image obtaining module is configured to obtain an image to be recognized, and sequentially extract a target image having the same size and shape as the template image from the image to be recognized.
10. The image recognition device according to any one of claims 6 to 8, wherein the template coarse contour extraction module is configured to traverse pixels of the template image, and for the traversed pixels Pi,j
According to the following steps:
Figure FDA0003451519430000031
calculating a gradient value Gx in the X direction;
according to the following steps:
Figure FDA0003451519430000032
calculating gradient value Gy in Y direction;
according to the following steps:
D=arctan(Gy/Gx)
calculating a pixel Pi,jIn the gradient direction Di,j
According to
Figure FDA0003451519430000041
Calculating a pixel Pi,jGradient amplitude G ofi,j
CN202111676569.6A 2021-12-31 2021-12-31 Image identification method and device Pending CN114549400A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111676569.6A CN114549400A (en) 2021-12-31 2021-12-31 Image identification method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111676569.6A CN114549400A (en) 2021-12-31 2021-12-31 Image identification method and device

Publications (1)

Publication Number Publication Date
CN114549400A true CN114549400A (en) 2022-05-27

Family

ID=81669023

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111676569.6A Pending CN114549400A (en) 2021-12-31 2021-12-31 Image identification method and device

Country Status (1)

Country Link
CN (1) CN114549400A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114862855A (en) * 2022-07-07 2022-08-05 南通盈赛纺织品有限公司 Textile defect detection method and system based on template matching
CN115351713A (en) * 2022-10-19 2022-11-18 武汉艾极涂科技有限公司 Sand blasting method, device, equipment and storage medium based on image recognition

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114862855A (en) * 2022-07-07 2022-08-05 南通盈赛纺织品有限公司 Textile defect detection method and system based on template matching
CN114862855B (en) * 2022-07-07 2022-09-23 南通盈赛纺织品有限公司 Textile defect detection method and system based on template matching
CN115351713A (en) * 2022-10-19 2022-11-18 武汉艾极涂科技有限公司 Sand blasting method, device, equipment and storage medium based on image recognition

Similar Documents

Publication Publication Date Title
JP6794766B2 (en) Fingerprint processing device, fingerprint processing method, program, fingerprint processing circuit
JP4042780B2 (en) Object recognition method, object recognition program and storage medium thereof, and object recognition apparatus
JP5385105B2 (en) Image search method and system
US8019164B2 (en) Apparatus, method and program product for matching with a template
US20020042676A1 (en) Road lane marker recognition
CN114549400A (en) Image identification method and device
CN112598922B (en) Parking space detection method, device, equipment and storage medium
CN112330678B (en) Product edge defect detection method
CN108491498A (en) A kind of bayonet image object searching method based on multiple features detection
US20060120578A1 (en) Minutiae matching
CN107862319B (en) Heterogeneous high-light optical image matching error eliminating method based on neighborhood voting
CN107545217B (en) Fingerprint matching method, fingerprint matching device and fingerprint identification chip
JP2002228423A (en) Tire detecting method and device
CN107895166B (en) Method for realizing target robust recognition based on feature descriptor by geometric hash method
CN110675442A (en) Local stereo matching method and system combined with target identification technology
CN114119644A (en) Template matching method based on edge features
CN107808165B (en) Infrared image matching method based on SUSAN corner detection
JP5160366B2 (en) Pattern matching method for electronic parts
CN109829502B (en) Image pair efficient dense matching method facing repeated textures and non-rigid deformation
CN110766728A (en) Combined image feature accurate matching algorithm based on deep learning
JPH08159716A (en) Calculation method of vanishing point of road, and position/attitude measuring apparatus for vehicle
CN110322680B (en) Single parking space detection method, system, terminal and storage medium based on designated points
JPH01271883A (en) Detecting system for center of fingerprint
CN113077410A (en) Image detection method, device and method, chip and computer readable storage medium
JP4674853B2 (en) Image processing component data creation method and image processing component data creation device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination