CN112507910B - Image recognition method and system based on pixel deformation, electronic device and storage medium - Google Patents

Image recognition method and system based on pixel deformation, electronic device and storage medium Download PDF

Info

Publication number
CN112507910B
CN112507910B CN202011478867.XA CN202011478867A CN112507910B CN 112507910 B CN112507910 B CN 112507910B CN 202011478867 A CN202011478867 A CN 202011478867A CN 112507910 B CN112507910 B CN 112507910B
Authority
CN
China
Prior art keywords
area
point
same
target
point number
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011478867.XA
Other languages
Chinese (zh)
Other versions
CN112507910A (en
Inventor
孔晶晶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Bank Co Ltd
Original Assignee
Ping An Bank Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Bank Co Ltd filed Critical Ping An Bank Co Ltd
Priority to CN202011478867.XA priority Critical patent/CN112507910B/en
Publication of CN112507910A publication Critical patent/CN112507910A/en
Application granted granted Critical
Publication of CN112507910B publication Critical patent/CN112507910B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to an artificial intelligence technology, and discloses a pattern recognition method based on pixel deformation, which comprises the following steps: acquiring a target area from a picture to be identified, wherein the target area comprises an internal communication area and a peripheral communication area; traversing in the picture according to the target area to obtain a current identification area; calculating the first same points of the current identification area and the internal communication area and the second same points of the current identification area and the peripheral communication area, and obtaining the compensation same points of the current identification area and the peripheral communication area by a graph edge compensation algorithm; calculating the similarity between the current identification area and the target area according to the first same point number, the second same point number and the compensation same point number; and identifying a graph corresponding to the target area in the picture according to the similarity calculation result and a preset threshold value. The invention also provides a pattern recognition system, an electronic device and a computer readable storage medium. The invention has good figure recognition degree aiming at the situation of figure pixel deformation.

Description

Image recognition method and system based on pixel deformation, electronic device and storage medium
Technical Field
The present invention relates to the field of artificial intelligence technologies, and in particular, to a method, a system, an electronic device, and a computer-readable storage medium for image recognition based on pixel deformation.
Background
Computer vision is an important direction in the field of artificial intelligence, and for human beings, recognizing graphics and images is a natural ability, while for computers, it is a very complex and challenging thing, which represents a new height of computer development. Graph search is an important aspect of computer vision recognition, and how to identify a specific graph in a given picture is the basis of graph search.
At present, a mainstream graph search algorithm generally identifies a specific graph by proposing features, however, in the processes of generating, transmitting and analyzing a picture, compression and imaging operations are inevitably performed on information in the picture, and the graph is influenced in the process. For example, in the interpolation algorithm of picture scaling and the vector graphics imaging process, graphics in the picture are deformed to some extent. Moreover, the deformation is cumulative, the more the operation process is, the larger the difference between the new picture and the original picture is, and the recognition degree of the graph search is reduced, so that the recognition effect is difficult to achieve.
Disclosure of Invention
In view of the above, the present invention provides a method, a system, an electronic device and a computer-readable storage medium for pattern recognition based on pixel deformation, so as to solve the problem of how to accurately perform pattern recognition under the condition that pixel deformation occurs in a picture.
First, to achieve the above object, the present invention provides a method for recognizing a pattern based on pixel deformation, the method comprising:
acquiring a target area from a picture to be identified, wherein the target area comprises an internal communication area and a peripheral communication area;
traversing in the picture according to the target area to obtain a current identification area;
calculating the first same point number of the current identification area and the internal communication area;
calculating the second same points of the current identification area and the peripheral communication area, and obtaining the compensation same points of the current identification area and the peripheral communication area through a graph edge compensation algorithm;
calculating the similarity between the current identification area and the target area according to the first same point number, the second same point number and the compensation same point number; and
and identifying a graph corresponding to the target area in the picture according to the similarity calculation result and a preset threshold value.
Optionally, the obtaining the target area from the picture to be recognized includes:
receiving a target point selected by a user in the picture, wherein the target point is any point in the internal communication area of the target area;
and acquiring a target region point set through a four-connectivity recursion algorithm according to the target point to obtain the target region, wherein the target region point set comprises a peripheral connected region point set and an internal connected region point set.
Optionally, the traversing in the picture according to the target region, and obtaining the current identification region includes:
and offsetting in a binary two-dimensional matrix corresponding to the picture according to the point number and the position of the target area point set to obtain the current identification area point set.
Optionally, the calculating a first same point number of the current identification area and the internal communication area includes:
and comparing the values of each point in the current identification area point set with the values of each point at the corresponding position in the internal communication area point set, calculating the same point number of the current identification area point set and the internal communication area point set, and recording the same point number as the first same point number.
Optionally, the calculating a second same number of points of the current identification area and the peripheral connected area includes:
and comparing the values of each point in the current identification area point set with the values of each point at the corresponding position in the peripheral connected area point set, calculating the same point number of the current identification area point set and the peripheral connected area point set, and recording the same point number as the second same point number.
Optionally, the obtaining the number of compensation same points of the current identification area and the peripheral connected area by using a graph edge compensation algorithm includes:
when a first comparison point in the peripheral connected region point set cannot be matched with a second comparison point at a corresponding position in the current identification region, further inquiring whether a third comparison point which can be matched with the first comparison point exists in an eight-connected region of the second comparison point;
when the third contrast point exists, marking the third contrast point as a compensated identical point of the first contrast point;
and counting to obtain the compensation same point number of the current identification area point set and the peripheral connected area point set.
Optionally, the calculating the similarity between the current identification area and the target area according to the first same point number, the second same point number and the compensation same point number includes:
setting a plurality of similarity calculation formulas according to different proportions of the first same point number, the second same point number and the compensation same point number in similarity calculation;
selecting a current similarity calculation formula from the multiple similarity calculation formulas according to the current recognition scene;
and calculating the similarity between the current identification area and the target area according to the first same point number, the second same point number, the compensation same point number and the current similarity calculation formula.
In addition, to achieve the above object, the present invention also provides a pattern recognition system, including:
the image recognition system comprises an acquisition module, a recognition module and a recognition module, wherein the acquisition module is used for acquiring a target area from an image to be recognized, the target area comprises an internal communication area and a peripheral communication area, and traversing is carried out in the image according to the target area to acquire a current recognition area;
the comparison module is used for calculating the first same points of the current identification area and the internal communication area, calculating the second same points of the current identification area and the peripheral communication area, and obtaining the compensation same points of the current identification area and the peripheral communication area through a graph edge compensation algorithm;
the calculating module is used for calculating the similarity between the current identification area and the target area according to the first same point number, the second same point number and the compensation same point number;
and the judging module is used for identifying the graph corresponding to the target area in the picture according to the similarity calculation result and a preset threshold value.
Further, in order to achieve the above object, the present invention also provides an electronic device, which includes a memory and a processor, wherein the memory stores a pattern recognition program that can be executed on the processor, and the pattern recognition program, when executed by the processor, implements the steps of the pattern recognition method as described above.
Further, to achieve the above object, the present invention also provides a computer-readable storage medium storing a pattern recognition program, which is executable by at least one processor to cause the at least one processor to perform the steps of the pattern recognition method as described above.
Compared with the prior art, the image recognition method, the image recognition system, the electronic device and the computer-readable storage medium based on pixel deformation provided by the invention can be used for carrying out similarity recognition aiming at the condition of image pixel deformation, and have good recognition degree on the image deformed after image pixel adjustment by adding the image edge compensation algorithm while carrying out image recognition according to the local characteristics of the target area. In addition, the algorithm carries out compensation based on the graph edge, is a compensation algorithm based on the local feature of the target region, and can filter noise points which do not accord with the local feature of the target region while improving the recognition degree.
Drawings
FIG. 1 is a diagram of an alternative hardware architecture of the electronic device of the present invention;
FIG. 2 is a block diagram of a preferred embodiment of a pattern recognition system according to the present invention;
FIG. 3A is a schematic illustration of a target area in accordance with the present invention;
FIG. 3B is a schematic diagram of a deformed image according to the present invention;
FIG. 3C is a schematic illustration of a pattern recognition effect according to the present invention;
FIG. 4 is a flowchart illustrating a preferred embodiment of a pattern recognition method according to the present invention;
FIG. 5 is a partially detailed flowchart of step S406 in FIG. 4;
FIG. 6 is a detailed flowchart of step S408 in FIG. 4;
the implementation, functional features and advantages of the present invention will be further described with reference to the accompanying drawings.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the description relating to "first", "second", etc. in the present invention is for descriptive purposes only and is not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one of the feature. In addition, technical solutions between the embodiments may be combined with each other, but must be based on the realization of the technical solutions by a person skilled in the art, and when the technical solutions are contradictory to each other or cannot be realized, such a combination should not be considered to exist, and is not within the protection scope of the present invention.
Fig. 1 is a schematic diagram of an alternative hardware architecture of the electronic device 2 according to the present invention.
In this embodiment, the electronic device 2 may include, but is not limited to, a memory 11, a processor 12, and a network interface 13, which are communicatively connected to each other through a system bus. It is noted that fig. 1 only shows the electronic device 2 with components 11-13, but it is to be understood that not all of the shown components are required to be implemented, and that more or less components may be implemented instead.
The electronic device 2 may be a server, a PC (Personal Computer), a smart phone, a tablet Computer, a palm Computer, a portable Computer, or other terminal equipment. The server may be a rack server, a blade server, a tower server, a cabinet server, or other computing devices, may be an independent server, or may be a server cluster composed of a plurality of servers.
The memory 11 includes at least one type of readable storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, etc. In some embodiments, the storage 11 may be an internal storage unit of the electronic device 2, such as a hard disk or a memory of the electronic device 2. In other embodiments, the memory 11 may also be an external storage device of the electronic apparatus 2, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like, provided on the electronic apparatus 2. Of course, the memory 11 may also comprise both an internal memory unit of the electronic apparatus 2 and an external memory device thereof. In this embodiment, the memory 11 is generally used for storing an operating system installed in the electronic device 2 and various application software, such as a program code of the pattern recognition system 200. Furthermore, the memory 11 may also be used to temporarily store various types of data that have been output or are to be output.
The processor 12 may be a Central Processing Unit (CPU), controller, microcontroller, microprocessor, or other data Processing chip in some embodiments. The processor 12 is typically used to control the overall operation of the electronic device 2. In this embodiment, the processor 12 is configured to run the program code stored in the memory 11 or process data, for example, run the pattern recognition system 200.
The network interface 13 may comprise a wireless network interface or a wired network interface, and the network interface 13 is generally used for establishing a communication connection between the electronic apparatus 2 and other electronic devices.
The hardware structure and functions of the apparatus according to the present invention have been described in detail. Various embodiments of the present invention will be presented based on the above description.
First, the present invention provides a pattern recognition system 200.
Referring to FIG. 2, a block diagram of a preferred embodiment of a pattern recognition system 200 according to the present invention is shown.
In this embodiment, the pattern recognition system 200 includes a series of computer program instructions stored on the memory 11 that, when executed by the processor 12, implement the pattern recognition operations of the various embodiments of the present invention. In some embodiments, the pattern recognition system 200 may be divided into one or more modules based on the particular operations implemented by the portions of the computer program instructions. For example, in fig. 2, the pattern recognition system 200 may be divided into an acquisition module 201, a comparison module 202, a calculation module 203, and a determination module 204. Wherein:
the obtaining module 201 is configured to obtain a target region from a picture to be identified, where the target region includes an internal connected region and a peripheral connected region.
Specifically, in this embodiment, a graphic region in the picture, which is the same as the target region, is identified based on the given target region, a region outside the target region in the picture is marked as a search region, and traversal and comparison are performed in the search region according to the target region, so as to obtain an identification result. Wherein the target area is a closed figure, such as a square, that is, composed of a peripheral connected area (peripheral outline) and an internal connected area (generally blank area).
For example, the picture to be recognized may be a workstation plan, the target area is a certain workstation graphic, the peripheral connected area is a peripheral contour line of the workstation graphic, and the internal connected area is a blank area of the vacancy graphic inside the peripheral contour line.
When the same figure in the target area needs to be identified from the picture, the target area is set by the user. In this embodiment, the target area may be set by directly selecting the target area from the picture. For example, refer to fig. 3A, which is a schematic diagram of one of the target regions. In fig. 3A, the small square on the right side of the picture is the target region. The operation method for selecting the target area in the picture by the user comprises the following steps: and clicking any point in the internal blank area of the target area, wherein the point is called a target point. The user can select any point in the internal blank area as the target point by dragging a mouse or clicking with a finger or the like.
And after receiving the target point selected by the user in the picture, acquiring a target area point set according to the target point through a four-way recursion algorithm. The target area point set is divided into a peripheral outline point set (B) and an internal blank area point set (W).
Before the picture is identified based on the target area, in order to improve the identification accuracy and efficiency, the picture is preprocessed. Specifically, the image preprocessing may employ a grayscale processing, a noise reduction filtering, a binarization processing technique, or the like. The binarization processing is to convert the pixel point into a numerical value of 0 or 1 according to the color RGB value of the pixel point in the picture, so that the picture is converted into a binary two-dimensional digital matrix.
The four connected regions, i.e. the four neighborhoods, refer to the positions of the corresponding pixels, which are adjacent to each other, and from any point on the region, any pixel in the region can be reached through the combination of movement in the four directions of up, down, left and right. Two requirements for two pixels to be connected are: whether the positions of the two pixels are adjacent; whether the gray values of the two pixels meet a certain similarity criterion (or are equal). For each pixel point, if the value of one point is the same as that of the four connected points, the two points are classified as one object.
On the basis of the target point, all point sets of the target area where the target point is located can be obtained through a four-way recursion algorithm, so that the local features of the target area are obtained, and subsequently, the similarity is identified in the picture according to the local features.
Specifically, all points with the same value as the target point (value 0) are found at four connected positions of the target point, the obtained point set is an internal blank area point set (W) of the target area, and then all points with the value 1 on the periphery are further obtained according to the internal blank area point set (W), namely a peripheral contour point set (B) of the target area.
The obtaining module 201 is further configured to traverse through the picture according to the target area, and obtain the current identification area.
Specifically, the target region point set is traversed in the whole two-dimensional matrix corresponding to the picture according to the target region point set, that is, the target region point set (according to the number and position of the point set) is shifted from left to right and from top to bottom in the two-dimensional matrix, so as to obtain the current recognition region point set (for example, the upper left square region in fig. 3A).
The comparison module 202 is configured to calculate a first same point number of the current identification area and the internal communication area.
Specifically, the values of each point in the current recognition area point set and each point in the corresponding position in the internal blank area point set are compared, the same point number of the current recognition area point set and the internal blank area point set W is calculated, and the same point number is recorded as a first same point number.
The comparison module 202 is further configured to calculate second same points of the current identification area and the peripheral connected area, and obtain the same compensation points through a graph edge compensation algorithm.
Specifically, the values of each point in the current recognition area point set and each point at the corresponding position in the peripheral contour point set are compared, the same point number of the current recognition area point set and the peripheral contour point set B is calculated and recorded as a second same point number.
It should be noted that, since the picture may have a certain distortion, so that the similarity of pattern recognition is low, additional processing is required for the case of pixel distortion. For example, a station plan is generally formed by converting a vector diagram into a PDF or a common picture, and in the conversion process, various types of information in the picture are often deformed to a certain extent, so that each station graphic in the plan is generally the same as the whole but has local differences.
Fig. 3B is a schematic diagram of a deformed pattern.
The target area point set consists of a peripheral outline point set and an internal blank area point set, and the main reason for reducing the similarity is that the similarity with the peripheral outline point set does not reach a threshold value after the graph is deformed. Since the inner blank area is a whole area, the matching degree of the point set of the area is relatively stable, and the point set does not basically become a factor of reducing the similarity. Therefore, for the comparison between the current recognition area point set and the peripheral outline point set, if only the second same points are calculated, the finally obtained similarity is often low, so that the second same points cannot be recognized, and therefore the second same points need to be corrected.
The specific mode of correction is to add a graph edge compensation algorithm in the process of offset comparison of the current identification region point set and the peripheral outline point set. If a certain point (first comparison point) in the peripheral contour point set cannot be matched with a point (second comparison point) at a corresponding position in the current identification area, further inquiring whether a point (third comparison point) which can be matched with the first comparison point exists in an eight-connected area of the second comparison point, and if so, marking (marking the third comparison point as a compensation identical point). The eight connected regions, namely the eight neighborhoods, refer to the upper, lower, left, right, upper left, upper right, lower left and lower right of the corresponding pixel positions, are adjacent positions and obliquely adjacent positions, and have eight directions in total. The matching means that the values of the two points (in the two-dimensional matrix) are equal.
After the correction, the number of compensation points of the current identification area point set and the peripheral contour point set B can be obtained through statistics.
The graph edge compensation algorithm is based on the local features of the target area, so that in the process of identifying the graph, peripheral contour features are compensated in a mode of expanding and traversing points around pixel points, missed similar points in a common graph identification algorithm are obtained, and meanwhile, the identification of areas which do not accord with the local features of the target area can be avoided.
The calculating module 203 is configured to calculate a similarity between the current identification area and the target area according to the first same point number, the second same point number, and the compensation same point number.
Specifically, different similarity calculation formulas may be configured according to different application scenarios. For example:
(1) Similarity s = ((w/(k-b) + (b + ab)/(k-w))/2,; or
(2) Similarity s = ((w/(k-b-ab) + (b + ab)/(k-w))/2
Wherein w is the first same point number, b is the second same point number, ab is the compensation same point number, and k is the total point number of the target region point set.
The formula (1) strengthens the proportion of the peripheral outline point set, and the formula (2) is balanced and can be selected according to different scenes.
In addition, a weight can be added to the compensation same point, such as:
(3) The similarity s = ((w/(k-b-ab λ) + (b + ab λ)/(k-w))/2.
The weight λ may be self-defined, or may be dynamically obtained through a certain parameter, such as: λ = b/tb, where tb is the total number of points in the set of peripheral contour points.
In other embodiments, other feasible similarity calculation formulas may also be configured according to the needs and specific features of the actual application scenarios, and are not described herein again.
The judging module 204 is configured to identify a graph corresponding to the target area in the picture according to the similarity calculation result and a predetermined threshold.
Specifically, the similarity between all the identified regions and the target region is obtained by traversing the two-dimensional matrix and calculating the similarity between each current identified region and the target region, and then a required graphic region is obtained according to a predetermined threshold value, namely, the graphic region is the same as the target region in the picture. In this embodiment, when the calculated similarity is greater than or equal to the threshold, it indicates that the current recognition area and the target area have the same pattern, that is, the current recognition area satisfies the condition.
For example, the threshold is set to 0.8, if the calculated similarity s > =0.8, the condition is satisfied, the identification region is retained, then the next identification region is traversed until all picture regions (the whole two-dimensional matrix) are traversed, and finally all the retained identification regions are regions with the same graph as the target region. For example, the target area in the upper drawing is a square, and the number and the positions of all squares in the picture that are the same as the square can be searched according to the embodiment.
The image recognition system provided by the embodiment can recognize the similarity according to the deformation condition of the image pixels, and the method adds the image edge compensation algorithm while recognizing the image according to the local characteristics of the target area, so that the recognition degree of the image with deformation after the image pixels are adjusted is good. In addition, the algorithm carries out compensation based on the graph edge, is a compensation algorithm based on the local feature of the target area, and can filter noise points which do not accord with the local feature of the target area while improving the recognition degree. Meanwhile, the method has certain expandability, can adopt a similarity calculation formula suitable for different scenes, further optimizes on the basis of optimizing a pattern recognition algorithm, and has wide application scenes and obvious optimization effect.
For example, refer to fig. 3C, which is a diagram illustrating a pattern recognition effect. The upper row of squares (station patterns) in fig. 3C is the target area, and the two patterns in the middle row are the deformed patterns in the picture. If the above-mentioned pattern edge compensation algorithm is not added, only the solid line part can be identified as the similar part of the target area, and the dotted line part cannot be identified. After the graph edge compensation algorithm is added, as shown in the following row, the two graphs can be recognized as similar parts of the target area, and the graph recognition degree is improved. According to the station system identification data, when the pattern edge compensation algorithm is not added, the identification degree of the station pattern is over 70 percent, and after the algorithm is added, the identification degree can reach over 90 percent.
In addition, the invention also provides a pattern recognition method based on pixel deformation.
Fig. 4 is a schematic flow chart of a preferred embodiment of the pattern recognition method of the present invention. In this embodiment, the execution order of the steps in the flowchart shown in fig. 4 may be changed and some steps may be omitted according to different requirements. The method comprises the following steps:
step S400, acquiring a target area from the picture to be identified, wherein the target area comprises an internal communication area and a peripheral communication area.
Specifically, in this embodiment, a graphic region in the picture, which is the same as the target region, is identified based on the given target region, a region outside the target region in the picture is marked as a search region, and traversal and comparison are performed in the search region according to the target region, so as to obtain an identification result. Wherein the target area is a closed figure, such as a square, that is, composed of a peripheral connected area (peripheral outline) and an internal connected area (generally blank area).
For example, the picture to be recognized may be a station plan view, the target area is a certain station graphic, the peripheral connected area is a peripheral contour line of the station graphic, and the internal connected area is a blank area of the blank graphic inside the peripheral contour line.
When the same graph in the target area needs to be identified from the picture, the target area is set by the user. In this embodiment, the target area may be set by directly selecting the target area from the picture. For example, as shown in fig. 3A, a small square on the right side of the picture is the target region. The operation method for selecting the target area in the picture by the user comprises the following steps: and clicking any point in the internal blank area of the target area, wherein the point is called a target point. The user can select any point in the internal blank area as the target point by dragging a mouse or clicking with a finger or the like.
And after receiving the target point selected by the user in the picture, acquiring a target region point set through a four-way recursion algorithm according to the target point. The target area point set is divided into a peripheral outline point set (B) and an internal blank area point set (W).
Before the picture is identified based on the target area, in order to improve the identification accuracy and efficiency, the picture is preprocessed. Specifically, the image preprocessing may employ a grayscale processing, a noise reduction filtering, a binarization processing technique, or the like. The binarization processing is to convert the pixel point into a numerical value of 0 or 1 according to the color RGB value of the pixel point in the picture, so that the picture is converted into a binary two-dimensional digital matrix.
The four connected regions, i.e. the four neighborhoods, refer to the positions of the corresponding pixels, which are adjacent to each other, and from any point on the region, any pixel in the region can be reached through the combination of movement in the four directions of up, down, left and right. Two requirements for two pixels to be connected are: whether the positions of the two pixels are adjacent; whether the gray values of the two pixels meet a certain similarity criterion (or are equal). For each pixel point, if the value of one point is the same as that of the four connected points, the two points are classified as one object.
On the basis of the target point, all point sets of the target area where the target point is located can be obtained through a four-way recursion algorithm, so that the local features of the target area are obtained, and subsequently, the similarity is identified in the picture according to the local features.
Specifically, all points with the same value as the target point (value 0) are found at the four-connected positions, the obtained point set is an internal blank area point set (W) of the target area, and then all points with the value 1 on the periphery are further obtained according to the internal blank area point set (W), namely a peripheral contour point set (B) of the target area.
And S402, traversing the picture according to the target area to obtain the current identification area.
Specifically, the target region point set is traversed in the whole two-dimensional matrix corresponding to the picture according to the target region point set, that is, the target region point set (according to the number and position of the point set) is shifted from left to right and from top to bottom in the two-dimensional matrix, so as to obtain the current recognition region point set (for example, the upper left square region in fig. 3A).
Step S404, calculating the first same point number of the current identification area and the internal communication area.
Specifically, the values of each point in the current recognition area point set and each point in the corresponding position in the internal blank area point set are compared, the same point number of the current recognition area point set and the internal blank area point set W is calculated, and the same point number is recorded as a first same point number.
Step S406, calculating the second same point number of the current identification area and the peripheral communication area, and obtaining the compensation same point number through a graph edge compensation algorithm.
Specifically, the values of each point in the current recognition area point set and each point at the corresponding position in the peripheral contour point set are compared, the same point number of the current recognition area point set and the peripheral contour point set B is calculated and recorded as a second same point number.
It should be noted that, since the picture may have a certain distortion, so that the similarity of pattern recognition is low, additional processing is required for the case of pixel distortion. For example, a workstation plan is generally formed by converting a vector diagram into a PDF or a normal picture, and in the conversion process, various types of information in the picture are often deformed to a certain extent, so that the overall workstation graphics in the plan are substantially the same but have local differences (as shown in fig. 3B).
The target area point set consists of a peripheral outline point set and an internal blank area point set, and the main reason for reducing the similarity is that the similarity with the peripheral outline point set cannot reach a threshold value after the graph is deformed. Since the inner blank area is a whole area, the matching degree of the point set of the area is relatively stable, and the point set does not basically become a factor of reducing the similarity. Therefore, for the comparison between the current recognition area point set and the peripheral outline point set, if only the second same points are calculated, the finally obtained similarity is often low, so that the second same points cannot be recognized, and therefore the second same points need to be corrected.
The specific mode of correction is to add a graph edge compensation algorithm in the process of offset comparison of the current identification region point set and the peripheral outline point set.
Further referring to fig. 5, a schematic diagram of a partial refinement flow (correction process) of the step S406 is shown. In this embodiment, the obtaining of the number of the same compensation points through the graph edge compensation algorithm in step S406 specifically includes:
s4060, when the first comparison point in the peripheral connected region point set cannot be matched with the second comparison point at the corresponding position in the current identification region, further querying whether a third comparison point that can be matched with the first comparison point exists in an eight-connected region of the second comparison point.
S4062, when the third contrast point exists, marking the third contrast point as a compensation identical point of the first contrast point.
The eight connected regions, namely the eight neighborhoods, refer to the upper, lower, left, right, upper left, upper right, lower left and lower right of the corresponding pixel positions, are adjacent positions and obliquely adjacent positions, and have eight directions in total. The matching means that the values of the two points (in the two-dimensional matrix) are equal.
S4064, counting the compensation same point number of the current identification area point set and the peripheral connected area point set.
The graph edge compensation algorithm is based on the local features of the target area, so that peripheral contour features are compensated in a mode of expanding and traversing points around pixel points in the graph recognition process to obtain missed similar points in the common graph recognition algorithm, and meanwhile, the situation that the areas which do not accord with the local features of the target area are recognized can be avoided.
Returning to fig. 4, in step S408, the similarity between the current identification area and the target area is calculated according to the first same point number, the second same point number, and the compensation same point number.
Specifically, different similarity calculation formulas may be configured according to different application scenarios. For example:
(1) Similarity s = ((w/(k-b) + (b + ab)/(k-w))/2,; or
(2) Similarity s = ((w/(k-b-ab) + (b + ab)/(k-w))/2
Wherein w is the first same point number, b is the second same point number, ab is the compensation same point number, and k is the total point number of the target region point set.
The formula (1) strengthens the proportion of the peripheral outline point set, and the formula (2) is balanced and can be selected according to different scenes.
In addition, a weight can be added to the compensation same point number, such as:
(3) The similarity s = ((w/(k-b-ab λ) + (b + ab λ)/(k-w))/2.
The weight λ may be self-defined, or may be dynamically obtained through a certain parameter, such as: λ = b/tb, where tb is the total number of points in the set of peripheral contour points.
In other embodiments, other feasible similarity calculation formulas may also be configured according to the needs and specific features of the actual application scenarios, and are not described herein again.
Further referring to fig. 6, a detailed flow chart of the step S408 is shown. In this embodiment, the step S408 specifically includes:
s4080, setting a plurality of similarity calculation formulas according to the first same point number, the second same point number and different proportions of the compensation same point number in similarity calculation.
Such as the similarity calculation formulas (1), (2), (3) or other possible formulas.
S4082, selecting a current similarity calculation formula from the similarity calculation formulas according to the current identified scene.
For example, when a higher density point of the peripheral contour point set is required, the above similarity calculation formula (1) may be selected as the current similarity calculation formula.
S4084, calculating the similarity between the current identification area and the target area according to the first same point number, the second same point number, the compensation same point number and the current similarity calculation formula.
The similarity between the current recognition area and the target area can be calculated by substituting the first same point number, the second same point number, and the compensation same point number calculated in steps S404 and S406 into the selected current similarity calculation formula, for example, the similarity calculation formula (1).
Returning to fig. 4, in step S410, a graph corresponding to the target area in the picture is identified according to the similarity calculation result and a predetermined threshold.
Specifically, the similarity between all the identified regions and the target region is obtained by traversing the two-dimensional matrix and calculating the similarity between each current identified region and the target region, and then a required graph region is obtained according to a preset threshold value, namely, the graph identification result in the picture which is the same as the target region is obtained. In this embodiment, when the calculated similarity is greater than or equal to the threshold, it indicates that the current recognition area and the target area have the same pattern, that is, the current recognition area satisfies the condition.
For example, the threshold is set to 0.8, if the calculated similarity s > =0.8, the condition is satisfied, the identification region is retained, then the next identification region is traversed until all picture regions (the whole two-dimensional matrix) are traversed, and finally all the retained identification regions are regions with the same graph as the target region. For example, the target area in the upper drawing is a square, and the number and the positions of all squares in the picture that are the same as the square can be searched according to the embodiment.
The image recognition method provided by the embodiment can be used for carrying out similarity recognition aiming at the condition of image pixel deformation, and the method adds an image edge compensation algorithm while carrying out image recognition according to the local characteristics of the target area, so that the method has good recognition degree on the image which is deformed after the image pixel adjustment. In addition, the algorithm carries out compensation based on the graph edge, is a compensation algorithm based on the local feature of the target region, and can filter noise points which do not accord with the local feature of the target region while improving the recognition degree. Meanwhile, the method has certain expandability, can adopt a similarity calculation formula suitable for different scenes, further optimizes on the basis of optimizing a pattern recognition algorithm, and has wide application scenes and obvious optimization effect.
The present invention also provides another embodiment, which is to provide a computer-readable storage medium storing a pattern recognition program, the pattern recognition program being executable by at least one processor to cause the at least one processor to perform the steps of the pattern recognition method as described above.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention or portions thereof contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (9)

1. A method for pattern recognition based on pixel deformation, the method comprising:
acquiring a target area from a picture to be identified, wherein the target area comprises an internal communication area and a peripheral communication area;
traversing in the picture according to the target area to obtain a current identification area;
calculating the first same point number of the current identification area and the internal communication area;
calculating the second same points of the current identification area and the peripheral communication area, and obtaining the compensation same points of the current identification area and the peripheral communication area through a graph edge compensation algorithm;
calculating the similarity between the current identification area and the target area according to the first same point number, the second same point number and the compensation same point number, wherein the similarity comprises the following steps: setting a plurality of similarity calculation formulas according to different proportions of the first same point number, the second same point number and the compensation same point number in similarity calculation; selecting a current similarity calculation formula from the multiple similarity calculation formulas according to the current recognition scene; calculating the similarity between the current identification area and the target area according to the first same point number, the second same point number, the compensation same point number and the current similarity calculation formula; and
and identifying a graph corresponding to the target area in the picture according to the similarity calculation result and a preset threshold value.
2. The method of claim 1, wherein the obtaining the target region from the picture to be identified comprises:
receiving a target point selected by a user in the picture, wherein the target point is any point in the internal communication area of the target area;
and acquiring a target region point set through a four-connectivity recursion algorithm according to the target point to obtain the target region, wherein the target region point set comprises a peripheral connected region point set and an internal connected region point set.
3. The method of claim 2, wherein said traversing in the picture according to the target region, obtaining a current identified region comprises:
and offsetting in a binary two-dimensional matrix corresponding to the picture according to the point number and the position of the target area point set to obtain the current identification area point set.
4. The method of claim 3, wherein said calculating a first same number of points for said current identified region and said internal connected region comprises:
and comparing the values of each point in the current identification region point set with the values of each point at the corresponding position in the internal connected region point set, calculating the same point number of the current identification region point set and the internal connected region point set, and recording the same point number as the first same point number.
5. The method of claim 3 or 4, wherein said calculating a second same number of points for the current identified region and the peripheral connected region comprises:
and comparing the values of each point in the current identification area point set with the values of each point at the corresponding position in the peripheral connected area point set, calculating the same point number of the current identification area point set and the peripheral connected area point set, and recording the same point number as the second same point number.
6. The method of claim 5, wherein said deriving the compensated same number of points for the current identified region and the peripheral connected region by a graph edge compensation algorithm comprises:
when a first comparison point in the peripheral connected region point set cannot be matched with a second comparison point at a corresponding position in the current identification region, further inquiring whether a third comparison point which can be matched with the first comparison point exists in an eight-connected region of the second comparison point;
when the third contrast point exists, marking the third contrast point as a compensated identical point of the first contrast point;
and counting to obtain the compensation same point number of the current identification area point set and the peripheral connected area point set.
7. A pattern recognition system, the system comprising:
the acquisition module is used for acquiring a target area from a picture to be identified, wherein the target area comprises an internal communication area and a peripheral communication area, and traversing the picture according to the target area to acquire a current identification area;
the comparison module is used for calculating the first same points of the current identification area and the internal communication area, calculating the second same points of the current identification area and the peripheral communication area, and obtaining the compensation same points of the current identification area and the peripheral communication area through a graph edge compensation algorithm;
a calculating module, configured to calculate a similarity between the current identification area and the target area according to the first same point number, the second same point number, and the compensation same point number, including: setting a plurality of similarity calculation formulas according to different proportions of the first same point number, the second same point number and the compensation same point number in similarity calculation; selecting a current similarity calculation formula from the multiple similarity calculation formulas according to the current recognition scene; calculating the similarity between the current identification area and the target area according to the first same point number, the second same point number, the compensation same point number and the current similarity calculation formula;
and the judging module is used for identifying the graph corresponding to the target area in the picture according to the similarity calculation result and a preset threshold value.
8. An electronic device, comprising a memory, a processor, and a pattern recognition program stored on the memory and operable on the processor, wherein the pattern recognition program, when executed by the processor, implements the steps of the pattern recognition method according to any one of claims 1-6.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a pattern recognition program, which is executable by at least one processor to cause the at least one processor to perform the steps of the pattern recognition method according to any one of claims 1-6.
CN202011478867.XA 2020-12-15 2020-12-15 Image recognition method and system based on pixel deformation, electronic device and storage medium Active CN112507910B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011478867.XA CN112507910B (en) 2020-12-15 2020-12-15 Image recognition method and system based on pixel deformation, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011478867.XA CN112507910B (en) 2020-12-15 2020-12-15 Image recognition method and system based on pixel deformation, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN112507910A CN112507910A (en) 2021-03-16
CN112507910B true CN112507910B (en) 2023-01-17

Family

ID=74973706

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011478867.XA Active CN112507910B (en) 2020-12-15 2020-12-15 Image recognition method and system based on pixel deformation, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN112507910B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107886495A (en) * 2017-09-30 2018-04-06 北京得华机器人技术研究院有限公司 A kind of auto-parts defect identification method based on similarity mode
CN111401326A (en) * 2020-04-21 2020-07-10 招商局金融科技有限公司 Target identity recognition method based on picture recognition, server and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008234203A (en) * 2007-03-19 2008-10-02 Ricoh Co Ltd Image processing apparatus
CN105631449B (en) * 2015-12-21 2019-06-28 华为技术有限公司 A kind of picture segmentation method, device and equipment
CN105760842A (en) * 2016-02-26 2016-07-13 北京大学 Station caption identification method based on combination of edge and texture features

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107886495A (en) * 2017-09-30 2018-04-06 北京得华机器人技术研究院有限公司 A kind of auto-parts defect identification method based on similarity mode
CN111401326A (en) * 2020-04-21 2020-07-10 招商局金融科技有限公司 Target identity recognition method based on picture recognition, server and storage medium

Also Published As

Publication number Publication date
CN112507910A (en) 2021-03-16

Similar Documents

Publication Publication Date Title
CN107633526B (en) Image tracking point acquisition method and device and storage medium
CN109753838A (en) Two-dimensional code identification method, device, computer equipment and storage medium
CN110807110B (en) Image searching method and device combining local and global features and electronic equipment
CN108960280B (en) Picture similarity detection method and system
CN107578011A (en) The decision method and device of key frame of video
US9131193B2 (en) Image-processing device removing encircling lines for identifying sub-regions of image
CN107610097A (en) Instrument localization method, device and terminal device
CN111080665B (en) Image frame recognition method, device, equipment and computer storage medium
CN107563986B (en) Image area judgment method and system
CN111260655A (en) Image generation method and device based on deep neural network model
CN111401424B (en) Target detection method, device and electronic system
CN112507910B (en) Image recognition method and system based on pixel deformation, electronic device and storage medium
CN113627423A (en) Circular seal character recognition method and device, computer equipment and storage medium
CN110807342B (en) Bar code positioning method, bar code positioning device, computer equipment and storage medium
CN109919164B (en) User interface object identification method and device
CN116229098A (en) Image recognition method based on mask contour tracking and related products
CN113255696B (en) Image recognition method, device, computer equipment and storage medium
JP5761353B2 (en) Ridge direction extraction device, ridge direction extraction method, ridge direction extraction program
CN111753573B (en) Two-dimensional code image recognition method and device, electronic equipment and readable storage medium
CN111256712B (en) Map optimization method and device and robot
CN115221910A (en) Two-dimensional code identification method, device and equipment and computer readable storage medium
CN112507921B (en) Target area-based graphic searching method, system, electronic device and storage medium
CN112529923A (en) Control identification method and device
CN112825141B (en) Method and device for recognizing text, recognition equipment and storage medium
CN117252767A (en) Text picture correction method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant