CN114821134A - Method for identifying print style number of publication based on template matching - Google Patents

Method for identifying print style number of publication based on template matching Download PDF

Info

Publication number
CN114821134A
CN114821134A CN202210753828.9A CN202210753828A CN114821134A CN 114821134 A CN114821134 A CN 114821134A CN 202210753828 A CN202210753828 A CN 202210753828A CN 114821134 A CN114821134 A CN 114821134A
Authority
CN
China
Prior art keywords
layer
template
numbers
weighting
pixel point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210753828.9A
Other languages
Chinese (zh)
Other versions
CN114821134B (en
Inventor
王艳彬
孙宪景
郑博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Blue Color World Education Technology Co ltd
Original Assignee
Shandong Blue Color World Education Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Blue Color World Education Technology Co ltd filed Critical Shandong Blue Color World Education Technology Co ltd
Priority to CN202210753828.9A priority Critical patent/CN114821134B/en
Publication of CN114821134A publication Critical patent/CN114821134A/en
Application granted granted Critical
Publication of CN114821134B publication Critical patent/CN114821134B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/40Document-oriented image-based pattern recognition
    • G06V30/41Analysis of document content
    • G06V30/418Document matching, e.g. of document images
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The invention discloses a method for identifying the number of a printed publication based on template matching, belonging to the technical field of image processing; the method comprises the following steps: presetting a plurality of template layers containing different numbers; acquiring a digital image to be identified; acquiring a plurality of template layers to be selected which are matched with the numbers on the weighting single layer; acquiring a position feature set and a stroke thickness feature set of numbers on each weighted list layer; judging and acquiring a template layer to be selected which is matched with the numbers on the weighting list layer, and acquiring the digital information on the weighting list layer according to the matched template layer to be selected; the analogy in turn identifies the information of each digit in the digital image. According to the method, the pixel values in the digital area in the template image are weighted, a digital recognition algorithm based on template matching is improved, and the efficiency and the accuracy of digital recognition of the print form of the publication are improved.

Description

Method for identifying print style number of publication based on template matching
Technical Field
The invention relates to the technical field of image processing, in particular to a method for recognizing printed numbers of publications based on template matching.
Background
Print number recognition has been a research hotspot in the field of pattern recognition. With the rapid development of the information of the modern society, we are surrounded by the digital age, and the numbers are advancing to replace the conversation and the expression and memory of the literal languages. In the mobile phone number, the driving license number, the identification card number and the physical examination table, a series of digital information expressions expressing identity, capability, object and health quality are all expressed by the combination of Arabic numbers 0 to 9 interpreted by a computer.
Therefore, one of the key points in the design of a processing system for solving such problems is to design a digital recognition method with high reliability and high recognition rate. However, none of the prior art methods for recognizing numbers can achieve perfect recognition. The traditional matching algorithm has large calculation amount, more complex calculation and low matching efficiency; the variation of the mean gray level value of the scene in the image field affects the correctness of the matching result, and the inadaptability to the azimuth rotation and the scale scaling of the scene in the image field.
Disclosure of Invention
The invention provides a method for identifying the number of a printed publication based on template matching, which improves the digital identification algorithm based on template matching by weighting the pixel values in the digital area in a template image and improves the efficiency and the accuracy of the digital identification of the printed publication.
The invention aims to provide a method for identifying the number of a printed publication based on template matching, which comprises the following steps:
presetting a plurality of template layers containing different numbers, wherein each template layer is a binary image and has equal size;
the method comprises the steps of obtaining a digital image to be identified, and dividing each digital in the digital image into a plurality of first single image layers according to a communication domain of each digital; normalizing each first single image layer to obtain a plurality of second single image layers with the same size as the template image layers; in each template layer and the second single layer, the gray value of the pixel point in the digital area is 1, and the gray value of the pixel point in the background area is 0;
traversing each pixel point of each template layer along the horizontal direction to obtain a plurality of pixel point sequences which are continuously arranged and have a gray value of 1; acquiring the step length of each pixel point sequence in each template layer and the weight of a pixel point in each pixel point sequence; weighting the template image layer according to the weight of the pixel points in each pixel point sequence to sequentially obtain a weighted template image layer corresponding to each number; sequentially comparing each second single image layer in a similar manner, and performing weighting processing to obtain a plurality of weighted single image layers;
arranging a plurality of transverse lines on each weighting template layer at equal intervals from top to bottom, and respectively counting the number of first intersection points of each transverse line and the number on the corresponding weighting template layer; sequentially obtaining the second intersection point number of each transverse line and the number on the weighting single layer by analogy;
acquiring a plurality of template layers to be selected which are matched with the numbers on the weighting single layers according to the number of second intersection points of each transverse line on one weighting single layer and the number of first intersection points on each weighting template layer;
acquiring a position feature set of numbers on each template layer to be selected according to the weight of all pixel points on each horizontal line corresponding to the template layer to be selected in each template layer to be selected; acquiring a stroke thickness characteristic set of numbers on each template layer to be selected according to the step length of the pixel point sequence corresponding to each horizontal line in each template layer to be selected; sequentially analogizing to obtain a position feature set and a stroke thickness feature set of the numbers on each weighted list layer;
judging and acquiring a template layer to be selected matched with the numbers on the weighting list layer according to a position feature set and a stroke thickness feature set of the numbers on the weighting list layer and a position feature set and a stroke thickness feature set of the numbers on each template layer to be selected, and acquiring digital information on the weighting list layer according to the matched template layer to be selected;
the analogy in turn identifies the information for each digit in the digital image.
In an embodiment, the step length of each pixel point sequence and the weight of the pixel point in each pixel point sequence are obtained according to the following steps:
counting the number of pixels in each pixel sequence to obtain the length of each pixel sequence;
setting the weight variation from the pixel points at two ends of each pixel point sequence to the central pixel point to be an arithmetic progression, and setting the weight of each pixel point in each pixel point sequenceThe sum of (1) and the initial weights of the pixel points at both ends of each pixel point sequence are all set as
Figure 734649DEST_PATH_IMAGE001
Wherein N is the transverse length of the template layer;
acquiring the step length of each pixel point sequence according to the length of each pixel point sequence and the initial weight of the pixels at the two corresponding ends;
and then, acquiring the weight of the pixel point in each pixel point sequence according to the step length of each pixel point sequence and the initial weight of the pixel points at the two corresponding ends.
In an embodiment, in the process of acquiring a plurality of template layers to be selected, which are matched with the numbers on the weighting single layer, if one template layer to be selected is acquired, the number information on the weighting single layer is acquired according to the template layer to be selected.
In an embodiment, the template layers to be selected that are matched with the numbers on the weighting list layer are obtained according to the following steps:
acquiring the position of each transverse line on a weighting list layer and the number of second intersection points between the position and the numbers on the weighting list layer; acquiring the number of first intersection points of transverse lines at the same position on the weighting single layer and numbers on the weighting template layer;
and judging a plurality of weighting template layers matched with the numbers on the weighting list layers according to the difference between the position of each transverse line on one weighting list layer and the number of first intersection points of the position and the numbers on the weighting list layer, and the number of second intersection points of the transverse lines at the same position on the weighting list layer and the numbers on the weighting template layers, namely the plurality of weighting template layers to be selected.
In an embodiment, the calculation formula for determining the weighted template layers matched with the numbers on the weighted list layer is as follows:
Figure 860212DEST_PATH_IMAGE002
in the formula (I), the compound is shown in the specification,
Figure 225466DEST_PATH_IMAGE003
second represented on the weighted list layer
Figure 490225DEST_PATH_IMAGE004
The number of second intersection points of the transverse lines and the numbers on the weighting list layer;
Figure 305734DEST_PATH_IMAGE005
second expressed on the weighted template layer
Figure 859206DEST_PATH_IMAGE004
The number of first intersection points of the transverse lines and the numbers on the weighted template layer;
Figure 418364DEST_PATH_IMAGE006
the difference value between the number of the first intersection points of the numbers of each transverse line on the weighting list layer and the number of the second intersection points of the transverse lines at the same position on the weighting list layer and the numbers on the weighting template layer is represented;
when in use
Figure 803209DEST_PATH_IMAGE007
If so, judging that the numbers on the weighted single layer are not matched with the numbers on the weighted template layer;
when in use
Figure 930565DEST_PATH_IMAGE008
And judging that the numbers on the weighting list layer are matched with the numbers on the weighting template layer, and sequentially acquiring a plurality of weighting template layers matched with the numbers on the weighting list layer, namely a plurality of weighting template layers to be selected.
In an embodiment, the position feature set of the numbers on each layer of the template to be selected is obtained according to the following steps:
according to the weight values of all pixel points on each transverse line corresponding to the template layer to be selected in each template layer to be selected, and according to the sequence of the transverse lines from top to bottom in the template layer to be selected, sequentially corresponding each transverse line to the template to be selectedObtaining a position feature set by the weight arrangement of all pixel points on the plate layer, namely the position feature set is
Figure 236912DEST_PATH_IMAGE009
And N represents the transverse length of the template layer to be selected.
In an embodiment, the stroke weight feature set of the numbers on each layer of the template to be selected is obtained according to the following steps:
according to the step length of the pixel point sequence corresponding to each horizontal line in each template layer to be selected, arranging the step length of the pixel point sequence corresponding to each horizontal line in sequence from top to bottom in the template layer to be selected to obtain a stroke thickness feature set, namely the stroke thickness feature set
Figure 68602DEST_PATH_IMAGE010
Wherein, in the step (A),
Figure 573533DEST_PATH_IMAGE011
and representing the total number of intersection points of all the transverse lines on the layer of the template to be selected.
In an embodiment, in the process of determining and acquiring the template layer to be selected that matches the numbers on the weighting list layer, the template layer to be selected that matches the numbers on the weighting list layer is determined according to a numerical difference between a position feature set of the numbers on a weighting list layer and a corresponding position in a position feature set of the numbers on each template layer to be selected, and a numerical difference between a stroke thickness feature set of the numbers on the weighting list layer and a corresponding position in a stroke thickness feature set of the numbers on each template layer to be selected.
In an embodiment, the calculation formula for determining the template layer to be selected that matches the number on the weighted list layer is as follows:
Figure 72123DEST_PATH_IMAGE012
in the formula (I), the compound is shown in the specification,
Figure 990400DEST_PATH_IMAGE013
representing weighted sheetsA position feature set corresponding to the image layer;
Figure 563464DEST_PATH_IMAGE014
representing the position feature set corresponding to the template layer to be selected,
Figure 595005DEST_PATH_IMAGE009
Figure 595322DEST_PATH_IMAGE015
the position feature set corresponding to the weighted single layer and the position feature set corresponding to the template layer to be selected
Figure 735316DEST_PATH_IMAGE016
Absolute value of difference of the individual values;
Figure 49754DEST_PATH_IMAGE017
representing a stroke thickness characteristic set corresponding to the weighting single layer;
Figure 263698DEST_PATH_IMAGE018
representing the stroke weight characteristic set corresponding to the template layer to be selected,
Figure 497233DEST_PATH_IMAGE010
Figure 796627DEST_PATH_IMAGE019
the first stroke weight characteristic set corresponding to the weighted list layer and the second stroke weight characteristic set corresponding to the template layer to be selected
Figure 649177DEST_PATH_IMAGE020
Absolute value of difference of the individual values;
Figure 714697DEST_PATH_IMAGE021
representing the matching error value of the number on the weighting list layer and the number on the template layer to be selected;
obtaining tolerable error components
Figure 384713DEST_PATH_IMAGE022
When in use
Figure 905824DEST_PATH_IMAGE023
If so, judging that the numbers on the weighting list layer are matched with the numbers on the weighting template layer to be selected,
when in use
Figure 562065DEST_PATH_IMAGE024
If so, judging that the numbers on the weighting list layer are not matched with the numbers on the weighting template layer to be selected,
wherein the allowable error component
Figure 485021DEST_PATH_IMAGE022
The method comprises the steps of calculating the number on the known correctly matched weighted single layer and the number on the weighted template layer to be selected
Figure 325938DEST_PATH_IMAGE021
Value derived tolerable error component
Figure 537608DEST_PATH_IMAGE022
In one embodiment, in the process of dividing each digital in the digital image into a plurality of first single image layers according to its own connected domain, the method further comprises:
acquiring a digital image to be identified; converting the digital image into a binary image after graying;
opening operation processing is carried out on the binary image, and isolated small points and burrs on the digital image are removed;
and dividing a plurality of first single image layers according to the connected domain of each number in the binary image.
The invention has the beneficial effects that:
the invention provides a method for identifying the number of printed publication based on template matching, which carries out the first rough matching by calculating the number of intersection points of a transverse line and numbers on a layer, and can obviously improve the matching speed due to the great reduction of the data volume. The matching of the digital position and the stroke thickness for the second time is accurate matching, and the matching is carried out according to the weight and the change characteristics of each pixel point, so that the error is reduced. The template matching times are reduced, the calculation time is short, the feature weighting is reduced, the matching error is reduced, the efficiency and the accuracy of the number recognition of the printed matter are improved, the character recognition rejection rate is continuously reduced along with the increase of the character template library, and the character recognition rate is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a flow chart illustrating the general steps of an embodiment of a method for identifying numbers of printed matters of a publication based on template matching according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The invention aims at the problem that the digital identification of the printed matter of the publication is greatly interfered by the quality of the scanned digital image, and can reduce the identification rate of character images with shielding, defects and pollution. By weighting the pixel values in the digital area in the template image, the digital recognition algorithm based on template matching is improved, and the efficiency and accuracy of digital recognition of print forms of publications are improved.
According to the invention, the importance of the edge point and the center point in the character stroke is different according to the morphological refinement principle, and the 'skeleton', namely the importance of the center point to the two edge points, is gradually reduced, so that the concept of weight can be added into the character template according to the logic to generate the character template with weighted characteristics, further the digital recognition algorithm based on template matching is improved, and the efficiency and the accuracy of digital recognition of the print style of the publication are improved.
The invention provides a method for identifying printed numbers of a publication based on template matching, which is shown in figure 1 and comprises the following steps:
s1, presetting a plurality of template layers containing different numbers, wherein each template layer is a binary image and has equal size;
in this embodiment, the standard digital template image size is set as
Figure 122173DEST_PATH_IMAGE025
Therefore, the divided digital images are normalized
Figure 165215DEST_PATH_IMAGE025
A pixel-sized template layer.
S2, acquiring digital images to be recognized, and dividing each digital image into a plurality of first single image layers according to the communication domain of each digital image; normalizing each first single image layer to obtain a plurality of second single image layers with the same size as the template image layers; in each template layer and the second single layer, the gray value of the pixel point in the digital area is 1, and the gray value of the pixel point in the background area is 0;
wherein, in the course of dividing each digit in the digital image into a plurality of first single image layers according to its own connected domain, also include:
acquiring a digital image to be identified; converting the digital image into a binary image after graying;
opening operation processing is carried out on the binary image, and isolated small points and burrs on the digital image are removed;
and dividing a plurality of first single image layers according to the connected domain of each number in the binary image.
In this embodiment, the number of the publication print is converted into image information by the optical scanning technique. Because the image can generate noise in the process of acquisition and transmission, and the noise exists, the subsequent image processing and analysis can be influenced to a certain extent, the noise with higher probability can be processed by using the self-adaptive median filter, and the details can be saved while the non-impulse noise is smoothed, wherein the self-adaptive median filter of a 3 x 3 template is adopted in the embodiment;
then carrying out gray processing on the digital image, obtaining a gray threshold T by using an Otsu algorithm, converting the digital image into a binary image according to the gray threshold T, wherein the pixel point of a digital area is 1, and the pixel point of a background area is 0;
performing opening operation processing on the binary image to remove isolated small points and burrs on the digital image;
segmenting characters according to connected domains of the digital regions, wherein each individual number can form a connected image domain, and acquiring the starting position and the ending position of the row and the column of each connected domain so as to obtain the minimum circumscribed rectangle of a single character, thereby completing the work of segmenting the characters and acquiring a first single layer corresponding to the individual number;
finally, performing geometric transformation on the first single image layers with different sizes to enable the first single image layers to have the same size as the template image layers, wherein the bilinear interpolation algorithm is used for performing normalization processing on the first single image layers of all numbers; thereby obtaining a second monolayer layer corresponding to the individual number.
S3, traversing each pixel point of each template layer along the horizontal direction to obtain a plurality of pixel point sequences which are continuously arranged and have the gray value of 1; acquiring the step length of each pixel point sequence in each template layer and the weight of a pixel point in each pixel point sequence; weighting the template image layer according to the weight of the pixel points in each pixel point sequence to sequentially obtain a weighted template image layer corresponding to each number; sequentially comparing each second single image layer in a similar manner, and performing weighting processing to obtain a plurality of weighted single image layers;
it should be noted that the central point of the digital stroke in the template layer has a greater effect, and the edge point of the stroke has a smaller influence on the recognition of the whole character, so that the pixel points in the digital region are weighted according to the positions of the pixel points in the template image to enhance the accuracy of template matching.
The step length of each pixel point sequence and the weight of the pixel points in each pixel point sequence are obtained according to the following steps:
counting the number of pixels in each pixel sequence to obtain the length of each pixel sequence;
setting the weight changes from the pixel points at two ends of each pixel point sequence to the central pixel point to be equal difference series, setting the sum of the weights of each pixel point in each pixel point sequence to be 1, and setting the initial weights of the pixel points at two ends of each pixel point sequence to be equal
Figure 786821DEST_PATH_IMAGE001
Wherein N is the transverse length of the template layer;
acquiring the step length of each pixel point sequence according to the length of each pixel point sequence and the initial weight of the pixel points at the corresponding two ends;
and then, acquiring the weight of the pixel point in each pixel point sequence according to the step length of each pixel point sequence and the initial weight of the pixel points at the two corresponding ends.
In this embodiment, the step of obtaining a sequence of more pixel points in the whole image is as follows:
(1) firstly, traversing a template layer from the left to the right from a pixel point at the upper left corner, marking the pixel point as A when the pixel point is encountered, continuing to traverse until the pixel point is encountered as 0, marking the previous pixel point as B, namely acquiring a continuously arranged pixel point sequence with the gray value of 1, marking the front-end pixel point of the pixel point sequence as A, marking the rear-end pixel point as B, and acquiring the length C from A to B by counting the number of the pixel points in the pixel point sequence;
(2) continuously traversing the row, if the pixel point can be met to be 1, continuously marking according to the step (1), and counting the pixel point sequence and the length of the pixel point sequence until the row is traversed;
(3) and (5) returning to the steps (1) and (2) to traverse the next line until the whole image is traversed.
Thus obtaining a length set of each pixel point sequence
Figure 610420DEST_PATH_IMAGE026
Wherein n is the number of pixel point sequences in the digital region.
Then, the sequence of the z-th pixel point is taken, the length of which is
Figure 874043DEST_PATH_IMAGE027
In this embodiment, the weights from the two edge pixels to the central pixel on the pixel point sequence are gradually increased, and the weights from the two edge points A, B to the central point Q are respectively changed into an arithmetic progression to obtain two identical and gradually increased arithmetic progression AQ and BQ, and an initial term is set
Figure 768662DEST_PATH_IMAGE028
Wherein N is the transverse length of the template layer, the sum of the weights on the pixel point sequence is 1, which can be known according to the characteristic of the arithmetic progression,
if it is
Figure 685802DEST_PATH_IMAGE027
Even number, the step length of the two same arithmetic progression
Figure 137643DEST_PATH_IMAGE029
Comprises the following steps:
Figure 736115DEST_PATH_IMAGE030
wherein the content of the first and second substances,
Figure 816066DEST_PATH_IMAGE027
representing the total length of two identical arithmetic progression,
Figure 45053DEST_PATH_IMAGE031
for the initial term, two parameters are known, and the step length is obtained
Figure 843245DEST_PATH_IMAGE029
The value of (c).
If it is
Figure 245408DEST_PATH_IMAGE027
If the number is odd, the two same arithmetic progression share the center point Q, and the step length thereof
Figure 55232DEST_PATH_IMAGE029
Comprises the following steps:
Figure 579754DEST_PATH_IMAGE032
wherein
Figure 271766DEST_PATH_IMAGE033
Representing the total length of two identical arithmetic progression,
Figure 415303DEST_PATH_IMAGE031
for the initial term, two parameters are known, and the step length is obtained
Figure 139021DEST_PATH_IMAGE029
The value of (c). Thus, the data sets of the two arithmetic progression are obtained, and the corresponding data sets are endowed to the pixel point sequence, so as to obtain the weight of each pixel point of the pixel point sequence.
Set available by the same way
Figure 100024DEST_PATH_IMAGE027
The weight of each pixel point in all the pixel point sequences is obtained, namely the weight of each pixel point in the digital region is obtained
Figure 217015DEST_PATH_IMAGE034
Wherein
Figure 288877DEST_PATH_IMAGE035
The coordinates of the pixel points in the corresponding digital region.
And finally, weighting the template layers of all the numbers according to the mode to process the weighted template layer corresponding to each number, wherein the numbers comprise all the numbers from 0 to 9.
S4, arranging a plurality of transverse lines on each weighting template layer at equal intervals from top to bottom, and respectively counting the number of first intersection points of each transverse line and the number on the corresponding weighting template layer; sequentially analogizing to obtain the number of second intersection points of each transverse line and the number on the weighting list layer;
acquiring a plurality of template layers to be selected which are matched with the numbers on the weighting single layers according to the number of second intersection points of each transverse line on one weighting single layer and the number of first intersection points on each weighting template layer;
in this embodiment, three horizontal lines are arranged on each weighted template layer from top to bottom at equal intervals
Figure 135610DEST_PATH_IMAGE036
The weighted template layer can be divided into four parts by the three transverse lines, and the positions respectively taken in the three transverse directions are positioned in the weighted template layer
Figure 877301DEST_PATH_IMAGE037
Figure 871802DEST_PATH_IMAGE038
Figure 153878DEST_PATH_IMAGE039
Therein is disclosed
Figure 58381DEST_PATH_IMAGE040
Is the longitudinal length of the weighted template layer, which is recorded as
Figure 95607DEST_PATH_IMAGE041
Figure 452770DEST_PATH_IMAGE042
And
Figure 866434DEST_PATH_IMAGE043
(ii) a Respectively count
Figure 422180DEST_PATH_IMAGE041
Figure 768323DEST_PATH_IMAGE042
Figure 471837DEST_PATH_IMAGE043
Sum of horizontal line to weight of pixel point on weighted template layer
Figure 95716DEST_PATH_IMAGE005
Wherein
Figure 974811DEST_PATH_IMAGE004
1 to 3; since the sum of the weights of the pixel points in the single pixel point sequence is 1, that is
Figure 760364DEST_PATH_IMAGE003
A first number of intersections representing the horizontal line and the number;
similarly, the number of second intersection points of the numbers on the weighted list layer is obtained
Figure 216753DEST_PATH_IMAGE003
In addition, in the process of acquiring a plurality of template layers to be selected, which are matched with the numbers on the weighting single layer, if one template layer to be selected is acquired, the number information on the weighting single layer is acquired according to the template layer to be selected.
The intersection feature table of the obtained transverse line and the numbers on the weighting template layer is shown in the following table 1:
table 1 is a cross point feature table of numbers on the layer of the horizontal line and the weighted template
Digital signature 0 1 2 3 4 5 6 7 8 9
Transverse line is on
Figure 582006DEST_PATH_IMAGE037
Point of intersection of
2 1 2 2 2 1 2 1 2 2
Transverse line is on
Figure 846766DEST_PATH_IMAGE038
Point of intersection of
2 1 1 1 2 1 2 1 1 1
Transverse line is on
Figure 334379DEST_PATH_IMAGE039
Point of intersection of
2 1 2 2 1 2 2 1 2 2
It should be noted that, the number of intersection points obtained when the fonts of numbers in the template layer and the single layer are different is also different, and for this reason, it is specified that both the digital fonts in the template layer and the single layer are arials in this embodiment.
The plurality of template layers to be selected which are matched with the numbers on the weighting single layer are obtained according to the following steps:
acquiring the position of each transverse line on a weighting list layer and the number of second intersection points between the position and the numbers on the weighting list layer; acquiring the number of first intersection points of transverse lines at the same position on the weighting single layer and numbers on the weighting template layer;
and judging a plurality of weighting template layers matched with the numbers on the weighting list layers according to the difference between the position of each transverse line on one weighting list layer and the number of first intersection points of the position and the numbers on the weighting list layer, and the number of second intersection points of the transverse lines at the same position on the weighting list layer and the numbers on the weighting template layers, namely the plurality of weighting template layers to be selected.
Specifically, the calculation formula for determining the plurality of weighted template layers matched with the numbers on the weighted list layer is as follows:
Figure 12485DEST_PATH_IMAGE044
in the formula (I), the compound is shown in the specification,
Figure 978167DEST_PATH_IMAGE003
second represented on the weighted list layer
Figure 94503DEST_PATH_IMAGE004
The number of second intersection points of the transverse lines and the numbers on the weighting list layer;
Figure 690700DEST_PATH_IMAGE005
second expressed on the weighted template layer
Figure 121682DEST_PATH_IMAGE004
The number of first intersection points of the transverse lines and the numbers on the weighted template layer;
Figure 891055DEST_PATH_IMAGE006
the difference value between the number of the first intersection points of the numbers of each transverse line on the weighting list layer and the number of the second intersection points of the transverse lines at the same position on the weighting list layer and the numbers on the weighting template layer is represented;
when in use
Figure 68089DEST_PATH_IMAGE007
If so, judging that the numbers on the weighted single layer are not matched with the numbers on the weighted template layer;
when in use
Figure 225401DEST_PATH_IMAGE008
And judging that the numbers on the weighting list layer are matched with the numbers on the weighting template layer, and sequentially acquiring a plurality of weighting template layers matched with the numbers on the weighting list layer, namely a plurality of weighting template layers to be selected.
S5, acquiring a position feature set of numbers on each template layer to be selected according to the weight of all pixel points on each transverse line corresponding to the template layer to be selected in each template layer to be selected; acquiring a stroke thickness characteristic set of numbers on each template layer to be selected according to the step length of the pixel point sequence corresponding to each horizontal line in each template layer to be selected; sequentially analogizing to obtain a position feature set and a stroke thickness feature set of the numbers on each weighted list layer;
it should be noted that, the length of each pixel sequence is longer, and the step length is smaller. And three straight lines
Figure 815782DEST_PATH_IMAGE036
The length of each pixel point sequence is the thickness degree of the character stroke at the intersection point, so the thickness degree of the character stroke at the intersection point can be represented by the step size.
The position feature set of the numbers on each template layer to be selected is obtained according to the following steps:
according to the weights of all pixel points on each horizontal line corresponding to the template layer to be selected in each template layer to be selected, sequentially arranging the weights of all the pixel points on each horizontal line corresponding to the template layer to be selected in the template layer to be selected from top to bottom according to the sequence of the horizontal lines in the template layer to be selected to obtain a position feature set, namely the position feature set is
Figure 326529DEST_PATH_IMAGE009
And N represents the transverse length of the template layer to be selected.
The stroke weight characteristic set of the numbers on each template layer to be selected is obtained according to the following steps:
according to the step length of the pixel point sequence corresponding to each horizontal line in each template layer to be selectedSequentially arranging the step length of the pixel point sequence corresponding to each transverse line according to the sequence of the transverse lines from top to bottom in the layer of the template to be selected to obtain a stroke thickness characteristic set, namely the stroke thickness characteristic set
Figure 420387DEST_PATH_IMAGE010
Wherein, in the step (A),
Figure 748600DEST_PATH_IMAGE011
and representing the total number of intersection points of all the transverse lines on the layer of the template to be selected.
S6, judging and acquiring a template layer to be selected which is matched with the numbers on the weighting list layer according to the position feature set and the stroke thickness feature set of the numbers on the weighting list layer and the position feature set and the stroke thickness feature set of the numbers on each template layer to be selected, and acquiring the digital information on the weighting list layer according to the matched template layer to be selected;
the analogy in turn identifies the information for each digit in the digital image.
In the process of judging and acquiring the template layer to be selected matched with the numbers on the weighting single layer, judging the template layer to be selected matched with the numbers on the weighting single layer according to the numerical difference value of the corresponding position in the position feature set of the numbers on the weighting single layer and the position feature set of the numbers on each template layer to be selected, and the numerical difference value of the corresponding position in the stroke thickness feature set of the numbers on the weighting single layer and the stroke thickness feature set of the numbers on each template layer to be selected.
Specifically, the calculation formula for judging the template layer to be selected that is matched with the numbers on the weighting list layer is as follows:
Figure 763961DEST_PATH_IMAGE012
in the formula (I), the compound is shown in the specification,
Figure 203032DEST_PATH_IMAGE013
representing a position feature set corresponding to the weighting single layer;
Figure 437484DEST_PATH_IMAGE014
representing the position feature set corresponding to the template layer to be selected,
Figure 546385DEST_PATH_IMAGE009
Figure 845780DEST_PATH_IMAGE015
the position feature set corresponding to the weighted single layer and the position feature set corresponding to the template layer to be selected
Figure 822963DEST_PATH_IMAGE016
Absolute value of difference of the individual values;
Figure 94675DEST_PATH_IMAGE017
representing a stroke thickness characteristic set corresponding to the weighting single layer;
Figure 436795DEST_PATH_IMAGE018
representing the stroke weight characteristic set corresponding to the template layer to be selected,
Figure 285802DEST_PATH_IMAGE010
Figure 942043DEST_PATH_IMAGE019
the first stroke weight characteristic set corresponding to the weighted list layer and the second stroke weight characteristic set corresponding to the template layer to be selected
Figure 130579DEST_PATH_IMAGE020
Absolute value of difference of the individual values;
Figure 378020DEST_PATH_IMAGE021
representing the matching error value of the number on the weighting list layer and the number on the template layer to be selected;
obtaining tolerable error components
Figure 979903DEST_PATH_IMAGE022
When in use
Figure 171325DEST_PATH_IMAGE023
If so, judging that the numbers on the weighting list layer are matched with the numbers on the weighting template layer to be selected,
when in use
Figure 542264DEST_PATH_IMAGE024
If so, judging that the numbers on the weighting list layer are not matched with the numbers on the weighting template layer to be selected; if the two characters are judged not to be matched, the intersection point characteristic is returned
Figure 226186DEST_PATH_IMAGE006
And matching, comparing with the next template character, and if no matched character exists in the template library, sending rejection information to request manual identification.
Wherein the allowable error component
Figure 659573DEST_PATH_IMAGE045
Obtaining a weighting single layer and a weighting template layer to be selected which are known to be correctly matched according to manual identification, and calculating the weighting single layer and the weighting template layer to be selected
Figure 719933DEST_PATH_IMAGE021
Value, in turn to obtain a correct match
Figure 945378DEST_PATH_IMAGE021
In the collection, get the collection
Figure 737884DEST_PATH_IMAGE021
Maximum value, as tolerable error component
Figure 314359DEST_PATH_IMAGE045
Sequentially obtaining templates corresponding to 0-9 digits according to the stepsFirst of layer
Figure 912831DEST_PATH_IMAGE004
The number of first intersection points of the transverse lines and the numbers on the template layer, and a position feature set and a stroke thickness feature set of the numbers on each template layer; constructing a template library;
according to the acquired digital image to be recognized, acquiring the corresponding first single image layer in the digital image to be recognized through the operation
Figure 602569DEST_PATH_IMAGE004
The number of second intersection points of the transverse lines and the numbers on the single image layer, and a position feature set and a stroke thickness feature set of the numbers on the single image layer; and acquiring a template layer matched with the number on the single image layer to be recognized from a template library through the number of the intersection points, the position feature set of the number on the layer and the stroke thickness feature set, so as to recognize the number information on the unknown digital image to be recognized according to the known template layer.
In this embodiment, the method further includes updating the template library to improve the recognition rate of the characters: the method comprises the following specific steps:
in this embodiment, the feature table of the recognized character is compared with the feature tables of the standard characters in the template library one by one, and when the feature table of a certain character in the template library is found to be matched with the feature table of the certain character, the recognized character is determined to be the standard character. However, due to the diversity of the digital fonts, standard characters which are not matched with the recognized characters exist in the template library, the system sends rejection information to prompt a user that the recognition result is not accurate enough, manual recognition is suggested, then the feature table of the rejected characters is added into the template library in the form of a template file through learning, and accurate recognition can be carried out when the characters with the features are encountered later.
In this embodiment, the number of intersections between the horizontal line and the numbers on the image layers is calculated to perform the first rough matching, and since the data amount is greatly reduced, the matching speed can be significantly increased. The matching of the digital position and the stroke thickness for the second time is accurate matching, and the matching is carried out according to the weight and the change characteristics of each pixel point, so that the error is reduced. The template matching times are reduced, the calculation time is short, the feature weighting is reduced, the matching error is reduced, the efficiency and the accuracy of the number recognition of the printed matter are improved, the character recognition rejection rate is continuously reduced along with the increase of the character template library, and the character recognition rate is improved.
The present invention is not limited to the above preferred embodiments, and any modifications, equivalent substitutions, improvements, etc. within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A method for recognizing print numbers of publications based on template matching is characterized by comprising the following steps:
presetting a plurality of template layers containing different numbers, wherein each template layer is a binary image and has equal size;
the method comprises the steps of obtaining a digital image to be identified, and dividing each digital in the digital image into a plurality of first single image layers according to a communication domain of each digital; normalizing each first single image layer to obtain a plurality of second single image layers with the same size as the template image layers; in each template layer and the second single layer, the gray value of the pixel point in the digital area is 1, and the gray value of the pixel point in the background area is 0;
traversing each pixel point of each template layer along the horizontal direction to obtain a plurality of pixel point sequences which are continuously arranged and have a gray value of 1; acquiring the step length of each pixel point sequence in each template layer and the weight of a pixel point in each pixel point sequence; weighting the template image layer according to the weight of the pixel points in each pixel point sequence to sequentially obtain a weighted template image layer corresponding to each number; sequentially comparing each second single image layer in a similar manner, and performing weighting processing to obtain a plurality of weighted single image layers;
arranging a plurality of transverse lines on each weighting template layer at equal intervals from top to bottom, and respectively counting the number of first intersection points of each transverse line and the number on the corresponding weighting template layer; sequentially analogizing to obtain the number of second intersection points of each transverse line and the number on the weighting list layer;
acquiring a plurality of template layers to be selected which are matched with the numbers on the weighting single layers according to the number of second intersection points of each transverse line on one weighting single layer and the number of first intersection points on each weighting template layer;
acquiring a position feature set of numbers on each template layer to be selected according to the weight of all pixel points on each horizontal line corresponding to the template layer to be selected in each template layer to be selected; acquiring a stroke thickness characteristic set of numbers on each template layer to be selected according to the step length of the pixel point sequence corresponding to each horizontal line in each template layer to be selected; sequentially analogizing to obtain a position feature set and a stroke thickness feature set of the numbers on each weighted list layer;
judging and acquiring a template layer to be selected matched with the numbers on the weighting list layer according to a position feature set and a stroke thickness feature set of the numbers on the weighting list layer and a position feature set and a stroke thickness feature set of the numbers on each template layer to be selected, and acquiring digital information on the weighting list layer according to the matched template layer to be selected;
the analogy in turn identifies the information for each digit in the digital image.
2. The method for identifying printed numbers of publications based on template matching according to claim 1, wherein the step length of each pixel point sequence and the weight of the pixel points in each pixel point sequence are obtained by the following steps:
counting the number of pixels in each pixel sequence to obtain the length of each pixel sequence;
setting the weight changes from the pixel points at two ends of each pixel point sequence to the central pixel point to be equal difference series, setting the sum of the weights of each pixel point in each pixel point sequence to be 1, and setting the initial weights of the pixel points at two ends of each pixel point sequence to be equal
Figure 171353DEST_PATH_IMAGE001
Wherein N is the transverse length of the template layer;
acquiring the step length of each pixel point sequence according to the length of each pixel point sequence and the initial weight of the pixels at the two corresponding ends;
and then, acquiring the weight of the pixel point in each pixel point sequence according to the step length of each pixel point sequence and the initial weight of the pixel points at the two corresponding ends.
3. The method according to claim 1, wherein in the process of obtaining a plurality of template layers to be selected that match the numbers on the weighted list layer, if one template layer to be selected is obtained, the digital information on the weighted list layer is obtained according to the template layer to be selected.
4. The method for identifying numbers of printed publication based on template matching according to claim 1, wherein the template layers to be selected that match the numbers on the weighted list layer are obtained according to the following steps:
acquiring the position of each transverse line on a weighting list layer and the number of second intersection points between the position and the numbers on the weighting list layer; acquiring the number of first intersection points of transverse lines at the same position on the weighting single layer and numbers on the weighting template layer;
and judging a plurality of weighting template layers matched with the numbers on the weighting list layers according to the difference between the position of each transverse line on one weighting list layer and the number of first intersection points of the position and the numbers on the weighting list layer, and the number of second intersection points of the transverse lines at the same position on the weighting list layer and the numbers on the weighting template layers, namely the plurality of weighting template layers to be selected.
5. The method according to claim 4, wherein the formula for determining the weighted template layers matching the numbers on the weighted list layers is as follows:
Figure 760597DEST_PATH_IMAGE002
in the formula (I), the compound is shown in the specification,
Figure 212438DEST_PATH_IMAGE003
second represented on the weighted list layer
Figure 138806DEST_PATH_IMAGE004
The number of second intersection points of the transverse lines and the numbers on the weighting list layer;
Figure 645790DEST_PATH_IMAGE005
second expressed on the weighted template layer
Figure 140356DEST_PATH_IMAGE004
The number of first intersection points of the transverse lines and the numbers on the weighted template layer;
Figure 938548DEST_PATH_IMAGE006
the difference value between the number of the first intersection points of the numbers of each transverse line on the weighting list layer and the number of the second intersection points of the transverse lines at the same position on the weighting list layer and the numbers on the weighting template layer is represented;
when in use
Figure 809552DEST_PATH_IMAGE007
If so, judging that the numbers on the weighted single layer are not matched with the numbers on the weighted template layer;
when in use
Figure 150534DEST_PATH_IMAGE008
And judging that the numbers on the weighting list layer are matched with the numbers on the weighting template layer, and sequentially acquiring a plurality of weighting template layers matched with the numbers on the weighting list layer, namely a plurality of weighting template layers to be selected.
6. The method for identifying numbers of printed publications based on template matching according to claim 1, wherein the position feature set of the numbers on each template layer to be selected is obtained according to the following steps:
according to the weights of all pixel points on each horizontal line corresponding to the template layer to be selected in each template layer to be selected, sequentially arranging the weights of all the pixel points on each horizontal line corresponding to the template layer to be selected in the template layer to be selected from top to bottom according to the sequence of the horizontal lines in the template layer to be selected to obtain a position feature set, namely the position feature set is
Figure 675057DEST_PATH_IMAGE009
And N represents the transverse length of the template layer to be selected.
7. The method for identifying numbers in a print form of a publication according to claim 6, wherein the stroke weight feature set of the numbers on each template layer to be selected is obtained according to the following steps:
according to the step length of the pixel point sequence corresponding to each horizontal line in each template layer to be selected, arranging the step length of the pixel point sequence corresponding to each horizontal line in sequence from top to bottom in the template layer to be selected to obtain a stroke thickness feature set, namely the stroke thickness feature set
Figure 101490DEST_PATH_IMAGE010
Wherein, in the step (A),
Figure 510606DEST_PATH_IMAGE011
and representing the total number of intersection points of all the transverse lines on the layer of the template to be selected.
8. The method for recognizing the numbers of the printed publications based on the template matching as claimed in claim 7, wherein in the process of judging and acquiring the template layer to be selected matching with the numbers on the weighted list layer, the template layer to be selected matching with the numbers on the weighted list layer is judged according to the numerical difference between the position feature set of the numbers on a weighted list layer and the corresponding position in the position feature set of the numbers on each template layer to be selected, and the numerical difference between the stroke thickness feature set of the numbers on the weighted list layer and the corresponding position in the stroke thickness feature set of the numbers on each template layer to be selected.
9. The method according to claim 8, wherein the calculation formula for determining the template layer to be selected that matches the number on the weighted list layer is as follows:
Figure 706095DEST_PATH_IMAGE012
in the formula (I), the compound is shown in the specification,
Figure DEST_PATH_IMAGE013
representing a position feature set corresponding to the weighting single layer;
Figure 870360DEST_PATH_IMAGE014
representing the position feature set corresponding to the template layer to be selected,
Figure 781159DEST_PATH_IMAGE009
Figure 728387DEST_PATH_IMAGE015
the position feature set corresponding to the weighted single layer and the position feature set corresponding to the template layer to be selected
Figure 43962DEST_PATH_IMAGE016
Absolute value of difference of the individual values;
Figure 910287DEST_PATH_IMAGE017
representing a stroke thickness characteristic set corresponding to the weighting single layer;
Figure 311312DEST_PATH_IMAGE018
representing the stroke weight characteristic set corresponding to the template layer to be selected,
Figure 921285DEST_PATH_IMAGE010
Figure 91366DEST_PATH_IMAGE019
the first stroke weight characteristic set corresponding to the weighted list layer and the second stroke weight characteristic set corresponding to the template layer to be selected
Figure 269538DEST_PATH_IMAGE020
Absolute value of difference of the individual values;
Figure 16914DEST_PATH_IMAGE021
representing the matching error value of the number on the weighting list layer and the number on the template layer to be selected;
obtaining tolerable error components
Figure 571523DEST_PATH_IMAGE022
When in use
Figure 330532DEST_PATH_IMAGE023
If so, judging that the numbers on the weighting list layer are matched with the numbers on the weighting template layer to be selected,
when in use
Figure 69818DEST_PATH_IMAGE024
If so, judging that the numbers on the weighting list layer are not matched with the numbers on the weighting template layer to be selected,
wherein the allowable error component
Figure 176926DEST_PATH_IMAGE022
The method comprises the steps of calculating the number on the known correctly matched weighted single layer and the number on the weighted template layer to be selected
Figure 4068DEST_PATH_IMAGE021
Value derived tolerable error component
Figure 273375DEST_PATH_IMAGE022
10. The method for identifying numbers on a printed matter based on template matching according to claim 1, wherein in the process of dividing each number in the digital image into a plurality of first single image layers according to its own connected domain, the method further comprises:
acquiring a digital image to be identified; converting the digital image into a binary image after graying;
opening operation processing is carried out on the binary image, and isolated small points and burrs on the digital image are removed;
and dividing a plurality of first single image layers according to the connected domain of each number in the binary image.
CN202210753828.9A 2022-06-30 2022-06-30 Method for identifying print style number of publication based on template matching Active CN114821134B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210753828.9A CN114821134B (en) 2022-06-30 2022-06-30 Method for identifying print style number of publication based on template matching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210753828.9A CN114821134B (en) 2022-06-30 2022-06-30 Method for identifying print style number of publication based on template matching

Publications (2)

Publication Number Publication Date
CN114821134A true CN114821134A (en) 2022-07-29
CN114821134B CN114821134B (en) 2022-09-02

Family

ID=82522229

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210753828.9A Active CN114821134B (en) 2022-06-30 2022-06-30 Method for identifying print style number of publication based on template matching

Country Status (1)

Country Link
CN (1) CN114821134B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010198305A (en) * 2009-02-25 2010-09-09 Amano Corp Vehicle number information reading system
CN102024144A (en) * 2010-11-23 2011-04-20 上海海事大学 Container number identification method
CN102509383A (en) * 2011-11-28 2012-06-20 哈尔滨工业大学深圳研究生院 Feature detection and template matching-based mixed number identification method
CN104463195A (en) * 2014-11-08 2015-03-25 沈阳工业大学 Printing style digital recognition method based on template matching
CN105574531A (en) * 2015-12-11 2016-05-11 中国电力科学研究院 Intersection point feature extraction based digital identification method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010198305A (en) * 2009-02-25 2010-09-09 Amano Corp Vehicle number information reading system
CN102024144A (en) * 2010-11-23 2011-04-20 上海海事大学 Container number identification method
CN102509383A (en) * 2011-11-28 2012-06-20 哈尔滨工业大学深圳研究生院 Feature detection and template matching-based mixed number identification method
CN104463195A (en) * 2014-11-08 2015-03-25 沈阳工业大学 Printing style digital recognition method based on template matching
CN105574531A (en) * 2015-12-11 2016-05-11 中国电力科学研究院 Intersection point feature extraction based digital identification method

Also Published As

Publication number Publication date
CN114821134B (en) 2022-09-02

Similar Documents

Publication Publication Date Title
CN109829453B (en) Method and device for recognizing characters in card and computing equipment
US5901239A (en) Skin pattern and fingerprint classification system
CN108596197B (en) Seal matching method and device
CN110032938B (en) Tibetan recognition method and device and electronic equipment
CN109919160B (en) Verification code identification method, device, terminal and storage medium
CN109740606B (en) Image identification method and device
CN109241861B (en) Mathematical formula identification method, device, equipment and storage medium
US20170308768A1 (en) Character information recognition method based on image processing
CN108197644A (en) A kind of image-recognizing method and device
CN108830275B (en) Method and device for identifying dot matrix characters and dot matrix numbers
WO2017161636A1 (en) Fingerprint-based terminal payment method and device
CN110647795A (en) Form recognition method
CN111523622B (en) Method for simulating handwriting by mechanical arm based on characteristic image self-learning
CN114038004A (en) Certificate information extraction method, device, equipment and storage medium
CN110738030A (en) Table reconstruction method and device, electronic equipment and storage medium
CN115457565A (en) OCR character recognition method, electronic equipment and storage medium
CN111339932B (en) Palm print image preprocessing method and system
CN111950559A (en) Pointer instrument automatic reading method based on radial gray scale
CN114417904A (en) Bar code identification method based on deep learning and book retrieval system
CN114821134B (en) Method for identifying print style number of publication based on template matching
CN112101343A (en) License plate character segmentation and recognition method
CN109726722B (en) Character segmentation method and device
CN114387592B (en) Character positioning and identifying method under complex background
CN111488870A (en) Character recognition method and character recognition device
CN112906690A (en) License plate segmentation model training method, license plate segmentation method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant