CN111553423A - Handwriting recognition method based on deep convolutional neural network image processing technology - Google Patents

Handwriting recognition method based on deep convolutional neural network image processing technology Download PDF

Info

Publication number
CN111553423A
CN111553423A CN202010354929.XA CN202010354929A CN111553423A CN 111553423 A CN111553423 A CN 111553423A CN 202010354929 A CN202010354929 A CN 202010354929A CN 111553423 A CN111553423 A CN 111553423A
Authority
CN
China
Prior art keywords
neural network
convolutional neural
network
image
handwriting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010354929.XA
Other languages
Chinese (zh)
Inventor
李明亮
刘海丰
周永旭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hebei GEO University
Original Assignee
Hebei GEO University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hebei GEO University filed Critical Hebei GEO University
Priority to CN202010354929.XA priority Critical patent/CN111553423A/en
Publication of CN111553423A publication Critical patent/CN111553423A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Multimedia (AREA)
  • Character Discrimination (AREA)
  • Image Analysis (AREA)

Abstract

The invention belongs to the technical field of image classification, and particularly discloses a handwriting recognition method based on a deep convolutional neural network image processing technology. The deep convolution-based neural network provided by the invention can accurately identify the directly extracted handwritten picture, and has high identification accuracy and high operation speed and convergence speed. The method is suitable for assisting in identifying the handwriting on the answer sheet or the answer sheet and outputting the handwriting as a standard font.

Description

Handwriting recognition method based on deep convolutional neural network image processing technology
Technical Field
The invention belongs to the technical field of image classification, relates to a handwriting recognition method, and particularly relates to a handwriting recognition method based on a deep convolutional neural network image processing technology.
Background
In recent years, various large-scale examinations such as college entrance examination and national english level examination adopt an online examination paper reading method, so that a large amount of material and labor are saved, and a large amount of time is saved. However, in the current online paper reading, only examinee test papers are photographed and stored, answer areas are sequentially divided according to the spacing distance of the handwriting in each piece of answer handwriting information to form a test quantity set and an answer area set so as to determine the number of the test questions and the number of the effective answer areas, and then each effective answer area of each piece of answer handwriting information is subjected to information identification and statistics to determine the answer of the test questions. The handwriting of each examinee is reserved by the examinees without exception, so that the teacher can sometimes make the examination and the examination trouble and get into a pit for word guessing, and if the examinees always judge the examination papers with beautiful writings, the teacher can also possibly check beauty and fatigue, which leads the subjective evaluation of the teacher to dominate.
With the birth of second generation artificial intelligence learning system transducer Flow developed by google based on distbel f (distributed deep learning platform of google corporation), the system is widely applied to the fields of image recognition, voice recognition and other multiple machine learning and deep learning. The sensor Flow is a system for transmitting a complex data structure to an artificial intelligent Neural Network for analysis and processing, the system identifies images based on a PC, and in addition, a Deep Convolutional Neural Network (DCNN) is also applied to the field of handwritten Chinese character identification. However, for the Chinese characters at present, only the print form can be directly extracted for efficient recognition, and the directly extracted handwriting form cannot be accurately recognized.
Disclosure of Invention
The invention aims to provide a handwriting recognition method based on a deep convolutional neural network image processing technology, so as to solve the problem that the directly extracted handwriting cannot be accurately recognized in the prior art.
In order to achieve the purpose, the technical method comprises the following steps:
a handwriting recognition method based on a deep convolutional neural network image processing technology comprises the following steps:
s1, extracting handwriting picture materials from the homework submitted by the student and storing the handwriting picture materials in a PKL format;
s2, reading data in a PKL format, carrying out binarization and normalization processing on the picture material, and then randomly distributing the processed picture material into a first test set, a first verification set and a first training set according to the proportion;
s3, obtaining a distilled data set by the first training set through a data distillation technology, inputting the distilled data set into a convolutional neural network with configured parameters and structures, and training the convolutional neural network into an image optimization network;
s4, inputting data of the first verification set into an image optimization network, classifying the first verification set by the image optimization network, adjusting parameters of the image optimization network according to classification results, training the image optimization network by the first training set again until the effect of classifying the first verification set reaches an expected effect, obtaining the image optimization network, and testing the classification accuracy of the image optimization network by the first test set;
s5, subdividing the picture material into a second test set, a second training set and a second verification set, building a deep convolutional neural network, training the convolutional neural network by using the second training set, verifying by using the second verification set and the second test set, and continuously adjusting network parameters until the expected accuracy is reached;
and S6, scanning the student test paper or answer sheet to read the student writing area, inputting the scanned picture into the image optimization network obtained in the step S4 after binarization and normalization processing, performing image optimization, performing character segmentation, inputting the image into the deep convolutional neural network, converting the handwriting of the student and outputting the handwriting as a standard font.
As a limitation: the handwritten picture materials in the step S1 include an HWDB1.1 data set, an MNIST data set, and a homemade data set with roman symbols, chemical symbols, and mathematical formulas as contents; the manufacturing process of the self-made data set comprises the following steps: the method comprises the steps of performing character segmentation on pictures containing roman symbols, chemical symbols and mathematical formula operation submitted by students to enable each character to be independently stored into an image, then performing binarization processing, adjusting the pixel size of the image, and manually marking out a standard font of each character.
As a further limitation: the method adopted by the binarization processing in the steps S2 and S4 is otsu method.
As a further limitation: the normalization processing in the steps S2, S4, and S5 employs a method standardized by min-max.
As another limitation: the image optimization network in step S3 is a reverse synthesis space transformation network, and the specific network structure sequentially includes, from top to bottom: two convolutional layers, one pooling layer, two fully-connected layers.
As a further limitation: the deep convolutional neural network in the step S4 has a structure of two convolutional layers, one pooling layer, and two fully-connected layers from top to bottom; and converting the output of the convolutional layer by adopting a Relu activation function, and classifying and outputting the one-dimensional vectors output by the full-link layer by adopting a softmax cross entropy loss method.
As a last definition: the character cutting method in step S6 specifically includes: horizontally projecting the picture, finding an upper limit and a lower limit of each line, and then cutting; and then carrying out vertical projection on each cut line, finding the left and right boundaries of each character, and then cutting a single character.
Due to the adoption of the scheme, compared with the prior art, the invention has the beneficial effects that:
(1) according to the handwriting recognition method based on the deep convolutional neural network image processing technology, a structural framework combining an image optimization network and a convolutional neural network is built and trained to form a complete and trained network framework, so that a directly extracted handwriting picture can be accurately recognized;
(2) according to the handwriting recognition method based on the deep convolutional neural network image processing technology, a large amount of handwriting picture materials are used for training an image optimization network, the optimization speed of the image optimization network on the handwriting picture materials is improved, the convolutional neural network is trained by using high-quality data, the recognition accuracy is improved, and the neural network can efficiently and accurately recognize and output pixel values as standard fonts;
(3) the handwriting recognition method based on the deep convolutional neural network image processing technology provided by the invention has the advantages that the handwriting picture material is stored into a readable RKL format and subjected to binarization processing, the operation speed is increased, normalization processing is performed, the output of a convolutional layer is converted by adopting a Relu activation function, and the convergence speed of the network is increased.
The invention is suitable for assisting in identifying the handwriting on the test paper or the answer sheet and outputting the handwriting as a standard font.
Drawings
The invention is described in further detail below with reference to the figures and the embodiments.
FIG. 1 is a flow chart of network training according to an embodiment of the present invention;
FIG. 2 is a flow chart of handwriting recognition according to an embodiment of the present invention;
FIG. 3 is a handwriting before recognition according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating standard fonts output after recognition according to an embodiment of the present invention.
Detailed Description
The present invention is further described with reference to the following examples, but it should be understood by those skilled in the art that the present invention is not limited to the following examples, and any modifications and variations based on the specific examples of the present invention are within the scope of the claims of the present invention.
Handwriting recognition method based on deep convolutional neural network image processing technology
A handwriting recognition method based on a deep convolutional neural network image processing technology comprises the following steps:
s1, extracting handwriting picture materials from the homework submitted by the student and storing the handwriting picture materials in a PKL format;
in the step, the handwritten picture materials comprise an HWDB1.1 data set, an MNIST data set and a homemade data set which takes Roman symbols, chemical symbols and mathematical formulas as contents; the manufacturing process of the self-made data set comprises the following steps: the method comprises the steps of performing character segmentation on pictures containing roman symbols, chemical symbols and mathematical formula operation submitted by students to enable each character to be independently stored into an image, then performing binarization processing, adjusting the pixel size of the image, and manually marking out a standard font of each character.
And S2, reading data in a PKL format, carrying out binarization and normalization processing on the picture material, and randomly distributing the data into a first test set, a first verification set and a first training set according to the ratio of 1:2: 7.
S3, obtaining a distilled data set by the first training set through a data distillation technology, inputting the distilled data set into a convolutional neural network with configured parameters and structures, and training the convolutional neural network into an image optimization network;
the image optimization network is a reverse synthesis space transformation network, and the network structure is as follows:
Conv(7×7,4)+Conv(7×7,8)+P+FC(48)+FC(8)
wherein Conv represents a convolution layer, P represents a pooling layer, FC represents a full-connection layer, the size and the number of convolution kernels are respectively shown in brackets, and the number of neurons is shown in brackets of the full-connection layer;
the data distillation technology comprises the following specific processes: generating a new data set by back-propagating iterative updates according to the first training set and the initialization weights of the convolutional network, i.e. using the first training set and the initialization weights of the convolutional network
Figure BDA0002473129610000061
θ0Is a fixed parameter, using a regenerated data set
Figure BDA0002473129610000065
And new learning rate
Figure BDA00024731296100000613
Through one iteration, updating
Figure BDA0002473129610000067
And
Figure BDA0002473129610000068
the formula of (1) is:
Figure BDA0002473129610000062
Figure BDA0002473129610000069
and
Figure BDA00024731296100000610
obtained by iterative learning of gradient in advance, needs initialization and fixes parameter theta0Update
Figure BDA00024731296100000611
And
Figure BDA0002473129610000063
obtaining data
Figure BDA0002473129610000064
The specific training method comprises the following steps: data to be recorded
Figure BDA00024731296100000612
Inputting the inverse synthesis space transformation network to obtain new deformation parameters, namely I (x), and the distortion parameters are p ═ p1, p2, … and p9]Affine transformation, and the related secondary coordinate transformation matrix is:
Figure BDA0002473129610000071
the training set is multiplied by the initialized warping parameter p, i.e. l (x) ═ g (p) i (x), because the coordinates after the warping transformation image calculation are non-integer, which results in image discontinuity, the invention performs bilinear interpolation on the transformed image l (x):
Figure BDA0002473129610000072
wherein, Vi C(ii) an ith pixel point representing the c sample, (x)i,yi) Denotes the i-th pixel coordinate of the image L (x), W denotesWidth of the image, H represents height of the image, (m, n) represents other points on the image;
inputting the image V (x) into a geometric predictor, wherein the geometric predictor is composed of a neural network or a small convolutional neural network, updating a parameter delta p of the geometric predictor by utilizing a back propagation BP algorithm, the delta p is a forward propagation prediction distortion increment of a backward synthesis space transformation network, and after updating the parameter delta p, further iteratively updating the distortion parameter p in a synthesis mode, namely the distortion parameter p is updated
Figure BDA0002473129610000073
The corresponding transformation matrix is, where w (pin) is the initial matrix, w (pout) is the transformation matrix composed of the final updated warping parameters p:
W(pout)=W(ΔP)·W(pin)
multiplying the final matrix W (pout) consisting of p with the training set, i.e.
Im=I(x)·W(Pout)
And outputting the data Im.
S4, inputting the data of the first verification set into the image optimization network, classifying the first verification set by the image optimization network, adjusting parameters of the image optimization network according to classification results, training the image optimization network to achieve an expected effect of classifying the first verification set by means of the first training set again, and testing the classification accuracy of the image optimization network by means of the first test set.
S5, subdividing the picture material into a second test set, a second training set and a second verification set, building a deep convolutional neural network, training the convolutional neural network by using the second training set, verifying by using the second verification set and the second test set, and continuously adjusting network parameters until the expected accuracy is reached;
the structure of the deep convolutional neural network is as follows:
Conv(3×3,8)+Conv(3×3,16)+P+Conv(3×3,32)+Conv(3×3,64)+P+FC(100)+FC(200)
inputting the second training set into a fixed initialized convolutional neural network, extracting the characteristics of input data by the convolutional layer, and converting the output of the convolutional layer by adopting a Relu activation function, wherein the specific operation formula is as follows:
Figure BDA0002473129610000081
Y=f(U)
wherein Im is the output of the image optimization network, W is a convolution kernel, b is convolution layer bias, U is the convolution layer output, f is a Relu activation function, and Y is the output of the convolution layer output U after the Relu activation function;
the pooling layer performs sampling operation on input data samples in a two-dimensional space, and the specific calculation process is as follows:
Figure BDA0002473129610000082
the full connection layer reduces the dimension of the input two-dimensional feature matrix to a one-dimensional feature vector;
and classifying and outputting the one-dimensional vectors output by the full-connection layer by adopting a softmax cross entropy loss method.
And S6, scanning the student test paper or answer sheet to read the student writing area, inputting the scanned picture into the image optimization network obtained in the step S4 after binarization and normalization processing, performing image optimization, performing character segmentation, inputting the image into the deep convolutional neural network, converting the handwriting of the student and outputting the handwriting as a standard font.
The character segmentation method specifically comprises the following steps: horizontally projecting the picture, finding an upper limit and a lower limit of each line, and then cutting; and then carrying out vertical projection on each cut line, finding the left and right boundaries of each character, and then cutting a single character.
The method adopted by the binarization processing in the steps S2 and S4 is an otsu method, and the specific method comprises the following steps: for image I (x, y), the segmentation threshold for the foreground (i.e., object) and background is denoted as T, and the ratio of the foreground image to the entire image is denoted as ω0Average gray level mu of0(ii) a The proportion of the background image to the whole image is omega1Average gray of μ1The total average gray scale of the image is recorded as mu, the inter-class variance is recorded as g, and M × N is the total number of pixelsThe number of pixels in the image with the gray value of the pixel less than the threshold value T is recorded as N0The number of pixels having a pixel gray level greater than the threshold T is denoted by N1Then, then
The foreground image is
Figure BDA0002473129610000091
Background image ratio of
Figure BDA0002473129610000092
The sum of the foreground pixels and the background pixels is N0+N1=M×N
The total ratio of the background image to the foreground image is omega01=1
The gray scale integration value in the 0-M gray scale interval is mu ═ mu0011
The inter-class variance value is g ═ omega0*(μ-μ0)21*(μ-μ1)2=ω01*(μ01)2
The threshold T which maximizes the inter-class variance can be obtained by adopting a traversal method.
The normalization processing in steps S2, S4, and S5 uses a method of min-max normalization, and maps the data into the [0,1] interval by transforming the original data, and the calculation formula is:
Figure BDA0002473129610000101
x″=x′*(mx-mi)+mi
in the formula, max and min are the maximum value and the minimum value of one column, x "is the result, and mx and mi are the mapped interval values, that is, mx is 1 and mi is 0.
The network training flowchart of this embodiment is shown in fig. 1, the handwriting recognition process is shown in fig. 2, the handwriting of the "each" character in the student work is extracted, the "each" handwriting is shown in fig. 3, the input is input to the image optimization network obtained in step S4 after binarization and normalization processing for image optimization and character segmentation, and then the input is input to the deep convolutional neural network for converting the handwriting of the student into the standard fonts "each" shown in fig. 4.

Claims (10)

1. A handwriting recognition method based on a deep convolutional neural network image processing technology is characterized by comprising the following steps:
s1, extracting handwriting picture materials from the homework submitted by the student and storing the handwriting picture materials in a PKL format;
s2, reading data in a PKL format, carrying out binarization and normalization processing on the picture material, and then randomly distributing the processed picture material into a first test set, a first verification set and a first training set according to the proportion;
s3, obtaining a distilled data set by the first training set through a data distillation technology, inputting the distilled data set into a convolutional neural network with configured parameters and structures, and training the convolutional neural network into an image optimization network;
s4, inputting data of the first verification set into an image optimization network, classifying the first verification set by the image optimization network, adjusting parameters of the image optimization network according to classification results, training the image optimization network by the first training set again until the effect of classifying the first verification set reaches an expected effect, and testing the classification accuracy of the image optimization network by the first test set;
s5, subdividing the picture material into a second test set, a second training set and a second verification set, building a deep convolutional neural network, training the convolutional neural network by using the second training set, verifying by using the second verification set and the second test set, and continuously adjusting network parameters until the expected accuracy is reached;
and S6, scanning the student test paper or answer sheet to read the student writing area, inputting the scanned picture into the image optimization network obtained in the step S4 after binarization and normalization processing, performing image optimization, performing character segmentation, inputting the image into the deep convolutional neural network, converting the handwriting of the student and outputting the handwriting as a standard font.
2. The handwriting recognition method based on deep convolutional neural network image processing technology of claim 1, wherein the handwriting picture materials in step S1 include HWDB1.1 dataset, MNIST dataset, and homemade dataset with roman symbol, chemical symbol, mathematical formula as content; the manufacturing process of the self-made data set comprises the following steps: the method comprises the steps of performing character segmentation on pictures containing roman symbols, chemical symbols and mathematical formula operation submitted by students to enable each character to be independently stored into an image, then performing binarization processing, adjusting the pixel size of the image, and manually marking out a standard font of each character.
3. The handwriting recognition method based on deep convolutional neural network image processing technique of claim 1 or 2, wherein the method adopted by the binarization processing in steps S2 and S4 is otsu method.
4. The handwriting recognition method based on deep convolutional neural network image processing technique of claim 1 or 2, wherein the normalization process in steps S2, S4 and S5 is a min-max normalization method.
5. The handwriting recognition method based on deep convolutional neural network image processing technique of claim 3, wherein the normalization process in steps S2, S4 and S5 is a method standardized for min-max.
6. The handwriting recognition method based on deep convolutional neural network image processing technology according to any one of claims 1, 2 and 5, wherein the image optimization network in step S3 is an inverse synthetic space transform network, and the specific network structures sequentially from top to bottom are: two convolutional layers, one pooling layer, two fully-connected layers.
7. The handwriting recognition method based on deep convolutional neural network image processing technology of claim 3, wherein the image optimization network in step S3 is an inverse synthetic spatial transform network, and the specific network structures are sequentially from top to bottom: two convolutional layers, one pooling layer, two fully-connected layers.
8. The handwriting recognition method based on deep convolutional neural network image processing technology of claim 4, wherein the image optimization network in step S3 is an inverse synthetic spatial transform network, and the specific network structures are sequentially from top to bottom: two convolutional layers, one pooling layer, two fully-connected layers.
9. The handwriting recognition method based on deep convolutional neural network image processing technology of any one of claims 1, 2, 5, 7 and 8, wherein the structure of the deep convolutional neural network in step S4 is, from top to bottom, two convolutional layers, one pooling layer, two fully-connected layers; and converting the output of the convolutional layer by adopting a Relu activation function, and classifying and outputting the one-dimensional vectors output by the full-link layer by adopting a softmax cross entropy loss method.
10. The handwriting recognition method based on the deep convolutional neural network image processing technology of claim 9, wherein the character segmentation method in step S6 is specifically: horizontally projecting the picture, finding an upper limit and a lower limit of each line, and then cutting; and then carrying out vertical projection on each cut line, finding the left and right boundaries of each character, and then cutting a single character.
CN202010354929.XA 2020-04-29 2020-04-29 Handwriting recognition method based on deep convolutional neural network image processing technology Pending CN111553423A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010354929.XA CN111553423A (en) 2020-04-29 2020-04-29 Handwriting recognition method based on deep convolutional neural network image processing technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010354929.XA CN111553423A (en) 2020-04-29 2020-04-29 Handwriting recognition method based on deep convolutional neural network image processing technology

Publications (1)

Publication Number Publication Date
CN111553423A true CN111553423A (en) 2020-08-18

Family

ID=72007836

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010354929.XA Pending CN111553423A (en) 2020-04-29 2020-04-29 Handwriting recognition method based on deep convolutional neural network image processing technology

Country Status (1)

Country Link
CN (1) CN111553423A (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112102126A (en) * 2020-08-31 2020-12-18 湖北美和易思教育科技有限公司 Student manual ability culture method and device based on neural network
CN112308058A (en) * 2020-10-25 2021-02-02 北京信息科技大学 Method for recognizing handwritten characters
CN112507864A (en) * 2020-12-04 2021-03-16 河北地质大学 Credit archive identification method based on convolutional neural network
CN113076900A (en) * 2021-04-12 2021-07-06 华南理工大学 Test paper head student information automatic detection method based on deep learning
CN113642550A (en) * 2021-07-20 2021-11-12 南京红松信息技术有限公司 Entropy maximization card-smearing identification method based on pixel probability distribution statistics
CN114842486A (en) * 2022-07-04 2022-08-02 南昌大学 Handwritten chemical structural formula recognition method, system, storage medium and equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107590497A (en) * 2017-09-20 2018-01-16 重庆邮电大学 Off-line Handwritten Chinese Recognition method based on depth convolutional neural networks
CN109800746A (en) * 2018-12-05 2019-05-24 天津大学 A kind of hand-written English document recognition methods based on CNN

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107590497A (en) * 2017-09-20 2018-01-16 重庆邮电大学 Off-line Handwritten Chinese Recognition method based on depth convolutional neural networks
CN109800746A (en) * 2018-12-05 2019-05-24 天津大学 A kind of hand-written English document recognition methods based on CNN

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
TONGZHOU WANG ET AL.: "DATASET DISTILLATION", 《ARXIV:1811.10959V1 [CS.LG] 27 NOV 2018》 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112102126A (en) * 2020-08-31 2020-12-18 湖北美和易思教育科技有限公司 Student manual ability culture method and device based on neural network
CN112308058A (en) * 2020-10-25 2021-02-02 北京信息科技大学 Method for recognizing handwritten characters
CN112308058B (en) * 2020-10-25 2023-10-24 北京信息科技大学 Method for recognizing handwritten characters
CN112507864A (en) * 2020-12-04 2021-03-16 河北地质大学 Credit archive identification method based on convolutional neural network
CN113076900A (en) * 2021-04-12 2021-07-06 华南理工大学 Test paper head student information automatic detection method based on deep learning
CN113642550A (en) * 2021-07-20 2021-11-12 南京红松信息技术有限公司 Entropy maximization card-smearing identification method based on pixel probability distribution statistics
CN113642550B (en) * 2021-07-20 2024-03-12 南京红松信息技术有限公司 Entropy maximization card-coating identification method based on pixel probability distribution statistics
CN114842486A (en) * 2022-07-04 2022-08-02 南昌大学 Handwritten chemical structural formula recognition method, system, storage medium and equipment

Similar Documents

Publication Publication Date Title
CN111553423A (en) Handwriting recognition method based on deep convolutional neural network image processing technology
CN109886121B (en) Human face key point positioning method for shielding robustness
CN111291629A (en) Method and device for recognizing text in image, computer equipment and computer storage medium
CN111414906A (en) Data synthesis and text recognition method for paper bill picture
CN113239954B (en) Attention mechanism-based image semantic segmentation feature fusion method
CN111401156B (en) Image identification method based on Gabor convolution neural network
CN110969171A (en) Image classification model, method and application based on improved convolutional neural network
CN110674777A (en) Optical character recognition method in patent text scene
CN113421318B (en) Font style migration method and system based on multitask generation countermeasure network
CN113223025A (en) Image processing method and device, and neural network training method and device
CN115147607A (en) Anti-noise zero-sample image classification method based on convex optimization theory
CN115393861B (en) Method for accurately segmenting handwritten text
CN111523622A (en) Method for simulating handwriting by mechanical arm based on characteristic image self-learning
CN114444565A (en) Image tampering detection method, terminal device and storage medium
CN114092938B (en) Image recognition processing method and device, electronic equipment and storage medium
CN113657377B (en) Structured recognition method for mechanical bill image
CN114972952A (en) Industrial part defect identification method based on model lightweight
CN113963232A (en) Network graph data extraction method based on attention learning
CN114220178A (en) Signature identification system and method based on channel attention mechanism
CN113408418A (en) Calligraphy font and character content synchronous identification method and system
Gayathri et al. Optical Character Recognition in Banking Sectors Using Convolutional Neural Network
CN116612478A (en) Off-line handwritten Chinese character scoring method, device and storage medium
CN109815889A (en) A kind of across resolution ratio face identification method based on character representation collection
CN114550179A (en) Method, system and equipment for guiding handwriting Chinese character blackboard writing
CN114241486A (en) Method for improving accuracy rate of identifying student information of test paper

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200818

RJ01 Rejection of invention patent application after publication