CN113240018B - Hand-drawn graph classification method and system based on error back propagation algorithm - Google Patents

Hand-drawn graph classification method and system based on error back propagation algorithm Download PDF

Info

Publication number
CN113240018B
CN113240018B CN202110544233.8A CN202110544233A CN113240018B CN 113240018 B CN113240018 B CN 113240018B CN 202110544233 A CN202110544233 A CN 202110544233A CN 113240018 B CN113240018 B CN 113240018B
Authority
CN
China
Prior art keywords
hand
data
layer
image data
output
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110544233.8A
Other languages
Chinese (zh)
Other versions
CN113240018A (en
Inventor
张黎明
代亚美
赵辉
王勋
林静涵
王洋
孟姣
牛庆然
章国江
霍鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Institute of Technology
Harbin Medical University
Original Assignee
Harbin Institute of Technology
Harbin Medical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Institute of Technology, Harbin Medical University filed Critical Harbin Institute of Technology
Priority to CN202110544233.8A priority Critical patent/CN113240018B/en
Publication of CN113240018A publication Critical patent/CN113240018A/en
Application granted granted Critical
Publication of CN113240018B publication Critical patent/CN113240018B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24137Distances to cluster centroïds
    • G06F18/2414Smoothing the distance, e.g. radial basis function networks [RBFN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computational Linguistics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

A hand-drawn graph classification method and system based on an error back propagation algorithm relates to the technical field of neural networks and is used for solving the problem that hand-drawn image data cannot be effectively classified in the prior art. The technical points of the invention comprise: designing one or more regular graphic depiction templates; acquiring hand-drawn image data according to the regular graph drawing template; preprocessing hand-drawn image data; constructing and training a BP neural network model; and inputting the hand-drawn image data to be classified into the trained BP neural network model to obtain a classification result. The method can distinguish the slight difference between writing tracks, and is more accurate for judging whether the writer has tremor or not. The invention can be applied to clinical medical treatment to judge whether the hand of a patient has tremor.

Description

Hand-drawn graph classification method and system based on error back propagation algorithm
Technical Field
The invention relates to the technical field of neural networks, in particular to a hand-drawn graph classification method and system based on an error back propagation algorithm.
Background
Handwriting movement, an activity dominated by the human higher nervous system, relies on the participation of multiple parts of the brain, and the coordinated coordination of the musculoskeletal system. The handwriting process is completed by a plurality of muscle groups through continuous and mutually overlapped coordinated activities from the aspect of muscle activity. Tremor of one limb or hand sometimes affects writing, as occurs in both parkinson's disease patients and patients with brain type hepatolenticular degeneration. Hand tremor can be seen in many nervous system diseases, such as essential tremor, parkinson's disease, hepatolenticular degeneration, dystonic tremor, cerebellar tremor, and others. At present, the hand tremor condition of a patient cannot be reflected due to the lack of objectivity when the evaluation is carried out by a writing and handwriting scale in clinic. It has been proposed to identify whether a tester has hand tremor by hand-drawing regular patterns, but the most critical point for identifying whether a tester has hand tremor by hand-drawing regular patterns is how to effectively classify hand-drawn image data. However, none of the prior art has studied this in detail.
Disclosure of Invention
In view of the above problems, the present invention provides a method and a system for classifying hand-drawn graphics based on an error back propagation algorithm, so as to solve the problem that the prior art cannot effectively classify hand-drawn image data.
According to one aspect of the invention, a hand-drawn graph classification method based on an error back propagation algorithm is provided, and the method comprises the following steps:
designing one or more regular graph drawing templates;
secondly, obtaining a plurality of groups of handwritten track data through drawing a regular graph drawing template, and carrying out digital processing on the handwritten track data to obtain hand-drawn image data; the handwriting track data comprises abnormal track data corresponding to tremor of hands and normal track data corresponding to no tremor of hands;
step three, preprocessing the hand-drawn image data;
fourthly, constructing a BP neural network model according to the detail characteristics of the handwriting track in the handwriting track data;
step five, dividing the hand-drawn image data into training data and testing data, and training a BP neural network model according to the training data;
step six, inputting test data into a BP neural network model obtained through training for classification, comparing the classification accuracy of the test data with a preset accuracy threshold, if the classification accuracy of the test data is smaller than the preset accuracy threshold, modifying training parameters for retraining until the classification accuracy of the test data reaches or exceeds the preset accuracy threshold, and storing the trained BP neural network model and parameters;
and seventhly, performing digital processing on the handwriting track data to be classified to obtain hand-drawn image data to be classified, and inputting the hand-drawn image data to be classified into the trained BP neural network model to obtain a classification result.
Further, in the first step, the regular pattern drawing template includes four templates, namely an archimedean line, a rectangle, a regular pentagon and a regular hexagon, and the regular pattern drawing template has positioning points thereon.
Furthermore, in the second step, the drawing period of the Archimedes line template is set to be 6 pi, and the drawing period of the rectangular, regular pentagonal and regular hexagonal templates is set to be 5 weeks long.
Further, the specific process of the third step comprises:
firstly, correcting hand-drawn image data according to positioning points to enable a hand-written track to be positioned on a horizontal and vertical standard line;
recording the length and the width of each template, and cutting the hand-drawn image data according to the length and the width of each template to ensure that each hand-drawn image data has a specific size;
and step three, rejecting hand-drawn image data obviously having abnormity, setting labels for the reserved hand-drawn image data, and performing normalization processing.
Further, the BP neural network model in step four includes an input layer, a hidden layer, and an output layer; the input layer comprises 300 nerve units, the hidden layer comprises 2 layers, each layer comprises 4 nerve units, and the output layer comprises 2 nerve units.
Further, the specific process of training in the fifth step includes:
step five, firstly, carrying out forward propagation of signals: transmitting training data to an input layer, randomly distributing a weight value in a range of [ -1,1], randomly distributing an offset value in a range of [0,1], realizing the propagation of a signal from the input layer to an output layer through a hidden layer, and obtaining an output value and an error function;
and fifthly, performing back propagation of errors: calculating partial derivatives of the error function to each nerve unit in the output layer and each nerve unit in the hidden layer by utilizing a chain rule;
fifthly, weight correction is carried out: the error function can be reduced along the negative gradient direction by using the steepest descent method, and the weight is corrected by using the partial derivative of each nerve unit of the output layer, the output and the partial derivative of each nerve unit of the hidden layer and the output of each nerve unit of the input layer;
and fifthly, finally, carrying out multiple iterations, namely adjusting the weights of each layer of signal forward propagation and error backward propagation until reaching the preset training times.
According to another aspect of the present invention, a hand-drawn graph classification system based on an error back propagation algorithm is provided, the system comprising:
the data acquisition module is used for acquiring a plurality of groups of handwritten track data by drawing a designed regular graph drawing template, and performing digital processing on the handwritten track data to acquire hand-drawn image data; the regular graph depicting template comprises four templates of an Archimedes line, a rectangle, a regular pentagon and a regular hexagon, and positioning points are arranged on the regular graph depicting template; the handwriting track data comprises abnormal track data corresponding to tremor of hands and normal track data corresponding to no tremor of hands;
the data preprocessing module is used for preprocessing the hand-drawn image data, and comprises the steps of firstly correcting the hand-drawn image data according to positioning points to enable a handwriting track to be positioned on a horizontal standard line and a vertical standard line; recording the length and width of each template, and cutting the hand-drawn image data according to the length and width of each template to ensure that each hand-drawn image data has a specific size; removing hand-drawn image data obviously having abnormity, setting labels for the reserved hand-drawn image data, and performing normalization processing;
the model training module is used for constructing a BP neural network model according to the detail characteristics of the handwriting track in the handwriting track data; dividing the hand-drawn image data into training data and testing data, and training a BP neural network model according to the training data; inputting test data into a BP neural network model obtained through training for classification, comparing the classification accuracy of the test data with a preset accuracy threshold, modifying training parameters for retraining until the classification accuracy of the test data reaches or exceeds the preset accuracy threshold if the classification accuracy of the test data is smaller than the preset accuracy threshold, and storing the trained BP neural network model and parameters;
and the classification module is used for performing digital processing on the handwriting track data to be classified to obtain hand-drawn image data to be classified, and inputting the hand-drawn image data to be classified into the trained BP neural network model to obtain a classification result.
Furthermore, the data acquisition module sets the drawing period of the Archimedes line template to be 6 pi, and the drawing periods of the rectangular, regular pentagonal and regular hexagonal templates to be 5 weeks.
Further, the specific process of training the BP neural network model according to the training data in the model training module includes:
the forward propagation of the signal is first performed: transmitting training data to an input layer, randomly distributing a weight value in a range of [ -1,1], randomly distributing an offset value in a range of [0,1], realizing the propagation of a signal from the input layer to an output layer through a hidden layer, and obtaining an output value and an error function;
then the back propagation of the error is performed: calculating partial derivatives of the error function to each nerve unit in the output layer and each nerve unit in the hidden layer by utilizing a chain rule;
then, weight correction is carried out: the error function can be reduced along the direction of the negative gradient by using a steepest descent method, and the weight is corrected by using the partial derivative of each nerve unit of the output layer, the output and partial derivative of each nerve unit of the hidden layer and the output of each nerve unit of the input layer;
and finally, carrying out multiple iterations, namely adjusting weights of each layer of signal forward propagation and error backward propagation until reaching the preset training times.
The beneficial technical effects of the invention are as follows:
at present, whether the hands of a patient have tremor is generally judged through visual observation in clinic, however, the tremor is difficult to find when the tremor degree is small, and the hand-drawing graph can better reflect the hand tremor condition of a tester. The method can distinguish slight difference between writing tracks, and is more accurate for judging whether the writer has tremor or not.
The regular graph is a regular graph with characteristics, and the difference between different types of graphs is obvious, so that the detailed characteristics in the regular graph can be conveniently extracted for detection. The types of the regular patterns are rich, and the designed corresponding templates have diversity; and the designed template can ensure a uniform background, and is convenient for preprocessing image data. The error back propagation algorithm realizes a mapping function from input to output, and theories prove that the error back propagation algorithm has the capability of realizing any complex nonlinear mapping and is suitable for solving the problem of complex internal mechanism. The network can automatically extract reasonable solving rules by learning a training set with correct answers, namely has self-learning capability and certain popularization and generalization capability.
Aiming at the hand-drawn regular pattern classification method provided by the invention, the BP neural network model has the obvious advantages of high training speed and high accuracy, and has better fault-tolerant capability, and individual unreasonable data in a sample cannot influence the classification effect of the model. Meanwhile, the BP neural network model is used, so that the classification of the hand-drawn regular graphs is more objective, the subjective influence in manual classification is avoided, and the result is more convincing. The BP neural network spontaneously summarizes the internal relation between the hand-drawn image data through multiple iterations, optimizes the internal parameters of the model and further achieves the purpose of classifying the hand-drawn image data. And finally, reflecting the handwriting motion condition of the painter through the hand-drawn rule graph.
Drawings
The invention may be better understood by referring to the following description in conjunction with the accompanying drawings, in which like reference numerals are used throughout the figures to indicate like or similar parts. The accompanying drawings, which are incorporated in and form a part of this specification, illustrate preferred embodiments of the present invention and, together with the detailed description, serve to further explain the principles and advantages of the invention.
FIG. 1 is a flow chart of a hand-drawn graph classification method based on an error back propagation algorithm according to the present invention;
FIG. 2 is an exemplary diagram of four sets of regular graphical rendering templates and captured template rendering image data in an embodiment of the invention;
FIG. 3 is a flow chart of image data preprocessing in an embodiment of the present invention;
FIG. 4 is a schematic diagram of a BP neural network structure used in the embodiment of the present invention;
FIG. 5 is a flow chart of training a BP neural network according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating a classification accuracy curve when the number of iterations is increased according to an embodiment of the present invention;
FIG. 7 is a block diagram of a hand-drawn graph classification system based on an error back propagation algorithm according to the present invention.
Detailed Description
In order that those skilled in the art will better understand the disclosure, exemplary embodiments or examples of the disclosure are described below with reference to the accompanying drawings. It is to be understood that the disclosed embodiments or examples are only some, but not all embodiments or examples of the invention. All other embodiments or examples obtained by a person of ordinary skill in the art based on the embodiments or examples of the present invention without any creative effort shall fall within the protection scope of the present invention.
In recent years, the development of neural network technology provides powerful support for handwriting anomaly detection. The Back Propagation algorithm (BP) is one of multilayer neural networks, and consists of two processes of forward Propagation of signals and Back Propagation of errors, and a model structure comprises an input layer, a hidden layer and an output layer.
The invention provides a hand-drawing rule graph classification method based on a BP network, which mainly comprises the following steps: firstly, designing a regular graph template, and acquiring image data of a hand-drawn graph of a tester; preprocessing the training set and the test set, wherein the preprocessing comprises correction, designated area cutting, feasibility screening, label setting, normalization processing and the like, and the training set and the test set are divided; then, constructing a proper BP network model, training the model by using the data in the training set, and continuously correcting the weight parameters through multiple iterations of the processes of signal forward propagation and error backward propagation until the expected requirements are met; and finally, testing the generalization ability of the model by using the data in the test set, thereby obtaining the detection model of the hand-drawn regular graph. The method mainly utilizes the bottom layer characteristics such as the gray level, the color, the texture, the shape and the position of the handwritten image or the characteristics such as a gray level histogram to mark labels of specific categories on the image, thereby realizing the detection of the handwriting abnormity.
The flow chart of the method of the invention is shown in figure 1, and the specific steps are as follows:
designing a rule graph depicting template;
in the design process of the template, the most important thing is to realize the differentiation and comprehension of the regular graph track information. A plurality of regular graph templates are designed, each graph collects a plurality of groups of image data, and a data set is enriched so as to eliminate bad data and improve the robustness of the data set. In the present embodiment, four templates, namely, an archimedean line, a rectangle, a regular pentagon and a regular hexagon, are designed, as shown in fig. 2, the archimedean line period is 6 pi, and the rectangle, the regular pentagon and the regular hexagon respectively describe 5 circles, that is, 5 circumferences. Positioning points are designed on the template, so that the clipping is more accurate during the subsequent image preprocessing; the left column in fig. 2 represents the template graphics, the middle column and the right column represent the captured handwritten traces depicting the template graphics. The tremor problem of the hands of the testers can be amplified by selecting a proper template, and further the tremor problem is embodied on the drawn handwriting track; and the training of subsequent models is facilitated, and the classification accuracy is improved.
It should be noted that the template may be designed with other regular patterns or irregular patterns.
Step two, acquiring hand-drawing image data; training data and test data are included;
pre-contacting testers, wherein the testers comprise two types of testers with tremor on hands and normal hands; the designed regular graph drawing template is laid on a handwriting board, a tester draws positioning points at the positioning points of the template by using a corresponding handwriting pen, then draws a graph on the template, and records the movement route of a pen point on paper, namely the handwriting track of a patient; and (4) digitizing the graphic track drawn by the tester, and exporting the graphic track into a picture format, thereby acquiring hand-drawn image data.
Specifically, 300 images, 150 images each of normal and abnormal, are collected for each type, i.e., each template pattern; for each tester, collecting three groups of data for each template graph; and randomly dividing the data in the data set into a training set and a test set according to a certain proportion. The background of hand-drawn image data acquired by the method is white, and the track part is black, namely, the target and the background have larger contrast, so that the subsequent processing is facilitated.
Thirdly, preprocessing hand drawing image data;
as shown in fig. 3, firstly, the acquired image data is corrected according to the positioning points, so that the track image is positioned on the horizontal and vertical standard lines; recording the length and width specific dimensions of each type of track image template, and cutting the acquired track images according to specific graph types to ensure that each track image has a specific size; storing the image of each graph under the same folder to obtain 4 groups of folders; manually observing the images under each group of folders, and removing data obviously having abnormity; and setting labels on the reserved image data, and carrying out normalization processing on the data set so that the optimization process of the optimal solution becomes more gradual and the optimal solution is obtained by convergence more easily. The preprocessed hand-drawn image data of each type have the same format and size, so that the training of a subsequent model is facilitated.
Step four, constructing a BP neural network model;
in the network model, the input layer is provided with m nerve units; the hidden layer is provided with n layers, and each layer is provided with p nerve units; the output layer has 2 neural units. And selecting a proper learning rate, an activation function, an optimization method and the like. And setting specific parameters of the model according to the number of data contained in the collected hand-drawn regular graph data set and the detail characteristics such as large difference of track textures between hand-drawn graphs.
Each template graph has m data, wherein m =300; let the input layer include m neurons, and the input signal is x = (x) 1 ,x 2 …x m ) (ii) a Let the hidden layer be 2 layers, each layer containing 4 neurons, i.e. p =4, and the input signal of the hidden layer is hi = (hi) 1 ,hi 2 ,hi 3 ,hi 4 ) The output signal is ho = (ho) 1 ,ho 2 ,ho 3 ,ho 4 ) (ii) a The output layer has 2 neurons, and the input signal of the output layer is yi = (yi) 1 ,yi 2 ) And the output signal is yo = (yo) 1 ,yo 2 ) (ii) a The desired output value is e = (e) 1 ,e 2 ) (ii) a Selecting a softmax function as an activation function, adopting Adam by an optimization method, and taking the learning rate to be 0.05, wherein the adopted BP neural network model structure is shown in figure 4.
Step five, training a BP neural network model;
as shown in fig. 5, first, forward propagation of a signal is performed; transmitting training data to an input layer, randomly distributing a weight w value in an interval of [ -1,1], and randomly distributing an offset b value in an interval of [0,1], so as to realize the propagation of signals from the input layer to an output layer through a hidden layer and obtain an output value and an error function;
specifically, the number of data sets is m; the input signal of the input layer is x = (x) 1 ,x 2 ) (ii) a Transport of hidden layerThe input signal is hi = (hi) 1 ,hi 2 …hi p ) And the output signal is ho = (ho) 1 ,ho 2 …ho p ) (ii) a The input signal of the output layer is yi = (yi) 1 ,yi 2 ) The output signal is yo = (yo) 1 ,yo 2 ) (ii) a The desired output value is e = (e) 1 ,e 2 )。
In the forward propagation process, when an input signal is input to the hidden layer, there are:
Figure BDA0003072915380000071
ho h (k)=f(hi h (k))
wherein, w ih As a function of the weight of the process, b h A bias function for the process; f (z) is an activation function, and is characterized in that the function and the derivative thereof are continuous, and f (z) = 1/(1 + e) -z ),(k=1,2…m),(h=1,2…p)。
When the output signal of the hidden layer is transmitted to the output layer, there are:
Figure BDA0003072915380000072
yo j (k)=f(yi j (k))
wherein, w hj As a function of the weight of the process, b j As a bias function of the process, (j =1, 2).
Then, the error is reversely propagated; calculating partial derivatives of the error function to each neuron in the output layer and each neuron in the hidden layer by utilizing a chain rule;
specifically, the error function between the output of the output layer and the desired output is:
Figure BDA0003072915380000073
calculating partial derivatives of the error function to each neuron in the output layer by using a chain rule:
Figure BDA0003072915380000074
Figure BDA0003072915380000075
calculating partial derivatives of the error function to each neuron of the hidden layer:
Figure BDA0003072915380000076
Figure BDA0003072915380000081
then, parameter correction is carried out; the steepest descent method is used to reduce the error function along the negative gradient direction, and the partial derivative delta of each neuron in the output layer is used 0 (k) And the output of each neuron of the hidden layer and the partial derivative delta of each neuron of the hidden layer h (k) Correcting the weight value by the output of each neuron of the input layer;
the concrete formula is as follows:
Figure BDA0003072915380000082
Figure BDA0003072915380000083
Figure BDA0003072915380000084
Figure BDA0003072915380000085
h is a learning rate, the model is easy to fall into a local optimal solution due to the high learning rate, and the training speed of the model is influenced due to the low learning rate.
Finally, carrying out multiple iterations; the weight adjustment process of each layer of signal forward propagation and error backward propagation is performed in cycles. And (4) continuously adjusting the weight value, namely, a learning and training process of the network. This process is continued until the preset number of learning times is 100.
Step six, testing the generalization ability of the model;
and testing the test data by using the model obtained by training, comparing the test data with the labels set in advance, and accumulating the number of the image data with correct test, thereby calculating the correct test rate of the model obtained by training on the test set. Setting a preset accuracy qualification standard to be more than 89% for example, if the qualification standard can not be reached, increasing iteration times or adjusting the number of layers of a model network, the number of neurons or corresponding model parameters and the like, retraining to reach an expected standard, and finally storing corresponding models and parameters.
The correct rate curve for the test data classification as the number of iterations increases is shown in fig. 6. As can be seen from fig. 6, the classification accuracy reaches 89% or more after 100 iterations, which indicates that the method is effective.
And seventhly, performing digital processing on the handwriting track data to be classified to obtain hand-drawn image data to be classified, and inputting the hand-drawn image data to be classified into the trained BP neural network model to obtain a classification result.
The present invention quantifies hand tremors and is beneficial in capturing pathological tremors that cannot be observed by the naked eye. On the premise that a data set is a plurality of data sources, the track information quantity of regular graphs is small, the difference between samples of the same type is small, the sample quantity is small and the like, the method can detect the slight difference between the abnormal hand-drawn graph and the normal hand-drawn graph, and therefore the functions of changes which are difficult to capture by naked eyes and deep association of the changes are extracted.
Another embodiment of the present invention provides a hand-drawn graph classification system based on an error back propagation algorithm, as shown in fig. 7, the system includes:
the data acquisition module 110 is configured to obtain multiple groups of handwritten trajectory data by drawing a designed regular graph drawing template, and perform digital processing on the handwritten trajectory data to acquire hand-drawn image data; the regular graph depicting template comprises four templates of an Archimedes line, a rectangle, a regular pentagon and a regular hexagon, and positioning points are arranged on the regular graph depicting template; the handwriting track data comprises abnormal track data corresponding to the existence of tremor in the hand and normal track data corresponding to the absence of tremor in the hand;
the data preprocessing module 120 is used for preprocessing the hand-drawn image data, and comprises the steps of firstly correcting the hand-drawn image data according to positioning points to enable a handwriting track to be positioned on a horizontal standard line and a vertical standard line; recording the length and width of each template, and cutting the hand-drawn image data according to the length and width of each template to ensure that each hand-drawn image data has a specific size; rejecting hand-drawn image data obviously having abnormity, setting labels for the reserved hand-drawn image data, and carrying out normalization processing;
the model training module 130 is used for constructing a BP neural network model according to the detail characteristics of the handwriting track in the handwriting track data; dividing the hand-drawn image data into training data and testing data, and training a BP neural network model according to the training data; inputting test data into a BP neural network model obtained through training for classification, comparing the classification accuracy of the test data with a preset accuracy threshold, modifying training parameters for retraining until the classification accuracy of the test data reaches or exceeds the preset accuracy threshold if the classification accuracy of the test data is less than the preset accuracy threshold, and storing the trained BP neural network model and parameters;
the classification module 140 is configured to obtain hand-drawn image data to be classified after performing digital processing on the hand-drawn trajectory data to be classified, and input the hand-drawn image data to be classified into the trained BP neural network model to obtain a classification result.
Further, the data obtaining module 110 sets the drawing period of the archimedean line template to be 6 pi, and the drawing periods of the rectangular, regular pentagonal and regular hexagonal templates to be 5 weeks long.
Further, the specific process of training the BP neural network model according to the training data in the model training module 130 includes:
the forward propagation of the signal is first performed: transmitting training data to an input layer, randomly distributing a weight value in an interval of [ -1,1], and randomly distributing an offset value in an interval of [0,1], so as to realize the propagation of signals from the input layer to an output layer through a hidden layer and obtain an output value and an error function;
then the back propagation of the error is performed: calculating partial derivatives of the error function to each nerve unit in the output layer and each nerve unit in the hidden layer by utilizing a chain rule;
then, weight correction is carried out: the error function can be reduced along the negative gradient direction by using the steepest descent method, and the weight is corrected by using the partial derivative of each nerve unit of the output layer, the output and the partial derivative of each nerve unit of the hidden layer and the output of each nerve unit of the input layer;
and finally, carrying out multiple iterations, namely adjusting the weights of all layers of signal forward propagation and error backward propagation until reaching the preset training times.
The function of the hand-drawn graph classification system based on the error back propagation algorithm in this embodiment can be described by the hand-drawn graph classification method based on the error back propagation algorithm, so that the detailed description of this embodiment is omitted, and reference may be made to the above method embodiments, which are not described herein again.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. The present invention has been disclosed with respect to the scope of the invention, which is to be considered as illustrative and not restrictive, and the scope of the invention is defined by the appended claims.

Claims (6)

1. A hand-drawn graph classification method based on an error back propagation algorithm is characterized by comprising the following steps:
designing one or more regular graph drawing templates; the regular graph depicting template comprises four templates of an Archimedes line, a rectangle, a regular pentagon and a regular hexagon, and positioning points are arranged on the regular graph depicting template;
secondly, obtaining a plurality of groups of handwritten track data through drawing a regular graph drawing template, and carrying out digital processing on the handwritten track data to obtain hand-drawn image data; the handwriting track data comprises abnormal track data corresponding to tremor of hands and normal track data corresponding to no tremor of hands;
step three, preprocessing the hand-drawn image data;
fourthly, constructing a BP neural network model according to the detail characteristics of the handwriting track in the handwriting track data;
step five, dividing the hand-drawn image data into training data and testing data, and training a BP neural network model according to the training data; the specific process comprises the following steps:
step five, firstly, carrying out forward propagation of signals: the training data is transferred to the input layer at [ -1,1,1 [ ]]Randomly assigning a weight value within the interval of 0,1]Randomly distributing offset values in the interval to realize the propagation of signals from an input layer to an output layer through a hidden layer to obtain an output value and an error function; specifically, the number of data sets is m; the input signal of the input layer is x = (x) 1 ,x 2 ) (ii) a The input signal of the hidden layer is hi = (hi) 1 ,hi 2 … hi p ) And the output signal is ho = (ho) 1 ,ho 2 …ho p ) (ii) a The input signal of the output layer is yi = (yi) 1 ,yi 2 ) And the output signal is yo = (yo) 1 ,yo 2 ) (ii) a The desired output value is e = (e) 1 ,e 2 );
In the forward propagation process, when an input signal is input to the hidden layer, there are:
Figure FDA0003868391940000011
ho h (k)=f(hi h (k))
wherein, w ih As a function of the weight of the process, b h A bias function for the process; f (z) is an activation function, and is characterized in that the function and the derivative thereof are continuous, and f (z) = 1/(1 + e) -z ),(k=1,2… m),(h=1,2…p);
When the output signal of the hidden layer is transmitted to the output layer, there are:
Figure FDA0003868391940000012
yo j (k)=f(yi j (k))
wherein, w hj As a function of the weight of the process, b j As a bias function of the process, (j =1,2);
and fifthly, performing back propagation of errors: calculating partial derivatives of the error function to each nerve unit in the output layer and each nerve unit in the hidden layer by utilizing a chain rule; specifically, the error function between the output of the output layer and the desired output is:
Figure FDA0003868391940000021
calculating partial derivatives of the error function to each neuron in the output layer by using a chain rule:
Figure FDA0003868391940000022
Figure FDA0003868391940000023
calculating partial derivatives of the error function to each neuron of the hidden layer:
Figure FDA0003868391940000024
Figure FDA0003868391940000025
fifthly, weight correction is carried out: the error function can be reduced along the direction of the negative gradient by using a steepest descent method, and the weight is corrected by using the partial derivative of each nerve unit of the output layer, the output and partial derivative of each nerve unit of the hidden layer and the output of each nerve unit of the input layer; the concrete formula is as follows:
Figure FDA0003868391940000026
Figure FDA0003868391940000027
Figure FDA0003868391940000028
Figure FDA0003868391940000029
wherein eta is a learning rate, the model is easy to fall into a local optimal solution due to the high learning rate, and the training speed of the model is influenced due to the low learning rate;
fifthly, finally, carrying out multiple iterations, namely adjusting the weights of each layer of signal forward propagation and error backward propagation until reaching the preset training times;
step six, inputting test data into a BP neural network model obtained through training for classification, comparing the classification accuracy of the test data with a preset accuracy threshold, if the classification accuracy of the test data is smaller than the preset accuracy threshold, modifying training parameters for retraining until the classification accuracy of the test data reaches or exceeds the preset accuracy threshold, and storing the trained BP neural network model and parameters;
and seventhly, performing digital processing on the handwriting track data to be classified to obtain hand-drawn image data to be classified, and inputting the hand-drawn image data to be classified into the trained BP neural network model to obtain a classification result.
2. The hand-drawn graph classification method based on the error back propagation algorithm is characterized in that in the second step, the drawing period of the Archimedes line template is set to be 6 pi, and the drawing periods of the rectangular, regular pentagonal and regular hexagonal templates are set to be 5 weeks.
3. The hand-drawn graph classification method based on the error back propagation algorithm as claimed in claim 2, wherein the specific process of the third step comprises:
firstly, correcting hand-drawn image data according to positioning points to enable a hand-written track to be positioned on a horizontal and vertical standard line;
recording the length and the width of each template, and cutting the hand-drawn image data according to the length and the width of each template to ensure that each hand-drawn image data has a specific size;
and step three, rejecting hand-drawn image data obviously having abnormity, setting labels for the reserved hand-drawn image data, and performing normalization processing.
4. The method for classifying hand-drawn figures based on the error back propagation algorithm according to claim 3, wherein the BP neural network model in step four comprises an input layer, a hidden layer and an output layer; the input layer comprises 300 nerve units, the hidden layer comprises 2 layers, each layer comprises 4 nerve units, and the output layer comprises 2 nerve units.
5. A hand-drawn graph classification system based on an error back propagation algorithm is characterized by comprising the following steps:
the data acquisition module is used for acquiring a plurality of groups of handwritten track data by drawing a designed regular graph drawing template, and carrying out digital processing on the handwritten track data to acquire hand-drawn image data; the regular graph depicting template comprises four templates of an Archimedes line, a rectangle, a regular pentagon and a regular hexagon, and positioning points are arranged on the regular graph depicting template; the handwriting track data comprises abnormal track data corresponding to the existence of tremor in the hand and normal track data corresponding to the absence of tremor in the hand;
the data preprocessing module is used for preprocessing the hand-drawn image data, and comprises the steps of firstly correcting the hand-drawn image data according to positioning points to enable a handwriting track to be positioned on a horizontal standard line and a vertical standard line; recording the length and width of each template, and cutting the hand-drawn image data according to the length and width of each template to ensure that each hand-drawn image data has a specific size; rejecting hand-drawn image data obviously having abnormity, setting labels for the reserved hand-drawn image data, and carrying out normalization processing;
the model training module is used for constructing a BP neural network model according to the detail characteristics of the handwriting track in the handwriting track data; dividing the hand-drawn image data into training data and testing data, and training a BP neural network model according to the training data; inputting test data into a BP neural network model obtained through training for classification, comparing the classification accuracy of the test data with a preset accuracy threshold, modifying training parameters for retraining until the classification accuracy of the test data reaches or exceeds the preset accuracy threshold if the classification accuracy of the test data is smaller than the preset accuracy threshold, and storing the trained BP neural network model and parameters; the specific process for training the BP neural network model according to the training data comprises the following steps:
fifthly, forward propagation of signals is firstly carried out: the training data is passed to the input layer at [ -1,1]Randomly assigning a weight value within the interval of 0,1]Randomly distributing offset values in the interval to realize the propagation of signals from an input layer to an output layer through a hidden layer to obtain an output value and an error function; specifically, let the number of data sets be m; the input signal of the input layer is x = (x) 1 ,x 2 ) (ii) a Hidden layer transportThe input signal is hi = (hi) 1 ,hi 2 …hi p ) The output signal is ho = (ho) 1 ,ho 2 … ho p ) (ii) a The input signal of the output layer is yi = (yi) 1 ,yi 2 ) The output signal is yo = (yo) 1 ,yo 2 ) (ii) a The desired output value is e = (e) 1 ,e 2 );
In the forward propagation process, when an input signal is input to the hidden layer, there are:
Figure FDA0003868391940000041
ho h (k)=f(hi h (k))
wherein, w ih As a function of the weight of the process, b h A bias function for the process; f (z) is an activation function, and is characterized in that the function and the derivative thereof are continuous, and f (z) = 1/(1 + e) -z ),(k=1,2…m),(h=1,2…p);
When the output signal of the hidden layer is transmitted to the output layer, there are:
Figure FDA0003868391940000042
yo j (k)=f(yi j (k))
wherein, w hj As a function of the weight of the process, b j As a bias function of the process, (j =1,2);
and fifthly, performing back propagation of errors: calculating partial derivatives of the error function to each nerve unit in the output layer and each nerve unit in the hidden layer by utilizing a chain rule; specifically, the error function between the output of the output layer and the desired output is:
Figure FDA0003868391940000043
calculating partial derivatives of the error function to each neuron in the output layer by using a chain rule:
Figure FDA0003868391940000044
Figure FDA0003868391940000051
calculating partial derivatives of the error function to each neuron of the hidden layer:
Figure FDA0003868391940000052
Figure FDA0003868391940000053
fifthly, weight correction is carried out: the error function can be reduced along the negative gradient direction by using the steepest descent method, and the weight is corrected by using the partial derivative of each nerve unit of the output layer, the output and the partial derivative of each nerve unit of the hidden layer and the output of each nerve unit of the input layer; the concrete formula is as follows:
Figure FDA0003868391940000054
Figure FDA0003868391940000055
Figure FDA0003868391940000056
Figure FDA0003868391940000057
wherein eta is a learning rate, the model is easy to fall into a local optimal solution due to the high learning rate, and the training speed of the model is influenced due to the low learning rate;
fifthly, finally, carrying out multiple iterations, namely adjusting the weights of each layer of signal forward propagation and error backward propagation until reaching the preset training times;
and the classification module is used for performing digital processing on the handwriting track data to be classified to obtain hand-drawn image data to be classified, and inputting the hand-drawn image data to be classified into the trained BP neural network model to obtain a classification result.
6. The hand-drawn graph classification system based on the error back propagation algorithm is characterized in that the data acquisition module sets the rendering period of the Archimedes line template to be 6 pi, and the rendering periods of the rectangular, regular pentagonal and regular hexagonal templates to be 5 weeks.
CN202110544233.8A 2021-05-19 2021-05-19 Hand-drawn graph classification method and system based on error back propagation algorithm Active CN113240018B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110544233.8A CN113240018B (en) 2021-05-19 2021-05-19 Hand-drawn graph classification method and system based on error back propagation algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110544233.8A CN113240018B (en) 2021-05-19 2021-05-19 Hand-drawn graph classification method and system based on error back propagation algorithm

Publications (2)

Publication Number Publication Date
CN113240018A CN113240018A (en) 2021-08-10
CN113240018B true CN113240018B (en) 2023-02-03

Family

ID=77137514

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110544233.8A Active CN113240018B (en) 2021-05-19 2021-05-19 Hand-drawn graph classification method and system based on error back propagation algorithm

Country Status (1)

Country Link
CN (1) CN113240018B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113888546A (en) * 2021-09-01 2022-01-04 浙江大华技术股份有限公司 Method for arranging hand-drawn graph, electronic equipment and storage medium
CN116781836B (en) * 2023-08-22 2023-12-01 云视图研智能数字技术(深圳)有限公司 Holographic remote teaching method and system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012166860A1 (en) * 2011-06-03 2012-12-06 Great Lakes Neurotechnologies Inc. Method and system for tuning of movement disorder therapy devices

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7418128B2 (en) * 2003-07-31 2008-08-26 Microsoft Corporation Elastic distortions for automatic generation of labeled data
CN106709474A (en) * 2017-01-23 2017-05-24 无锡职业技术学院 Handwritten telephone number identification, verification and information sending system
CN109035488A (en) * 2018-08-07 2018-12-18 哈尔滨工业大学(威海) Aero-engine time series method for detecting abnormality based on CNN feature extraction
CN109276255B (en) * 2018-11-27 2023-02-28 平安科技(深圳)有限公司 Method and device for detecting tremor of limbs
WO2020185973A1 (en) * 2019-03-11 2020-09-17 doc.ai incorporated System and method with federated learning model for medical research applications
CN110956226A (en) * 2019-11-28 2020-04-03 中国人民解放军总医院 Handwriting track abnormity detection method based on deep learning
CN111131237B (en) * 2019-12-23 2020-12-29 深圳供电局有限公司 Microgrid attack identification method based on BP neural network and grid-connected interface device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012166860A1 (en) * 2011-06-03 2012-12-06 Great Lakes Neurotechnologies Inc. Method and system for tuning of movement disorder therapy devices

Also Published As

Publication number Publication date
CN113240018A (en) 2021-08-10

Similar Documents

Publication Publication Date Title
CN113240018B (en) Hand-drawn graph classification method and system based on error back propagation algorithm
CN105654121B (en) A kind of complicated jacquard fabric defect inspection method based on deep learning
CN109664300B (en) Robot multi-style calligraphy copying method based on force sense learning
CN111709267B (en) Electroencephalogram signal emotion recognition method of deep convolutional neural network
CN111671423B (en) EEG signal representation method, classification method, visualization method and medium
CN112656431A (en) Electroencephalogram-based attention recognition method and device, terminal equipment and storage medium
CN114331971A (en) Ultrasonic endoscope target detection method based on semi-supervised self-training
CN101726498B (en) Intelligent detector and method of copper strip surface quality on basis of vision bionics
CN112819093A (en) Man-machine asynchronous recognition method based on small data set and convolutional neural network
CN108629762B (en) Image preprocessing method and system for reducing interference characteristics of bone age evaluation model
Peryanto et al. Convolutional neural network and support vector machine in classification of flower images
CN107481224A (en) Method for registering images and device, storage medium and equipment based on structure of mitochondria
CN113657168A (en) Convolutional neural network-based student learning emotion recognition method
CN111507414B (en) Deep learning skin disease picture comparison and classification method, storage medium and robot
Lu et al. Detection and classification of bearing surface defects based on machine vision
Bilang et al. Cactaceae detection using MobileNet architecture
CN112883922B (en) Sign language identification method based on CNN-BiGRU neural network fusion
CN109524109A (en) A kind of contactless fatigue monitoring method based on muscle pressure state
CN110796638B (en) Pore detection method
CN111832401A (en) Electronic marking recognition method
Sharma et al. A review for the automatic methods of plant's leaf image segmentation
CN110956226A (en) Handwriting track abnormity detection method based on deep learning
CN114299323A (en) Printed matter defect detection method combining machine vision and deep learning technology
CN115966003A (en) System for evaluating online learning efficiency of learner based on emotion recognition
CN114067159A (en) EUS-based fine-granularity classification method for submucosal tumors

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant