CN109190666A - Flowers image classification method based on improved deep neural network - Google Patents

Flowers image classification method based on improved deep neural network Download PDF

Info

Publication number
CN109190666A
CN109190666A CN201810854879.4A CN201810854879A CN109190666A CN 109190666 A CN109190666 A CN 109190666A CN 201810854879 A CN201810854879 A CN 201810854879A CN 109190666 A CN109190666 A CN 109190666A
Authority
CN
China
Prior art keywords
network
training
flowers
activation primitive
improved
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201810854879.4A
Other languages
Chinese (zh)
Other versions
CN109190666B (en
Inventor
刘秀磊
吴迪
刘旭红
李红臣
刘婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Information Science and Technology University
Original Assignee
Beijing Information Science and Technology University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Information Science and Technology University filed Critical Beijing Information Science and Technology University
Priority to CN201810854879.4A priority Critical patent/CN109190666B/en
Publication of CN109190666A publication Critical patent/CN109190666A/en
Application granted granted Critical
Publication of CN109190666B publication Critical patent/CN109190666B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/048Activation functions

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computational Linguistics (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

The present invention provides a kind of flowers image classification methods based on improved deep neural network, the method that this method uses transfer learning, the InceptionV3 network of the training on large-scale dataset is used for the classification of flowers image data set, activation primitive therein is improved.Show in the experiment on general purpose O xford flower-102 data set: the model is higher than conventional method and common convolutional neural networks classification accuracy in flower class image classification task, and it is higher than unmodified convolutional neural networks accuracy rate, transition process accuracy rate reaches 81.32%, and trim process accuracy rate reaches 92.85%.

Description

Flowers image classification method based on improved deep neural network
Technical field
The present invention relates to a kind of flowers image classification methods, and in particular to a kind of flower based on improved deep neural network Grass image classification method.
Background technique
Colored picture classification task is a more difficult task.The maximum difficult variation in class between class.Example Such as, from different classes of some images compared with classification itself with small change, and some small differences determine Their different classifications.In addition, as a Sustainable Growth, the plant of non-rigid object, flower can with various deformation, so It is also varied widely in classification.Many conventional methods are all used in flowers identification.
Traditional method needs to establish a classifier for each colored class, and obtains a large amount of style to train these classification Device.In practice, many different types of flowers make work become extremely difficult and boring.In typical flower image, part is hidden Gear, illumination, Multiple events etc., scale and viewpoint vary widely.
Due to traditional flowers classification method needs largely artificial markup information, characteristic information deficiency etc., classification capacity has Limit.On Oxford flow-ers-102 data set, traditional flowers classification method accuracy rate is below 81%.
In recent years, deep learning makes a breakthrough in many fields.Convolutional neural networks (Convolutional Neural Networks, CNN) because easily learning the high-level feature to image, it is widely applied in the task of image classification.
However, to traditional convolutional neural networks, when carrying out spending class picture classification task, there is a following disadvantage: 1. Entire neural network needs over-fitting easily occur by a large amount of nonlinear transformation.2. traditional neural network structure layer Number not enough, it is not comprehensive enough to extract picture feature information.3. classical BP neural network, when it carries out error back propagation, if The number of plies is excessive, it may appear that gradient diffusing phenomenon.4. flowers picture has the lesser characteristic of difference in class, the flower of somewhat like class can It can will not be distinguished well by network.
Summary of the invention
In view of the above technical problems, the present invention provides a kind of flowers image classification side based on improved deep neural network Method, this method can effectively improve the ability that neural network extracts characteristic information, reduce the over-fitting of network, increase point of network Class ability.
The technical solution adopted by the present invention are as follows:
The embodiment of the present invention provides a kind of flowers image classification method based on improved deep neural network, comprising: will Basic InceptionV3 network is trained on large-scale dataset, obtains pre-training network;To the pre-training network into Row improves, and obtains the improved network for being suitable for the data set of flowers identification, the data set includes training dataset and test Data set;Transfer training will be carried out in improved network migration to the training dataset, the network after obtaining transfer training;It will The activation primitive of network after transfer training is revised as obtaining based on the Tanh-ReLU function after Tanh and ReLU function correction Network to after improvement activation primitive;Network after improvement activation primitive is fine-tuning to the training dataset, is finely adjusted instruction Practice, the network after obtaining fine tuning training;The test data set is inputted into the network after the fine tuning training, to flowers image Classify.
Optionally, described that the pre-training network is improved, obtain the improvement for being suitable for the data set of flowers identification Network include: the full articulamentum of the last layer for deleting the pre-training network, one layer of overall situation is added and is averaged pond layer, and complete The first full articulamentum is added after the average pond layer of office, and the second full articulamentum is added after the described first full articulamentum, thus Obtain the improvement network;Wherein, the described first full articulamentum contains 1024 nodes, and activation primitive uses Relu, and uses Dropout processing, probability are set as 0.5;The activation primitive of the second full articulamentum uses Softmax, output node 102 Class.
Optionally, described transfer training to be carried out in improved network migration to the training dataset, obtain migration instruction Network after white silk includes: to keep the network weight of the part original I nceptionV3 constant, most using training dataset training The parameter of 4 layers of network afterwards, to obtain the network after transfer training;
Wherein, using optimizer RMSprop training parameter, in the training process, when gradient declines, each batch includes 32 A sample, iteration wheel number are set as 30 wheels.
Optionally, the expression formula of the Tanh-ReLU function are as follows:
Optionally, the network after the activation primitive by improvement is fine-tuning to the training dataset, is finely adjusted training, obtains To fine tuning training after network include: freeze improve activation primitive after network in the first two original block parameter, make this two The value of the parameter of original block remains unchanged in training, using the parameter of the training dataset re -training remainder layer, thus Network after obtaining fine tuning training;
Wherein, using optimizer SGD training parameter, learning rate is set as 0.001, and momentum parameter is set as 0.9, and loss function makes With cross entropy loss function, in the training process, when gradient declines, each batch includes 32 samples, and iteration wheel number is set as 30 Wheel.
Optionally, the data set is handled by data enhancing.
Optionally, the data set, which handle by data enhancing, includes:
The inclination of different angle is carried out to picture, and carries out horizontal and vertical image rotation, to increase sample size;
The random cropping of 80% size and 80% to 120% random scaling are carried out to picture, to increase sample size;With And
The appropriate Gaussian noise for increasing picture.
Flowers image classification method provided in an embodiment of the present invention based on improved deep neural network is learned using migration The InceptionV3 network of the training on large-scale dataset is used for the classification of flowers image data set, to it by the method for habit In activation primitive improve.Experiment on general purpose O xford flower-102 data set shows: the model is in flower class figure As higher than conventional method and common convolutional neural networks classification accuracy in classification task, and than unmodified convolutional neural networks Accuracy rate is high, and transition process accuracy rate reaches 81.32%, and trim process accuracy rate reaches 92.85%.
Detailed description of the invention
Fig. 1 is the process of the flowers image classification method provided in an embodiment of the present invention based on improved deep neural network Schematic diagram;
Fig. 2 is Tanh function and ReLU function curve;
Fig. 3 is Tanh-ReLU function and its derivative curve;
Fig. 4 is for different activation primitives to 102 class flower classification accuracy with iteration wheel number change curve.
Specific embodiment
To keep the technical problem to be solved in the present invention, technical solution and advantage clearer, below in conjunction with attached drawing and tool Body embodiment is described in detail.
Fig. 1 is the process of the flowers image classification method provided in an embodiment of the present invention based on improved deep neural network Schematic diagram.As shown in Figure 1, the flowers image classification method packet provided in an embodiment of the present invention based on improved deep neural network Include following steps:
S101, basic InceptionV3 network is trained on large-scale dataset, obtains pre-training network;
S102, the pre-training network is improved, obtains the improved network for being suitable for the data set of flowers identification, The data set includes training dataset and test data set;
S103, transfer training will be carried out in improved network migration to the training dataset, after obtaining transfer training Network;
S104, the activation primitive of the network after transfer training is revised as based on after Tanh and ReLU function correction Tanh-ReLU function obtains improving the network after activation primitive;
S105, the network after improvement activation primitive is fine-tuning to the training dataset, is finely adjusted training, is finely tuned Network after training;
S106, the test data set is inputted into the network after the fine tuning training, to classify to flowers image.
Hereinafter, above steps is described in detail.
S101, basic InceptionV3 network is trained on large-scale dataset, obtains pre-training network
The embodiment of the present invention is used as flower classification net using the InceptionV3 network of the training on large-scale dataset Network, to obtain pre-training network.
Every kind of Inception structure that the embodiment of the present invention uses improves 3 kinds on the basis of InceptionV2 Inception module, as follows:
In the first Inception structure, each 5 × 5 convolution is substituted by two 3 × 3 convolution.
In second of Inception structure, n × n convolution is resolved into the form of n × 1 and 1 × 1 two layers convolution.For 17 × 17 network, final choice n are 7.
In the third Inception structure, the output of convolution kernel group is expanded.This framework is used in coarse grid Promote the expression of high sized image in (Coarsest Grid).
The InceptionV3 network model that the embodiment of the present invention uses does following improvement: optimizer on the basis of V2 SGD is replaced with RMSProp, LSR layers are added after the full articulamentum of classification, 7 × 7 convolution kernels are replaced by three 3 × 3 convolution kernels.
S102, the pre-training network is improved, obtains the improved network for being suitable for the data set of flowers identification.
The flower classification experiment of the embodiment of the present invention needs to classify 102 class flowers.To make the network be suitable for flower Grass classification, improves the network, and network improvement includes: the full articulamentum of the last layer for deleting the pre-training network, is added one layer The average pond layer of the overall situation, to expand receptive field, and the first full articulamentum of addition after the average pond layer of the overall situation, and described the The second full articulamentum is added after one full articulamentum, to obtain the improvement network;Wherein, the described first full articulamentum contains 1024 nodes, activation primitive uses Relu, and is handled using Dropout, and probability is set as 0.5, to prevent network over-fitting; The activation primitive of the second full articulamentum uses Softmax, and output node is 102 classes.Network structure such as 1 institute of table after improvement Show.
The improved network structure of table 1 (315 layers)
As can be seen from Table 1, network inputs are 299 × 299 × 3, that is, input the size of picture.The output of network is 1 × 1 × 102, the probability value of corresponding every class flower.
S103, transfer training will be carried out in improved network migration to the training dataset, after obtaining transfer training Network.
The step includes: to keep the network weight of the part original I nceptionV3 constant, is assembled for training using the training data The parameter for practicing last 4 layers of network, to obtain the network after transfer training.Since training parameter is less, select more steady Optimizer RMSprop.In training process, when gradient declines, each batch includes 32 samples, and iteration wheel number is set as 30 wheels.
S104, the activation primitive of the network after transfer training is revised as based on after Tanh and ReLU function correction Tanh-ReLU function obtains improving the network after activation primitive.
The improved InceptionV3 neural network structure model that the embodiment of the present invention proposes is as listed in Table 1, only Tanh-ReLU function activation primitive therein being changed to after Tanh and ReLU function correction.The last layer is full articulamentum, Finally to obtain the probability of flower classification.Maximum feature of the invention is unsaturation function ReLU and Tanh function is soft full With the characteristics of combine.Tanh function may be expressed as:
ReLU function may be expressed as:
Respective function curve is as shown in Figure 2.
Figure it is seen that ReLU function is all set to 0 for minus situation, greater than zero in the case where keep not Become, this illustrates that the function has faster convergence rate when being greater than zero, but since negative value is all set to 0.Therefore, node data May be irreversibly dead in the training process, and then destroy data flow.In the task of flowers image classification, due to flowers With the lesser characteristic of class inherited, and ReLU exports no negative value, and the biasing accumulated between active coating will affect the effect of classification Fruit.
Based on this, it is the left-hand component of Tanh function that the present invention, which takes minus part, and the part greater than zero is ReLU letter Several right half parts is denoted as Tanh-ReLU function.The expression formula of the function are as follows:
Tanh-ReLU function curve and its derivative curve are as shown in Figure 3.
When Web vector graphic BP algorithm is recalled, error from output Es-region propagations, the first derivative of the currently active function and The value of Current neural member must be multiplied at each layer, i.e. Grad=Error × (Tanh-ReLU) ' (x) × x.The function is led Several and ReLU function is almost the same, therefore the advantages of inherit ReLU, can effectively alleviate the problem of gradient disappears, thus directly with The mode training deep-neural-network of supervision, does not need unsupervised layer-by-layer training.
When being greater than zero, the advantages of Tanh-ReLU inherits ReLU, because having linear unsaturated form, therefore in gradient There is faster converging form when decline.And when less than zero, the shortcomings that overcoming ReLU that may destroy data manifold, and have The characteristic (being fixed between -1 and 0) of soft saturation promotes the robustness to noise.For flowers picture classification task, can be directed to The lesser characteristic of flowers picture class inherited is preferably classified.
Compared with ReLU function, Tanh-ReLU has the characteristics that following two: 1. its activation value at x < 0 are negative value, are led Number is not 0.Because the output of derivative will become 0, this will lead to asking for neuronal death when the input of ReLU function is negative value Topic.Tanh-ReLU function can improve this problem, and a kind of characteristic of soft saturation, this point is presented in the part for allowing input to be negative Special it can save noise robust.2. can to export mean value to be 0.All outputs of ReLU are all non-negative, so output mean value is inevitable Be it is non-negative, this will lead to network occur mean shift (Mean Shift), may not be restrained in the certain ultra-deep networks of training. Tanh-ReLU function has repaired this problem, so that mean value may be 0.
To sum up, the present invention is directed to the characteristics of flowers picture, and the activation primitive in the network structure in table 1 is changed to Tanh- ReLU function generates improved InceptionV3 network structure.
S105, the network after improvement activation primitive is fine-tuning to the training dataset, is finely adjusted training, is finely tuned Network after training.
The step includes: to freeze the parameter of the first two original block in the network that S104 is obtained, and keeps it in training intermediate value It is constant, utilize the parameter of the training dataset re -training remainder layer.Since training parameter is more, select convergence rate very fast Optimizer SGD, learning rate therein is set as 0.001, and momentum parameter is set as 0.9, and loss function uses cross entropy loss function. Iteration wheel number is set as 30 wheels, and in gradient decline, each batch includes sample number and iteration wheel number with transfer training process.
Further, the data set that the embodiment of the present invention uses is handled by data enhancing.Enhancing processing can include:
The inclination of different angle is carried out to picture, and carries out horizontal and vertical image rotation, to increase sample size;
The random cropping of 80% size and 80% to 120% random scaling are carried out to picture, to increase sample size;With And
The appropriate Gaussian noise for increasing picture.
[embodiment]
The advantages of flowers image classification method provided in an embodiment of the present invention, is described below by way of experiment.
[experiment and analysis]
Experimental situation
The software and hardware experimental situation that this experiment uses is as shown in table 2.
Table 2 tests hardware environment
Under linux system, this experiment uses the Keras deep learning frame based on TensorFlow, to flowers picture It is trained and tests.
Oxford flower-102 public data collection is selected in this experiment, and database is created from Oxford University's visual geometric group The flower image data base built.Comprising 102 classes flower classification, each classification picture is between 40~258, totally 8189 picture.It should Database also takes into account all difficult points in field of image recognition simultaneously, and such as illumination variation, visible change, background is complicated, spends Grass classification form is more and color change is complicated.In addition part it is different flowers height it is similar, therefore, flowers image classification is ground Study carefully and is of great significance.
Data enhancing
Data enhancement methods can substantially increase the sample size of training dataset, improve the generalization ability of network model.Essence On, data enhancement methods are the processes for artificially increasing the sample size of data set by data transfer devices such as affine transformations.
Database only has 8189 colored class pictures, and for 102 class flower classification tasks, average every class only has 80 Picture is used for flowers type, and still very little so to carry out data enhancing can just fully meet instruction to every class flowers image data amount Practice the demand of network.
1. in view of the different directions for being susceptible to shooting flowers guarantee that invariance is rotated and tilted in image recognition processes, The inclination of different angle is carried out to picture, and carries out horizontal and vertical image rotation.Increase sample size.
2. certain a part of flowers picture is also the flowers type in view of under complex background, it is big to carry out 80% to picture Small random cropping and 80% to 120% random scaling, to increase sample size.
3. considering rain, mist, snow and some illumination variations, Various Seasonal or the image that different photo opporunities obtain in one day Tone, brightness and saturation degree have different variations, suitably increase Gaussian noise.
By above 3 kinds of data enhancement methods, training is constantly generated from former data by training dataset generator Data, until reaching target iteration wheel number.It is excessively quasi- that network can be effectively reduced in enhanced data in the training process It closes, increases the ability of convolutional network identification flowers image.
Database has 8189 pictures, wherein will be used as training set by 7169 pictures, 1020 pictures are used as test Collection.The data set is extended to original 30 times with data enhancing technology, effectively avoids the over-fitting of network.
InceptionV3 network structure experiment based on Tanh-ReLU activation primitive
After carrying out image data enhancing, then image is pre-processed.In view of 102 class flowers photo resolutions are uneven Deng all pictures being zoomed to 299 × 299 pixels, to complete the requirement of the unitized input of network.In view of picture pixels be from 0~255, input calculation amount is more complex, picture pixels point is compressed to -1~1 from 0~255, to simplify the input of network.
It needs to inhibit more neurons in view of training sample data are very few, in database Oxford flower-102 On when being trained, Dropout ratio is set as 0.5, prevents over-fitting.
To solve full articulamentum, last pond layer choosing, which is selected, uses global average pond, and the characteristic pattern of the last layer is carried out The mean value pond of whole figure forms a characteristic point, by this feature point group at last feature vector.Conducive to last Softmax is calculated.
The process of migration is the parameter constant of fixed original InceptionV3 network portion, the parameter at remaining 4 layers of the top of training. Optimizer uses RMSProp optimizer, and loss function uses multi-class logarithm loss function (categoricalcrossentropy). Batchsize is set as 32, and iteration wheel number is set as 30, and the number of iterations is 32 × 30=960 times.
The process of trim network is the parameter constant in fixed network in InceptionV3 the first two original block, i.e., fixed Preceding 127 layer parameter is constant, the parameter at the remaining top of training.Optimizer is SGD, and learning rate is set as 0.0001, and momentum parameter is set It is set to 0.9, loss function, Batchsize, iteration wheel number is as transition process setting.
Interpretation of result
Classification accuracy can be improved to verify improved activation primitive, uses Tanh, ReLU, Tanh- herein ReLU3 kind activation primitive classifies to 102 class flowers pictures.Transition process and trim process accuracy rate are as shown in table 3.Its In, preceding 30 wheel is transition process, and rear 30 wheel is trim process.In transition process, since RMSProp optimizer needs are searched in advance Then the initial learning rate of rope declines it by the order of magnitude, therefore accuracy rate is shaken, but generally speaking Tanh-ReLU activates letter Number is higher than the accuracy rate of other two activation primitives.Relatively stable SGD optimizer is used in trim process, is only calculated every time One sample, it can be seen that improved Tanh-ReLU activation primitive classification accuracy increases, as shown in the following table 3.
The different activation primitives of table 3 are to 102 class flower classification accuracys rate
When carrying out 102 class flowers picture classification experiments, direct comparison Tanh function, ReLU function and improved Tanh-ReLU function, by Experimental comparison, wherein the Batch number of transition process and the training set of trim process is 32, iteration Taking turns book is 30 wheels, and total iteration wheel number is 60 wheels.It can be concluded that, improved activation primitive not only can be improved by testing above The convergence rate of network, and increase to flowers picture Classification and Identification rate.From table 3 it can be seen that being improved in transition process Tanh-ReLU function ratio Tanh and ReLU activation primitive accuracy rate is higher by 2.3% and 0.48% respectively respectively, trim process is quasi- True rate is higher by 2.08% and 1.32% respectively.
Show that accuracy rate is 92.85% when classifying to 102 class flowers, in transition process and trim process, The flowers picture of the relatively high part of accuracy rate is analyzed.The relatively high flowers of classification accuracy are mostly in color or form There is notable difference with other flowers, as shown in Figure 4.
To sum up, the extraction picture feature information occurred when the present invention carries out flowers picture classification task for traditional network is not Comprehensive and disadvantage, using the thought of transfer learning, using the InceptionV3 network rack of the pre-training on ImageNet data set Structure.Excessively easily there is over-fitting for traditional network parameter, data set is suitably increased using data enhancing technology.? On the basis of the InceptionV3 network architecture, the activation primitive of network is improved, using Tanh-ReLU activation primitive, tests table Bright, model is more preferable than unmodified network model classifying quality, and than conventional method and other deep neural network frameworks Classify.Transition process accuracy rate reaches 81.32%, and trim process accuracy rate reaches 92.85%.Verify the improved method for The accuracy rate of flowers picture classification task and the feasibility for carrying out flowers identification.
This is it should be noted that method provided by the invention can also be extended to similar field is studied.In plant credit There is universality in class, but to use for reference botany expert knowledge library and do some further investigations.It also can be used as animal species simultaneously to grind The reference studied carefully.
Embodiment described above, only a specific embodiment of the invention, to illustrate technical solution of the present invention, rather than It is limited, scope of protection of the present invention is not limited thereto, although having carried out with reference to the foregoing embodiments to the present invention detailed Illustrate, those skilled in the art should understand that: anyone skilled in the art the invention discloses In technical scope, it can still modify to technical solution documented by previous embodiment or variation can be readily occurred in, or Person's equivalent replacement of some of the technical features;And these modifications, variation or replacement, do not make corresponding technical solution Essence is detached from the spirit and scope of technical solution of the embodiment of the present invention, should be covered by the protection scope of the present invention.Therefore, The protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (7)

1. a kind of flowers image classification method based on improved deep neural network characterized by comprising
Basic InceptionV3 network is trained on large-scale dataset, obtains pre-training network;
The pre-training network is improved, the improved network for being suitable for the data set of flowers identification, the data are obtained Collection includes training dataset and test data set;
Transfer training will be carried out in improved network migration to the training dataset, the network after obtaining transfer training;
The activation primitive of network after transfer training is revised as based on the Tanh-ReLU letter after Tanh and ReLU function correction Number obtains improving the network after activation primitive;
Network after improvement activation primitive is fine-tuning to the training dataset, is finely adjusted training, after obtaining fine tuning training Network;
The test data set is inputted into the network after the fine tuning training, to classify to flowers image.
2. being fitted the method according to claim 1, wherein described improve the pre-training network The improved network of data set for flowers identification includes:
The full articulamentum of the last layer for deleting the pre-training network is added one layer of overall situation and is averaged pond layer, and average in the overall situation The first full articulamentum is added after the layer of pond, and the second full articulamentum is added after the described first full articulamentum, to obtain institute State improvement network;
Wherein, the described first full articulamentum contains 1024 nodes, and activation primitive uses Relu, and is handled using Dropout, generally Rate is set as 0.5;The activation primitive of the second full articulamentum uses Softmax, and output node is 102 classes.
3. method according to claim 1 or 2, which is characterized in that described by improved network migration to the trained number According to transfer training is carried out on collection, the network after obtaining transfer training includes: the network weight for keeping the part original I nceptionV3 It is constant, using the parameter of last 4 layers of the network of training dataset training, to obtain the network after transfer training;
Wherein, using optimizer RMSprop training parameter, in the training process, when gradient declines, each batch includes 32 samples This, iteration wheel number is set as 30 wheels.
4. the method according to claim 1, wherein the expression formula of the Tanh-ReLU function are as follows:
5. method according to claim 1 or 2, which is characterized in that the network after the activation primitive by improvement is fine-tuning to The training dataset, is finely adjusted training, and the network after obtaining fine tuning training includes: to freeze to improve the network after activation primitive The parameter of middle the first two original block, remains unchanged the value of the parameter of two original blocks in training, utilizes the trained number According to the parameter of collection re -training remainder layer, to obtain the network after fine tuning training;
Wherein, using optimizer SGD training parameter, learning rate is set as 0.001, and momentum parameter is set as 0.9, and loss function uses friendship Entropy loss function is pitched, in the training process, when gradient declines, each batch includes 32 samples, and iteration wheel number is set as 30 wheels.
6. the method according to claim 1, wherein the data set is handled by data enhancing.
7. according to the method described in claim 6, it is characterized in that, the data set by data enhancing carry out processing include:
The inclination of different angle is carried out to picture, and carries out horizontal and vertical image rotation, to increase sample size;
The random cropping of 80% size and 80% to 120% random scaling are carried out to picture, to increase sample size;And
The appropriate Gaussian noise for increasing picture.
CN201810854879.4A 2018-07-30 2018-07-30 Flower image classification method based on improved deep neural network Active CN109190666B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810854879.4A CN109190666B (en) 2018-07-30 2018-07-30 Flower image classification method based on improved deep neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810854879.4A CN109190666B (en) 2018-07-30 2018-07-30 Flower image classification method based on improved deep neural network

Publications (2)

Publication Number Publication Date
CN109190666A true CN109190666A (en) 2019-01-11
CN109190666B CN109190666B (en) 2022-04-29

Family

ID=64937434

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810854879.4A Active CN109190666B (en) 2018-07-30 2018-07-30 Flower image classification method based on improved deep neural network

Country Status (1)

Country Link
CN (1) CN109190666B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110347851A (en) * 2019-05-30 2019-10-18 中国地质大学(武汉) Image search method and system based on convolutional neural networks
CN110739051A (en) * 2019-10-08 2020-01-31 中山大学附属第三医院 Method for establishing eosinophilic granulocyte proportion model by using nasal polyp pathological picture
CN111008674A (en) * 2019-12-24 2020-04-14 哈尔滨工程大学 Underwater target detection method based on rapid cycle unit
CN111383357A (en) * 2019-05-31 2020-07-07 纵目科技(上海)股份有限公司 Network model fine-tuning method, system, terminal and storage medium adapting to target data set

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107404656A (en) * 2017-06-26 2017-11-28 武汉斗鱼网络科技有限公司 Live video recommends method, apparatus and server
CN107423815A (en) * 2017-08-07 2017-12-01 北京工业大学 A kind of computer based low quality classification chart is as data cleaning method
US20180204111A1 (en) * 2013-02-28 2018-07-19 Z Advanced Computing, Inc. System and Method for Extremely Efficient Image and Pattern Recognition and Artificial Intelligence Platform

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180204111A1 (en) * 2013-02-28 2018-07-19 Z Advanced Computing, Inc. System and Method for Extremely Efficient Image and Pattern Recognition and Artificial Intelligence Platform
CN107404656A (en) * 2017-06-26 2017-11-28 武汉斗鱼网络科技有限公司 Live video recommends method, apparatus and server
CN107423815A (en) * 2017-08-07 2017-12-01 北京工业大学 A kind of computer based low quality classification chart is as data cleaning method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
杨国亮 等: "一种改进的深度卷积神经网络的精细图像分类", 《江西师范大学学报( 自然科学版)》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110347851A (en) * 2019-05-30 2019-10-18 中国地质大学(武汉) Image search method and system based on convolutional neural networks
CN111383357A (en) * 2019-05-31 2020-07-07 纵目科技(上海)股份有限公司 Network model fine-tuning method, system, terminal and storage medium adapting to target data set
CN110739051A (en) * 2019-10-08 2020-01-31 中山大学附属第三医院 Method for establishing eosinophilic granulocyte proportion model by using nasal polyp pathological picture
CN110739051B (en) * 2019-10-08 2022-06-03 中山大学附属第三医院 Method for establishing eosinophilic granulocyte proportion model by using nasal polyp pathological picture
CN111008674A (en) * 2019-12-24 2020-04-14 哈尔滨工程大学 Underwater target detection method based on rapid cycle unit
CN111008674B (en) * 2019-12-24 2022-05-03 哈尔滨工程大学 Underwater target detection method based on rapid cycle unit

Also Published As

Publication number Publication date
CN109190666B (en) 2022-04-29

Similar Documents

Publication Publication Date Title
CN109685115B (en) Fine-grained conceptual model with bilinear feature fusion and learning method
Yoon et al. Combined group and exclusive sparsity for deep neural networks
CN109190666A (en) Flowers image classification method based on improved deep neural network
CN107358257B (en) Under a kind of big data scene can incremental learning image classification training method
CN110929610B (en) Plant disease identification method and system based on CNN model and transfer learning
CN112446388A (en) Multi-category vegetable seedling identification method and system based on lightweight two-stage detection model
CN106407986B (en) A kind of identification method of image target of synthetic aperture radar based on depth model
CN110399821B (en) Customer satisfaction acquisition method based on facial expression recognition
CN106845401B (en) Pest image identification method based on multi-space convolution neural network
CN105894045B (en) A kind of model recognizing method of the depth network model based on spatial pyramid pond
CN109035260A (en) A kind of sky areas dividing method, device and convolutional neural networks
CN108961245A (en) Picture quality classification method based on binary channels depth parallel-convolution network
CN109829541A (en) Deep neural network incremental training method and system based on learning automaton
CN106504064A (en) Clothes classification based on depth convolutional neural networks recommends method and system with collocation
CN109325484A (en) Flowers image classification method based on background priori conspicuousness
CN105631415A (en) Video pedestrian recognition method based on convolution neural network
CN113627472B (en) Intelligent garden leaf feeding pest identification method based on layered deep learning model
CN108537777A (en) A kind of crop disease recognition methods based on neural network
CN103955702A (en) SAR image terrain classification method based on depth RBF network
CN110222215B (en) Crop pest detection method based on F-SSD-IV3
CN109344699A (en) Winter jujube disease recognition method based on depth of seam division convolutional neural networks
CN110363253A (en) A kind of Surfaces of Hot Rolled Strip defect classification method based on convolutional neural networks
CN110084285A (en) Fish fine grit classification method based on deep learning
CN108805167A (en) L aplace function constraint-based sparse depth confidence network image classification method
CN110032925A (en) A kind of images of gestures segmentation and recognition methods based on improvement capsule network and algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant