CN107038410A - A kind of weed images recognition methods that network is stacked based on depth - Google Patents
A kind of weed images recognition methods that network is stacked based on depth Download PDFInfo
- Publication number
- CN107038410A CN107038410A CN201710102806.5A CN201710102806A CN107038410A CN 107038410 A CN107038410 A CN 107038410A CN 201710102806 A CN201710102806 A CN 201710102806A CN 107038410 A CN107038410 A CN 107038410A
- Authority
- CN
- China
- Prior art keywords
- network
- depth
- layer
- module
- training
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
- G06N3/084—Backpropagation, e.g. using gradient descent
Abstract
The present invention relates to a kind of weed images recognition methods that network is stacked based on depth, solve that weed identification rate is low, inefficiency defect compared with prior art.The present invention comprises the following steps:Training image is collected and pre-processed;Construct and train depth to stack network model;Testing image is collected and pre-processed;Test sample is inputted into the depth after training and stacks network model, the automatic identification of weed images is carried out.The depth that the present invention is constructed, which stacks network, not only has very strong feature representation and classification capacity, also reduces training time and recognition time, enhances the robustness of weed identification.
Description
Technical field
The present invention relates to image identification technical field, specifically a kind of weed images knowledge that network is stacked based on depth
Other method.
Background technology
Weeds are the formidable enemies in crop growth, have generation within crops whole growth period, crops can be caused big
Measure the underproduction.Existing weeds classification, identification work are mainly by a small number of plant protection experts and agriculture technical staff to complete.But weeds
Species is various, each plant protection expert it is poor its can also can only identification division weeds.Increasing sign shows, weeds are known
Increasing for other demand has increasingly sharpened with the relatively small number of contradiction of plant protection expert.Now in area of pattern recognition, deep learning reason
By the focus as numerous scholar's research, it is widely used in recognition of face, object identification field, and achieve preferable effect
Really.But be not used widely also in agriculture field deep learning, especially bad environments, number of species it is huge
On weed identification.Because weeds specimen types quantity is more, the particularity that sample size is huge, the volume commonly used in deep learning method
Product neural net method needs constantly to adjust gradient parameter in training process, has had a strong impact on training effectiveness and recognition efficiency.
Therefore, how effectively to realize that the automatic identification of weeds has become the technical problem for being badly in need of solving.
The content of the invention
The invention aims to solve, weed identification rate in the prior art is low, there is provided one kind for inefficiency defect
Stack the weed images recognition methods of network to solve the above problems based on depth.
To achieve these goals, technical scheme is as follows:
A kind of weed images recognition methods that network is stacked based on depth, is comprised the following steps:
Training image is collected and pre-processed, all kinds of weed images are collected, the collection quantity per class weed images is big
In 100 width, as training image, size normalization processing is carried out to all training images, 256 × 256 are processed into
Pixel, obtains several training samples;
Construct and train depth to stack network model, construction depth stacks network structure module, by training sample pixel value
As input, network structure module is stacked to depth and is trained, the depth after being trained stacks network model;
Testing image is collected and pre-processed, weed images to be measured are shot using collecting device, all kinds of treat is carried out
Survey the collection of weed images, the collection quantity per class weed images to be measured is more than 30 width, and to weed images to be measured by 256 ×
256 pixels are normalized, and obtain test sample;
Test sample is inputted into the depth after training and stacks network model, the automatic identification of weed images is carried out.
Described construction simultaneously trains depth to stack network model to comprise the following steps:
Construction depth stacks network architecture module, and it is 120 layers to set mixed-media network modules mixed-media number, and each layer network module is
The special neutral net that three sublayers are assembled, three sublayers are respectively the linear sublayer of linear input block, comprising non-linear
The non-linear sublayer of unit and the output sublayer comprising linear convergent rate;
Depth is trained to stack network architecture module, it is comprised the following steps that:
Weights method is calculated using limited Boltzmann machine and calculates weight matrix W,
The weight matrix W per layer network module center line temper layer and non-linear sublayer is calculated,
Weight matrix U is calculated,
Calculate per the weight matrix U between non-linear sublayer in layer network module and output sublayer;
Connected each module layer by every layer of W and U, to train depth to stack network, its formula is as follows:
yi=UTδ(WTxi)=Gi(U,W)
xi=[x1i,…,xji,…,xDi]TRepresent the function on i-th of module, yiRepresent the output of i-th of module, δ
() is sigmoid functions, and G () has the function of relation for expression U and W.
Described use is limited Boltzmann machine calculating weights method calculating weight matrix W and comprised the following steps:
Random initializtion is limited model parameter θ={ W, a, the b } of Boltzmann machine,
Wherein, W represents input layer weight matrix, and a represents that input layer is biased, and b represents first hidden layer biasing, and sets
The learning rate λ of three parametersW=λa=λb=0.1;
Input to input layerForward-propagating is carried out, the output of first hidden layer is calculated
Wherein,For the input of the 0th layer of i-th of node;
Output to first hidden layerBackpropagation is carried out, is obtained For the input of L i-th of node of layer;
It is rightForward-propagating is carried out, is obtained For the output of L j-th of node of layer;
With reference to the corresponding learning rate of each parameter, model parameter θ={ W, a, b } is updated, the variable quantity of each parameter is:
Wherein E [] represents to ask for mathematic expectaion;
Change the input of input layer32 steps are repeated to 35 steps, until convergence obtains W.
Described calculating weight matrix U comprises the following steps:
Training sample is converted into training vector collection X=[x1,…,xi,…,xN], wherein N represents sample set quantity;xi=
[x1i,…,xji,…,xDi]TThe function on i-th of module is represented, D represents the dimension of input vector;
Because i-th of module is output as:yi=UThi,
Wherein hi=σ (WTxi) be i-th sample hidden layer vector, thus obtain upper strata weights squares of the U for a L × C
Battle array, W is D × L lower floor's weight matrix, and σ () is a sigmoid function, and L represents the quantity of Hidden unit, and C is represented
The dimension of input vector;
Calculate minimizing variance:
Wherein:T=[t1,…,ti,…,tN] label of N number of sample is represented, E represents the variance of minimization, and Y represents whole
The output matrix of network, yiRepresent the output of i-th of module, tiRepresent the training label of the i-th sample;
Calculate local derviation of the minimizing variance to U:
Allow its local derviation to be set to 0, can obtain:
U=(HHT)-1HTT,
Wherein H=[h1,…,hi,…,hN].U represents weight matrix, and H represents the excitation matrix of all hidden layers, and T is represented
The label of all samples.
Beneficial effect
A kind of weed images recognition methods that network is stacked based on depth of the present invention, the depth constructed compared with prior art
Degree, which stacks network, not only has very strong feature representation and classification capacity, also reduces training time and recognition time, enhances
The robustness of weed identification.The present invention improves the accuracy rate of weed identification, enhances weed identification training effectiveness, has reached reality
Border application level.
Brief description of the drawings
Fig. 1 is method precedence diagram of the invention.
Embodiment
To make to have a better understanding and awareness to architectural feature of the invention and the effect reached, to preferably
Embodiment and accompanying drawing coordinate detailed description, are described as follows:
As shown in figure 1, a kind of weed images recognition methods that network is stacked based on depth of the present invention, including it is following
Step:
The first step, is collected and pre-processes to training image.All kinds of weed images are collected, per the collection of class weed images
Quantity is more than 100 width, as training image, carries out size normalization processing to all training images, is processed into 256
× 256 pixels, obtain several training samples.
Second step, constructs and trains depth to stack network model.Construction depth stacks network structure module, by training sample
Pixel value stacks network structure module as input, to depth and is trained, and the depth after being trained stacks network model.Its
Comprise the following steps that:
(1) construction depth stacks network architecture module.It is 120 layers, each layer network module to set mixed-media network modules mixed-media number
It is the special neutral net that three sublayers are assembled, three sublayers are respectively the linear sublayer of linear input block, comprising non-
The non-linear sublayer of linear unit and the output sublayer comprising linear convergent rate.
(2) training depth stacks network architecture module, and it is comprised the following steps that:
A, weights methods calculated using limited Boltzmann machine calculate weight matrix W,
Calculate the weight matrix W per layer network module center line temper layer and non-linear sublayer.
A, random initializtion are limited model parameter θ={ W, a, the b } of Boltzmann machine,
Wherein, W represents input layer weight matrix, and a represents that input layer is biased, and b represents first hidden layer biasing, and sets
The learning rate λ of three parametersW=λa=λb=0.1;
B, the input to input layerForward-propagating is carried out, the output of first hidden layer is calculated
Wherein,For the input of the 0th layer of i-th of node;
C, the output to first hidden layerBackpropagation is carried out, is obtained For the input of L i-th of node of layer;
It is d, rightForward-propagating is carried out, is obtained For the output of L j-th of node of layer;
E, with reference to the corresponding learning rate of each parameter, update model parameter θ={ W, a, b }, the variable quantity of each parameter is:
Wherein E [] represents to ask for mathematic expectaion;
F, the input for changing input layerB is repeated to step e, until convergence obtains W.
B, calculating weight matrix U,
Calculate per the weight matrix U between non-linear sublayer in layer network module and output sublayer.
A, training sample is converted into training vector collection X=[x1,…,xi,…,xN], wherein N represents sample set quantity;
xi=[x1i,…,xji,…,xDi]TThe function on i-th of module is represented, D represents the dimension of input vector;
B, it is output as due to i-th of module:yi=UThi,
Wherein hi=σ (WTxi) be i-th sample hidden layer vector, thus obtain upper strata weights squares of the U for a L × C
Battle array, W is D × L lower floor's weight matrix, and σ () is a sigmoid function, and L represents the quantity of Hidden unit, and C is represented
The dimension of input vector;
C, calculating minimizing variance:
Wherein:T=[t1,…,ti,…,tN] label of N number of sample is represented, E represents the variance of minimization, and Y represents whole
The output matrix of network, yiRepresent the output of i-th of module, tiRepresent the training label of the i-th sample;
The local derviation of d, calculating minimizing variance to U:
Allow its local derviation to be set to 0, can obtain:
U=(HHT)-1HTT,
Wherein H=[h1,…,hi,…,hN].U represents weight matrix, and H represents the excitation matrix of all hidden layers, and T is represented
The label of all samples.
C, each module layer of being connected by every layer of W and U, to train depth to stack network, its formula is as follows:
yi=UTδ(WTxi)=Gi(U,W)
xi=[x1i,…,xji,…,xDi]TRepresent the function on i-th of module, yiRepresent the output of i-th of module, δ
() is sigmoid functions, and G () has the function of relation for expression U and W.
3rd step, is collected and pre-processes to testing image.Weed images to be measured are shot using collecting device, are carried out
The collection of all kinds of weed images to be measured, the collection quantity per class weed images to be measured is more than 30 width, and to weed images to be measured
It is normalized by 256 × 256 pixels, obtains test sample;
4th step, inputs the depth after training by test sample and stacks network model, carry out the automatic of weed images
Identification.
General principle, principal character and the advantages of the present invention of the present invention has been shown and described above.The technology of the industry
Personnel are it should be appreciated that the present invention is not limited to the above embodiments, and that described in above-described embodiment and specification is the present invention
Principle, various changes and modifications of the present invention are possible without departing from the spirit and scope of the present invention, these change and
Improvement is both fallen within the range of claimed invention.The protection domain of application claims by appended claims and its
Equivalent is defined.
Claims (4)
1. a kind of weed images recognition methods that network is stacked based on depth, it is characterised in that comprise the following steps:
11) training image is collected and pre-processed, collect all kinds of weed images, the collection quantity per class weed images is more than
All training images, as training image, are carried out size normalization processing, are processed into 256 × 256 pictures by 100 width
Element, obtains several training samples;
12) construct and train depth to stack network model, construction depth stacks network structure module, training sample pixel value is made
For input, network structure module is stacked to depth and is trained, the depth after being trained stacks network model;
13) testing image is collected and pre-processed, weed images to be measured are shot using collecting device, carried out all kinds of to be measured
The collection of weed images, the collection quantity per class weed images to be measured is more than 30 width, and to weed images to be measured by 256 ×
256 pixels are normalized, and obtain test sample;
14) test sample is inputted into the depth after training and stacks network model, carry out the automatic identification of weed images.
2. a kind of weed images recognition methods that network is stacked based on depth according to claim 1, it is characterised in that institute
The construction stated simultaneously trains depth to stack network model to comprise the following steps:
21) construction depth stacks network architecture module, and it is 120 layers to set mixed-media network modules mixed-media number, and each layer network module is
The special neutral net that three sublayers are assembled, three sublayers are respectively the linear sublayer of linear input block, comprising non-linear
The non-linear sublayer of unit and the output sublayer comprising linear convergent rate;
22) training depth stacks network architecture module, and it is comprised the following steps that:
221) weights method is calculated using limited Boltzmann machine and calculates weight matrix W,
The weight matrix W per layer network module center line temper layer and non-linear sublayer is calculated,
222) weight matrix U is calculated,
Calculate per the weight matrix U between non-linear sublayer in layer network module and output sublayer;
223) connected each module layer by every layer of W and U, to train depth to stack network, its formula is as follows:
yi=UTδ(WTxi)=Gi(U,W)
xi=[x1i,…,xji,…,xDi]TRepresent the function on i-th of module, yiRepresent the output of i-th of module, δ ()
For sigmoid functions, there is the function of relation for expression U and W in G ().
3. a kind of weed images recognition methods that network is stacked based on depth according to claim 2, it is characterised in that institute
The use stated is limited Boltzmann machine calculating weights method calculating weight matrix W and comprised the following steps:
31) random initializtion is limited model parameter θ={ W, a, the b } of Boltzmann machine,
Wherein, W represents input layer weight matrix, and a represents that input layer is biased, and b represents first hidden layer biasing, and sets three
The learning rate λ of parameterW=λa=λb=0.1;
32) to the input of input layerForward-propagating is carried out, the output of first hidden layer is calculated
Wherein,For the input of the 0th layer of i-th of node;
33) to the output of first hidden layerBackpropagation is carried out, is obtained For the input of L i-th of node of layer;
34) it is rightForward-propagating is carried out, is obtained For the output of L j-th of node of layer;
35) the corresponding learning rate of each parameter is combined, model parameter θ={ W, a, b } is updated, the variable quantity of each parameter is:
1
Wherein E [] represents to ask for mathematic expectaion;
36) input of input layer is changed32 steps are repeated to 35 steps, until convergence obtains W.
4. a kind of weed images recognition methods that network is stacked based on depth according to claim 2, it is characterised in that institute
The calculating weight matrix U stated comprises the following steps:
41) training sample is converted into training vector collection X=[x1,…,xi,…,xN], wherein N represents sample set quantity;
xi=[x1i,…,xji,…,xDi]TThe function on i-th of module is represented, D represents the dimension of input vector;
42) because i-th of module is output as:yi=UThi,
Wherein hi=σ (WTxi) be i-th sample hidden layer vector, thus obtain upper strata weight matrixs of the U for a L × C, W is
One D × L lower floor's weight matrix, σ () is sigmoid function, and L represents the quantity of Hidden unit, C represent input to
The dimension of amount;
43) minimizing variance is calculated:
Wherein:T=[t1,…,ti,…,tN] label of N number of sample is represented, E represents the variance of minimization, and Y represents whole network
Output matrix, yiRepresent the output of i-th of module, tiRepresent the training label of the i-th sample;
44) local derviation of the minimizing variance to U is calculated:
Allow its local derviation to be set to 0, can obtain:
U=(HHT)-1HTT,
Wherein H=[h1,…,hi,…,hN].U represents weight matrix, and H represents the excitation matrix of all hidden layers, and T represents all
The label of sample.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710102806.5A CN107038410A (en) | 2017-02-24 | 2017-02-24 | A kind of weed images recognition methods that network is stacked based on depth |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710102806.5A CN107038410A (en) | 2017-02-24 | 2017-02-24 | A kind of weed images recognition methods that network is stacked based on depth |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107038410A true CN107038410A (en) | 2017-08-11 |
Family
ID=59533606
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710102806.5A Pending CN107038410A (en) | 2017-02-24 | 2017-02-24 | A kind of weed images recognition methods that network is stacked based on depth |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107038410A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109461159A (en) * | 2018-11-20 | 2019-03-12 | 扬州工业职业技术学院 | A kind of image partition method of field crops weeds |
CN109961024A (en) * | 2019-03-08 | 2019-07-02 | 武汉大学 | Wheat weeds in field detection method based on deep learning |
CN110009043A (en) * | 2019-04-09 | 2019-07-12 | 广东省智能制造研究所 | A kind of pest and disease damage detection method based on depth convolutional neural networks |
CN111091138A (en) * | 2019-11-14 | 2020-05-01 | 远景智能国际私人投资有限公司 | Irradiation forecast processing method and stacked generalization model training method and device |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106326899A (en) * | 2016-08-18 | 2017-01-11 | 郑州大学 | Tobacco leaf grading method based on hyperspectral image and deep learning algorithm |
-
2017
- 2017-02-24 CN CN201710102806.5A patent/CN107038410A/en active Pending
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106326899A (en) * | 2016-08-18 | 2017-01-11 | 郑州大学 | Tobacco leaf grading method based on hyperspectral image and deep learning algorithm |
Non-Patent Citations (2)
Title |
---|
叶睿: ""基于深度学习的人脸检测方法研究"", 《中国优秀硕士学位论文全文数据库 信息科技辑》 * |
高莹莹: ""面向情感语音合成的言语情感建模研究"", 《中国博士学位论文全文数据库 信息科技辑》 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109461159A (en) * | 2018-11-20 | 2019-03-12 | 扬州工业职业技术学院 | A kind of image partition method of field crops weeds |
CN109961024A (en) * | 2019-03-08 | 2019-07-02 | 武汉大学 | Wheat weeds in field detection method based on deep learning |
CN110009043A (en) * | 2019-04-09 | 2019-07-12 | 广东省智能制造研究所 | A kind of pest and disease damage detection method based on depth convolutional neural networks |
CN110009043B (en) * | 2019-04-09 | 2021-08-17 | 广东省智能制造研究所 | Disease and insect pest detection method based on deep convolutional neural network |
CN111091138A (en) * | 2019-11-14 | 2020-05-01 | 远景智能国际私人投资有限公司 | Irradiation forecast processing method and stacked generalization model training method and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Uğuz et al. | Classification of olive leaf diseases using deep convolutional neural networks | |
CN103413151B (en) | Hyperspectral image classification method based on figure canonical low-rank representation Dimensionality Reduction | |
CN106845401A (en) | A kind of insect image-recognizing method based on many spatial convoluted neutral nets | |
Yadav et al. | Development and validation of standard area diagrams to aid assessment of pecan scab symptoms on fruit | |
CN107038410A (en) | A kind of weed images recognition methods that network is stacked based on depth | |
CN107464035A (en) | Chinese medicine performance rating method and system | |
CN106446942A (en) | Crop disease identification method based on incremental learning | |
CN104408466B (en) | Learn the high-spectrum remote sensing semisupervised classification method of composition based on local manifolds | |
Khan et al. | Deep learning for apple diseases: classification and identification | |
CN106997475A (en) | A kind of insect image-recognizing method based on parallel-convolution neutral net | |
CN104008551B (en) | A kind of Citrus Huanglongbing pathogen detection method based on visible images | |
CN106874688A (en) | Intelligent lead compound based on convolutional neural networks finds method | |
CN107067043A (en) | A kind of diseases and pests of agronomic crop detection method | |
CN106485227A (en) | A kind of Evaluation of Customer Satisfaction Degree method that is expressed one's feelings based on video face | |
CN106960243A (en) | A kind of method for improving convolutional neural networks structure | |
CN103839078B (en) | A kind of hyperspectral image classification method based on Active Learning | |
CN106991428A (en) | Insect image-recognizing method based on adaptive pool model | |
CN107368852A (en) | A kind of Classification of Polarimetric SAR Image method based on non-down sampling contourlet DCGAN | |
CN107945182A (en) | Maize leaf disease recognition method based on convolutional neural networks model GoogleNet | |
CN109902715A (en) | A kind of method for detecting infrared puniness target based on context converging network | |
CN106991666A (en) | A kind of disease geo-radar image recognition methods suitable for many size pictorial informations | |
CN110490227A (en) | A kind of few sample image classification method based on Feature Conversion | |
CN107688856A (en) | Indoor Robot scene active identification method based on deeply study | |
CN104091181A (en) | Injurious insect image automatic recognition method and system based on deep restricted Boltzmann machine | |
CN103839075B (en) | SAR image classification method based on united sparse representation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170811 |
|
RJ01 | Rejection of invention patent application after publication |