CN110599457A - Citrus huanglongbing classification method based on BD capsule network - Google Patents
Citrus huanglongbing classification method based on BD capsule network Download PDFInfo
- Publication number
- CN110599457A CN110599457A CN201910747697.1A CN201910747697A CN110599457A CN 110599457 A CN110599457 A CN 110599457A CN 201910747697 A CN201910747697 A CN 201910747697A CN 110599457 A CN110599457 A CN 110599457A
- Authority
- CN
- China
- Prior art keywords
- capsule
- layer
- image
- output
- vector
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 239000002775 capsule Substances 0.000 title claims abstract description 126
- 238000000034 method Methods 0.000 title claims abstract description 42
- 241000207199 Citrus Species 0.000 title claims abstract description 38
- 235000020971 citrus fruits Nutrition 0.000 title claims abstract description 37
- 241001478315 Candidatus Liberibacter asiaticus Species 0.000 title description 10
- 239000013598 vector Substances 0.000 claims abstract description 57
- 201000010099 disease Diseases 0.000 claims abstract description 36
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 claims abstract description 36
- 238000004422 calculation algorithm Methods 0.000 claims abstract description 8
- 238000006243 chemical reaction Methods 0.000 claims abstract description 8
- 238000007781 pre-processing Methods 0.000 claims abstract description 7
- 210000002569 neuron Anatomy 0.000 claims description 10
- 230000008569 process Effects 0.000 claims description 10
- 230000006870 function Effects 0.000 claims description 8
- 238000012549 training Methods 0.000 claims description 8
- 238000004364 calculation method Methods 0.000 claims description 4
- 230000005484 gravity Effects 0.000 claims description 4
- 238000013519 translation Methods 0.000 claims description 3
- 238000000605 extraction Methods 0.000 abstract description 2
- 241000894006 Bacteria Species 0.000 description 4
- 238000010586 diagram Methods 0.000 description 4
- 241000894007 species Species 0.000 description 4
- 235000013399 edible fruits Nutrition 0.000 description 3
- 238000010801 machine learning Methods 0.000 description 3
- 238000013527 convolutional neural network Methods 0.000 description 2
- 238000013135 deep learning Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- LLLILZLFKGJCCV-UHFFFAOYSA-M n-methyl-n-[(1-methylpyridin-1-ium-4-yl)methylideneamino]aniline;methyl sulfate Chemical compound COS([O-])(=O)=O.C=1C=CC=CC=1N(C)\N=C\C1=CC=[N+](C)C=C1 LLLILZLFKGJCCV-UHFFFAOYSA-M 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000004383 yellowing Methods 0.000 description 2
- 241000186146 Brevibacterium Species 0.000 description 1
- 235000000569 Calocarpum sapota Nutrition 0.000 description 1
- 208000035473 Communicable disease Diseases 0.000 description 1
- 240000004244 Cucurbita moschata Species 0.000 description 1
- 235000009854 Cucurbita moschata Nutrition 0.000 description 1
- 235000009852 Cucurbita pepo Nutrition 0.000 description 1
- 238000007900 DNA-DNA hybridization Methods 0.000 description 1
- 241000196324 Embryophyta Species 0.000 description 1
- 235000007688 Lycopersicon esculentum Nutrition 0.000 description 1
- 240000001794 Manilkara zapota Species 0.000 description 1
- 235000011339 Manilkara zapota Nutrition 0.000 description 1
- 241001466030 Psylloidea Species 0.000 description 1
- 241000918585 Pythium aphanidermatum Species 0.000 description 1
- 240000003768 Solanum lycopersicum Species 0.000 description 1
- 244000061456 Solanum tuberosum Species 0.000 description 1
- 235000002595 Solanum tuberosum Nutrition 0.000 description 1
- 241000607479 Yersinia pestis Species 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 230000007850 degeneration Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 208000015181 infectious disease Diseases 0.000 description 1
- 230000009545 invasion Effects 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000007477 logistic regression Methods 0.000 description 1
- 244000052769 pathogen Species 0.000 description 1
- 230000001717 pathogenic effect Effects 0.000 description 1
- 230000001575 pathological effect Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 230000002265 prevention Effects 0.000 description 1
- 238000007637 random forest analysis Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 235000020354 squash Nutrition 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 229940124597 therapeutic agent Drugs 0.000 description 1
- 230000017260 vegetative to reproductive phase transition of meristem Effects 0.000 description 1
- 210000003462 vein Anatomy 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30188—Vegetation; Agriculture
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Quality & Reliability (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a citrus greening disease classification method based on a BD capsule network, which comprises the following steps of: s1: image preprocessing, wherein the image is an image comprising citrus leaves; s2: the image is input into a BD capsule network through an input layer, and primary features corresponding to the image are extracted by using a convolutional layer; s3: performing characteristic conversion, namely performing corresponding spatial dimension conversion on each primary characteristic in the primary capsule layer, and expanding a new dimension to obtain a corresponding characteristic vector; s4: analyzing the feature vectors, analyzing each feature vector by the digital capsule layer by using a dynamic routing algorithm, outputting 2 output vectors, and determining the image category according to the output vectors; s5: and reconstructing an image, passing the output vector through a full connection layer, performing Batchnorm and Dropout operation, and outputting the reconstructed image. The method does not need manual participation, reduces resource consumption, and has higher precision than the traditional feature extraction method.
Description
Technical Field
The invention relates to the field of deep learning, in particular to a citrus greening disease classification method based on a BD capsule network.
Background
The citrus yellow shoot disease was first discovered in 1919 in south China, is a citrus tree disease closely related to a bacterium belonging to the genus Brevibacterium, which can be classified into Asian species, African species, American species, African subspecies and recently discovered new species that are mainly harmful to plants such as tomato, potato, etc., and transmitted by psyllid. There have been many views that this group of bacteria is the cause of huanglongbing and Chen et al indicate that the direct pathological response of the bacteria to the citrus host has not been fully established. The yellow dragon disease has different names in different countries and regions, is called yellow dragon disease in continental of China, damping off in Taiwan of China, leaf mottle disease in Philippines, tip blight in India, greening disease in south Africa, and degeneration disease of nervus and phloem of leaf vein in Indonesia.
In 1995, professor linkongxiang, who first demonstrated that huanglongbing is an infectious disease commemorative, the 13 th world citrus conference held in fuzhou, china formally stipulated that huanglongbing is described in terms of hlb (huangglongbin), confirming the prominent contribution of our scientists to this worldwide research institute for citrus disease.
After the citrus plants are infected with the yellow dragon disease, the growth vigor of the trees is rapidly declined, the tips of the branches are short and few, and the phenomena of mottle yellowing and overall yellowing and withering of leaves occur; the diseased trees have early flowering, more flowers, low fruit setting rate and sharply reduced yield; the fruit becomes smaller and the fruit is not normally colored, commonly called as "red naseberry" and the quality is deteriorated. If the disease is induced into the affected part and is manifested, if a rescue measure is not taken in time, the disease area will be enlarged continuously, the bacteria source is increased, the spreading rate of the epidemic situation is increased continuously, and the damage loss is increased rapidly. At present, the citrus huanglongbing is mainly distributed in more than 40 countries and regions of Asia, Africa and America, and the occurrence of huanglongbing epidemic is reported in Guangdong, Guangxi, Fujian, Zhejiang, Jiangxi, Hunan, Guizhou, Hainan, Yunnan, Sichuan, Taiwan and the like.
The spread of citrus yellow shoot disease poses serious threat to the stability and development of the citrus industry in the world. Although the disease has been found for nearly 100 years, an effective therapeutic agent and an ideal disease-resistant variety are still lacking. At present, effective measures for controlling invasion and spread of the disease tree are to dig out the disease tree and control pests and prevent diseases in time, and the prevention and treatment premises are to detect and diagnose the disease in time and rapidly.
The traditional Classification of Huanglongbing is generally divided into two categories. The first method is as follows: disease field diagnosis, index crop identification, pathogen microscopic observation, serology identification, DNA-DNA hybridization identification, PCR detection, etc. and utilizes physiological structure characteristics to make detection and classification. The second method is image interpretation based on pattern recognition and machine learning, the machine learning method is mainly divided into a classical learning method and a deep learning method, the classical learning method mainly comprises a support vector machine, a decision tree, logistic regression, random forest, K-nearest neighbor algorithm and the like, the required training sample data amount is small, the equipment performance requirement is low, and the model is easy to understand. In a classical machine learning method, feature extraction is a crucial link for determining an image classification effect.
The first method requires a large amount of resources, wastes time and labor, and has high cost in the actual classification process. The second method has low recognition accuracy and strong dependence on the background, so the classification accuracy of the method is not enough.
Disclosure of Invention
The invention provides a citrus greening disease classification method based on a BD capsule network, which achieves the effects of automation of classification tasks and improvement of model classification accuracy.
In order to solve the technical problems, the technical scheme of the invention is as follows:
a citrus greening disease classification method based on a BD capsule network comprises two large blocks which are a classification module and a reconstruction module respectively, wherein the classification module comprises an input layer, a convolutional neural network layer, an initial capsule layer, a digital capsule layer and an output layer, and the reconstruction module comprises a full-connection layer and comprises the following steps:
s1: image preprocessing, wherein the image is an image comprising citrus leaves;
s2: the image is input into the BD capsule network through an input layer of the BD capsule network, and primary features corresponding to the image are extracted by using a convolution layer of the BD capsule network so as to provide a more applicable advanced example for a capsule layer;
s3: performing characteristic conversion, namely performing corresponding spatial dimension conversion on each primary characteristic in the primary capsule layer, and expanding a new dimension on the basis of the primary characteristic dimension to obtain a characteristic vector corresponding to each primary characteristic; the number of numerical values in the feature vector is not changed, and only one dimension is added, so that the feature vector can represent more parameters, the expression mode of the vector is obtained by improving the recognition accuracy, and preparation is made for the input of the next layer;
s4: analyzing the characteristic vectors, connecting the digital capsule layer and the initial capsule layer in a vector and vector mode, analyzing each characteristic vector by the digital capsule layer by using a dynamic routing algorithm, outputting 2 output vectors, and determining the image category according to the output vectors; the images can be divided into two categories according to the onset of Huanglongbing, namely, the images are divided into diseased images and non-diseased images.
S5: and reconstructing an image, passing the output vector through a full connection layer, performing Batchnorm and Dropout operation, and outputting the reconstructed image.
Preferably, the image preprocessing in step S1 specifically includes:
and scaling the citrus leaf images by using a bilinear interpolation method, and unifying the sizes.
Preferably, the uniform size is 78 × 78 pixel value size.
Preferably, the digital capsule layer in step S4 includes several capsule layers, each capsule is represented by a vector, and the attitude parameter of an object in one capsule and the probability of belonging to the category are included in one capsule.
Preferably, the specific calculation process using the dynamic routing algorithm in step S3 includes:
in the formula uiThe output of the ith capsule layer is shown,means that the prediction vector output of the j layer is obtained by calculating the capsule layer of the i layer, WijRepresented are weight values for learning and back propagation;
updating the association degree of the upper and lower capsule layers by judging the weight value, taking or rejecting the proportion of the upper and lower capsule layers in the learning process, if the predicted result is close to the true value, increasing the association value of the upper and lower capsule layers, and if the predicted result is far away from the true value, reducing the association value of the upper and lower capsule layers:
cijthe associated values of i and j layers of adjacent capsule layers, bijThe probability of the ith layer of capsule layer being selected by the jth layer of capsule layer is shown, when the routing layer starts to execute, bijIs set to 0, k is the total number of capsules in the digital capsule layer, j belongs to [1, k ]];
The vector input into the j-th capsule layer is sjTo show that:
vjrepresents the output of the j-th capsule layer;ensuring the output value of the capsule layer to be in the interval of 0,1]And will not exceed the expected range value. While the above equation is a nonlinear activation function Squash.
Input deviceAnd output vjUpdating b through inner product operation between vectorsijBy the value of bijTo adjust the correlation process between the capsule layersDegree:
preferably, in step S4, the image category is determined according to the output vector, specifically:
the L2 norm is calculated for 2 output vectors, and the length of each output vector is calculated, and the output vector with the largest norm value represents the category with the largest probability of the image. The image probability corresponds to the probability of having huanglongbing.
Preferably, after the digital capsule layer is output, the difference degree between the prediction classification and the actual classification is also calculated:
Lc=Tcmax(0,m+-||vc||)2+λ(1-Tc)max(0,||vc||-m-)2
in the formula, vcA result value, L, representing the digital capsule layer outputcRepresenting a loss function, c representing a classified class, TcIs an indication function of classification, and the classification c is 1 when the result is sick and 0 when the result is not sick; m is+False negatives are penalized for the upper border; m is-Punishment of false positives for the lower border; λ is a proportionality coefficient, the specific gravity between the first half and the second half of the plus sign in the equation is adjusted, λ is 1, which means that the specific gravity is the same before and after, that is, the number of diseases in the data set is the same as that of the disease; m is+、m-And λ are both hyper-parameters, preset to a fixed value prior to BD capsule network learning.
Preferably, the full-link layer includes first full-link layer, the full-link layer of second, the full-link layer of third, wherein, the input of first full-link layer is output vector, input the full-link layer of second after the output of first full-link layer carries out the operation of Batchnorm and Dropout, the output of the full-link layer of second inputs the full-link layer of third after carrying out the operation of Batchnorm and Dropout, the full-link layer of third outputs the reconsitution image, and each neuron of back level is connected with the mode that all neurons of previous level were connected between the two-layer full-link layer. The fully-connected layer acts as a "reconstructor" throughout the BD capsule network. The capsule layer operation is to map the original data to the hidden layer feature space, and the full connection layer is to map the learned "distributed feature representation" to the sample space.
Preferably, the Batchnorm operation comprises the steps of:
finding out x ═ x in this batch1,x2,...xmMean value ofxAnd the batch data is output by the first full connection layer or the second full connection layer:
finding x ═ x1,x2,...xmThe variance of } in the mean square wave
III, normalizing x
Where ε represents the bias, giving a value close to zero in the programming, in order to prevent the denominator from being zero;
introducing scaling gamma and translation variable beta, and calculating output yi:
Preferably, the Dropout operation is implemented using Bernoulli stochastic distribution to randomly reduce neurons of the fully connected layer, with all connections of the reduced neurons being culled during training.
Preferably, the BD capsule network loss function is a mean square error loss as shown in the following formula:
o denotes the input image pixel value and Y denotes the output pixel value after passing through the full connection layer.
Compared with the prior art, the technical scheme of the invention has the beneficial effects that:
in addition, the reconstructed image can newly generate the citrus leaf image, and the analysis experiment under the condition of insufficient data can be effectively solved. Compared with the original capsule network, the BD capsule network has higher accuracy and higher training speed, and improves the comprehensive performance of the original network.
Drawings
FIG. 1 is a schematic flow chart of the method of the present invention.
Fig. 2 is a schematic diagram of a BD capsule network structure according to the present invention.
Fig. 3 is a schematic diagram of a specific calculation process of the BD classification network.
Fig. 4 is a schematic diagram of the full connection layer principle.
Fig. 5 is a network structure of the fully connected layer after Dropout processing in the embodiment.
FIG. 6 is a comparison graph of the accuracy of the BD capsule network and the capsule network training in the embodiment.
Detailed Description
The drawings are for illustrative purposes only and are not to be construed as limiting the patent;
for the purpose of better illustrating the embodiments, certain features of the drawings may be omitted, enlarged or reduced, and do not represent the size of an actual product;
it will be understood by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
The technical solution of the present invention is further described below with reference to the accompanying drawings and examples.
Example 1
The embodiment provides a citrus greening disease classification method based on a BD capsule network, and the BD capsule network comprises two major blocks, as shown in FIG. 1, a classification module and a reconstruction module, as shown in FIG. 2, the classification module comprises an input layer, a convolutional neural network layer, an initial capsule layer, a digital capsule layer and an output layer, and the reconstruction module comprises a full connection layer, and the method comprises the following steps:
s1: image preprocessing, wherein the image is an image comprising citrus leaves;
s2: the image is input into the BD capsule network through an input layer of the BD capsule network, and primary features corresponding to the image are extracted by using a convolution layer of the BD capsule network;
s3: performing characteristic conversion, namely performing corresponding spatial dimension conversion on each primary characteristic in the primary capsule layer, and expanding a new dimension on the basis of the primary characteristic dimension to obtain a characteristic vector corresponding to each primary characteristic;
s4: analyzing the feature vectors, analyzing each feature vector by the digital capsule layer by using a dynamic routing algorithm, outputting 2 output vectors, and determining the image category according to the output vectors;
s5: and reconstructing an image, passing the output vector through a full connection layer, performing Batchnorm and Dropout operation, and outputting the reconstructed image.
In step S1, the image preprocessing specifically includes:
and scaling the citrus leaf images by using a bilinear interpolation method, and unifying the sizes.
The uniform size is 78 x 78 pixel value size.
The digital capsule layer in step S4 includes several capsule layers, each of which is represented by a vector, and a capsule contains the attitude parameter of an object and the probability of belonging to the category.
The specific calculation process using the dynamic routing algorithm in step S3 is as shown in fig. 3, and includes:
in the formula uiThe output of the ith capsule layer is shown,means that the prediction vector output of the j layer is obtained by calculating the capsule layer of the i layer, WijRepresented are weight values for learning and back propagation;
updating the association degree of the upper and lower capsule layers by judging the weight value, taking or rejecting the proportion of the upper and lower capsule layers in the learning process, if the predicted result is close to the true value, increasing the association value of the upper and lower capsule layers, and if the predicted result is far away from the true value, reducing the association value of the upper and lower capsule layers:
cijthe associated values of i and j layers of adjacent capsule layers, bijThe probability of the ith layer of capsule layer being selected by the jth layer of capsule layer is shown, when the routing layer starts to execute, bijIs set to 0, k is the total number of capsules in the digital capsule layer, j belongs to [1, k ]];
The vector input into the j-th capsule layer is sjTo show that:
vjrepresents the output of the j-th capsule layer;
input deviceAnd output vjUpdating b through inner product operation between vectorsijBy the value of bijTo adjust the degree of correlation between capsule layers:
in step S4, the image category is determined according to the output vector, specifically:
the L2 norm is calculated for 2 output vectors, and the length of each output vector is calculated, and the output vector with the largest norm value represents the category with the largest probability of the image.
After the digital capsule layer is output, calculating the difference degree between the prediction classification and the actual classification:
Lc=Tc max(0,m+-||vc||)2+λ(1-Tc)max(0,||vc||-m-)2
in the formula, vcA result value, L, representing the digital capsule layer outputcRepresenting a loss function, c representing a classified class, TcIs an indication function of classification, the classification c has 1, and the other is 0; m is+False negatives are penalized for the upper border; m is-Punishment of false positives for the lower border; λ is a proportionality coefficient, and the specific gravity between the first half and the second half of the plus sign in the equation is adjusted; m is+、m-And λ are both hyper-parameters, preset to a fixed value prior to BD capsule network learning.
The full articulamentum includes the complete articulamentum of first complete articulamentum, second, the complete articulamentum of third, and wherein, the input of the first complete articulamentum does the output vector, and the output of the first complete articulamentum carries out the full articulamentum of input second after Batchnorm and the operation of Dropout, and the output of the full articulamentum of second carries out the full articulamentum of input third after Batchnorm and the operation of Dropout, and the full articulamentum of third outputs the rebuilt image. The original schematic diagram of the fully-connected layer is shown in fig. 4, and each neuron in the next layer is connected with all neurons in the previous layer.
The Batchnorm operation comprises the following steps:
finding out x ═ x in this batch1,x2,...xmMean value ofxAnd the batch data is output by the first full connection layer or the second full connection layer:
finding x ═ x1,x2,...xmThe variance of } in the mean square wave
III, normalizing x
In the formula, epsilon represents a deviation;
introducing scaling gamma and translation variable beta, calculating output yi:
The Dropout operation is to use Bernoulli stochastic distribution to achieve a stochastic reduction of neurons in the fully connected layer, with all connections of the reduced neurons being culled during training.
In the case that the probability is 0.5, the network structure of the fully connected layer after Dropout processing is shown in fig. 5;
in fig. 6, ID1 indicates the training accuracy of the BD capsule network, and ID2 indicates the training accuracy of the raw capsule network. Fig. 6 shows that the BD capsule network is trained to stabilize faster than the original capsule network. From table 1 it can be concluded that BD capsule networks have a higher accuracy than capsule networks.
TABLE 1
Model name | BD capsule network | Capsule network |
Testing accuracy | 89.50% | 88.80% |
In the specific implementation process, labeling is carried out on the citrus leaf data set, the first sheet of healthy citrus leaves is marked as 0_00001, the second sheet of healthy citrus leaves is marked as 0_00002, and the rest is done in the same way; the first sheet of diseased citrus leaves is marked as 1_00001, the second sheet is marked as 1_00002, and so on;
scaling the citrus leaf sub-images by a bilinear interpolation method, wherein the size of the images is unified to 78 × 78 pixel values;
leading the processed citrus leaf sub-images into a classification module of a BD capsule network, and then carrying out a series of operations by the classification module;
after the BD capsule network classification module is out, the output vector expression enters a reconstruction module of the BD capsule network, and finally a new reconstruction image is obtained.
The same or similar reference numerals correspond to the same or similar parts;
the terms describing positional relationships in the drawings are for illustrative purposes only and are not to be construed as limiting the patent;
it should be understood that the above-described embodiments of the present invention are merely examples for clearly illustrating the present invention, and are not intended to limit the embodiments of the present invention. Other variations and modifications will be apparent to persons skilled in the art in light of the above description. And are neither required nor exhaustive of all embodiments. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present invention should be included in the protection scope of the claims of the present invention.
Claims (10)
1. A citrus greening disease classification method based on a BD capsule network is characterized by comprising the following steps:
s1: image preprocessing, wherein the image is an image comprising citrus leaves;
s2: the image is input into the BD capsule network through an input layer of the BD capsule network, and primary features corresponding to the image are extracted by using a convolution layer of the BD capsule network;
s3: performing characteristic conversion, namely performing corresponding spatial dimension conversion on each primary characteristic in the primary capsule layer, and expanding a new dimension on the basis of the primary characteristic dimension to obtain a characteristic vector corresponding to each primary characteristic;
s4: analyzing the feature vectors, analyzing each feature vector by the digital capsule layer by using a dynamic routing algorithm, outputting 2 output vectors, and determining the image category according to the output vectors;
s5: and reconstructing an image, passing the output vector through a full connection layer, performing Batchnorm and Dropout operation, and outputting the reconstructed image.
2. The BD capsule network-based citrus greening disease classification method according to claim 1, wherein the image preprocessing in the step S1 includes:
and scaling the citrus leaf images by using a bilinear interpolation method, and unifying the sizes.
3. The BD capsule network-based citrus greening disease classification method of claim 2, wherein the uniform size is 78 x 78 pixel value size.
4. The BD capsule network-based citrus greening disease classification method according to claim 1, wherein the digital capsule layers in step S4 comprise a plurality of capsule layers, each capsule is represented by a vector, and a capsule contains an object posture parameter and a probability of belonging to the category.
5. The BD capsule network-based citrus greening disease classification method according to claim 4, wherein the specific calculation process using the dynamic routing algorithm in the step S3 includes:
in the formula uiThe output of the ith capsule layer is shown,means that the prediction vector output of the j layer is obtained by calculating the capsule layer of the i layer, WijRepresented are weight values for learning and back propagation;
updating the association degree of the upper and lower capsule layers by judging the weight value, taking or rejecting the proportion of the upper and lower capsule layers in the learning process, if the predicted result is close to the true value, increasing the association value of the upper and lower capsule layers, and if the predicted result is far away from the true value, reducing the association value of the upper and lower capsule layers:
cijthe associated values of i and j layers of adjacent capsule layers, bijThe probability of the ith layer of capsule layer being selected by the jth layer of capsule layer is shown, when the routing layer starts to execute, bijIs set to 0, k is the total number of capsules in the digital capsule layer, j belongs to [1, k ]];
The vector input into the j-th capsule layer is sjTo show that:
vjrepresents the output of the j-th capsule layer;
input deviceAnd output vjUpdating b through inner product operation between vectorsijBy the value of bijTo adjust the degree of correlation between capsule layers:
6. the BD capsule network-based citrus greening disease classification method according to claim 4, wherein in step S4, the image category is determined according to the output vector, specifically:
the L2 norm is calculated for 2 output vectors, and the length of each output vector is calculated, and the output vector with the largest norm value represents the category with the largest probability of the image.
7. The BD capsule network-based citrus greening disease classification method according to claim 5 or 6, wherein the digital capsule layer outputs and calculates the difference degree between the predicted classification and the actual classification:
Lc=Tc max(0,m+-||vc||)2+λ(1-Tc)max(0,||vc||-m-)2
in the formula, vcA result value, L, representing the digital capsule layer outputcRepresenting a loss function, c representing a classified class, TcIs an indication function of classification, and the classification c is 1 when the result is sick and 0 when the result is not sick; m is+False negatives are penalized for the upper border; m is-Punishment of false positives for the lower border; λ is a proportionality coefficient, and the specific gravity between the first half and the second half of the plus sign in the equation is adjusted; m is+、m-And λ are both hyper-parameters, preset prior to BD capsule network learningThe value is obtained.
8. The BD capsule network-based citrus greening disease classification method according to claim 1, wherein the fully connected layers comprise a first fully connected layer, a second fully connected layer and a third fully connected layer, wherein the input of the first fully connected layer is the output vector, the output of the first fully connected layer is input into the second fully connected layer after being subjected to Batchnorm and Dropout operations, the output of the second fully connected layer is input into the third fully connected layer after being subjected to Batchnorm and Dropout operations, and the third fully connected layer outputs the reconstructed image.
9. The BD capsule network-based citrus greening disease classification method according to claim 8, wherein the Batchnorm operation comprises the steps of:
finding out x ═ x in this batch1,x2,...xmMean value ofxAnd the batch data is output by the first full connection layer or the second full connection layer:
finding x ═ x1,x2,...xmThe variance of } in the mean square wave
III, normalizing x
In the formula, epsilon represents a deviation;
introducing scaling gamma and translation variable beta, and calculating output yi:
10. The BD capsule network-based citrus greening disease classification method of claim 8, wherein the Dropout is operative to randomly reduce neurons of a fully connected layer using bernoulli stochastic distribution, the reduced neurons having all their connections culled during training.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910747697.1A CN110599457B (en) | 2019-08-14 | 2019-08-14 | Citrus huanglongbing classification method based on BD capsule network |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910747697.1A CN110599457B (en) | 2019-08-14 | 2019-08-14 | Citrus huanglongbing classification method based on BD capsule network |
Publications (2)
Publication Number | Publication Date |
---|---|
CN110599457A true CN110599457A (en) | 2019-12-20 |
CN110599457B CN110599457B (en) | 2022-12-16 |
Family
ID=68854127
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910747697.1A Active CN110599457B (en) | 2019-08-14 | 2019-08-14 | Citrus huanglongbing classification method based on BD capsule network |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110599457B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111814532A (en) * | 2020-05-09 | 2020-10-23 | 五邑大学 | Detection and spraying method for bacterial leaf blight of rice, control device and unmanned aerial vehicle |
CN112364920A (en) * | 2020-11-12 | 2021-02-12 | 西安电子科技大学 | Thyroid cancer pathological image classification method based on deep learning |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2948499A1 (en) * | 2016-11-16 | 2018-05-16 | The Governing Council Of The University Of Toronto | System and method for classifying and segmenting microscopy images with deep multiple instance learning |
CN108985316A (en) * | 2018-05-24 | 2018-12-11 | 西南大学 | A kind of capsule network image classification recognition methods improving reconstructed network |
CN108985377A (en) * | 2018-07-18 | 2018-12-11 | 太原理工大学 | A kind of image high-level semantics recognition methods of the multiple features fusion based on deep layer network |
CN109086872A (en) * | 2018-07-30 | 2018-12-25 | 东北大学 | Seismic wave recognizer based on convolutional neural networks |
CN109636791A (en) * | 2018-12-13 | 2019-04-16 | 华南农业大学 | A kind of Citrus Huanglongbing pathogen detection method based on deep learning, apparatus and system |
CN110084320A (en) * | 2019-05-08 | 2019-08-02 | 广东工业大学 | Thyroid papillary carcinoma Ultrasound Image Recognition Method, device, system and medium |
CN110110668A (en) * | 2019-05-08 | 2019-08-09 | 湘潭大学 | A kind of gait recognition method based on feedback weight convolutional neural networks and capsule neural network |
-
2019
- 2019-08-14 CN CN201910747697.1A patent/CN110599457B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2948499A1 (en) * | 2016-11-16 | 2018-05-16 | The Governing Council Of The University Of Toronto | System and method for classifying and segmenting microscopy images with deep multiple instance learning |
CN108985316A (en) * | 2018-05-24 | 2018-12-11 | 西南大学 | A kind of capsule network image classification recognition methods improving reconstructed network |
CN108985377A (en) * | 2018-07-18 | 2018-12-11 | 太原理工大学 | A kind of image high-level semantics recognition methods of the multiple features fusion based on deep layer network |
CN109086872A (en) * | 2018-07-30 | 2018-12-25 | 东北大学 | Seismic wave recognizer based on convolutional neural networks |
CN109636791A (en) * | 2018-12-13 | 2019-04-16 | 华南农业大学 | A kind of Citrus Huanglongbing pathogen detection method based on deep learning, apparatus and system |
CN110084320A (en) * | 2019-05-08 | 2019-08-02 | 广东工业大学 | Thyroid papillary carcinoma Ultrasound Image Recognition Method, device, system and medium |
CN110110668A (en) * | 2019-05-08 | 2019-08-09 | 湘潭大学 | A kind of gait recognition method based on feedback weight convolutional neural networks and capsule neural network |
Non-Patent Citations (2)
Title |
---|
SARA SABOUR ET AL.: "Dynamic Routing Between Capsules", 《31ST ANNUAL CONFERENCE ON NEURAL INFORMATION PROCESSING SYSTEMS (NIPS)》 * |
向灿群: "卷积神经网络在图像分类识别中的应用研究", 《中国优秀博硕士学位论文全文数据库(硕士)信息科技》 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111814532A (en) * | 2020-05-09 | 2020-10-23 | 五邑大学 | Detection and spraying method for bacterial leaf blight of rice, control device and unmanned aerial vehicle |
CN112364920A (en) * | 2020-11-12 | 2021-02-12 | 西安电子科技大学 | Thyroid cancer pathological image classification method based on deep learning |
CN112364920B (en) * | 2020-11-12 | 2023-05-23 | 西安电子科技大学 | Thyroid cancer pathological image classification method based on deep learning |
Also Published As
Publication number | Publication date |
---|---|
CN110599457B (en) | 2022-12-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108491765B (en) | Vegetable image classification and identification method and system | |
Russel et al. | Leaf species and disease classification using multiscale parallel deep CNN architecture | |
CN109446985B (en) | Multi-angle plant identification method based on vector neural network | |
CN110599457B (en) | Citrus huanglongbing classification method based on BD capsule network | |
CN113221913A (en) | Agriculture and forestry disease and pest fine-grained identification method and device based on Gaussian probability decision-level fusion | |
Prasher et al. | Analysis of DenseNet201 with SGD optimizer for diagnosis of multiple rice leaf diseases | |
Jasim | High-Performance Deep learning to Detection and Tracking Tomato Plant Leaf Predict Disease and Expert Systems | |
Al-Akkam et al. | Plants leaf diseases detection using deep learning | |
Nigam et al. | 12 Wheat rust disease identification using deep learning | |
Prasher et al. | Potato leaf disease prediction using RMSProp, Adam and SGD optimizers | |
Devi et al. | Eight convolutional layered deep convolutional neural network based banana leaf disease prediction | |
Banerjee et al. | Predicting Tulip Leaf Diseases: A Integrated CNN and Random Forest Approach | |
Soujanya et al. | Recognition of plant diseases by leaf image classification based on improved alexnet | |
CN107886128A (en) | A kind of shuttlecock recognition methods, system, medium and equipment | |
CN112597890A (en) | Face recognition method based on multi-dimensional Taylor network | |
Yang et al. | The research on detection of crop diseases ranking based on transfer learning | |
Lwin et al. | Image Classification for Rice Leaf Disease Using AlexNet Model | |
Rasjava et al. | Detection of Rice Plants Diseases Using Convolutional Neural Network (CNN) | |
Drake | Why does grassland productivity increase with species richness? Disentangling species richness and composition with tests for overyielding and superyielding in biodiversity experiments | |
Kaur et al. | MintLeafNet: A CNN Model for Accurate and Efficient Detection of Multiple Mint Leaf Diseases | |
CN115587296A (en) | Automatic crop disease identification method based on migration learning step-by-step identification | |
Varur et al. | Classification of maturity stages of coconuts using deep learning on embedded platforms | |
Iparraguirre-Villanueva et al. | Disease identification in crop plants based on convolutional neural networks | |
Maurya et al. | Multi-head attention-based transfer learning approach for porato disease detection | |
Chandra et al. | A versatile approach based on convolutional neural networks for early identification of diseases in tomato plants |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |