CN108564589A - A kind of plant leaf blade dividing method based on the full convolutional neural networks of improvement - Google Patents
A kind of plant leaf blade dividing method based on the full convolutional neural networks of improvement Download PDFInfo
- Publication number
- CN108564589A CN108564589A CN201810253196.3A CN201810253196A CN108564589A CN 108564589 A CN108564589 A CN 108564589A CN 201810253196 A CN201810253196 A CN 201810253196A CN 108564589 A CN108564589 A CN 108564589A
- Authority
- CN
- China
- Prior art keywords
- neural networks
- convolutional neural
- layer
- full convolutional
- blade
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30188—Vegetation; Agriculture
Abstract
The invention discloses a kind of based on the plant leaf blade dividing method for improving full convolutional neural networks.The model parameter amount of 8 times of up-samplings of original FCN is big, to realize the quick Accurate Segmentation of plant leaf blade, using direct-connected structure, remove part layer, and utilize single layer feature, the characteristic pattern after the diminution of VGG16 models is up-sampled by minification to artwork size by deconvolution, model after 3 kinds of improvement is obtained.Finally using 6 kinds of different plants, totally 1762 pictures are as training sample, and 441 blade pictures are as test sample training pattern.It was found that it is only 6.90MB to be occupied using the memory parameters of the direct-connected structural model of 4 times of up-samplings, it is reduced into 1/77 times of original FCN models, classification Average Accuracy and average area coincidence degree are up to 99.099% and 98.204% on test set.For this method compared with conventional method K means clusters and Otsu thresholding methods, average area registration is higher by 3.704 and 4.295 percentage points, can extract intact leaf, is influenced by blade surface color, intensity of illumination unevenness small.
Description
Technical field
The present invention relates to a kind of image partition methods, and in particular to a kind of based on the leaves of plants for improving full convolutional neural networks
Piece dividing method.
Background technology
Plant is the important component of the ecosystem, and plant identification and plant disease identification are a weights of plant protection
It works.Plant leaf blade segmentation is logical to be intended to the leaf area in positioning image and extraction, is plant to reduce background object interference
Important step in identification and plant disease identification.The segmentation effect of blade is to subsequent feature extraction and pattern-recognition work meeting
Generating directly influences, therefore the Accurate Segmentation extraction problem of blade receives significant attention.
Segmentation for plant leaf blade, existing method are broadly divided into the segmentation based on threshold value and the segmentation based on cluster,
Threshold segmentation has minimum error method, maximum variance between clusters (Otsu) and maximum entropy method (MEM).Otsu as typical thresholding method,
Optimal segmenting threshold can be chosen automatically, and realization is relatively simple, but the less effective when foreground intensity profile range is larger.Cluster
Algorithm mainly by the similarity of color between zoning, clusters Similar color, but in foreground and background colour-difference
When different smaller, it is difficult to extract foreground target.Due to being affected by external environmental factor, existing certain methods, which exist, accidentally divides
Phenomenon is cut, this will produce adverse effect to the development of follow-up identification process, therefore proposing that a kind of automatic, accurate dividing method is must
It wants.
Invention content
It is a primary object of the present invention to overcome deficiency in the prior art, provide it is a kind of can accurately by plant leaf blade from
The method split in background, to including scab similar with background color, uneven illumination or dash area, energy gram in blade
It is taken to interfere and accurately extract intact leaf.In order to solve the above technical problems, the technical solution adopted by the present invention is:
A kind of image partition method based on the full convolutional neural networks of improvement, includes the following steps:
(1) test data is obtained and is handled;
(2) full convolutional neural networks are built;
(3) the full convolutional neural networks of training;
(4) blade segmentation is carried out using trained full convolutional neural networks.
The process (1) specifically includes following step:
The complicated variety that sample is considered in Selection experiment sample, if the dimension of picture in pictures is inconsistent, shooting
Intensity of illumination is uneven, there is black shade etc., the color disease blade similar with background colour of selected section scab in picture.
Select 6 kinds of plants in PlantVillage engineerings (www.plantvillage.org) totally 2203 blade pictures
As test data, wherein the disease blade picture of healthy leaves picture and 2 kinds of plants comprising 5 kinds of plants.
Image is marked by specialized image annotation tool, obtains the binary map that blade is respectively labeled as 1 and 0 with background.It will
It marks image and presses 4:1 ratio is randomly divided into training set and test set, obtains 1762 trained pictures and 441 test pictures.
The process (2) specifically includes following step:
VGG-16 disaggregated models are chosen as basic model, build full convolutional neural networks.
The full convolutional neural networks include transfer learning parameter layer and the parameter layer that non-migrating learns.Wherein transfer learning
Parameter layer have first layer convolutional layer C1, it includes Conv1_1, Conv1_2, second layer convolutional layer C2, it includes Conv2_1,
Conv2_2, third layer convolutional layer C3, it includes Conv3_1, Conv3_2, Conv3_3, the 4th layer of convolutional layer C4, it includes
Conv4_1、Conv4_2、Conv4_3.The parameter layer of non-migrating study is convolutional layer C5, C6, C7, warp lamination D5, D6, D7.
The convolution kernel size that convolutional layer C1 is used is 3 × 3, step-length 1, totally 64 convolution kernels.The convolution kernel size that convolutional layer C2 is used
It is 3 × 3, step-length 1, totally 128 convolution kernels.The convolution kernel size that convolutional layer C3 is used is 3 × 3, step-length 1, totally 256 volumes
Product core.The convolution kernel size that convolutional layer C4 is used is 3 × 3, step-length 1, totally 512 convolution kernels.Convolutional layer C5, C6, C7 are to spy
Sign carries out Nonlinear Mapping, and the convolution kernel size used is 1 × 1, step-length 1, totally 2 convolution kernels.Warp lamination D5 is used
Convolution kernel size be 16 × 16, step-length 8, totally 2 convolution kernels.The convolution kernel size that warp lamination D6 is used is 8 × 8, step
A length of 4, totally 2 convolution kernels.The convolution kernel size that warp lamination D7 is used is 4 × 4, step-length 2, totally 2 convolution kernels.In warp
Connection cuts layer after lamination, for the characteristic pattern after sampling to be cut to artwork size, finally uses Softmax layers of counting loss
Value.
Above-mentioned full convolutional neural networks are designed as 3 models, are Direct-FCN8s, Direct-FCN4s respectively,
Direct-FCN2s, wherein 8,4,2 refer respectively to the step-length that warp lamination uses.Direct-FCN8s migrates C1 to Conv4-3
After parameter, connects C5 and D5 and export prediction result.After Direct-FCN4s migrates C1 to Conv3-3 parameters, connection C6 and D6 is defeated
Go out prediction result.After Direct-FCN2s migrates C1 to Conv2-2 parameters, connects C7 and D7 and export prediction result.
The process (3) specifically includes following step:
The experiment artwork of acquisition and label picture are corresponded, it will according to the input form of Caffe deep learning frames
Picture is put into corresponding folder, and wherein label picture need to be converted into matlab matrix forms, additionally set up train.txt with
Val.txt files separately include the title of training set and all pictures of test set.
Under GPU environment, it is 1 that trained batch size, which is arranged, using stochastic gradient descent algorithm learning model parameter, Gu
It is 1E-05 to determine learning rate, and regularization coefficient is set as 0.0005, and the momentum term that stochastic gradient descent is arranged is 0.99.Training
It is set as 1 with test pictures batch size, training 1762 batches of iteration traverse a training set (epoch) and once surveyed
Examination, traversing a test set needs 441 test iteration, greatest iteration 176200 times (100 epoch).
Evaluation model performance takes the classification Average Accuracy of segmentation with reference to internationally recognized semantic segmentation evaluation index
(mean acc) and average area coincidence degree (mean intersection over union, mean IoU) are used as segmentation precision
Index, it is also contemplated that model memory parameters demand (size of computer memory space shared by model itself) and it is average individual
The picture segmentation time (has randomly selected 35 test pictures, has run segmentation procedure at only CPU and GPU state respectively, take list
The average value of pictures sliced time).
Mean acc=(1/ncl)∑inii/ti (1)
Mean IoU=(1/ncl)∑inii/(ti+∑jnji-nii) (2)
Wherein, nclIndicate the classification number of segmentation, ti=∑jnijIndicate the total pixel number of classification i, niiIndicate classification i predictions
For the pixel number of i, njiIndicate that classification j is predicted as the pixel number of i.
The process (4) specifically includes following step:
Caffe deep learning framework environments are configured, the leaf image for needing to carry out image segmentation is read, passes through Python languages
Image category 1 (i.e. blade) is calculated after propagated forward in model file after sentence is called using .caffemodel as the training of suffix
Probability distribution graph, take probability 0.5 be used as threshold value, be determined as blade more than 0.5, background be otherwise determined as, in program
Middle that classification is assigned a value of 255 for the pixel value at 1, classification is that the pixel value at 0 is assigned a value of 0.After model is divided generate with
White is foreground (blade), the bianry image that black is background.
Beneficial effects of the present invention:
The present invention can effectively overcome blade using full convolutional neural networks dividing plant blade, the model after training is improved
The influence of the uneven illumination, shade and scab on surface, the phenomenon that avoiding less divided and over-segmentation, model parameter amount is small, can be with
Meet and require in real time, and obtains accurate complete plant leaf blade segmentation result.
Description of the drawings
Fig. 1 is the structure chart of full convolutional neural networks after the improvement in the present invention.
Fig. 2 is improved model segmentation precision and iterations relational graph.
Fig. 3 is segmentation result of the distinct methods to partial test sample.
Specific implementation mode
The invention will be further described in the following with reference to the drawings and specific embodiments.
A kind of plant leaf blade dividing method based on the full convolutional neural networks of improvement proposed by the present invention, including walk as follows
Suddenly:
(1) test data is obtained and is handled
(2) full convolutional neural networks are built
(3) the full convolutional neural networks of training
(4) divided using trained full convolutional neural networks model realization blade
The process (1) specifically includes following step:
The complicated variety that sample is considered in Selection experiment sample, if the dimension of picture in pictures is inconsistent, shooting
Intensity of illumination is uneven, there is black shade etc., the color disease blade similar with background colour of selected section scab in picture.
Select 6 kinds of plants in PlantVillage engineerings (www.plantvillage.org) totally 2203 blade pictures
As test data, wherein the disease blade picture of healthy leaves picture and 2 kinds of plants comprising 5 kinds of plants.
Image is marked by specialized image annotation tool Labelme, obtains blade and background are respectively labeled as 1 and 0 two
Value figure.Mark image is pressed 4:1 ratio is randomly divided into training set and test set, obtains 1762 trained pictures and 441 surveys
Attempt piece.
The process (2) specifically includes following step:
As shown in Figure 1, choosing VGG-16 disaggregated models as basic model, full convolutional neural networks are built.
The full convolutional neural networks include transfer learning parameter layer and the parameter layer that non-migrating learns.Wherein transfer learning
Parameter layer have first layer convolutional layer C1, including Conv1_1, Conv1_2, second layer convolutional layer C2, including Conv2_1,
Conv2_2, third layer convolutional layer C3, including Conv3_1, Conv3_2, Conv3_3, the 4th layer of convolutional layer C4, including Conv4_
1、Conv4_2、Conv4_3.The parameter layer of non-migrating study includes convolutional layer C5, C6, C7, and warp lamination includes D5, D6, D7.
The convolution kernel size that convolutional layer C1 is used is 3 × 3, step-length 1, totally 64 convolution kernels.The convolution kernel size that convolutional layer C2 is used
It is 3 × 3, step-length 1, totally 128 convolution kernels.The convolution kernel size that convolutional layer C3 is used is 3 × 3, step-length 1, totally 256 volumes
Product core.The convolution kernel size that convolutional layer C4 is used is 3 × 3, step-length 1, totally 512 convolution kernels.Convolutional layer C5, C6, C7 are to spy
Sign carries out Nonlinear Mapping, and the convolution kernel size used is 1 × 1, step-length 1, totally 2 convolution kernels.Warp lamination D5 is used
Convolution kernel size be 16 × 16, step-length 8, totally 2 convolution kernels.The convolution kernel size that warp lamination D6 is used is 8 × 8, step
A length of 4, totally 2 convolution kernels.The convolution kernel size that warp lamination D7 is used is 4 × 4, step-length 2, totally 2 convolution kernels.In warp
Connection cuts layer after lamination, for the characteristic pattern after up-sampling to be cut to artwork size, finally uses Softmax layers to calculate and damages
Mistake value.
Above-mentioned full convolutional neural networks are designed to 3 models, are Direct-FCN8s, Direct-FCN4s respectively,
Direct-FCN2s, wherein 8,4,2 refer respectively to the step-length that warp lamination uses.Fig. 1 is full convolution after the improvement in the present invention
The structure chart of neural network.After Direct-FCN8s migrates C1 to Conv4-3 parameters, connects C5 and D5 and export prediction result.
After Direct-FCN4s migrates C1 to Conv3-3 parameters, connects C6 and D6 and export prediction result.Direct-FCN2s migrates C1 extremely
After Conv2-2 parameters, connects C7 and D7 and export prediction result.
The process (3) specifically includes following step:
The experiment artwork of acquisition and label picture are corresponded, it will according to the input form of Caffe deep learning frames
Picture is put into corresponding folder, and wherein label picture need to be converted into matlab matrix forms, additionally set up train.txt with
Val.txt files separately include the title of training set and all pictures of test set.
Under GPU environment, it is 1 that trained batch size, which is arranged, using stochastic gradient descent algorithm learning model parameter, Gu
It is 1E-05 to determine learning rate, and regularization coefficient is set as 0.0005, and the momentum term that stochastic gradient descent is arranged is 0.99.Training
It is set as 1 with test pictures batch size, training 1762 batches of iteration traverse a training set (epoch) and once surveyed
Examination, traversing a test set needs 441 test iteration, greatest iteration 176200 times (100 epoch).
Evaluation model performance takes the classification Average Accuracy of segmentation with reference to internationally recognized semantic segmentation evaluation index
(mean acc) and average area coincidence degree (mean intersection over union, mean IoU) are used as segmentation precision
Index, it is also contemplated that model memory parameters demand (size of computer memory space shared by model itself) and it is average individual
The picture segmentation time (has randomly selected 35 test pictures, has run segmentation procedure at only CPU and GPU state respectively, take list
The average value of pictures sliced time).
Mean acc=(1/ncl)∑inii/ti (1)
Mean IoU=(1/ncl)∑inii/(ti+∑jnji-nii) (2)
Wherein nclIndicate the classification number of segmentation, ti=∑j nijIndicate the total pixel number of classification i, niiIndicate classification i predictions
For the pixel number of i, njiIndicate that classification j is predicted as the pixel number of i.
Fig. 2 is that the present invention is increased with iterations using the segmentation precision of 8 times, 4 times and 2 times up-sampling times exponential models
Trend chart.As it can be seen that after 1 repetitive exercise, classification Average Accuracy and the average area registration of model distinguish energy
Reach 98% and 96% or more, when iterations continue growing, the classification Average Accuracy of each model is overlapped with average area
Degree has promotion.
Table 1 is that the performance of each model compares, it is seen then that each model accuracy in the present invention can reach high value, wherein adopting
The model Direct-FCN4s precision up-sampled with 4 times is better than the up-sampling model using 2 times and 8 times, classification Average Accuracy
With average area registration respectively up to 99.099% and 98.204%.Its memory parameters demand only accounts for 6.90MB, is reduced into original
The 1/77 of beginning model Skip-FCN8s-4096.In addition, the segmentation of Direct-FCN4s models is average single under only CPU and GPU state
Pictures sliced time is only 3.89s and 0.06s, is 1/9th of Skip-FCN4s-4096 model sliced times.
Table 1
By the method for the present invention (Direct-FCN4s and archetype Skip-FCN8s-4096) and K-means clustering procedures with
And threshold method Otsu dividing methods are compared.Select 4 pictures at random in test set, Fig. 3 is the segmentation under distinct methods
Effect exemplary plot, the first row artwork are cherry healthy leaves pictures, and background is simpler, including less dash area, three kinds of methods
Segmentation effect is all preferable;Second row artwork is blueberry health leaf picture, surface while blade picture includes bulk dash area
Intensity of illumination it is different, by observation as a result, finding two kinds of listed conventional methods to dash area high treating effect, but right
It is poor in the stronger part treatment effect of blade surface illumination, it is accidentally easily divided into background.It is opened unlike illustration from front two,
The original picture of third and fourth row is potato late blight and pepper bacterial leaf spot picture respectively, the scab color of blade surface with
Blade normal segments differ greatly, and the color of presentation is more similar to background.K-means is with Otsu split plot designs to the mistake of scab part
Segmentation rate is high, and blade complete characterization information is caused to be lost, and the optimal models and archetype of the present invention can realize that blade is complete
Whole segmentation.Experiment shows that conventional segmentation methods are preferable for the segmentation effect of the blade of simple background, K-means and the side Otsu
Owned by France in non-learning method, segmentation effect is easily influenced by environmental factor, and the present invention can be obtained automatically by learning priori
The feature for taking picture, it is good to complicated picture processing robustness, the automatic segmentation of complicated picture may be implemented.
The series of detailed descriptions listed above only for the present invention feasible embodiment specifically
Bright, they are all without departing from equivalent implementations made by technical spirit of the present invention not to limit the scope of the invention
Or change should all be included in the protection scope of the present invention.
Claims (8)
1. a kind of based on the plant leaf blade dividing method for improving full convolutional neural networks, which is characterized in that include the following steps:
Step 1, test data is obtained and is handled:It obtains test data sample and has complicated variety, and data sample is divided into
Training picture and test pictures;
Step 2, VGG-16 disaggregated models are chosen as basic model, build full convolutional neural networks;
Step 3, full convolutional neural networks are trained using Caffe deep learnings;
Step 4, blade is split using trained full convolutional neural networks model.
2. according to claim 1 a kind of based on the plant leaf blade dividing method for improving full convolutional neural networks, feature
It is, the detailed process of the step 1 includes:
Step 1.1, select PlantVillage engineerings in 6 kinds of plants totally 2203 blade pictures as test data, wherein
Include the disease blade picture of the healthy leaves picture and 2 kinds of plants of 5 kinds of plants;
Step 1.2, image is marked by specialized image annotation tool, obtains the two-value that blade is respectively labeled as 1 and 0 with background
Mark image is pressed 4 by figure:1 ratio is randomly divided into training set and test set, obtains 1762 trained pictures and 441 tests
Picture.
3. according to claim 1 a kind of based on the plant leaf blade dividing method for improving full convolutional neural networks, feature
It is, the complicated variety of the sample includes:Dimension of picture in pictures is inconsistent, shooting intensity of illumination is uneven, figure
There are black shade etc., the color of part scab similar to background colour in piece.
4. according to claim 1 a kind of based on the plant leaf blade dividing method for improving full convolutional neural networks, feature
It is, the detailed process of the step 2 includes:
The full convolutional neural networks include transfer learning parameter layer and the parameter layer that non-migrating learns;The wherein ginseng of transfer learning
Several layers have first layer convolutional layer Conv1, second layer convolutional layer Conv2, third layer convolutional layer Conv3, the 4th layer of convolutional layer Conv,
The parameter layer of non-migrating study is convolutional layer C5, C6, C7, warp lamination D5, D6, D7;Connection cuts layer after warp lamination, uses
In the characteristic pattern after sampling is cut to artwork size, Softmax layers of counting loss value are finally used.
5. according to claim 4 a kind of based on the plant leaf blade dividing method for improving full convolutional neural networks, feature
It is, convolutional layer Conv1 includes Conv1_1, Conv1_2, and the convolution kernel size used is 3 × 3, step-length 1, totally 64 convolution
Core;
Convolutional layer Conv2 includes Conv2_1, Conv2_2, and the convolution kernel size used is 3 × 3, step-length 1, totally 128 convolution
Core;
Convolutional layer Conv3 includes Conv3_1, Conv3_2, Conv3_3, and the convolution kernel size used is 3 × 3, step-length 1, altogether
256 convolution kernels;
Convolutional layer Conv4 includes Conv4_1, Conv4_2, Conv4_3, and the convolution kernel size used is 3 × 3, step-length 1, altogether
512 convolution kernels;
Convolutional layer C5, C6, C7 carry out Nonlinear Mapping to feature, and the convolution kernel size used is 1 × 1, step-length 1, totally 2
Convolution kernel;
The convolution kernel size that warp lamination D5 is used is 16 × 16, step-length 8, totally 2 convolution kernels;
The convolution kernel size that warp lamination D6 is used is 8 × 8, step-length 4, totally 2 convolution kernels;
The convolution kernel size that warp lamination D7 is used is 4 × 4, step-length 2, totally 2 convolution kernels.
6. according to claim 5 a kind of based on the plant leaf blade dividing method for improving full convolutional neural networks, feature
It is, the full convolutional neural networks are designed as 3 models, are Direct-FCN8s, Direct-FCN4s, Direct- respectively
FCN2s, wherein 8,4,2 refer respectively to the step-length that warp lamination uses;Direct-FCN8s migrates Conv1 to Conv4-3 parameters
Afterwards, it connects C5 and D5 and exports prediction result;After Direct-FCN4s migrates Conv1 to Conv3-3 parameters, C6 and D6 outputs are connected
Prediction result;After Direct-FCN2s migrates Conv1 to Conv2-2 parameters, connects C7 and D7 and export prediction result.
7. according to claim 1 a kind of based on the plant leaf blade dividing method for improving full convolutional neural networks, feature
It is, the detailed process of the step 3 includes:
The experiment artwork of acquisition and label picture are corresponded, according to the input form of Caffe deep learning frames by picture
Be put into corresponding folder, wherein label picture need to be converted into matlab matrix forms, additionally set up train.txt with
Val.txt files separately include the title of training set and all pictures of test set;
Under GPU environment, it is 1 that trained batch size, which is arranged, fixed to learn using stochastic gradient descent algorithm learning model parameter
Habit rate is 1E-05, and regularization coefficient is set as 0.0005, and the momentum term that stochastic gradient descent is arranged is 0.99;Training and survey
Attempting piece batch size and is set as 1, training iteration 1762 times, it is an epoch to traverse a training set, is once tested, time
Going through a test set needs 441 test iteration, greatest iteration 176200 times, i.e. 100 epoch;
Determine validity function, the classification Average Accuracy mean acc of segmentation and average area coincidence degree mean IoU conducts
Segmentation precision index, it is also contemplated that model memory parameters demand and average single picture sliced time;
Mean acc=(1/ncl)∑inii/ti
Mean IoU=(1/ncl)∑inii/(ti+∑jnji-nii)
Wherein, nclIndicate the classification number of segmentation, ti=∑jnijIndicate the total pixel number of classification i, niiIndicate that classification i is predicted as i's
Pixel number, njiIndicate that classification j is predicted as the pixel number of i.
8. according to claim 1 a kind of based on the plant leaf blade dividing method for improving full convolutional neural networks, feature
It is, the detailed process of the step 4 includes:
Caffe deep learning framework environments are configured, the leaf image for needing to carry out image segmentation is read, utilizes trained nerve
The probability distribution graph of blade is calculated after propagated forward for network model, takes probability 0.5 to be used as threshold value, is sentenced more than 0.5
Not Wei blade, be otherwise determined as background, be that pixel value at 1 is assigned a value of 255 by classification, classification is that the pixel value at 0 is assigned a value of
0;It is the bianry image of background to be generated by foreground, black of white after neural network model is divided.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810253196.3A CN108564589A (en) | 2018-03-26 | 2018-03-26 | A kind of plant leaf blade dividing method based on the full convolutional neural networks of improvement |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810253196.3A CN108564589A (en) | 2018-03-26 | 2018-03-26 | A kind of plant leaf blade dividing method based on the full convolutional neural networks of improvement |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108564589A true CN108564589A (en) | 2018-09-21 |
Family
ID=63533275
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810253196.3A Pending CN108564589A (en) | 2018-03-26 | 2018-03-26 | A kind of plant leaf blade dividing method based on the full convolutional neural networks of improvement |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108564589A (en) |
Cited By (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109558787A (en) * | 2018-09-28 | 2019-04-02 | 浙江农林大学 | A kind of Bamboo insect pests recognition methods based on convolutional neural networks model |
CN109754362A (en) * | 2018-12-24 | 2019-05-14 | 哈尔滨工程大学 | A method of sea cucumber object detection results are marked with rotatable bounding box |
CN110309841A (en) * | 2018-09-28 | 2019-10-08 | 浙江农林大学 | A kind of hickory nut common insect pests recognition methods based on deep learning |
CN110378305A (en) * | 2019-07-24 | 2019-10-25 | 中南民族大学 | Tealeaves disease recognition method, equipment, storage medium and device |
CN110443811A (en) * | 2019-07-26 | 2019-11-12 | 广州中医药大学(广州中医药研究院) | A kind of full-automatic partition method of complex background leaf image |
CN110457982A (en) * | 2018-12-28 | 2019-11-15 | 中国科学院合肥物质科学研究院 | A kind of crop disease image-recognizing method based on feature transfer learning |
CN110969566A (en) * | 2018-09-29 | 2020-04-07 | 北京嘉楠捷思信息技术有限公司 | Deconvolution processing method and device, and image processing method and device |
CN111444924A (en) * | 2020-04-20 | 2020-07-24 | 中国科学院声学研究所南海研究站 | Method and system for detecting plant diseases and insect pests and analyzing disaster grades |
CN111553240A (en) * | 2020-04-24 | 2020-08-18 | 四川省农业科学院农业信息与农村经济研究所 | Corn disease condition grading method and system and computer equipment |
CN111862190A (en) * | 2020-07-10 | 2020-10-30 | 北京农业生物技术研究中心 | Method and device for automatically measuring area of isolated plant soft rot disease spot |
CN112381835A (en) * | 2020-10-29 | 2021-02-19 | 中国农业大学 | Crop leaf segmentation method and device based on convolutional neural network |
CN112949378A (en) * | 2020-12-30 | 2021-06-11 | 至微生物智能科技(厦门)有限公司 | Bacterial microscopic image segmentation method based on deep learning network |
CN113313752A (en) * | 2021-05-27 | 2021-08-27 | 哈尔滨工业大学 | Leaf area index identification method based on machine vision |
CN111307805B (en) * | 2020-03-20 | 2023-07-18 | 湖南烟叶复烤有限公司郴州复烤厂 | Threshing quality detection device and detection method based on visual feature fusion |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107169974A (en) * | 2017-05-26 | 2017-09-15 | 中国科学技术大学 | It is a kind of based on the image partition method for supervising full convolutional neural networks more |
WO2018036293A1 (en) * | 2016-08-26 | 2018-03-01 | 杭州海康威视数字技术股份有限公司 | Image segmentation method, apparatus, and fully convolutional network system |
-
2018
- 2018-03-26 CN CN201810253196.3A patent/CN108564589A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2018036293A1 (en) * | 2016-08-26 | 2018-03-01 | 杭州海康威视数字技术股份有限公司 | Image segmentation method, apparatus, and fully convolutional network system |
CN107169974A (en) * | 2017-05-26 | 2017-09-15 | 中国科学技术大学 | It is a kind of based on the image partition method for supervising full convolutional neural networks more |
Non-Patent Citations (1)
Title |
---|
EVAN SHELLHAMER等: "Fully Convolutional for Semantic Segmentation", 《IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE》 * |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110309841A (en) * | 2018-09-28 | 2019-10-08 | 浙江农林大学 | A kind of hickory nut common insect pests recognition methods based on deep learning |
CN109558787A (en) * | 2018-09-28 | 2019-04-02 | 浙江农林大学 | A kind of Bamboo insect pests recognition methods based on convolutional neural networks model |
CN110969566A (en) * | 2018-09-29 | 2020-04-07 | 北京嘉楠捷思信息技术有限公司 | Deconvolution processing method and device, and image processing method and device |
CN109754362A (en) * | 2018-12-24 | 2019-05-14 | 哈尔滨工程大学 | A method of sea cucumber object detection results are marked with rotatable bounding box |
CN110457982A (en) * | 2018-12-28 | 2019-11-15 | 中国科学院合肥物质科学研究院 | A kind of crop disease image-recognizing method based on feature transfer learning |
CN110457982B (en) * | 2018-12-28 | 2023-04-11 | 中国科学院合肥物质科学研究院 | Crop disease image identification method based on feature migration learning |
CN110378305B (en) * | 2019-07-24 | 2021-10-12 | 中南民族大学 | Tea disease identification method, equipment, storage medium and device |
CN110378305A (en) * | 2019-07-24 | 2019-10-25 | 中南民族大学 | Tealeaves disease recognition method, equipment, storage medium and device |
CN110443811A (en) * | 2019-07-26 | 2019-11-12 | 广州中医药大学(广州中医药研究院) | A kind of full-automatic partition method of complex background leaf image |
CN111307805B (en) * | 2020-03-20 | 2023-07-18 | 湖南烟叶复烤有限公司郴州复烤厂 | Threshing quality detection device and detection method based on visual feature fusion |
CN111444924A (en) * | 2020-04-20 | 2020-07-24 | 中国科学院声学研究所南海研究站 | Method and system for detecting plant diseases and insect pests and analyzing disaster grades |
CN111444924B (en) * | 2020-04-20 | 2023-05-30 | 中国科学院声学研究所南海研究站 | Method and system for detecting plant diseases and insect pests and analyzing disaster grade |
CN111553240A (en) * | 2020-04-24 | 2020-08-18 | 四川省农业科学院农业信息与农村经济研究所 | Corn disease condition grading method and system and computer equipment |
CN111862190A (en) * | 2020-07-10 | 2020-10-30 | 北京农业生物技术研究中心 | Method and device for automatically measuring area of isolated plant soft rot disease spot |
CN111862190B (en) * | 2020-07-10 | 2024-04-05 | 北京农业生物技术研究中心 | Method and device for automatically measuring area of soft rot disease spots of isolated plants |
CN112381835A (en) * | 2020-10-29 | 2021-02-19 | 中国农业大学 | Crop leaf segmentation method and device based on convolutional neural network |
CN112949378A (en) * | 2020-12-30 | 2021-06-11 | 至微生物智能科技(厦门)有限公司 | Bacterial microscopic image segmentation method based on deep learning network |
CN113313752A (en) * | 2021-05-27 | 2021-08-27 | 哈尔滨工业大学 | Leaf area index identification method based on machine vision |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108564589A (en) | A kind of plant leaf blade dividing method based on the full convolutional neural networks of improvement | |
Karlekar et al. | SoyNet: Soybean leaf diseases classification | |
CN107977671B (en) | Tongue picture classification method based on multitask convolutional neural network | |
CN107480620B (en) | Remote sensing image automatic target identification method based on heterogeneous feature fusion | |
Mathur et al. | Crosspooled FishNet: transfer learning based fish species classification model | |
CN108734694A (en) | Thyroid tumors ultrasonoscopy automatic identifying method based on faster r-cnn | |
Marchant et al. | Automated analysis of foraminifera fossil records by image classification using a convolutional neural network | |
CN106462746A (en) | Analyzing digital holographic microscopy data for hematology applications | |
CN111476170A (en) | Remote sensing image semantic segmentation method combining deep learning and random forest | |
Nawaz et al. | A robust deep learning approach for tomato plant leaf disease localization and classification | |
CN111753828A (en) | Natural scene horizontal character detection method based on deep convolutional neural network | |
Xiao et al. | Salient object detection based on eye tracking data | |
CN112347977B (en) | Automatic detection method, storage medium and device for induced pluripotent stem cells | |
CN109961096B (en) | Multimode hyperspectral image migration classification method | |
Li et al. | Example-based image colorization via automatic feature selection and fusion | |
Liu et al. | SemiText: Scene text detection with semi-supervised learning | |
Wu et al. | Scene text detection using adaptive color reduction, adjacent character model and hybrid verification strategy | |
Zhai et al. | A generative adversarial network based framework for unsupervised visual surface inspection | |
CN109685030A (en) | A kind of mug rim of a cup defects detection classification method based on convolutional neural networks | |
CN110543906A (en) | Skin type automatic identification method based on data enhancement and Mask R-CNN model | |
Ju et al. | Classification of jujube defects in small data sets based on transfer learning | |
Sivaranjani et al. | Real-time identification of medicinal plants using machine learning techniques | |
CN103903015A (en) | Cell mitosis detection method | |
Asming et al. | Processing and classification of landsat and sentinel images for oil palm plantation detection | |
Naiemi et al. | Scene text detection using enhanced extremal region and convolutional neural network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180921 |
|
RJ01 | Rejection of invention patent application after publication |