AU2020103613A4 - Cnn and transfer learning based disease intelligent identification method and system - Google Patents

Cnn and transfer learning based disease intelligent identification method and system Download PDF

Info

Publication number
AU2020103613A4
AU2020103613A4 AU2020103613A AU2020103613A AU2020103613A4 AU 2020103613 A4 AU2020103613 A4 AU 2020103613A4 AU 2020103613 A AU2020103613 A AU 2020103613A AU 2020103613 A AU2020103613 A AU 2020103613A AU 2020103613 A4 AU2020103613 A4 AU 2020103613A4
Authority
AU
Australia
Prior art keywords
layer
disease
plant
image
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
AU2020103613A
Inventor
Chen CAI
Yan Cao
Peng HE
Liang Hu
Bo Lei
Yongbo Liu
Jiangyun Tang
Qingxiang Tang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Agricultural Information And Rural Economic Research Institute Of Sichuan Academy Of Agricultural Sciences
Original Assignee
Agricultural Information And Rural Economic Res Institute Of Sichuan Academy Of Agricultural Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agricultural Information And Rural Economic Res Institute Of Sichuan Academy Of Agricultural Science filed Critical Agricultural Information And Rural Economic Res Institute Of Sichuan Academy Of Agricultural Science
Priority to AU2020103613A priority Critical patent/AU2020103613A4/en
Application granted granted Critical
Publication of AU2020103613A4 publication Critical patent/AU2020103613A4/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/096Transfer learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The present invention provides a CNN and transfer learning based disease intelligent identification method and system, which could lower the interference of picture background, achieve a high identification accuracy in the case of a limited quantity of samples and support 5 multi-classification of training samples, and is thus high in operating efficiency. A disease image intelligent identification method comprises the following steps of: image preprocessing: normalizing the sizes of images, and quickly positioning a region of sign of disease using Faster-RCNN to exclude background interferences; image feature extraction: extracting image features using a triplet similarity measurement model, and then using SIFT feature as a 0 complementary feature to perform weight fusion; and disease classification and identification: learning a first image feature of normal plant images using a deep convolutional neural network, learning a second image feature of diseased plant images through transfer learning, and performing classification and identification in combination with the first image feature and the second image feature. 1/3 Android APP Service layer ... ... ... .--- ... . ..... Control layer Computing storage layer Fig. I inpu 4 rocesin F-RNN raco etifia~e ->output Fig. 2

Description

1/3
Android APP
Service layer
...... ... .... .--- .....
Control layer
Computing storage layer
Fig. I
inpu 4 rocesin F-RNN raco etifia~e ->output
Fig. 2
CNN AND TRANSFER LEARNING BASED DISEASE INTELLIGENT IDENTIFICATION METHOD AND SYSTEM
Technical Field
The present invention relates to the field of intelligent identification, in particular to a CNN and transfer learning based disease intelligent identification method and system.
Background
Currently, technologies that could be used in the field of computer vision are many, among which the most commonly used two in the field of agriculture are image morphology based OpenCV vision algorithm library and SVM (support vector machine).
The OpenCV vision algorithm utilizes color and shape features of disease graphs as a basis, extracts disease features in a visible light condition and classifies images according to the features to complete diagnosis. Such method requires an extremely high quality on a disease image and the identification result may be affected by different light conditions and backgrounds. Therefore, this method is low in identification accuracy and anti-noise capacity and poor in universality of disease image detection.
The SVM is based on the principle of deep learning technology, and can train image samples to achieve feature extraction so as to realize the aim of classification and identification. However, this method has two shortcomings: (1) this method uses quadratic -0 programming to solve support vectors, so that if in the case of a great quantity of samples, the solving process of a matrix would occupy a great amount of running memory and operating time, thus being poor in operating performances; (2) classic SVM algorithms only provide algorithms of binary classification, but with respect to disease identification uses in the field of agriculture, the problem to be solved usually is multi-class classification; thus, the multi-classification problem can only be solved by combining the classic SVM algorithms with other algorithms, resulting in an increase of model complexity level and developing cost.
There is thus a need for improved methods for intelligent identification of disease to overcome the defects of the prior art, or at least provide a useful alternative to existing methods.
Summary of the Invention
Embodiments of the present invention may provide a CNN and transfer learning based disease intelligent identification method and system, which could lower the interference of picture background, achieve a high identification accuracy in the case of a limited quantity of samples and support multi-classification of training samples and is thus high in operating efficiency.
The embodiments of the present invention are achieved as follows:
A CNN and transfer learning based disease intelligent identification method, the intelligent identification method comprises a disease image identification method, and the disease image identification method comprises the following steps of:
Sl, image preprocessing: performing image binaryzation on acquired plant pictures, normalizing the sizes of the images, and quickly positioning a region of sign of disease using a Faster-RCNN multi-target detecting algorithm to exclude background interferences, wherein the plant pictures comprise normal plant images and diseased plant images;
S2, image feature extraction: extracting plant image features using a triplet similarity measurement model, and then using SIFT feature as a complementary feature to perform weight fusion; and
S3, disease classification and identification: performing multi-classification identification on diseases using a deep convolutional neural network model, comprising the following steps of:
S31, establishing and training a first convolutional neural network model;
S32, using the described obtained first convolutional neural network model to train the normal plant images and to form model parameters;
S33, transferring the model parameters to a new convolutional neural network model to obtain a second convolutional neural network model; and
S34, using the second convolutional neural network model to train diseased plant images needing to be identified, and labeling and classifying the images using Softmax, wherein the convolutional neural network model comprises the first convolutional neural network model and the second convolutional neural network model and comprises convolutional layers, full connected layers and a disease and pest classification layer.
In a preferred embodiment of the present invention, the described convolutional neural network model comprises 5 convolutional layers, 2 full connected layers and a disease and pest classification layer, wherein the disease and pest classification layer comprises a Softmax classifier, of which each class corresponds to one plant disease.
In a preferred embodiment of the present invention, S31 further comprises the following steps: initializing network parameters of the various convolutional layers and full connected layers of the trained first convolutional neural network, randomly initializing neuron parameters of the disease and pest classification layer using Gaussian distribution, and retraining the whole first convolutional neural network model using disease data.
In a preferred embodiment of the present invention, the whole convolutional neural network is trained using a BP algorism, wherein the training process comprises information forward propagation and error backward propagation.
In a preferred embodiment of the present invention, a processing manner for the information forward propagation comprises the following steps: selecting a batch of plant
samples {(X' N)'(x2 ,' 2)---(X'0.)1 from training samples, wherein Xi are plant sample
images and are plant disease categories corresponding to the plant samples; linearly arranging the RGB pixel values of the normal plant sample images as an input; calculating a second output of the current layer by using network parameters of the current layer and a first output of the preceding layer, the second output being used as the input of the next layer; and sequentially calculating layer by layer until multi-classification labels of plant diseases are output.
In a preferred embodiment of the present invention, a processing manner for the error backward propagation comprises the following steps:
a loss function formula for the Softmax classifier being as follows:
25o) - I- _1{ =j~log eo
wherein m denotes the quantity of training samples, k denotes the number of classes
of an attributive classifier, Idenotes an indicative function, Oi and Oj are model parameters and x is the quantity of input samples; after an error of an output layer is calculated, reversely rolling back same to the preceding layers to update parameters;
If the next layer is a pooling layer, the error value d5 thereof is shown in the Formula:
(5j= sa /AkI)0 h(a') / upsampljo)•h a , wherein Layer I is a convolutional layer, Layer 1+1 is a
pooling layer, and the error of the pooling layer is 5j'', the convolutional layer corresponds
to the node of the upsampled pooling layer, and function h(a) indicates the derivative of
the activation function corresponding to the node I of Layer I;
If the next layer is a convolutional layer, the computing method for error of channel I of the pooling layer of current layer is as follows:
M
(5 = - j Ok j=1
wherein, the pooling layer of current layer has N feature maps, the convolutional layer has M
kernels, and the numbers of the convolutional layers and the pooling layers arell+1, respectively. Each kernel of the convolutional layers has its corresponding error term, with which the parameters may be updated by means of the current network weight and learning rate.
In a preferred embodiment of the present invention, the operation method of extracting plant image features using a triplet similarity measurement model comprising the following steps of:
S21: establishing a triplet <"'xi') according to constraints specified in formula:
|||2 2 x" - xil 2 + threhold< x" ±ihhd -x, 2X 2, wherein threshold indicates a specific threshold, i is the
first image feature of a reference sample, xi is the second image feature belonging to the
same class as the reference sample, and is the third image feature belonging to a different class from the reference sample; the reference sample is randomly selected from training data; and
S22: combining the training set into a triplet in the form of (di dd), wherein di
indicates a first photograph of the plant, d indicates a second photograph of the plant, and
dfvindicates a randomly selected photograph of another plant different from the plant;
S23: inputting the training set into networks, with di and d to an upper network and
d to a lower network, transferring the processing results through to the last full connected
layers respectively by means of a convolution, a pooling and other operations, extracting a
corresponding feature of ( ), and then inputting the full connected layer of the upper network to a Triplet loss function together with the full connected layer of the lower network, computing a residual and adjusting the parameters between both networks at the same time.
In a preferred embodiment of the present invention, the weight fusion comprises the following steps of: finding extreme points based on a spatial scale; extracting positions, scales and rotation invariants of these extreme points; and taking these extreme points as feature points for feature matching.
The present invention provides a CNN and transfer learning based disease intelligent identification system, and the plant disease intelligent identification system comprises a service layer, a control layer and a computing storage layer; the service layer provides visual results for the user; the control layer is responsible for scheduling functional modules at the network backend; and the computing storage layer comprises a video image data processing module, an HDFS data storage module, a deep learning computing module and a database module, the video image data processing module preprocesses images and then stores data into the HDFS data storage module, the deep learning computing module reads data from the HDFS data storage module for training, and then loads a trained model into the database module for plant disease identification.
In a preferred embodiment of the present invention, the plant disease intelligent identification system works based on a MapReduce computing framework and adopts an offline batch processing, an online real-time processing and a stream computing method; the offline batch processing computes by means of a GPU and then writes the identified offline image data into HBase or MySQL to provide an external query service; and the online real-time processing and the stream computing method are used for processing image data in real time, storing the recognition results into the database module after processing, and directly providing the recognition results computed in real time for the service requester by means of an API of the service layer.
Embodiments of the present invention have the following beneficial effects:
(1) The disease identification method and system have low requirements for image quality and high reliability. The conventional OpenCV has abundant vision algorithm function libraries, but it is intended to be used on high-quality images during the image identification for solving agricultural diseases, and its identification accuracy is easily affected by environmental interference (e.g., illumination and background). The Triplet similarity measurement model adopted in the system effectively improves the feature identification rate. With the model trained using the convolutional neural network, the system recognizes the disease images at various pixels with improved universality and reliability as well as decreased background interference.
(2) The disease identification method and system reach high operational efficiency by efficient computing. The conventional SVM model operates relatively slowly in case of a large sample size for training, while the system reads image data from HDFS, obtains the corresponding pixel labels from SQL, and iterates training quickly via a distributed computing platform by employing the deep convolutional neural network model. The model well trained is loaded into the current model database to realize the identification function of plant diseases and pests. In this mode, the disease shown in an image is locally classified within 3 seconds.
(3) The disease identification method and system may achieve an relatively high identification accuracy with limited sample size. With conventional deep learning methods, thousands of image samples are necessary for training to realize accurate image identification. The method introduces the transfer learning mode into the model to achieve higher identification accuracy with smaller sample size of disease images for training. That is, the method forms model parameters by training with normal plant images, and transfers these parameters to new model to facilitate the training of plant disease image data sets.
(4) Supporting multi-classification of training samples. As crops are threatened by various diseases, the system supports diagnosis of a plurality of common plant diseases. The conventional SVM model usually supports binary classification only for its training samples and cannot meet the multi-classification task with respect to crop diseases. After training of the convolutional neural network, the system inputs the images into the sub-network and propagates them forward to the Softmax layer to achieve the objective of classifying various plant diseases by means of Softmax multi-label classification method.
Brief Description of Figures
In order to illustrate the technical solutions in the embodiments of the present invention more clearly, figures used in the embodiments will be introduced below briefly. It should be understood that the figures described below only show some embodiments of the present invention, and they shall not be construed as restriction to the scope. Those of ordinary skill in the art can also obtain other figures based on those figures without creative work.
Fig. 1 is an architecture diagram of an intelligent identification system of an embodiment of the present invention;
Fig. 2 illustrates a flow chart of an intelligent identification method of an embodiment of the present invention;
Fig. 3 is visual images of convolutional features of an embodiment of the present invention;
Fig. 4 is a convolutional neural network model for identifying maize diseases and pests of an embodiment of the present invention;
Fig. 5 is a functional structure diagram of a client (APP) of an embodiment of the present invention;
Fig. 6 is an interface operation flow chart of the APP of an embodiment of the present invention; and
Fig. 7 consists of images of some diseased maize photographed in natural light of an embodiment of the present invention.
Detailed Description
The technical solutions of the embodiments of the present invention will be described clearly and completely as follows in combination with the figures of these embodiments for clear understanding of the purposes, technical solutions and advantages of the present invention. Apparently, the embodiments described are only some, not all of the embodiments of the present invention. Generally, the components in the embodiments of the present invention described and shown in the figures herein can be arranged and designed in various configurations.
Therefore, the detailed descriptions of the embodiments of the present invention provided in the figures are not intended to limit the scope of the invention, and the embodiments are only certain embodiments of the present invention. On the basis of the embodiments of the present invention, other embodiments obtained by those of ordinary skill in the art without creative work also belong to the protection scope of the present invention.
It should be noted that similar marks and letters generally indicate the similar items. Therefore, any item already defined in one figure is not necessarily further defined and explained in the subsequent figures.
For description of the present invention, it should be noted that orientation or position relations indicated by the terms "center". "above", "under", "left", "right", "vertical", "horizontal", "inside", "outside" etc.are based on the orientation or position relations shown in the figures or the commonly arranged orientation or position relations as used in the invention, and they are used to describe the invention and simplify description herein instead of indicating or implying that the device or component indicated must have specific orientation and be constructed and operated in specific orientation. Therefore, the embodiments described herein shall not be construed as limitation hereto. In addition, the terms "first", "second", "third" etc. are only used to distinguish descriptions instead of being construed as indication or implication of relative importance.
In addition, the terms "horizontal", "vertical", "overhanging" etc. do not mean that the component is necessarily to be completely horizontal or overhanging, but allowing for a slight tilt. For example, "horizontal" only means a horizontal direction in relative to a "vertical" direction, which does not mean that the structure is necessarily to be completely horizontal, but allowing for a slight tilt.
In the description of the present invention, the terms "arrangement","installation", "connected with" and "in communication with" shall be understood in a broad sense unless otherwise specified and defined, for example, the connection may be fixed connection, detachable connection or integrated connection; it may be mechanical connection or electric connection; and it may be direct connection or indirect connection through an intermediate, or internal connection between two elements. Those of ordinary skill in the art can understand the specific meaning of these terms in the present invention according to actual conditions.
First embodiment
A CNN and transfer learning based disease intelligent identification system provided by the embodiment is an algorithm model combining a convolutional neural network and a transfer learning to realize image intelligent diagnosis of common plant diseases. Take maize for example, in this embodiment, 10 common maize diseases (i.e., northern leaf blight, southern leaf blight, northern leaf spot, gray leaf spot, stem rot, common corn rust, sphacelotheca reiliana, ear rot, curvularia disease and maize sheath blight) are researched using the system. According to the recognition results, the system is able to identify these 10 common maize diseases with an accuracy above 90%, and provide efficient technical support for prevention and control of maize diseases.
Fig. 1 is an architecture flow chart of the identification system of the embodiment, which mainly comprises three layers:
service layer: presenting result in response to user request to the user in a visual manner, thus the user gets information about the result directly and clearly. The layer mainly comprises a video data source transmission API and a workflow service API. Information is transferred to the network backend via HTTP, reaches a Handler endpoint through Web Server, and then reaches the corresponding Action method of a work controller through routing lookup and distribution. Fig. 1 shows that the client and the workflow service API are in communication connection to enable automatic transmission of documents, information and tasks depending on the preset rules throughout the process of sending requests from client to the result presentation, thus reducing manual intervention; the workflow service API and RPC service are in communication connection to enable the client to remotely control and invoke the control layer;
control layer: being responsible for scheduling functional modules at the backend. Master assigns a Reduce task to Worker, maps data generated by all Mappers and assign tasks with the same key to a Worker. Fig. 1 shows that the control layer of the embodiment comprises a management module, a scheduling module, an operation module, a model module and a central control, the management module, the scheduling module, the operation module and the model module communicate with the RPC service and are connected with the central control, respectively, the scheduling module and the operation module are in communication connection, and the model module and the operation module are in communication connection; and
computing storage layer: being used for data processing and data storage, comprising an HDFS data storage module, a deep learning computing module, a video image data processing module and a database module. The video image data processing module is responsible for image denoising, white balance adjustment, image equalization and other operations to ensure the image data normalization, and finally stored these data in the HDFS data storage module; the deep learning computing module is mainly used for model training, image feature extraction, image identification, etc. During model training, the system reads image data from the HDFS, obtains the corresponding pixel labels from the SQL, and then quickly iterates training via the distributed computing platform. The model well trained is loaded into the current model database to realize the identification function of maize diseases and pests.
In the process of maize diseases and pests identification, as a maize image may involve a plurality of disease targets, i.e., a request requires to identify a plurality of targets at the same time; and the identification model adopts the deep convolutional neural model, the network parameter size is considerable and the identification of a plurality of targets will lead to dramatically increased computation. Therefore, the multi-target identification and computing tasks can be assigned to a plurality of servers instead of the central control to effectively improve the efficiency and the concurrency performance of services.
The multi-target identification distributed computing system of the system works based on a MapReduce computing framework, and provides three computing processing modes: offline batch processing, online real-time processing and stream computing. The offline batch processing computes by means of a GPU and then writes the identified offline image data into HBase or MySQL to provide an external query service; and the online real-time processing and the stream computing method are used for processing image data in real time. All of these three computing processing methods store the recognition results into a data center after processing, and directly provide the recognition results computed in real time for the service requester by means of an API of the service layer.
The embodiment further provides a CNN and transfer learning based disease intelligent identification method used in combination with the plant disease intelligent identification system, the intelligent identification method mainly comprises a disease image identification method.
Fig. 2 shows that the disease image identification algorithm mainly comprises: image input, image preprocessing, feature location, feature extraction, image identification and classification and image output.
Image input: photographing the maize to be identified at the client (e.g., mobile phone
APP). In the embodiment, the image input side is an Android image identification front end, and an Android image acquisition front end may be installed in a device (e.g., mobile phone or tablet computer) in the form of an APP and mainly used for acquiring and inputting disease images.
Fig. 5 shows that the image identification front end comprises a disease data center module and a maize disease identification module, the disease data center is used to query disease data, and the maize disease identification module is used to identify diseases on images photographed on site or selected from an album.
Fig. 6 clearly shows the operation process at the client. When photographing or selecting an image from the mobile album, the user will be informed of positioning the region of the sign of disease to identify the disease. If the user clicks upload, the system will crop the images selected for corresponding compression, and then send the processed images to the backend identification algorithm model for disease identification through a GET request.
Image preprocessing: preprocessing images at the computing storage layer of the system. As both of the healthy maize images and diseased maize images contain some irrelevant image information, the image preprocessing is intended to improve the image identification reliability by enhancing the relevant information and minimizing the irrelevant information. Preprocessing is responsible for image denoising, white balance adjustment, image equalization and other operations to ensure image data normalization; and using a Faster-RCNN multi-target detecting algorithm to quickly position a region of sign of disease and exclude background interference.
Image feature extraction: using the Triplet similarity measurement model to extract maize features in the embodiment. The Triplet similarity measurement learning method make the image features of the same maize photographed in different scenes more similar by means of the feature learning. Triplet is a Triplet established according to the constraints specified in Formula (2-1). To establish a Triplet, a reference sample is randomly selected from the a training data, with an image feature denoted as X ; a feature belonging to the same class as
the reference sample is selected and denoted as Xi; and a feature belonging to a different
class from the reference sample is denoted as X and the( ' ' is denoted as a triplet. The Triplet loss error function based on Triplet measurement learning is derived from the following principle:
2 2 x- 2threhold< x 2 (2-1)
The threshold indicates a specific threshold, and the inequation essentially defines the relationship of feature distances between samples belonging to the same class and different classes, i.e., the distance between samples belonging to the same class + threshold should be less than the distance between samples of different classes. The training set is combined into a
triplet in the form of (addfid), wherein di indicates a photograph of a maize, df
indicates another photograph of the maize, and d indicates a randomly selected photograph
of another maize different from the maize. The training set is input into networks, with di
and di to an upper network and di to a lower network, the processing results are transferred through to the last full connected layers respectively by means of a convolution, a
pooling and other operations, a corresponding feature of <' 'i is extracted. The full connected layer of the upper network is input to a Triplet loss function together with the full connected layer of the lower network to compute a residual and adjust the parameters between both networks at the same time. Fig. 3 visually shows the convolutional features of different layers.
In order to make up for the deficiency that the features extracted by means of the deep convolutional neural network are insufficient in signifying the texture details on images, we enhance the capability of feature description by employing the Scale-invariant feature transform (SIFT) features as complementary features and conducting the weight fusion. SIFT is a local feature extraction algorithm finding extreme points based on a spatial scale, extracting positions, scales and rotation invariants of these extreme points, and matching these extreme points as feature points to achieve good matching results.
Disease classification and identification: the disease classification and identification relates to the core algorithm model of the embodiment, i.e., the algorithm model combining the convolutional neural network and the transfer learning. Fig. 4 shows that a deep convolutional neural network model is designed for multi-classification identification of maize diseases and pests based on image features of maize diseases and pests.
The convolutional neural network model comprises 5 convolutional layers, 2 full connected layers and a disease and pest classification layer.
Convolutional layer: being used to obtain local features of maize images. As the convolutional neural network only receive fixed-size input, the input images should be converted into 224*224 fixed-size 3-channel RGB images. The input size to the network is 224*224*3, and the first convolutional layer C1 uses 96 filters (or kernels) with size of 11*11 and stride of 4 to conduct a convolutional operation with the input fixed image block. The convolutional operation is followed by a Max Pooling operation, and all Max Pooling operations downsample the input by using filters with a size of 3*3 and a stride of 1. Layer C2 uses 256 kernels with a size of 5*5 and a stride of 1. Layer C3 uses 384 kernels with a size of 3*3 and a stride of 1. Layer C4 uses 384 kernels with a size of 3*3 and a stride of 1. Layer C5 uses 256 kernels with a size of 3*3 and a stride of 1.
Full connected layer: being used to reconnect all local features into a complete image through a weight matrix. The two full connected layers Fl and F2 comprise 4096 neurons, respectively. These neurons are connected with every neuron in the input and the output, respectively.
Disease and pest classification layer: being used to classify diseases and pests. In the embodiment, the disease and pest classification layer is a Softmax classifier, of which each class corresponds to one maize disease or pest, i.e., the quantity of neurons in the classifier corresponds to the quantity of classes of the maize diseases and pests.
However, as the available images of maize diseases and pests are limited in quantity and additional images are difficult to be acquired, training of the deep convolutional neural network model only with small data size of diseases and pests will lead to over-fitted model, which is unable to identify new images with poor generalization ability. In the embodiment, the convolutional neural network is combined with the transfer learning to process the maize disease images. The transfer learning facilitates the completion of learning tasks in a new environment by exploiting the knowledge acquired in previous environment. In the deep learning training, the well trained model parameters may be transferred to a new model to assist in training a new data set. Firstly, the deep convolutional neural network model is used for processing; secondly, the parameters of the well trained deep convolutional neural network model are used to initialize the network parameters in the 5 convolutional layers and the 2 full connected layers; thirdly, Gaussian distribution is used to randomly initialize the neuron parameters of the disease and pest classification layer (Gaussian distribution function is shown in Formula 2-2); finally, the whole network is trained again using the data set of maize diseases and pests.
f(x)= 1 e 2(x 2
o-J (2-2)
When training these network parameters, we use Error Back Propagation (BP) algorithm to train the whole convolutional neural network. The training process comprises an information forward propagation and an error back propagation.
Forward propagation stage: selecting a batch of samples (X')'(X,y2),.--(X., 2 m)I
from the training samples, wherein Xi denotes an image of maize sample and denotes the class of disease and pest of the sample. The RGB pixel values of the sample image are linearly arranged as input. For subsequent layers, the output of current layer is computed based on the current network parameters of the layer and the output of the preceding layer, and the output of current layer is used as the input to the next layer. The iterative process continues until the results are transferred to the output layer, which outputs the multi-classification labels of maize diseases and pests.
Back propagation stage: the loss function of Softmax attributive classifier for multi-label classification is shown as Formula (2-3):
J(O)= 1lyo = jlog (0 -(2-3)
wherein m denotes the quantity of training samples, k denotes the number of classes
of an attributive classifier, and I{Idenotes an indicative function, indicating the consistency between the output classification and the label; i and Oj are model parameters, x is the quantity of input samples, and e is a constant.
After an error of an output layer is calculated, it should be reversely rolled back to the preceding layers to update parameters, i.e., the error back propagation is required. If the next
layer is a pooling layer, the error value d5 thereof is shown in Formula (2-4), wherein Layer
is a convolutional layer, Layer l+1 is a pooling layer, and the error of the pooling layer is 6j+1
25 =upsampl/ )ha) (2-4)
In the formula above, • denotes dot product operation of matrix, the convolutional layer corresponds to the node of the upsampled pooling layer, so their subscripts are the same, and the Function h(a) indicates the derivative of the activation function corresponding to the node i of Layer 1. If the next layer is a convolutional layer, it is assumed that the pooling layer of current layer has N feature maps, the convolutional layer has M kernels, and the numbers of the convolutional layers and the pooling layers are ,+1, respectively. Each kernel of the convolutional layers has its corresponding error term. Therefore, the error computing method of Channel i in the pooling layer of current layer is shown in Formula (2-5):
M
J=1 (2-5)
According to the error computing, the parameters may be updated based on the current network weight and learning rate.
After the network training is completed, the network is able to be used to identify maize diseases and pests. The well trained convolutional neural network may be considered to be a multi-class classifier, an image is input into the sub-network and propagated forward to the Softmax layer, and the class corresponding to the maximum of the Softmax layer is the class of disease or pest of the input image.
Image output: being installed at client to display the image analysis results.
Finally, the disease identification system of the embodiment is tested. Some images used in the test are shown in Fig. 7.The maize disease images are divided into two groups herein: 314 images as training data and 112 images as test data. Meanwhile, 400 images of healthy maize are prepared. All of these images were photographed in the open field in natural light.
The trained deep convolutional neural network model is tested by identifying 112 images containing disease testing data of 10 diseases (i.e., northern leaf blight, southern leaf blight, northern leaf spot, gray leaf spot, stem rot, common corn rust, sphacelotheca reiliana, ear rot, curvularia disease and maize sheath blight). The results are as shown in Table 1: Table 1 Identification Results
Disease class Quantity of training samples/ Quantity of Identification rate testing samples correctly identified samples Northern leaf 52/16 15 93.75% blight Southern leaf 45/15 13 86.67% blight Northern leaf 17/6 5 83.33% spot Gray leaf spot 24/9 7 77.78% Stem rot 19/8 7 87.50% Common corn 30/10 10 100.00% rust Sphacelotheca 27/11 11 100.00% reiliana Ear rot 33/13 12 92.31% Curvularia 27/11 10 90.91% disease Maize sheath 40/13 13 100.00% blight Total 314/112 103 91.96% The experimental results show that the method is able to effectively identify these 10 common maize diseases with an accuracy above 90%.
In conclusion, the CNN and transfer learning based disease intelligent identification method and system provided in the embodiment may identify maize diseases in the natural environment. The method and system take 10 common maize diseases as research objects: preprocessing images, learning maize image features using Triplet loss model structure, then extracting image texture details using SIFT algorithm, and finally labeling these images by means of Softmax for classification. The training set combines images of healthy maize and diseased maize, the deep similarity network is employed to learn the feature representation of healthy maize images, then the transfer learning method is used to learn the features of maize disease images, and finally these features are classified for identification. The results of this study showed that the method is able to identify these 10 common maize diseases with an accuracy above 90%, and provide efficient technical support for prevention and control of maize diseases.
While embodiments of the present invention have been illustrated and described in the specification, it is not intended that these embodiments illustrate and describe all possible forms of the present invention. It should be understood that the disclosed embodiments are merely exemplary of the invention that may be embodied in various and alternative forms. The drawings are not necessarily drawn to scale; some features may be exaggerated or minimized to show details of particular components. The specific structural and functional details disclosed should not be interpreted as definition, but merely as a representative basis on which those skilled in the art learn to implement the present invention in various forms. As those of ordinary skill in the art will understand, various features of the embodiments illustrated and described with reference to any one of the Figures may be combined with features illustrated in one or more other Figures to produce alternative embodiments that are not explicitly illustrated or described. The combinations of features illustrated provide representative embodiments for typical applications. However, various combinations and modifications of the features consistent with the teachings of the present invention may be desired for particular applications or implementations.
It will be understood that the terms "comprise" and "include" and any of their derivatives (eg comprises, comprising, includes, including) as used in this specification is to be taken to be inclusive of features to which the term refers, and is not meant to exclude the presence of any additional features unless otherwise stated or implied
The reference to any prior art in this specification is not, and should not be taken as, an acknowledgement of any form of suggestion that such prior art forms part of the common general knowledge.
Only preferred embodiments of the invention are described above, but not limited to the present invention. For a person skilled in the art, the invention may take on various alterations and changes. Any alteration, equivalent replacement and improvement, etc. without departing from its spirit and principle of the present invention shall be included in the scope of protection of the invention.

Claims (10)

Claims
1. A CNN and transfer learning based disease intelligent identification method, characterized in that the intelligent identification method comprises a disease image identification method, and the disease image identification method comprises the following steps of:
SI, image preprocessing: performing image binaryzation on acquired plant pictures, normalizing the sizes of the images, and quickly positioning a region of sign of disease using a Faster-RCNN multi-target detecting algorithm to exclude background interferences, wherein the plant pictures comprise normal plant images and diseased plant images;
S2, image feature extraction: extracting image features of the described preprocessed plant pictures using a triplet similarity measurement model, and then using SIFT feature as a complementary feature to perform weight fusion; and
S3, disease classification and identification: learning a first image feature of the normal plant images using a deep convolutional neural network, learning a second image feature of diseased plant images through transfer learning, and performing classification and identification in combination with the first image feature and the second image feature,
wherein performing multi-classification identification on diseases using a deep convolutional neural network model comprises the following steps of:
S31, establishing and training a first convolutional neural network model;
S32, using the described obtained first convolutional neural network model to train the normal plant images and to form model parameters;
S33, transferring the model parameters to a new convolutional neural network model to obtain a second convolutional neural network model; and
S34, using the second convolutional neural network model to train diseased plant images needing to be identified, and labeling and classifying the images using Softmax, wherein the convolutional neural network model comprises the first convolutional neural network model and the second convolutional neural network model and comprises convolutional layers, full connected layers and a disease and pest classification layer.
2. The CNN and transfer learning based disease intelligent identification method according to claim 1, characterized in that the convolutional neural network model comprises 5 convolutional layers, 2 full connected layers and a disease and pest classification layer, wherein the disease and pest classification layer comprises a Softmax classifier, of which each class corresponds to one plant disease.
3. The CNN and transfer learning based disease intelligent identification method according to claim 1, characterized in that S31 further comprises the following steps: initializing network parameters of the various convolutional layers and full connected layers of the trained first convolutional neural network, randomly initializing neuron parameters of the disease and pest classification layer using Gaussian distribution, and retraining the whole first convolutional neural network model using disease data.
4. The CNN and transfer learning based disease intelligent identification method according to claim 1, characterized in that the whole convolutional neural network is trained using a BP algorism, wherein the training process comprises information forward propagation and error backward propagation.
5. The CNN and transfer learning based disease intelligent identification method according to claim 4, characterized in that a processing manner for the information forward propagation comprises the following steps: selecting a batch of plant samples
{(X 1 ,) 2 )..--, (n, (X21Y ) from training samples, wherein Xi are plant sample images
and ' are plant disease categories corresponding to the plant samples; linearly arranging the RGB pixel values of the normal plant sample images as an input; calculating a second output of the current layer by using network parameters of the current layer and a first output of the preceding layer, the second output being used as the input of the next layer; and sequentially calculating layer by layer until multi-classification labels of plant diseases are output.
6. The CNN and transfer learning based disease intelligent identification method according to claim 4, characterized in that a processing manner for the error backward propagation comprises the following steps:
a loss function formula for the Softmax classifier being as follows:
1Fm J(o) = Z1y = jlog e w r=1d=1sk eu f
wherein rn denotes the quantity of training samples, k denotes the number of classes of an attributive classifier, denotes an indicative function, i and Oj are model parameters and x is the quantity of input samples; after an error of an output layer is calculated, reversely rolling back same to the preceding layers to update parameters;
If the next layer is a pooling layer, the error value C. thereof is shown in the Formula: 651j =upsampl upsamlkhl gjo ) h(a' , wherein Layer I is a convolutional layer, Layer 1+1 is a )
pooling layer, and the error of the pooling layer is , the convolutional layer corresponds
to the node of the upsampled pooling layer, and function h indicates the derivative of the
activation function corresponding to the node I of LayerJ ;
If the next layer is a convolutional layer, the computing method for error of channel
' of the pooling layer of current layer is as follows:
j=1
wherein, the pooling layer of current layer has N feature maps, the convolutional layer has M kernels, and the numbers of the convolutional layers and the pooling layers are layers 1,1+1 , respectively. Each kernel of the convolutional layers has its corresponding error term, with which the parameters may be updated by means of the current network weight and learning rate.
7. The CNN and transfer learning based disease intelligent identification method according to claim 1, characterized in that the operation method of extracting plant image features using a Triplet similarity measurement model comprising the following steps of:
S21: establishing a triplet(Xx, ) according to constraints specified in formula:
|||2 - x,' l2 x"' 2 ± threhold< x" - X'1 2 + , characterized in that the threshold indicates a specific
threshold,i is the first image feature of a reference sample,21 is the second image
feature belonging to the same class as the reference sample, and is the third image feature belonging to a different class from the reference sample; the reference sample is randomly selected from training data; and
S22: combining the training set into a triplet in the form of (d,dp,d"), wherein da
indicates a first photograph of the plant, d indicates a second photograph of the plant, and
d indicates a randomly selected photograph of another plant different from the plant;
S23: inputting the training set into networks, with i and d to an upper network
and d to a lower network, transferring the processing results through to the last full connected layers respectively by means of a convolution, a pooling and other operations,
extracting a corresponding feature of ('' ), and then inputting the full connected layer of the upper network to a Triplet loss function together with the full connected layer of the lower network, computing a residual and adjusting the parameters between both networks at the same time.
8. The CNN and transfer learning based disease intelligent identification method according to claim 7, characterized in that the weight fusion comprises the following steps of: finding extreme points based on a spatial scale; extracting positions, scales and rotation invariants of these extreme points; and taking these extreme points as feature points for feature matching.
9. A CNN and transfer learning based disease intelligent identification system, characterized in that the plant disease intelligent identification system comprises a service layer, a control layer and a computing storage layer; the service layer provides visual results to the user; the control layer is responsible for scheduling functional modules at the network backend; and the computing storage layer comprises a video image data processing module, an HDFS data storage module, a deep learning computing module and a database module, the video image data processing module preprocesses images and then stores data into the HDFS data storage module, the deep learning computing module reads data from the HDFS data storage module for training, and then loads a trained model into the database module for plant disease identification.
10. The CNN and transfer learning based disease intelligent identification system according to claim 9, characterized in that the plant disease intelligent identification system works based on a MapReduce computing framework and adopts an offline batch processing, an online real-time processing and a stream computing method; the offline batch processing computes by means of a GPU and then writes the identified offline image data into HBase or
MySQL to provide an external query service; and the online real-time processing and the stream computing method are used for processing image data in real time, storing the recognition results into the database module after processing, and directly providing the recognition results computed in real time for the service requester by means of an API of the service layer.
AU2020103613A 2020-11-23 2020-11-23 Cnn and transfer learning based disease intelligent identification method and system Active AU2020103613A4 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2020103613A AU2020103613A4 (en) 2020-11-23 2020-11-23 Cnn and transfer learning based disease intelligent identification method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2020103613A AU2020103613A4 (en) 2020-11-23 2020-11-23 Cnn and transfer learning based disease intelligent identification method and system

Publications (1)

Publication Number Publication Date
AU2020103613A4 true AU2020103613A4 (en) 2021-02-04

Family

ID=74236471

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2020103613A Active AU2020103613A4 (en) 2020-11-23 2020-11-23 Cnn and transfer learning based disease intelligent identification method and system

Country Status (1)

Country Link
AU (1) AU2020103613A4 (en)

Cited By (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112949500A (en) * 2021-03-04 2021-06-11 北京联合大学 Improved YOLOv3 lane line detection method based on spatial feature coding
CN113011262A (en) * 2021-02-18 2021-06-22 广州大学华软软件学院 Multi-size cell nucleus recognition device and method based on convolutional neural network
CN113066053A (en) * 2021-03-11 2021-07-02 紫东信息科技(苏州)有限公司 Model migration-based duodenum self-training classification method and system
CN113132397A (en) * 2021-04-23 2021-07-16 信阳农林学院 Network encryption traffic identification method, device and equipment based on deep learning
CN113159238A (en) * 2021-06-23 2021-07-23 安翰科技(武汉)股份有限公司 Endoscope image recognition method, electronic device, and storage medium
CN113177574A (en) * 2021-03-19 2021-07-27 华中科技大学 Visual model for material characterization image analysis and analysis method thereof
CN113221749A (en) * 2021-05-13 2021-08-06 扬州大学 Crop disease remote sensing monitoring method based on image processing and deep learning
CN113240623A (en) * 2021-03-18 2021-08-10 中国公路工程咨询集团有限公司 Pavement disease detection method and device
CN113253248A (en) * 2021-05-11 2021-08-13 西北工业大学 Small sample vertical array target distance estimation method based on transfer learning
CN113269051A (en) * 2021-04-30 2021-08-17 广州图匠数据科技有限公司 Commodity identification method, intelligent terminal and storage device
CN113313017A (en) * 2021-05-27 2021-08-27 中科院合肥技术创新工程院 Non-instrument physical training method and system
CN113344911A (en) * 2021-07-06 2021-09-03 北京大都正隆医疗科技有限公司 Method and device for measuring size of calculus
CN113378967A (en) * 2021-06-28 2021-09-10 哈尔滨工业大学 Structural health monitoring multivariate data anomaly diagnosis method based on convolutional neural network and transfer learning
CN113378723A (en) * 2021-06-13 2021-09-10 国网福建省电力有限公司 Automatic safety identification system for hidden danger of power transmission and transformation line based on depth residual error network
CN113486959A (en) * 2021-07-07 2021-10-08 漳州卫生职业学院 Lung CT image classification method based on feature migration
CN113505856A (en) * 2021-08-05 2021-10-15 大连海事大学 Hyperspectral image unsupervised self-adaptive classification method
CN113554151A (en) * 2021-07-07 2021-10-26 浙江工业大学 Attention mechanism method based on convolution interlayer relation
CN113610820A (en) * 2021-08-12 2021-11-05 上海数依数据科技有限公司 Station target detection system based on deep learning algorithm
CN113627518A (en) * 2021-08-07 2021-11-09 福州大学 Method for realizing multichannel convolution-recurrent neural network electroencephalogram emotion recognition model by utilizing transfer learning
CN113627074A (en) * 2021-07-13 2021-11-09 西安理工大学 Ground wave propagation delay prediction method based on transfer learning
CN113642665A (en) * 2021-08-24 2021-11-12 广州市香港科大霍英东研究院 Relation network-based few-sample classification method and system
CN113707323A (en) * 2021-08-31 2021-11-26 平安科技(深圳)有限公司 Disease prediction method, device, equipment and medium based on machine learning
CN113706524A (en) * 2021-09-17 2021-11-26 上海交通大学 Convolutional neural network reproduction image detection system improved based on continuous learning method
CN113780371A (en) * 2021-08-24 2021-12-10 上海电力大学 Insulator state edge recognition method based on edge calculation and deep learning
CN113792514A (en) * 2021-09-18 2021-12-14 上海交通大学 Chemical mechanical polishing chip surface height prediction model modeling method based on transfer learning
CN113807324A (en) * 2021-11-02 2021-12-17 中国人民解放军32021部队 Sonar image recognition method and device, electronic equipment and storage medium
CN113935377A (en) * 2021-10-13 2022-01-14 燕山大学 Pipeline leakage aperture identification method combining feature migration with time-frequency diagram
CN113989833A (en) * 2021-09-30 2022-01-28 西安工业大学 Oral mucosal disease identification method based on EfficientNet network
CN114067311A (en) * 2021-08-11 2022-02-18 中国农业科学院茶叶研究所 Tea tree pest and disease prediction method based on intelligent recognition crowd-sourced data
CN114067191A (en) * 2021-11-26 2022-02-18 成都泰盟软件有限公司 Image recognition-based design method of medical media biological recognition APP
CN114140428A (en) * 2021-11-30 2022-03-04 东北林业大学 Method and system for detecting and identifying larch caterpillars based on YOLOv5
CN114297940A (en) * 2021-12-31 2022-04-08 合肥工业大学 Method and device for determining unsteady reservoir parameters
CN114387627A (en) * 2022-01-11 2022-04-22 厦门大学 Small sample wireless device radio frequency fingerprint identification method and device based on depth measurement learning
CN114419341A (en) * 2022-01-20 2022-04-29 大连海事大学 Convolutional neural network image identification method based on transfer learning improvement
CN114549962A (en) * 2022-03-07 2022-05-27 重庆锐云科技有限公司 Garden plant leaf disease classification method
CN114764827A (en) * 2022-04-27 2022-07-19 安徽农业大学 Mulberry leaf disease and insect pest detection method under self-adaptive low-illumination scene
CN114863242A (en) * 2022-04-26 2022-08-05 北京拙河科技有限公司 Deep learning network optimization method and system for image recognition
CN114972883A (en) * 2022-06-17 2022-08-30 平安科技(深圳)有限公司 Target detection sample generation method based on artificial intelligence and related equipment
CN115396831A (en) * 2021-05-08 2022-11-25 中国移动通信集团浙江有限公司 Interaction model generation method, device, equipment and storage medium
CN115563286A (en) * 2022-11-10 2023-01-03 东北农业大学 Knowledge-driven milk cow disease text classification method
CN115631847A (en) * 2022-10-19 2023-01-20 哈尔滨工业大学 Early lung cancer diagnosis system based on multiple mathematical characteristics, storage medium and equipment
CN116246176A (en) * 2023-05-12 2023-06-09 山东建筑大学 Crop disease detection method and device, electronic equipment and storage medium
CN116311230A (en) * 2023-05-17 2023-06-23 安徽大学 Corn leaf disease identification method and device oriented to real scene
CN116385953A (en) * 2023-01-11 2023-07-04 哈尔滨市科佳通用机电股份有限公司 Railway wagon door hinge breaking fault image identification method
CN116584472A (en) * 2023-07-13 2023-08-15 四川省农业机械科学研究院 Multistage control-based brittle Li Pen medicine method and system
CN116703897A (en) * 2023-08-02 2023-09-05 青岛兴牧畜牧科技发展有限公司 Pig weight estimation method based on image processing
CN116689310A (en) * 2023-08-08 2023-09-05 河南工学院 Automatic identification classification system for battery sorting and recycling
CN116825283A (en) * 2023-04-27 2023-09-29 清华大学 Nuclear medicine treatment individuation dosage evaluation method and device based on transfer learning
CN116863340A (en) * 2023-08-16 2023-10-10 安徽荃银超大种业有限公司 Rice leaf disease identification method based on deep learning
CN116991932A (en) * 2023-09-25 2023-11-03 济南卓鲁信息科技有限公司 Data analysis and management system and method based on artificial intelligence
CN117058492A (en) * 2023-10-13 2023-11-14 之江实验室 Two-stage training disease identification method and system based on learning decoupling
CN117078604A (en) * 2023-07-31 2023-11-17 台州道致科技股份有限公司 Unmanned laboratory intelligent management method and system
CN117290762A (en) * 2023-10-11 2023-12-26 北京邮电大学 Insect pest falling-in identification method, type identification method, device, insect trap and system
CN117330315A (en) * 2023-12-01 2024-01-02 智能制造龙城实验室 Rotary machine fault monitoring method based on online migration learning
CN117351293A (en) * 2023-12-04 2024-01-05 天津医科大学口腔医院 Combined learning periodontal disease image classification method and device
CN117392552A (en) * 2023-12-13 2024-01-12 江西农业大学 Blade disease identification method and system based on dual-path convolutional neural network
CN117671395A (en) * 2024-02-02 2024-03-08 南昌康德莱医疗科技有限公司 Cancer cell type recognition device
CN117970224A (en) * 2024-03-29 2024-05-03 国网福建省电力有限公司 CVT error state online evaluation method, system, equipment and medium

Cited By (95)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113011262A (en) * 2021-02-18 2021-06-22 广州大学华软软件学院 Multi-size cell nucleus recognition device and method based on convolutional neural network
CN113011262B (en) * 2021-02-18 2023-10-13 广州大学华软软件学院 Multi-size cell nucleus identification device and method based on convolutional neural network
CN112949500A (en) * 2021-03-04 2021-06-11 北京联合大学 Improved YOLOv3 lane line detection method based on spatial feature coding
CN113066053A (en) * 2021-03-11 2021-07-02 紫东信息科技(苏州)有限公司 Model migration-based duodenum self-training classification method and system
CN113066053B (en) * 2021-03-11 2023-10-10 紫东信息科技(苏州)有限公司 Model migration-based duodenum self-training classification method and system
CN113240623A (en) * 2021-03-18 2021-08-10 中国公路工程咨询集团有限公司 Pavement disease detection method and device
CN113240623B (en) * 2021-03-18 2023-11-07 中国公路工程咨询集团有限公司 Pavement disease detection method and device
CN113177574A (en) * 2021-03-19 2021-07-27 华中科技大学 Visual model for material characterization image analysis and analysis method thereof
CN113132397A (en) * 2021-04-23 2021-07-16 信阳农林学院 Network encryption traffic identification method, device and equipment based on deep learning
CN113269051A (en) * 2021-04-30 2021-08-17 广州图匠数据科技有限公司 Commodity identification method, intelligent terminal and storage device
CN115396831A (en) * 2021-05-08 2022-11-25 中国移动通信集团浙江有限公司 Interaction model generation method, device, equipment and storage medium
CN113253248A (en) * 2021-05-11 2021-08-13 西北工业大学 Small sample vertical array target distance estimation method based on transfer learning
CN113253248B (en) * 2021-05-11 2023-06-30 西北工业大学 Small sample vertical array target distance estimation method based on transfer learning
CN113221749A (en) * 2021-05-13 2021-08-06 扬州大学 Crop disease remote sensing monitoring method based on image processing and deep learning
CN113313017A (en) * 2021-05-27 2021-08-27 中科院合肥技术创新工程院 Non-instrument physical training method and system
CN113378723B (en) * 2021-06-13 2023-08-01 国网福建省电力有限公司 Automatic safety recognition system for hidden danger of power transmission and transformation line based on depth residual error network
CN113378723A (en) * 2021-06-13 2021-09-10 国网福建省电力有限公司 Automatic safety identification system for hidden danger of power transmission and transformation line based on depth residual error network
CN113159238A (en) * 2021-06-23 2021-07-23 安翰科技(武汉)股份有限公司 Endoscope image recognition method, electronic device, and storage medium
CN113378967B (en) * 2021-06-28 2022-11-08 哈尔滨工业大学 Structural health monitoring multivariate data anomaly diagnosis method based on convolutional neural network and transfer learning
CN113378967A (en) * 2021-06-28 2021-09-10 哈尔滨工业大学 Structural health monitoring multivariate data anomaly diagnosis method based on convolutional neural network and transfer learning
CN113344911A (en) * 2021-07-06 2021-09-03 北京大都正隆医疗科技有限公司 Method and device for measuring size of calculus
CN113486959B (en) * 2021-07-07 2023-06-16 漳州卫生职业学院 Lung CT image classification method based on feature migration
CN113486959A (en) * 2021-07-07 2021-10-08 漳州卫生职业学院 Lung CT image classification method based on feature migration
CN113554151B (en) * 2021-07-07 2024-03-22 浙江工业大学 Attention mechanism method based on convolution interlayer relation
CN113554151A (en) * 2021-07-07 2021-10-26 浙江工业大学 Attention mechanism method based on convolution interlayer relation
CN113627074B (en) * 2021-07-13 2024-04-19 西安理工大学 Ground wave propagation delay prediction method based on transfer learning
CN113627074A (en) * 2021-07-13 2021-11-09 西安理工大学 Ground wave propagation delay prediction method based on transfer learning
CN113505856B (en) * 2021-08-05 2024-04-09 大连海事大学 Non-supervision self-adaptive classification method for hyperspectral images
CN113505856A (en) * 2021-08-05 2021-10-15 大连海事大学 Hyperspectral image unsupervised self-adaptive classification method
CN113627518B (en) * 2021-08-07 2023-08-08 福州大学 Method for realizing neural network brain electricity emotion recognition model by utilizing transfer learning
CN113627518A (en) * 2021-08-07 2021-11-09 福州大学 Method for realizing multichannel convolution-recurrent neural network electroencephalogram emotion recognition model by utilizing transfer learning
CN114067311A (en) * 2021-08-11 2022-02-18 中国农业科学院茶叶研究所 Tea tree pest and disease prediction method based on intelligent recognition crowd-sourced data
CN113610820A (en) * 2021-08-12 2021-11-05 上海数依数据科技有限公司 Station target detection system based on deep learning algorithm
CN113780371A (en) * 2021-08-24 2021-12-10 上海电力大学 Insulator state edge recognition method based on edge calculation and deep learning
CN113642665A (en) * 2021-08-24 2021-11-12 广州市香港科大霍英东研究院 Relation network-based few-sample classification method and system
CN113707323A (en) * 2021-08-31 2021-11-26 平安科技(深圳)有限公司 Disease prediction method, device, equipment and medium based on machine learning
CN113707323B (en) * 2021-08-31 2024-05-14 平安科技(深圳)有限公司 Disease prediction method, device, equipment and medium based on machine learning
CN113706524B (en) * 2021-09-17 2023-11-14 上海交通大学 Convolutional neural network image-flipping detection system based on continuous learning method improvement
CN113706524A (en) * 2021-09-17 2021-11-26 上海交通大学 Convolutional neural network reproduction image detection system improved based on continuous learning method
CN113792514A (en) * 2021-09-18 2021-12-14 上海交通大学 Chemical mechanical polishing chip surface height prediction model modeling method based on transfer learning
CN113792514B (en) * 2021-09-18 2023-11-24 上海交通大学 Chemical mechanical polishing chip surface height prediction model modeling method based on transfer learning
CN113989833A (en) * 2021-09-30 2022-01-28 西安工业大学 Oral mucosal disease identification method based on EfficientNet network
CN113935377A (en) * 2021-10-13 2022-01-14 燕山大学 Pipeline leakage aperture identification method combining feature migration with time-frequency diagram
CN113935377B (en) * 2021-10-13 2024-05-07 燕山大学 Pipeline leakage aperture identification method combining characteristic migration with time-frequency diagram
CN113807324A (en) * 2021-11-02 2021-12-17 中国人民解放军32021部队 Sonar image recognition method and device, electronic equipment and storage medium
CN114067191B (en) * 2021-11-26 2024-04-05 成都泰盟软件有限公司 Image recognition-based disease medium biological recognition APP design method
CN114067191A (en) * 2021-11-26 2022-02-18 成都泰盟软件有限公司 Image recognition-based design method of medical media biological recognition APP
CN114140428A (en) * 2021-11-30 2022-03-04 东北林业大学 Method and system for detecting and identifying larch caterpillars based on YOLOv5
CN114297940B (en) * 2021-12-31 2024-05-07 合肥工业大学 Method and device for determining unsteady state reservoir parameters
CN114297940A (en) * 2021-12-31 2022-04-08 合肥工业大学 Method and device for determining unsteady reservoir parameters
CN114387627A (en) * 2022-01-11 2022-04-22 厦门大学 Small sample wireless device radio frequency fingerprint identification method and device based on depth measurement learning
CN114419341B (en) * 2022-01-20 2024-04-26 大连海事大学 Convolutional neural network image recognition method based on transfer learning improvement
CN114419341A (en) * 2022-01-20 2022-04-29 大连海事大学 Convolutional neural network image identification method based on transfer learning improvement
CN114549962A (en) * 2022-03-07 2022-05-27 重庆锐云科技有限公司 Garden plant leaf disease classification method
CN114863242B (en) * 2022-04-26 2022-11-29 北京拙河科技有限公司 Deep learning network optimization method and system for image recognition
CN114863242A (en) * 2022-04-26 2022-08-05 北京拙河科技有限公司 Deep learning network optimization method and system for image recognition
CN114764827B (en) * 2022-04-27 2024-05-07 安徽农业大学 Self-adaptive mulberry leaf disease and pest detection method in low-light scene
CN114764827A (en) * 2022-04-27 2022-07-19 安徽农业大学 Mulberry leaf disease and insect pest detection method under self-adaptive low-illumination scene
CN114972883B (en) * 2022-06-17 2024-05-10 平安科技(深圳)有限公司 Target detection sample generation method based on artificial intelligence and related equipment
CN114972883A (en) * 2022-06-17 2022-08-30 平安科技(深圳)有限公司 Target detection sample generation method based on artificial intelligence and related equipment
CN115631847B (en) * 2022-10-19 2023-07-14 哈尔滨工业大学 Early lung cancer diagnosis system, storage medium and equipment based on multiple groups of chemical characteristics
CN115631847A (en) * 2022-10-19 2023-01-20 哈尔滨工业大学 Early lung cancer diagnosis system based on multiple mathematical characteristics, storage medium and equipment
CN115563286B (en) * 2022-11-10 2023-12-01 东北农业大学 Knowledge-driven dairy cow disease text classification method
CN115563286A (en) * 2022-11-10 2023-01-03 东北农业大学 Knowledge-driven milk cow disease text classification method
CN116385953A (en) * 2023-01-11 2023-07-04 哈尔滨市科佳通用机电股份有限公司 Railway wagon door hinge breaking fault image identification method
CN116385953B (en) * 2023-01-11 2023-12-15 哈尔滨市科佳通用机电股份有限公司 Railway wagon door hinge breaking fault image identification method
CN116825283A (en) * 2023-04-27 2023-09-29 清华大学 Nuclear medicine treatment individuation dosage evaluation method and device based on transfer learning
CN116246176B (en) * 2023-05-12 2023-09-19 山东建筑大学 Crop disease detection method and device, electronic equipment and storage medium
CN116246176A (en) * 2023-05-12 2023-06-09 山东建筑大学 Crop disease detection method and device, electronic equipment and storage medium
CN116311230B (en) * 2023-05-17 2023-08-18 安徽大学 Corn leaf disease identification method and device oriented to real scene
CN116311230A (en) * 2023-05-17 2023-06-23 安徽大学 Corn leaf disease identification method and device oriented to real scene
CN116584472B (en) * 2023-07-13 2023-10-27 四川省农业机械科学研究院 Multistage control-based brittle Li Pen medicine method and system
CN116584472A (en) * 2023-07-13 2023-08-15 四川省农业机械科学研究院 Multistage control-based brittle Li Pen medicine method and system
CN117078604A (en) * 2023-07-31 2023-11-17 台州道致科技股份有限公司 Unmanned laboratory intelligent management method and system
CN117078604B (en) * 2023-07-31 2024-03-12 台州道致科技股份有限公司 Unmanned laboratory intelligent management method and system
CN116703897B (en) * 2023-08-02 2023-10-13 青岛兴牧畜牧科技发展有限公司 Pig weight estimation method based on image processing
CN116703897A (en) * 2023-08-02 2023-09-05 青岛兴牧畜牧科技发展有限公司 Pig weight estimation method based on image processing
CN116689310A (en) * 2023-08-08 2023-09-05 河南工学院 Automatic identification classification system for battery sorting and recycling
CN116689310B (en) * 2023-08-08 2023-10-20 河南工学院 Automatic identification classification system for battery sorting and recycling
CN116863340A (en) * 2023-08-16 2023-10-10 安徽荃银超大种业有限公司 Rice leaf disease identification method based on deep learning
CN116991932B (en) * 2023-09-25 2023-12-15 济南卓鲁信息科技有限公司 Data analysis and management system and method based on artificial intelligence
CN116991932A (en) * 2023-09-25 2023-11-03 济南卓鲁信息科技有限公司 Data analysis and management system and method based on artificial intelligence
CN117290762A (en) * 2023-10-11 2023-12-26 北京邮电大学 Insect pest falling-in identification method, type identification method, device, insect trap and system
CN117290762B (en) * 2023-10-11 2024-04-02 北京邮电大学 Insect pest falling-in identification method, type identification method, device, insect trap and system
CN117058492B (en) * 2023-10-13 2024-02-27 之江实验室 Two-stage training disease identification method and system based on learning decoupling
CN117058492A (en) * 2023-10-13 2023-11-14 之江实验室 Two-stage training disease identification method and system based on learning decoupling
CN117330315B (en) * 2023-12-01 2024-02-23 智能制造龙城实验室 Rotary machine fault monitoring method based on online migration learning
CN117330315A (en) * 2023-12-01 2024-01-02 智能制造龙城实验室 Rotary machine fault monitoring method based on online migration learning
CN117351293B (en) * 2023-12-04 2024-02-06 天津医科大学口腔医院 Combined learning periodontal disease image classification method and device
CN117351293A (en) * 2023-12-04 2024-01-05 天津医科大学口腔医院 Combined learning periodontal disease image classification method and device
CN117392552B (en) * 2023-12-13 2024-02-20 江西农业大学 Blade disease identification method and system based on dual-path convolutional neural network
CN117392552A (en) * 2023-12-13 2024-01-12 江西农业大学 Blade disease identification method and system based on dual-path convolutional neural network
CN117671395A (en) * 2024-02-02 2024-03-08 南昌康德莱医疗科技有限公司 Cancer cell type recognition device
CN117671395B (en) * 2024-02-02 2024-04-26 南昌康德莱医疗科技有限公司 Cancer cell type recognition device
CN117970224A (en) * 2024-03-29 2024-05-03 国网福建省电力有限公司 CVT error state online evaluation method, system, equipment and medium

Similar Documents

Publication Publication Date Title
AU2020103613A4 (en) Cnn and transfer learning based disease intelligent identification method and system
CN110148120B (en) Intelligent disease identification method and system based on CNN and transfer learning
CN109584248B (en) Infrared target instance segmentation method based on feature fusion and dense connection network
Chouhan et al. Applications of computer vision in plant pathology: a survey
US10977494B2 (en) Recognition of weed in a natural environment
WO2020192736A1 (en) Object recognition method and device
Lu et al. Canopy-attention-YOLOv4-based immature/mature apple fruit detection on dense-foliage tree architectures for early crop load estimation
Tian et al. Environmentally adaptive segmentation algorithm for outdoor image segmentation
WO2021155792A1 (en) Processing apparatus, method and storage medium
US7136524B1 (en) Robust perceptual color identification
US20220148291A1 (en) Image classification method and apparatus, and image classification model training method and apparatus
Wang et al. YOLOv3-Litchi detection method of densely distributed litchi in large vision scenes
Cai et al. Residual-capsule networks with threshold convolution for segmentation of wheat plantation rows in UAV images
CN111553240A (en) Corn disease condition grading method and system and computer equipment
CN108596195B (en) Scene recognition method based on sparse coding feature extraction
WO2020232942A1 (en) Method for constructing farmland image-based convolutional neural network model, and system thereof
CN112464983A (en) Small sample learning method for apple tree leaf disease image classification
CN112215795A (en) Intelligent server component detection method based on deep learning
CN111199245A (en) Rape pest identification method
Chen et al. CitrusYOLO: a algorithm for citrus detection under orchard environment based on YOLOV4
Selvakumar et al. Automated mango leaf infection classification using weighted and deep features with optimized recurrent neural network concept
Kumar et al. Drone-based apple detection: Finding the depth of apples using YOLOv7 architecture with multi-head attention mechanism
Nga et al. Combining binary particle swarm optimization with support vector machine for enhancing rice varieties classification accuracy
Patil et al. Sensitive crop leaf disease prediction based on computer vision techniques with handcrafted features
CN115330759B (en) Method and device for calculating distance loss based on Hausdorff distance

Legal Events

Date Code Title Description
FGI Letters patent sealed or granted (innovation patent)