CN114463651A - Crop pest and disease identification method based on ultra-lightweight efficient convolutional neural network - Google Patents

Crop pest and disease identification method based on ultra-lightweight efficient convolutional neural network Download PDF

Info

Publication number
CN114463651A
CN114463651A CN202210012912.5A CN202210012912A CN114463651A CN 114463651 A CN114463651 A CN 114463651A CN 202210012912 A CN202210012912 A CN 202210012912A CN 114463651 A CN114463651 A CN 114463651A
Authority
CN
China
Prior art keywords
module
layer
neural network
features
ultra
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210012912.5A
Other languages
Chinese (zh)
Inventor
张晨晓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN202210012912.5A priority Critical patent/CN114463651A/en
Publication of CN114463651A publication Critical patent/CN114463651A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a crop pest and disease identification method based on an ultra-lightweight efficient convolutional neural network. According to the method, a depth separable convolution module is used for carrying out efficient picture high-dimensional feature extraction, a spatial pyramid pooling layer is combined for carrying out local and global feature maintenance, and then the method is put into a full-connection classifier for classification training. Compared with the existing method, the method has less network parameters, the network training speed is higher, and the classification precision similar to that of a complex neural network can be achieved on a small sample set. The method has lower requirements on hardware equipment in an actual application scene, and is more suitable for deployment and application on a low-calculation-force mobile end platform.

Description

Crop pest and disease identification method based on ultra-lightweight efficient convolutional neural network
Technical Field
The invention belongs to the technical field of image recognition, and particularly relates to a crop pest and disease identification method based on an ultra-lightweight efficient convolutional neural network.
Background
Crop pest infection is a main factor influencing the healthy growth of plants, and the grain safety is greatly threatened. The real-time, rapid and accurate monitoring of the whole life cycle of crop diseases and insect pests has important significance for protecting the growth of crops, ensuring the yield of crops and maintaining the safety of grains. However, due to the fact that the diseases and pests are various, the method relying on manual visual inspection and based on experience interpretation is very low in efficiency, misdiagnosis is easy to occur, and the accuracy and timeliness of crop disease and pest monitoring are poor. Therefore, the automatic detection of crop diseases and insect pests based on the computer vision technology provides an efficient and low-cost method for real-time crop growth monitoring and management. In recent years, the explosive development of deep learning techniques in various fields has verified their excellent performance in image interpretation and understanding. Most of the existing plant disease classification methods based on deep learning adopt a convolutional neural network originally developed for a general image classification task. Although the existing neural network architectures can also be directly applied to pest and disease picture recognition of crops, the particularity of the crop pictures is not considered, the network architectures generally have huge network training parameters and have higher requirements on the computing capability of hardware equipment, and the training and reasoning time of the model takes longer time, so that the rapid and flexible deployment of the neural network architectures on platforms with limited computing power (such as mobile end platforms: mobile phones) is severely restricted.
Disclosure of Invention
The invention aims to solve the technical problems that aiming at the defects of the method, an ultra-lightweight high-efficiency convolutional neural network is provided for realizing the crop disease and insect pest identification task based on the image, and the problems that the parameter quantity of the existing model is huge, the training reasoning time is long, and the platform computing power requirement is high are mainly solved.
The proposed network consists of two parts, a depth feature extraction module that employs residual depth convolution and a classification module that receives multi-scale features enhanced by a spatial pyramid pooling layer. The network is built in a very compact design with only about 10 ten thousand parameters, which greatly facilitates the need for lightweight models in real demand. Publicly available plant data sets were used in the experiments. The proposed network shows superior advantages compared to the most advanced architectures, with the lowest computational complexity and competitive classification performance.
In order to solve the problems, the invention provides a crop disease and insect pest identification method based on an ultra-lightweight efficient convolutional neural network, which mainly comprises the following steps:
step 1, collecting health, disease and insect pest image data of different crop types;
step 2, preprocessing the collected crop image data set, and dividing the data set into a training set, a testing set and a verification set according to a certain proportion;
step 3, inputting the training set data into the ultra-lightweight high-efficiency convolutional neural network for training;
the ultra-lightweight high-efficiency convolutional neural network structure consists of 5 basic modules, wherein the first module sequentially comprises a convolutional layer, a batch normalization layer and an activation function layer, the second module comprises a plurality of pooling layers, the third module comprises two residual error depth separable convolutional modules with the step length of 1 and the step length of 2, the fourth module comprises a convolutional layer, a batch normalization layer and an activation function layer, the characteristics output by the fourth module are input into the fifth module through space pyramid pooling, and the five modules are single-layer fully-connected layers;
step 4, continuously inputting the verification set data into the network in the training process to check the result and evaluate the performance;
step 5, repeating the step 3 and the step 4, and only keeping the model with the best performance on the verification set until the training is finished;
and 6, acquiring a finally trained network model, and inputting data on the test set into the crop disease and insect pest identification model to obtain a final disease and insect pest detection result.
Further, the step 2 specifically includes: removing blurred images, out-of-focus images and images lost by a shooting subject from the collected images, and then sequentially selecting images from the data set according to the proportion of 70%, 20% and 10% as a training set, a verification set and a test set respectively.
Further, the ultra-lightweight high-efficiency convolutional neural network structure used in step 3 is composed of 5 basic modules, the first module is composed of 1 convolutional layer with a convolutional kernel of 3 × 3 and a step size of 1, 1 batch normalization layer and a ReLU activation function layer in sequence, the second module is composed of two pooling layers with a maximal convolutional kernel of 3 × 3 and a step size of 2, the third module is composed of two residual depth separable convolutional modules with a step size of 1 and a step size of 2, the fourth module is composed of 1 convolutional layer with a convolutional kernel of 1 × 1 and a step size of 1, 1 batch normalization layer and a ReLU activation function layer in sequence, the features output by the fourth module are subjected to spatial pyramid pooling operation and finally input into the fifth module, it consists of a single fully connected layer with 2016 an input dimension and 38 an output dimension.
Further, the residual depth separable convolution module with step size of 1 copies the input features into two identical features, and passes one group of features through 1 submodule, which is composed of 1 convolution kernel of 3 × 3, a depth separable convolution module with step size of 1, 1 batch normalization layer, 1 convolution kernel of 1 × 1, a convolution layer with step size of 1, 1 batch normalization layer, and 1 ReLU activation layer, and the features processed by the submodule are spliced with the original features, and then a channel recombination module is used to obtain the final output features.
Further, the residual depth separable convolution module with step size of 2 copies the input features into two identical features, and passes one group of the features through a first sub-module, which is composed of 1 convolution kernel of 3 × 3, a depth separable convolution module with step size of 2, 1 batch normalization layer, 1 convolution kernel of 1 × 1 and step size of 1, 1 batch normalization layer and 1 ReLU activation layer; another set of features is passed through a second sub-module consisting of 1 convolution kernel with 1 x 1 and a convolution layer with step size 1, 1 batch normalization layer, one ReLU active layer, 1 convolution kernel with 3 x 3 and a depth separable convolution module with step size 2, one batch normalization layer, 1 convolution kernel with 1 x 1 and a step size 1, 1 batch normalization layer, and one ReLU active layer. Finally, the features processed by the two sub-modules are spliced, and then the final output feature is obtained through a channel recombination module.
Furthermore, the depth separable convolution module takes the thought of a residual error network as reference, the input features are copied into two identical feature groups, one of the two identical feature groups is subjected to feature transformation through a depth convolution layer and then is spliced with the original features, and because the depth separable convolution is not subjected to information transformation of channel dimensions, 1 channel recombination module is designed after the features, and the information interaction of the features in the channel dimensions is enhanced.
Furthermore, the channel recombination module divides the multi-channel features into N groups, then transposes the feature groups according to the group transformation dimensionality, and groups and splices the transposed feature groups again to form a final new feature map.
Further, the loss function used in the network training in step 3 is a cross-entropy loss function.
Further, the performance evaluation in step 4 includes accuracy, recall and F1 index.
Compared with the prior art, the invention provides an ultra-lightweight high-efficiency neural network classifier for identifying agricultural pest and disease pictures, introduces deep separable convolution into a feature extractor, greatly reduces the number of training parameters required by a common feature extractor, introduces a channel recombination module for enhancing the interaction of multi-channel features in order to increase the information interaction of the features in multi-channel dimensions, and provides a spatial pyramid pooling layer on the feature classifier for maintaining the information of local and global multi-scale features of the feature map. Compared with the existing convolutional neural network classification framework, the method provided by the invention has fewer network parameters, the network training speed is higher, and the classification precision similar to that of a complex neural network can be achieved on a small sample set. Therefore, the method has lower requirements on hardware equipment in practical application scenes, and is very suitable for deployment and application on low-computing-power mobile end platforms (such as mobile phones and the like).
Drawings
FIG. 1 is a flow chart of an embodiment of the present invention.
Fig. 2 is a structural diagram of an ultra-lightweight high-efficiency convolutional neural network constructed in an embodiment of the present invention.
Fig. 3 is a block diagram of a depth separable convolution module.
Detailed Description
The present invention will be further described with reference to the accompanying drawings.
As shown in figure 1, the invention provides a crop pest and disease identification method based on an ultra-lightweight efficient convolutional neural network, which comprises the following steps:
step 1, collecting health, disease and insect pest image data of different crop types;
step 2, preprocessing the collected crop image data set, and dividing the data set into a training set, a testing set and a verification set according to a certain proportion;
step 3, inputting training set data into the ultra-lightweight high-efficiency convolutional neural network for training, wherein a loss function used for network training is a cross entropy loss function;
step 4, continuously inputting the verification set data into the network in the training process to check the result and evaluate the performance;
step 5, repeating the step 3 and the step 4, and only keeping the model with the best performance on the verification set until the training is finished;
and 6, acquiring a finally trained network model, and inputting data on the test set into the crop disease and insect pest identification model to obtain a final disease and insect pest detection result.
Further, the step 2 specifically includes: removing blurred images, out-of-focus images and images lost by a shooting subject from the collected images, and then sequentially selecting images from the data set according to the proportion of 70%, 20% and 10% as a training set, a verification set and a test set respectively.
Further, as shown in fig. 2, the ultra-lightweight high-efficiency convolutional neural network structure used in step 3 is composed of 5 basic modules, the first module is composed of 1 convolutional layer with convolution kernel of 3 × 3 and step size of 1, 1 batch normalization layer and a ReLU activation function layer in sequence, the second module is composed of two pooling layers with maximum convolution kernel of 3 × 3 and step size of 2, the third module is composed of two residual depth separable convolution modules with step size of 1 and step size of 2, the fourth module is composed of 1 convolutional layer with convolution kernel of 1 × 1 and step size of 1, 1 batch normalization layer and a ReLU activation function layer in sequence, the features output by the fourth module are subjected to spatial pyramid pooling operation and finally input into the fifth module, it consists of a single fully connected layer with 2016 an input dimension and 38 an output dimension.
Further, as shown in fig. 3a, the residual depth separable convolution module with step size of 1 copies the input features into two identical features, and passes one set of the features through 1 sub-module, which is composed of 1 convolution kernel of 3 × 3, a depth separable convolution module with step size of 1, 1 batch normalization layer, 1 convolution kernel of 1 × 1, a convolution layer with step size of 1, 1 batch normalization layer, and 1 ReLU activation layer, and the features processed by the sub-module are spliced with the original features, and then pass through a channel recombination module to obtain the final output features.
Further, as shown in fig. 3b, the residual depth separable convolution module with step size 2 first copies the input features into two identical features, and passes one set of the features through a first sub-module, which is composed of 1 convolution kernel of 3 × 3, a depth separable convolution module with step size 2, 1 batch normalization layer, 1 convolution kernel of 1 × 1, a convolution layer with step size 1, 1 batch normalization layer, and 1 ReLU activation layer; another set of features is passed through a second sub-module consisting of 1 convolution kernel with 1 x 1 and a convolution layer with step size 1, 1 batch normalization layer, one ReLU active layer, 1 convolution kernel with 3 x 3 and a depth separable convolution module with step size 2, one batch normalization layer, 1 convolution kernel with 1 x 1 and a step size 1, 1 batch normalization layer, and one ReLU active layer. Finally, the features processed by the two sub-modules are spliced, and then the final output feature is obtained through a channel recombination module.
Furthermore, the depth separable convolution module takes the thought of a residual error network as reference, the input features are copied into two identical feature groups, one of the two identical feature groups is subjected to feature transformation through a depth convolution layer and then is spliced with the original features, and because the depth separable convolution is not subjected to information transformation of channel dimensions, 1 channel recombination module is designed after the features, and the information interaction of the features in the channel dimensions is enhanced.
Furthermore, the channel recombination module divides the multi-channel features into N groups, then transposes the feature groups according to the group transformation dimensionality, and groups and splices the transposed feature groups again to form a final new feature map.
The method and the existing method are respectively tested on the PLANT-VILLAGE open data set for classification precision and classification efficiency, and the test results are as follows:
TABLE 1 Classification accuracy comparison
Figure BDA0003459680280000051
TABLE 2 comparison of Classification efficiencies
Figure BDA0003459680280000052
VGG16, MobileNet _ v2, Xception, ShuffleNet _ v2 in Table 1 are the existing methods, compared with the existing methods, the method of the invention achieves 98.13% accuracy and 97.49 recall rate, and the F1 index of 0.9776 is very close to the optimal Xception method in the overall classification accuracy. As can be seen from Table 2, compared with other methods, the method of the present invention has very few model parameters, memory occupation and floating point operands, and in terms of operation speed, the method of the present invention is 2.6 times faster than ShuffleNet _ v2 and 380 times faster than Xception. In conclusion, compared with the existing neural network classification framework, the method has fewer network parameters and higher network training speed, can achieve the classification precision performance similar to the optimal network framework on a small sample set, achieves the optimization of hundreds of times on the operation efficiency, and only occupies 6Mb in memory space. Thus, the inventive method is well suited for operation on low-computing platforms.
The specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made to the described embodiments or alternatives may be employed by those skilled in the art without departing from the spirit or ambit of the invention as defined in the appended claims.

Claims (8)

1. A crop pest and disease identification method based on an ultra-lightweight efficient convolutional neural network is characterized by comprising the following steps:
step 1, collecting health, disease and insect pest image data of different crop types;
step 2, preprocessing the collected crop image data set, and dividing the data set into a training set, a testing set and a verification set according to a certain proportion;
step 3, inputting the training set data into the ultra-lightweight high-efficiency convolutional neural network for training;
the ultra-lightweight high-efficiency convolutional neural network structure consists of 5 basic modules, wherein the first module sequentially comprises a convolutional layer, a batch normalization layer and an activation function layer, the second module comprises a plurality of pooling layers, the third module comprises two residual error depth separable convolutional modules with the step length of 1 and the step length of 2, the fourth module comprises a convolutional layer, a batch normalization layer and an activation function layer, the characteristics output by the fourth module are input into the fifth module through space pyramid pooling, and the five modules are single-layer fully-connected layers;
step 4, continuously inputting the verification set data into the network in the training process to check the result and evaluate the performance;
step 5, repeating the step 3 and the step 4, and only keeping the model with the best performance on the verification set until the training is finished;
and 6, acquiring a finally trained network model, and inputting data on the test set into the crop disease and insect pest identification model to obtain a final disease and insect pest detection result.
2. The crop pest and disease identification method based on the ultra-lightweight efficient convolutional neural network as claimed in claim 1, characterized in that: and 2, removing blurred images, defocused images and images lost by a shooting subject from the acquired images, and then sequentially selecting images from the data set according to the proportion of 70%, 20% and 10% as a training set, a verification set and a test set respectively.
3. The crop pest and disease identification method based on the ultra-lightweight efficient convolutional neural network as claimed in claim 1, characterized in that: in step 3, the first module is formed by stacking 1 convolution layer with convolution kernel of 3 × 3 and step length of 1, 1 batch normalization layer and a ReLU activation function layer in sequence;
the second module is formed by stacking two pooling layers with the maximum convolution kernel of 3 x 3 and the step length of 2;
the fourth module is formed by overlapping 1 convolution layer with convolution kernel of 1 × 1 and step length of 1, 1 batch normalization layer and a ReLU activation function layer in sequence;
the fifth module consists of a single fully connected layer with 2016 an input dimension and 38 an output dimension.
4. The crop pest and disease identification method based on the ultra-lightweight efficient convolutional neural network as claimed in claim 1, characterized in that: the specific processing process of the residual depth separable convolution module with the step length of 1 is as follows;
firstly, copying input features into two identical features, enabling one group of features to pass through 1 submodule, wherein the submodule is composed of 1 depth separable convolution module with convolution kernel of 3 x 3 and step length of 1, 1 batch normalization layer, 1 convolution kernel of 1 x 1 and step length of 1, 1 batch normalization layer and 1 ReLU activation layer, splicing the features processed by the submodule and original features, and then obtaining final output features through a channel recombination module.
5. The crop pest and disease identification method based on the ultra-lightweight efficient convolutional neural network as claimed in claim 1, characterized in that: the specific processing procedure of the residual depth separable convolution module with the step length of 2 is as follows;
firstly, copying input features into two identical features, and enabling one group of features to pass through a first sub-module, wherein the sub-module consists of 1 depth separable convolution module with convolution kernel of 3 × 3 and step length of 2, 1 batch normalization layer, 1 convolution kernel of 1 × 1 and step length of 1, 1 batch normalization layer and 1 ReLU activation layer; and the other group of features passes through a second sub-module, the sub-module consists of 1 convolution kernel with 1 × 1 and the step size of 1 convolution layer, 1 batch normalization layer, a ReLU active layer, 1 convolution kernel with 3 × 3 and the step size of 2 depth separable convolution modules, one batch normalization layer, 1 convolution kernel with 1 × 1 and the step size of 1 convolution layer, 1 batch normalization layer and one ReLU active layer, and finally the processed features of the two sub-modules are spliced and the final output features are obtained through a channel recombination module.
6. The crop pest and disease identification method based on the ultra-lightweight high-efficiency convolutional neural network as claimed in claim 4 or 5, characterized in that: the channel recombination module divides the multichannel characteristics into N groups, then transforms the characteristics according to the group dimension, transposes the characteristic groups, and groups and splices the transposed characteristic groups again to form a final new characteristic diagram.
7. The crop pest and disease identification method based on the ultra-lightweight efficient convolutional neural network as claimed in claim 1, characterized in that: and 3, the loss function used for network training in the step 3 is a cross entropy loss function.
8. The crop pest and disease identification method based on the ultra-lightweight efficient convolutional neural network as claimed in claim 1, characterized in that: the performance evaluation in step 4 included accuracy, recall, and F1 index.
CN202210012912.5A 2022-01-07 2022-01-07 Crop pest and disease identification method based on ultra-lightweight efficient convolutional neural network Pending CN114463651A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210012912.5A CN114463651A (en) 2022-01-07 2022-01-07 Crop pest and disease identification method based on ultra-lightweight efficient convolutional neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210012912.5A CN114463651A (en) 2022-01-07 2022-01-07 Crop pest and disease identification method based on ultra-lightweight efficient convolutional neural network

Publications (1)

Publication Number Publication Date
CN114463651A true CN114463651A (en) 2022-05-10

Family

ID=81409281

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210012912.5A Pending CN114463651A (en) 2022-01-07 2022-01-07 Crop pest and disease identification method based on ultra-lightweight efficient convolutional neural network

Country Status (1)

Country Link
CN (1) CN114463651A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115116054A (en) * 2022-07-13 2022-09-27 江苏科技大学 Insect pest identification method based on multi-scale lightweight network
CN117122308A (en) * 2023-07-24 2023-11-28 苏州大学 Electrocardiogram measurement method and system based on mobile phone built-in acceleration sensor

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115116054A (en) * 2022-07-13 2022-09-27 江苏科技大学 Insect pest identification method based on multi-scale lightweight network
CN115116054B (en) * 2022-07-13 2024-05-24 江苏科技大学 Multi-scale lightweight network-based pest and disease damage identification method
CN117122308A (en) * 2023-07-24 2023-11-28 苏州大学 Electrocardiogram measurement method and system based on mobile phone built-in acceleration sensor
CN117122308B (en) * 2023-07-24 2024-04-12 苏州大学 Electrocardiogram measurement method and system based on mobile phone built-in acceleration sensor

Similar Documents

Publication Publication Date Title
Zhang et al. Identification of maize leaf diseases using improved deep convolutional neural networks
CN114463651A (en) Crop pest and disease identification method based on ultra-lightweight efficient convolutional neural network
CN108090447A (en) Hyperspectral image classification method and device under double branch's deep structures
CN111696101A (en) Light-weight solanaceae disease identification method based on SE-Inception
CN115116054B (en) Multi-scale lightweight network-based pest and disease damage identification method
CN112308825B (en) SqueezeNet-based crop leaf disease identification method
CN114972208B (en) YOLOv 4-based lightweight wheat scab detection method
KR20200091795A (en) LEARNING METHOD AND LEARNING DEVICE FOR CNN USING 1xK OR Kx1 CONVOLUTION TO BE USED FOR HARDWARE OPTIMIZATION, AND TESTING METHOD AND TESTING DEVICE USING THE SAME
CN114676769A (en) Visual transform-based small sample insect image identification method
Sehree et al. Olive trees cases classification based on deep convolutional neural network from unmanned aerial vehicle imagery
CN114170137A (en) Pepper disease identification method, identification system and computer readable storage medium
CN110852398B (en) Aphis gossypii glover recognition method based on convolutional neural network
CN110414338B (en) Pedestrian re-identification method based on sparse attention network
CN116630700A (en) Remote sensing image classification method based on introduction channel-space attention mechanism
CN115761356A (en) Image recognition method and device, electronic equipment and storage medium
CN112989912B (en) Oil tea fruit type identification method based on unmanned aerial vehicle image
CN114972264A (en) Method and device for identifying mung bean leaf spot based on MS-PLNet model
CN115063700A (en) Detection method based on small-scale pine wood nematode disease tree
Rajeswarappa et al. Crop Pests Identification based on Fusion CNN Model: A Deep Learning
CN114120046B (en) Lightweight engineering structure crack identification method and system based on phantom convolution
CN113436200B (en) RGB image classification method based on lightweight segmentation convolutional network
CN116012718B (en) Method, system, electronic equipment and computer storage medium for detecting field pests
CN115100517B (en) Method and device for identifying insects in field
CN116258914B (en) Remote Sensing Image Classification Method Based on Machine Learning and Local and Global Feature Fusion
CN117372881B (en) Intelligent identification method, medium and system for tobacco plant diseases and insect pests

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination