CN115019175B - Pest identification method based on migration element learning - Google Patents

Pest identification method based on migration element learning Download PDF

Info

Publication number
CN115019175B
CN115019175B CN202210738451.XA CN202210738451A CN115019175B CN 115019175 B CN115019175 B CN 115019175B CN 202210738451 A CN202210738451 A CN 202210738451A CN 115019175 B CN115019175 B CN 115019175B
Authority
CN
China
Prior art keywords
learning
pest
sample
feature extraction
category
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210738451.XA
Other languages
Chinese (zh)
Other versions
CN115019175A (en
Inventor
梁炜健
王春桃
郭庆文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South China Agricultural University
Original Assignee
South China Agricultural University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South China Agricultural University filed Critical South China Agricultural University
Priority to CN202210738451.XA priority Critical patent/CN115019175B/en
Publication of CN115019175A publication Critical patent/CN115019175A/en
Application granted granted Critical
Publication of CN115019175B publication Critical patent/CN115019175B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a pest identification method based on migration element learning, which comprises the following steps: s1: constructing a pest data set, and randomly sampling to respectively form a support set and a query set; an improved feature extraction network is obtained; s2: updating parameters of the improved feature extraction network; s3: obtaining feature vectors of samples in a support set and a query set; s4: calculating to obtain a prototype of each category in the support set; s5: and calculating the class probability of each sample in the query set to be used as a pest identification result. The invention provides a pest identification method based on migration element learning, which solves the problem that the existing pest identification method has lower accuracy in distinguishing new type pests from old type pests.

Description

Pest identification method based on migration element learning
Technical Field
The invention relates to the technical field of image recognition, in particular to a pest recognition method based on migration element learning.
Background
With the development of deep learning technology, in intelligent control of agricultural diseases and insect pests, it is becoming increasingly popular to adopt a deep learning model to identify and classify the insect pests. Most pest identification methods based on small sample learning at present need to train a deep model on a common class (basic class) with enough samples, and then transfer the model to a new class with only a few examples for small sample learning. The model network has a simple structure, and is often insufficient for effectively expressing information contained in pest images, so that the accuracy is low when new and old types of pests are distinguished.
In the prior art, as disclosed in 2021-03-09, the publication number is CN112464971A, and the convolutional attention model is tested by using a test set to construct a pest detection model, so that the pest detection model has high detection precision and strong robustness, but when a new type of pests and an old type of pests are mixed, the type of the pests cannot be accurately identified as known or unknown.
Disclosure of Invention
The invention provides a pest identification method based on migration element learning, which aims to overcome the technical defect that the existing pest identification method has lower accuracy in distinguishing new type pests from old type pests.
In order to solve the technical problems, the technical scheme of the invention is as follows:
a pest identification method based on migration element learning comprises the following steps:
S1: acquiring pest image samples to construct a pest data set, and randomly sampling the pest data set to respectively form a support set and a query set;
The method comprises the steps of combining an existing feature extraction network with a feature extraction network with an improved spatial attention module;
s2: updating parameters of the improved feature extraction network by a meta learning method based on migration;
S3: respectively inputting the support set and the query set into the updated feature extraction network, and correspondingly obtaining feature vectors of each sample in the support set and the query set;
s4: calculating according to the feature vectors of the samples in the support set to obtain a prototype of each category in the support set;
S5: and calculating the class probability of each sample in the query set according to the prototype of each class in the support set and the feature vector of each sample in the query set to obtain the pest identification result.
In the scheme, the characteristic extraction capability of the existing characteristic extraction network is improved by introducing the spatial attention module, and the pest image recognition performance is improved; meanwhile, the parameter of the improved feature extraction network is updated based on the meta-learning method of migration, so that the feature extraction capability of the feature extraction network can be maintained, the feature extraction capability of the feature extraction network also has the capability of distinguishing new category features after migration, and the accuracy of distinguishing new category pests from old category pests is improved.
Preferably, in step S1, the pest data set includes M types of pest images, N types of pest images are randomly extracted from the pest data set, N is smaller than M, K pest images are randomly extracted from each extracted type to form a supporting set, and q non-repeated pest images are randomly extracted from the extracted types to form a query set.
Preferably, the feature extraction network is improved by combining ResNeSt a 101 with a spatial attention module, specifically: the insertion of spatial attention modules at the SAM locations before 3 x 3 global maximization of ResNeSt a 101 and/or at the last SAM location of each neck of each layer of ResNeSt a results in an improved feature extraction network.
Preferably, the spatial attention module is a NAM spatial attention module or CBAM spatial attention module.
Preferably, the method comprises the steps of,
The spatial attention module inserted at the SAM location prior to 3 x 3 global maximum pooling of ResNeSt101 is of the same or different type as the spatial attention module inserted at the last SAM location of each neck of each layer of ResNeSt 101;
the spatial attention module inserted at the last SAM location of each neck of each layer ResNeSt is of the same type.
In the above scheme, the improved feature extraction network SA-ResNeSt101 reserves the network depth consistent with other residual networks, can fully express the information contained in the pest images, and uses a plurality of groups of modules such as 3×3 convolution, channel attention, space attention and the like in the network, so that not only the information of each channel in the images, but also the information of each pixel in each channel can be focused, the focus area of the pest images can be focused, and the expression capability of the network can be improved on the premise of not greatly increasing the calculated amount, thereby improving the expression capability of the feature extraction network.
Preferably, step S2 specifically includes:
A1: setting a learning batch and a learning rate;
A2: randomly extracting N 1 categories from the pest data set, randomly extracting K 1 pest images from each extracted category to form a learning support set, and randomly extracting q 1 non-repeated pest images from the extracted categories to form a learning query set;
a3: inputting the learning support set and the learning query set into an improved feature extraction network, and correspondingly obtaining feature vectors of each sample in the learning support set and the learning query set;
A4: calculating according to the feature vectors of the samples in the learning support set to obtain a prototype of each category in the learning support set;
a5: calculating according to the prototype of each category in the learning support set and the feature vector of each sample in the learning query set to obtain the category probability of each sample in the learning query set;
a6: calculating a loss value between the class probability of each sample and the corresponding class label;
A7: it is determined whether the loss value is converged or not,
If yes, completing the small sample learning;
if not, updating the parameters of the improved feature extraction network, and returning to the step A2.
In the scheme, unlike the main stream meta-learning and transfer learning method, the transfer-based meta-learning (small sample learning) method not only can maintain the feature extraction capability of SA-ResNeSt101, but also has the distinguishing capability of new category features after transfer after the meta-learning process of a small number of samples, so that the method is more effective than the main stream small sample learning method in pest identification.
Preferably, a cross entropy loss function is used to calculate the loss value between the class probability of each sample and its corresponding class label.
Preferably, before inputting the pest image sample into the feature extraction network, further comprising performing data enhancement processing on the pest image sample; the data enhancement process includes: random aspect ratio clipping, random scaling, random horizontal flipping, random vertical flipping, random rotation, color dithering, gaussian blurring.
Preferably, the prototype of class c is calculated by the following formula:
Wherein, An i-th sample representing a support set category c, c=1, 2,..Representing the eigenvector of sample x i.
Preferably, the probability of category c is calculated by the following formula:
Where x j represents the j-th sample in the query set, y j represents the class label for sample x j, j=1, 2,..q, The eigenvectors representing samples x j, τ representing the hyper-parameters, sim representing the function that calculates the similarity between the two vectors, E n representing the prototype of support-set class N, n=1, 2.
Compared with the prior art, the technical scheme of the invention has the beneficial effects that:
The invention provides a pest identification method based on migration element learning, which improves the feature extraction capacity of the existing feature extraction network by introducing a spatial attention module, and improves the pest image identification performance; meanwhile, the parameter of the improved feature extraction network is updated based on the meta-learning method of migration, so that the feature extraction capability of the feature extraction network can be maintained, the feature extraction capability of the feature extraction network also has the capability of distinguishing new category features after migration, and the accuracy of distinguishing new category pests from old category pests is improved.
Drawings
FIG. 1 is a flow chart of the steps performed in the technical scheme of the invention;
FIG. 2 is a pre-training flow chart of the improved feature extraction network of the present invention;
FIG. 3 is a schematic diagram of an improved feature extraction network in accordance with the present invention;
FIG. 4 is a schematic diagram of the NAM space attention module according to the present invention;
FIG. 5 is a schematic diagram of a CBAM spatial attention module according to the present invention;
FIG. 6 is a flow chart of meta learning based on migration in the present invention.
Detailed Description
The drawings are for illustrative purposes only and are not to be construed as limiting the present patent;
For the purpose of better illustrating the embodiments, certain elements of the drawings may be omitted, enlarged or reduced and do not represent the actual product dimensions;
It will be appreciated by those skilled in the art that certain well-known structures in the drawings and descriptions thereof may be omitted.
The technical scheme of the invention is further described below with reference to the accompanying drawings and examples.
Example 1
As shown in fig. 1, a pest identification method based on transfer element learning includes the following steps:
S1: acquiring pest image samples to construct a pest data set, and randomly sampling the pest data set to respectively form a support set and a query set;
The method comprises the steps of combining an existing feature extraction network with a feature extraction network with an improved spatial attention module;
s2: updating parameters of the improved feature extraction network by a meta learning method based on migration;
S3: respectively inputting the support set and the query set into the updated feature extraction network, and correspondingly obtaining feature vectors of each sample in the support set and the query set;
s4: calculating according to the feature vectors of the samples in the support set to obtain a prototype of each category in the support set;
S5: and calculating the class probability of each sample in the query set according to the prototype of each class in the support set and the feature vector of each sample in the query set to obtain the pest identification result.
In the specific implementation process, the feature extraction capability of the existing feature extraction network is improved by introducing a spatial attention module, and the pest image recognition performance is improved; meanwhile, the parameter of the improved feature extraction network is updated based on the meta-learning method of migration, so that the feature extraction capability of the feature extraction network can be maintained, the feature extraction capability of the feature extraction network also has the capability of distinguishing new category features after migration, and the accuracy of distinguishing new category pests from old category pests is improved.
Example 2
A pest identification method based on migration element learning comprises the following steps:
S1: acquiring pest image samples to construct a pest data set, and randomly sampling the pest data set to respectively form a support set and a query set;
More specifically, the pest data set includes M classes of pest images, N classes are randomly extracted from the pest data set, N is less than M, K pest images are randomly extracted from each extracted class to form a supporting set, and q non-repeated pest images are randomly extracted from the extracted classes to form a query set.
Feature extraction network with improved combination of existing feature extraction network with spatial attention module
As shown in fig. 2, the improved feature extraction network is pre-trained: and adding a full connection layer at the output end of the SA-ResNeSt to serve as a classifier, adopting cross entropy as a loss function, and setting super parameters such as batch, learning rate and the like. Firstly taking a pest image sample for training as input to obtain corresponding prediction as output, then calculating a loss value between the prediction and a label of the sample, repeating the process until the loss value is not reduced, and removing a classifier after training is finished to obtain SA-ResNeSt with initialization parameters.
More specifically, as shown in fig. 3, the feature extraction network improved by incorporating ResNeSt a 101 with a spatial attention module is specifically: the insertion of spatial attention modules at the SAM locations before 3 x 3 global maximization of ResNeSt a 101 and/or at the last SAM location of each neck of each layer of ResNeSt a results in an improved feature extraction network.
More specifically, as shown in fig. 4-5, the spatial attention module is a NAM spatial attention module or CBAM spatial attention module.
More specifically, the method comprises the steps of,
The spatial attention module inserted at the SAM location prior to 3 x 3 global maximum pooling of ResNeSt101 is of the same or different type as the spatial attention module inserted at the last SAM location of each neck of each layer of ResNeSt 101;
the spatial attention module inserted at the last SAM location of each neck of each layer ResNeSt is of the same type.
In the specific implementation process, the improved feature extraction network SA-ResNeSt101 keeps the network depth consistent with other residual networks, can fully express information contained in pest images, and uses a plurality of groups of modules such as 3X 3 convolution, channel attention, space attention and the like in the network, so that not only is the information of each channel in the image focused, but also the information of each pixel in each channel can be focused, the focus area of the pest images can be focused, and the expression capability of the network can be improved on the premise of not greatly increasing the calculated amount, thereby improving the expression capability of the feature extraction network.
S2: updating parameters of the improved feature extraction network by a meta learning method based on migration;
More specifically, as shown in fig. 6, step S2 specifically includes:
A1: setting a learning batch and a learning rate;
A2: randomly extracting N 1 categories from the pest data set, randomly extracting K 1 pest images from each extracted category to form a learning support set, and randomly extracting q 1 non-repeated pest images from the extracted categories to form a learning query set;
a3: inputting the learning support set and the learning query set into an improved feature extraction network, and correspondingly obtaining feature vectors of each sample in the learning support set and the learning query set;
A4: calculating according to the feature vectors of the samples in the learning support set to obtain a prototype of each category in the learning support set;
more specifically, a prototype of category c is calculated by the following formula:
Wherein, An i-th sample representing a learning support set category c, c=1, 2, N, y i represents the class label of sample x i, avg ()'s function represents averaging multiple vectors,/>Representing the eigenvector of sample x i.
A5: calculating according to the prototype of each category in the learning support set and the feature vector of each sample in the learning query set to obtain the category probability of each sample in the learning query set;
More specifically, the probability of category c is calculated by the following formula:
Where x j represents the j-th sample in the learning query set, y j represents the class label for sample x j, j=1, 2,..q, Representative of the feature vector of sample x j, τ representative of the hyper-parameters, sim representative of the function that calculates the similarity between the two vectors, E n representative of the prototype of learning support set class N, n=1, 2.
A6: calculating a loss value between the class probability of each sample and the corresponding class label;
More specifically, a cross entropy loss function is used to calculate the loss value between the class probability of each sample and its corresponding class label.
In a specific implementation, the cross entropy loss function is:
Where p (y i=c|xi) represents the class probability for sample x i, y i' represents the class label for sample x i, n represents the number of samples of the learning query set, and log (-) is a logarithmic function.
A7: it is determined whether the loss value is converged or not,
If yes, completing the small sample learning;
if not, updating the parameters of the improved feature extraction network, and returning to the step A2.
In the specific implementation process, unlike the main stream meta-learning and transfer learning method, the transfer-based meta-learning (small sample learning) method not only can maintain the feature extraction capability of SA-ResNeSt101, but also has the distinguishing capability of new category features after transfer after the meta-learning process of a small number of samples, so that the method is more effective than the main stream small sample learning method in pest identification.
S3: respectively inputting the support set and the query set into the updated feature extraction network, and correspondingly obtaining feature vectors of each sample in the support set and the query set;
s4: calculating according to the feature vectors of the samples in the support set to obtain a prototype of each category in the support set;
more specifically, a prototype of category c is calculated by the following formula:
Wherein, An i-th sample representing a support set category c, c=1, 2,..Representing the eigenvector of sample x i.
S5: and calculating the class probability of each sample in the query set according to the prototype of each class in the support set and the feature vector of each sample in the query set to obtain the pest identification result.
More specifically, the probability of category c is calculated by the following formula:
Where x j represents the j-th sample in the query set, y j represents the class label for sample x j, j=1, 2,..q, The eigenvectors representing samples x j, τ representing the hyper-parameters, sim representing the function that calculates the similarity between the two vectors, E n representing the prototype of support-set class N, n=1, 2.
More specifically, before inputting the pest image sample into the feature extraction network, further comprising performing data enhancement processing on the pest image sample; the data enhancement process includes: random aspect ratio clipping, random scaling, random horizontal flipping, random vertical flipping, random rotation, color dithering, gaussian blurring.
Example 3
In this example, two relatively complete pest data sets, aug-D0 and semi-IP102, were constructed, and the two pest data sets were divided into a training data set, a test data set, and a verification data set, with specific statistical data shown in table 1.
TABLE 1 statistics of aug-D0 and semi-IP102 pest data sets
The aug-D0 and semi-IP102 are adopted to carry out experiments on the pest identification method (the method) based on the migration element learning, and the experimental results are as follows:
The comparison of the results of the small sample learning experiments on aug-IP102 with the conventional model is shown in Table 2;
TABLE 2
During training, the number of types extracted in each round is set to be 5, the number of samples of the support set extracted in each round is set to be 5, and the number of samples of the query set is set to be 15; in the test, the evaluation index is that the accuracy (accuracy) (%) of the samples of each type are set to be 5, the number of samples of each type of the support set is set to be 1, 5 and 10, the number of samples of the query set is set to be 15, and the confidence interval is 95%;
the calculation formula of the average recognition accuracy is as follows:
Where Num correct represents the predicted correct number and Num represents the total number of samples in the query set.
The comparison of the small sample learning experiment results with the traditional model on the semi-IP102 data set is shown in Table 3;
TABLE 3 Table 3
Model name 1-shot 5-shot 10-shot
Matching network 42.66%±0.42% 55.06%±0.39% 60.61%±0.38%
Prototype network 38.23%±0.36% 54.96%±0.36% 59.96%±0.34%
Relationship network 37.74%±0.40% 51.35%±0.41% 55.61%±0.39%
Primitive base line network 34.07%±0.35% 46.90%±0.55% 51.83%±0.38%
SA-ResNeSt101 57.62%±0.53% 72.90%±0.30% 76.80%±0.30%
During training, the number of types extracted in each round is set to be 5, the number of samples of the support set extracted in each round is set to be 5, and the number of samples of the query set is set to be 15; during testing, the evaluation index is that the accuracy (accuracy) (%) of each extracted category is set to be 5, the number of samples of each extracted support set is set to be 1, 5 and 10, the number of images of the query set is set to be 15, and the confidence interval is 95%.
As can be seen from tables 2 and 3, the effect of SA-ResNeSt101 in the present embodiment achieves higher recognition accuracy than the prior art on the dataset composed of the semi-IP102 and aug-D0 new categories.
A spatial attention module is added to the SAM location of ResNeSt a 101, wherein ResNeSt a in the attention type represents a network of ResNeSt a without any spatial attention module added, C-SAM represents a spatial attention module of CBAM and N-SAM represents a spatial attention module of NAM. Of the spatial attention module positions, B represents the addition of Spatial Attention Module (SAM) at the 3 convolutions before ResNeSt main way, E represents the addition of Spatial Attention Module (SAM) at the last position of each neck of ResNeSt, and the training and testing patterns are consistent with tables 2 and 3.
The effectiveness of the attention mechanism on aug-D0 is shown in Table 4;
TABLE 4 Table 4
Attention type Spatial attention module location 1-shot 5-shot 10-shot
ResNeSt101 —— 78.83%±0.51% 91.53%±0.26% 93.48%±0.21%
N-SAM B 80.45%±0.32% 92.44%±0.17% 94.16%±0.15%
N-SAM E 79.18%±0.43% 91.89%±0.26% 93.80%±0.20%
C-SAM B 84.18%±0.43% 95.00%±0.14% 96.37%±0.10%
C-SAM E 77.56%±0.50% 90.25%±0.27% 92.31%±0.22%
N-SAM,N-SAM B,E 78.40%±0.43% 91.78%±0.28% 93.88%±0.21%
N-SAM,C-SAM B,E 78.19%±0.46% 92.13%±0.29% 94.11%±0.22%
C-SAM,N-SAM B,E 79.75%±0.46% 92.44%±0.26% 94.30%±0.22%
C-SAM,C-SAM B,E 78.09%±0.46% 91.11%±0.28% 93.10%±0.24%
The effectiveness of the attention mechanism on semi-IP102 is shown in Table 5;
TABLE 5
As can be seen from tables 4 and 5, the improved feature extraction network SA-ResNeSt101 enhances the expression capability of the pest image features, and effectively improves the recognition accuracy of the pest images.
The results of the open world study experiments on aug-D0 are compared to those shown in Table 6;
TABLE 6
Model name 1-shot 5-shot 10-shot
Matching network 55.49%±0.18% 70.56%±0.09% 74%±0.08%
Prototype network 47.77%±0.18% 67.39%±0.10% 70.17%±0.08%
Relationship network 43.13%±0.18% 62.16±0.10% 64.01%±0.08%
Primitive base line network 12.33%±0.05% 34.18%±0.09% 37.39%±0.07%
SA-ResNeSt101 63.44%±0.13% 79.37%±0.07% 81.86%±0.07%
The results of the open world study experiments on semi-IP102 are compared to the results shown in Table 7;
TABLE 7
Model name 1-shot 5-shot 10-shot
Matching network 34.36%±0.13% 46.61%±0.09% 51.69%±0.08%
Prototype network 36.82%±0.15% 53.19%±0.09% 55.6%±0.08%
Relationship network 36.46%±0.15% 51.11%±0.08% 52.7%±0.07%
Primitive base line network 17.92%±0.08% 30.23%±0.08% 33.74%±0.07%
SA-ResNeSt101 44.87%±0.18% 60.01%±0.08% 63.58%±0.07%
As can be seen from tables 6 and 7, the present method can more accurately distinguish new and old categories as compared to the prior art on an open world dataset consisting of aug-D0 and semi-IP102 new and old categories.
The comparison of the migration-based meta-learning method described in this embodiment with the experimental results of the existing meta-learning on the dataset composed of the aug-D0 and semi-IP102 new classes is shown in table 8;
TABLE 8
The experimental results of the migration-based meta learning method described in this embodiment and the existing meta learning on the open world dataset composed of the new class and the old class of aug-D0 and semi-IP102 are shown in table 9;
TABLE 9
As can be seen from tables 8 and 9, the method further improves pest identification accuracy based on meta-learning compared with the existing meta-learning process on the new data set and the open world data set consisting of semi-IP102 and aug-D0.
It is to be understood that the above examples of the present invention are provided by way of illustration only and not by way of limitation of the embodiments of the present invention. Other variations or modifications of the above teachings will be apparent to those of ordinary skill in the art. It is not necessary here nor is it exhaustive of all embodiments. Any modification, equivalent replacement, improvement, etc. which come within the spirit and principles of the invention are desired to be protected by the following claims.

Claims (7)

1. The pest identification method based on migration element learning is characterized by comprising the following steps of:
S1: acquiring pest image samples to construct a pest data set, and randomly sampling the pest data set to respectively form a support set and a query set;
The method comprises the steps of combining an existing feature extraction network with a feature extraction network with an improved spatial attention module;
s2: updating parameters of the improved feature extraction network by a meta learning method based on migration;
S3: respectively inputting the support set and the query set into the updated feature extraction network, and correspondingly obtaining feature vectors of each sample in the support set and the query set;
s4: calculating according to the feature vectors of the samples in the support set to obtain a prototype of each category in the support set;
S5: calculating according to the prototype of each category in the support set and the feature vector of each sample in the query set to obtain the category probability of each sample in the query set as a pest identification result;
The step S2 specifically comprises the following steps:
A1: setting a learning batch and a learning rate;
A2: randomly extracting N 1 categories from the pest data set, randomly extracting K 1 pest images from each extracted category to form a learning support set, and randomly extracting q 1 non-repeated pest images from the extracted categories to form a learning query set;
a3: inputting the learning support set and the learning query set into an improved feature extraction network, and correspondingly obtaining feature vectors of each sample in the learning support set and the learning query set;
A4: calculating according to the feature vectors of the samples in the learning support set to obtain a prototype of each category in the learning support set;
a5: calculating according to the prototype of each category in the learning support set and the feature vector of each sample in the learning query set to obtain the category probability of each sample in the learning query set;
a6: calculating a loss value between the class probability of each sample and the corresponding class label;
A7: it is determined whether the loss value is converged or not,
If yes, completing the small sample learning;
if not, updating parameters of the improved feature extraction network, and returning to the step A2;
The prototype of category c is calculated by the following formula:
Wherein, The ith sample representing class c in the support set, c=1, 2, …, N, y i represents the class label for sample x i, avg (-) function represents averaging multiple vectors,/>A feature vector representing sample x i;
The probability of category c is calculated by the following formula:
Where x j represents the j-th sample in the query set, y j represents the class label for sample x j, j=1, 2, …, q, Representing the eigenvectors of sample x j, τ representing the hyper-parameters, sim representing the function of computing the similarity between the two vectors, E n representing the prototype of class N in the support set, n=1, 2, …, N.
2. The method for identifying pests based on the shift element learning according to claim 1, wherein in the step S1, the pest data set includes M types of pest images, N types of pest data sets are randomly extracted, N < M, K pest image composition support sets are randomly extracted from each extracted type, and q non-repeated pest image composition query sets are randomly extracted from the extracted types.
3. The pest identification method based on the transition element learning according to claim 1, wherein the feature extraction network improved by combining ResNeSt with the spatial attention module is specifically: the insertion of spatial attention modules at the SAM locations before 3 x 3 global maximization of ResNeSt a 101 and/or at the last SAM location of each neck of each layer of ResNeSt a results in an improved feature extraction network.
4. A method of pest identification based on mobile element learning according to claim 3, wherein the spatial attention module is a NAM spatial attention module or CBAM spatial attention module.
5. The pest identification method based on the transition element learning according to claim 4, wherein,
The spatial attention module inserted at the SAM location prior to 3 x 3 global maximum pooling of ResNeSt101 is of the same or different type as the spatial attention module inserted at the last SAM location of each neck of each layer of ResNeSt 101;
the spatial attention module inserted at the last SAM location of each neck of each layer ResNeSt is of the same type.
6. The method of claim 5, wherein the cross entropy loss function is used to calculate a loss value between the class probability of each sample and its corresponding class label.
7. The pest identification method based on the shift element learning according to claim 1, further comprising performing data enhancement processing on the pest image sample before inputting the pest image sample into the feature extraction network; the data enhancement process includes: random aspect ratio clipping, random scaling, random horizontal flipping, random vertical flipping, random rotation, color dithering, gaussian blurring.
CN202210738451.XA 2022-06-27 2022-06-27 Pest identification method based on migration element learning Active CN115019175B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210738451.XA CN115019175B (en) 2022-06-27 2022-06-27 Pest identification method based on migration element learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210738451.XA CN115019175B (en) 2022-06-27 2022-06-27 Pest identification method based on migration element learning

Publications (2)

Publication Number Publication Date
CN115019175A CN115019175A (en) 2022-09-06
CN115019175B true CN115019175B (en) 2024-06-18

Family

ID=83076870

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210738451.XA Active CN115019175B (en) 2022-06-27 2022-06-27 Pest identification method based on migration element learning

Country Status (1)

Country Link
CN (1) CN115019175B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117709394A (en) * 2024-02-06 2024-03-15 华侨大学 Vehicle track prediction model training method, multi-model migration prediction method and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113191229A (en) * 2021-04-20 2021-07-30 华南农业大学 Intelligent visual pest detection method
CN114511739A (en) * 2022-01-25 2022-05-17 哈尔滨工程大学 Task-adaptive small sample image classification method based on meta-migration learning

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112270681B (en) * 2020-11-26 2022-11-15 华南农业大学 Method and system for detecting and counting yellow plate pests deeply
CN114639000A (en) * 2022-03-30 2022-06-17 浙江大学 Small sample learning method and device based on cross-sample attention aggregation

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113191229A (en) * 2021-04-20 2021-07-30 华南农业大学 Intelligent visual pest detection method
CN114511739A (en) * 2022-01-25 2022-05-17 哈尔滨工程大学 Task-adaptive small sample image classification method based on meta-migration learning

Also Published As

Publication number Publication date
CN115019175A (en) 2022-09-06

Similar Documents

Publication Publication Date Title
CN113378632B (en) Pseudo-label optimization-based unsupervised domain adaptive pedestrian re-identification method
CN110443143B (en) Multi-branch convolutional neural network fused remote sensing image scene classification method
CN107506761B (en) Brain image segmentation method and system based on significance learning convolutional neural network
CN110334765B (en) Remote sensing image classification method based on attention mechanism multi-scale deep learning
CN108416370A (en) Image classification method, device based on semi-supervised deep learning and storage medium
CN102609681B (en) Face recognition method based on dictionary learning models
CN111079847B (en) Remote sensing image automatic labeling method based on deep learning
CN112800876B (en) Super-spherical feature embedding method and system for re-identification
CN111178120B (en) Pest image detection method based on crop identification cascading technology
CN111860596B (en) Unsupervised pavement crack classification method and model building method based on deep learning
CN109583379A (en) A kind of pedestrian&#39;s recognition methods again being aligned network based on selective erasing pedestrian
CN110457677B (en) Entity relationship identification method and device, storage medium and computer equipment
CN112862849B (en) Image segmentation and full convolution neural network-based field rice ear counting method
CN111461238A (en) Model training method, character recognition method, device, equipment and storage medium
CN108681735A (en) Optical character recognition method based on convolutional neural networks deep learning model
CN104268552B (en) One kind is based on the polygonal fine classification sorting technique of part
CN113887480B (en) Burma language image text recognition method and device based on multi-decoder joint learning
CN113486886B (en) License plate recognition method and device in natural scene
CN116258861B (en) Semi-supervised semantic segmentation method and segmentation device based on multi-label learning
CN111723815A (en) Model training method, image processing method, device, computer system, and medium
CN115019175B (en) Pest identification method based on migration element learning
CN116612307A (en) Solanaceae disease grade identification method based on transfer learning
CN111310820A (en) Foundation meteorological cloud chart classification method based on cross validation depth CNN feature integration
CN113033345B (en) V2V video face recognition method based on public feature subspace
CN114037886A (en) Image recognition method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant