CN112668490A - Yolov 4-based pest detection method, system, device and readable storage medium - Google Patents

Yolov 4-based pest detection method, system, device and readable storage medium Download PDF

Info

Publication number
CN112668490A
CN112668490A CN202011611845.6A CN202011611845A CN112668490A CN 112668490 A CN112668490 A CN 112668490A CN 202011611845 A CN202011611845 A CN 202011611845A CN 112668490 A CN112668490 A CN 112668490A
Authority
CN
China
Prior art keywords
data set
pest
training
yolov4
pest detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011611845.6A
Other languages
Chinese (zh)
Other versions
CN112668490B (en
Inventor
陈渝阳
朱旭华
吴弘洋
冯晋
姚波
刘志敏
申智慧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Top Cloud Agri Technology Co ltd
Original Assignee
Zhejiang Top Cloud Agri Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Top Cloud Agri Technology Co ltd filed Critical Zhejiang Top Cloud Agri Technology Co ltd
Priority to CN202011611845.6A priority Critical patent/CN112668490B/en
Publication of CN112668490A publication Critical patent/CN112668490A/en
Application granted granted Critical
Publication of CN112668490B publication Critical patent/CN112668490B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a pest detection method based on YOLOv4, which comprises the following steps: obtaining pictures of various insect pest species to form an original data set; carrying out sample enhancement processing on an original data set and dividing the original data set into a training data set, a verification data set and a test data set; modifying anchor frame parameter values of the YOLOv4 network model based on a k-means clustering algorithm, and modifying a loss function of the YOLOv4 network model based on a cross entropy loss function; inputting a training data set into a YOLOv4 network model for training, taking a test data set and a verification data set as input, and verifying a training result to obtain an improved YOLOv4 pest detection model; and identifying the pest picture to be detected based on the improved YOLOv4 pest detection model to obtain a pest identification result. The pest image to be detected can be quickly identified to obtain an identification result, the unstable gradient condition occurring along with depth deepening can be quickly and effectively relieved, and the detection precision of the algorithm is remarkably improved.

Description

Yolov 4-based pest detection method, system, device and readable storage medium
Technical Field
The invention relates to the technical field of big data processing, in particular to a pest detection method, a pest detection system, a pest detection device and a readable storage medium based on YOLOv 4.
Background
At present, agricultural and forest plants cultivated in large areas provide enough food for insects, so that insects and quantity of pests are much. Up to now, the description of pest species in our country can be found: about 300 kinds of rice pests, more than 300 kinds of cotton pests, more than 160 kinds of apple pests and about 200 kinds of mulberry pests exist. Because the pests are various and have great harm to crops, the management and prevention work of the crop pests and diseases has a vital influence on the yield and the quality of the crops, and helps farmers to quickly and correctly identify the field pests and diseases, so that the effectiveness and the timeliness of management measures can be improved to the greatest extent, the disaster degree and the damage area are reduced, the safety of agricultural products is powerfully guaranteed, and the economic loss is reduced.
Traditional discernment field plant diseases and insect pests are all that the manual work goes to discernment, however, discernment is not all can discern, probably entomology expert need expend a large amount of time and energy when differentiating the insect, say more or not ordinary lack the relevant knowledge agricultural worker of insect pest, and the work of discernment is difficult to carry out. However, the efficiency of scientific research personnel can be effectively improved by identifying the insect pests and using the insect pest detection method, and meanwhile, the knowledge and the science popularization of insect pest directions can be provided for common agricultural workers, so that all agricultural workers are helped to judge the types of the insect pests and provide effective insect pest control measures.
With the increasing research of big data, the deep learning based on the convolutional neural network is a new research direction in the field of machine learning, the deep learning method learns the internal rules of sample data to obtain the internal information of the data, the information is very helpful for explaining the data such as characters, images, voice and the like, in the prior art, the deep learning and the pest recognition are combined with each other to carry out the pest recognition, however, the existing methods for carrying out the recognition based on the big data have lower accuracy and are not accurate enough for classification.
Disclosure of Invention
The invention provides a pest detection method, a pest detection system, a pest detection device and a readable storage medium based on YOLOv4, aiming at the defects in the prior art.
In order to solve the technical problem, the invention is solved by the following technical scheme:
a pest detection method based on YOLOv4 comprises the following steps:
obtaining pictures of various insect pest species to form an original data set;
carrying out sample enhancement processing on an original data set and dividing the original data set into a training data set, a verification data set and a test data set;
modifying anchor frame parameter values of the YOLOv4 network model based on a k-means clustering algorithm, and modifying a loss function of the YOLOv4 network model based on a cross entropy loss function;
inputting a training data set into a YOLOv4 network model for training, taking a test data set and a verification data set as input after the training is finished, and verifying a training result to obtain an improved YOLOv4 pest detection model;
and identifying the pest picture to be detected based on the improved YOLOv4 pest detection model to obtain a pest identification result.
As an implementation manner, the modifying the anchor frame parameter value of the YOLOv4 network model specifically includes:
and obtaining the optimal anchor points under three scales based on a k-means clustering algorithm, and using the anchor points as anchor frame parameter values for training a Yolov4 pest detection model.
As an implementation manner, the enhancement processing includes one or more of noise processing, image expansion processing, rotation processing, image fusion processing, and random erasure processing.
As an implementable embodiment, the cross-entropy-based loss function modifies the loss function of the YOLOv4 network model, and the modified loss function is represented as:
Figure BDA0002874866930000021
as an implementation, the method further comprises the following steps: and carrying out normalization processing on all data in the training data set, and randomly dividing the data after the normalization processing to obtain a training data set, a verification data set and a test data set.
As an implementation, the method further comprises the following steps: dividing the sizes of the pests in the original data sets into at least three types of original data sets according to the pest shapes, and obtaining an improved YOLOv4 pest detection model of each type based on the original data sets of each type. .
A pest detection system based on YOLOv4 comprises a data acquisition module, a processing and dividing module, a parameter modification module, a model training module and an image recognition module;
the data acquisition module is used for acquiring pictures of various insect pest species to form an original data set;
the processing and dividing module is used for performing sample enhancement processing on the original data set and dividing the original data set into a training data set, a verification data set and a test data set;
the parameter modification module is used for modifying anchor frame parameter values of the YOLOv4 network model based on a k-means clustering algorithm and modifying loss functions of the YOLOv4 network model based on cross entropy loss functions;
the model training module is used for inputting a training data set into a YOLOv4 network model for training, taking a test data set and a verification data set as input after the training is finished, and verifying a training result to obtain an improved YOLOv4 pest detection model;
the image identification module is used for identifying the pest image to be detected based on the improved YOLOv4 pest detection model to obtain a pest identification result.
As an implementation, the system further includes a classification module configured to: dividing the sizes of the pests in the original data sets into at least three types of original data sets according to the pest shapes, and obtaining an improved YOLOv4 pest detection model of each type based on the original data sets of each type.
A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the following method steps:
obtaining pictures of various insect pest species to form an original data set;
carrying out sample enhancement processing on an original data set and dividing the original data set into a training data set, a verification data set and a test data set;
modifying anchor frame parameter values of the YOLOv4 network model based on a k-means clustering algorithm, and modifying a loss function of the YOLOv4 network model based on a cross entropy loss function;
inputting a training data set into a YOLOv4 network model for training, taking a test data set and a verification data set as input after the training is finished, and verifying a training result to obtain an improved YOLOv4 pest detection model;
and identifying the pest picture to be detected based on the improved YOLOv4 pest detection model to obtain a pest identification result.
An apparatus for YOLOv 4-based pest detection comprising a memory, a processor and a computer program stored in the memory and executable on the processor, the processor when executing the computer program implementing the method steps of:
obtaining pictures of various insect pest species to form an original data set;
carrying out sample enhancement processing on an original data set and dividing the original data set into a training data set, a verification data set and a test data set;
modifying anchor frame parameter values of the YOLOv4 network model based on a k-means clustering algorithm, and modifying a loss function of the YOLOv4 network model based on a cross entropy loss function;
inputting a training data set into a YOLOv4 network model for training, taking a test data set and a verification data set as input after the training is finished, and verifying a training result to obtain an improved YOLOv4 pest detection model;
and identifying the pest picture to be detected based on the improved YOLOv4 pest detection model to obtain a pest identification result.
Due to the adoption of the technical scheme, the invention has the remarkable technical effects that:
the method comprises the steps of obtaining pictures of various insect pest species to form an original data set; carrying out sample enhancement processing on an original data set and dividing the original data set into a training data set, a verification data set and a test data set; modifying anchor frame parameter values of the YOLOv4 network model based on a k-means clustering algorithm, and modifying a loss function of the YOLOv4 network model based on a cross entropy loss function; inputting a training data set into a YOLOv4 network model for training, taking a test data set and a verification data set as input after the training is finished, and verifying a training result to obtain an improved YOLOv4 pest detection model; and identifying the pest picture to be detected based on the improved YOLOv4 pest detection model to obtain a pest identification result. By the method, the system and the device, the pest picture to be detected can be rapidly identified to obtain a pest identification result, the situation of unstable gradient along with depth deepening can be rapidly and effectively relieved, and the detection precision of the algorithm is remarkably improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a schematic overall flow diagram of the process of the present invention;
FIG. 2 is a schematic diagram of the overall architecture of the system of the present invention;
FIG. 3 is a flow chart of the identification of the present invention;
FIG. 4 is a schematic view of the identification process of the present invention for improving the Yolov4 pest detection model based on different categories;
FIG. 5 is a schematic diagram of several possible scenarios of sample enhancement processing;
FIG. 6 is a schematic structural view of Darknet 53;
fig. 7 is a schematic diagram of recognition result output.
Detailed Description
The present invention will be described in further detail with reference to examples, which are illustrative of the present invention and are not to be construed as being limited thereto.
With the application and research of big data becoming more and more extensive, deep learning based on a convolutional neural network is a new research direction in the field of machine learning and is also actively applied to various fields. In the prior art, deep learning and pest identification are combined with each other to identify pests, but several existing methods for identification based on big data are relatively low in accuracy and not accurate enough in classification, so that a new detection method is researched on the basis of the prior art, the detection accuracy can be improved, and an identification result can be obtained quickly.
Example 1:
a pest detection method based on YOLOv4, as shown in figure 1, comprises the following steps:
s100, obtaining pictures of various insect pest species to form an original data set;
s200, performing sample enhancement processing on the original data set and dividing the original data set into a training data set, a verification data set and a test data set;
s300, modifying anchor frame parameter values of the YOLOv4 network model based on a k-means clustering algorithm, and modifying a loss function of the YOLOv4 network model based on a cross entropy loss function;
s400, inputting a training data set into a YOLOv4 network model for training, taking a test data set and a verification data set as input after the training is finished, and verifying a training result to obtain an improved YOLOv4 pest detection model;
s500, identifying the pest picture to be detected based on the improved YOLOv4 pest detection model to obtain a pest identification result.
The YOLOv4 network is slightly improved on the basis of the YOLOv2 network, the speed is not reduced, and the robustness of small target detection and short-distance object detection is improved. The main improvement points are different from Loss functions, two factors are added on the basis of the cross entropy, a class-Loss function is modified into a cross entropy function Focal-local, a new Loss function is formed, and the problem of extreme foreground and background class imbalance of training data in the training process of the target detector is solved. In addition, anchor frame parameter values of a YOLOv4 network model are modified, a model structure of the YOLOv4 network based on a Darknet-53 model structure is adopted, the Darknet-53 model structure can be shown in the attached figure 6, a characteristic pyramid model is added on the basis of the model, the output of characteristics is increased to 3 scales, and due to the fact that the receptive fields are different under the three scales, the receptive field is maximum when 32 times of down sampling is carried out, and the method is suitable for detection of large targets; 16 times down-sampling is suitable for detecting medium targets; and 8 times of the reception field is the smallest, which is suitable for detecting small targets, and when 416x416 is the input size, the number of the final suggested frames is 10647 (52 × 52+26 × 26+13 × 13) × 3.
In the present invention, the data set currently contains 508 photographs of agricultural and forestry pests, the photographs include various kinds of pests with different illumination conditions, different sizes and different shooting angles, all the photographs are read first to obtain the number of each category, the number of each category is equalized by adopting an upsampling mode, and meanwhile, the number of samples is increased by adopting data enhancement modes such as turning, rotating, mix up image fusion, image dithering, image mirroring, image adding noise, and the like, so as to increase data diversity, and a specific processing result can be seen in fig. 5, which shows an example of sample enhancement that may be used by the present invention. And (3) collecting and sorting all data to form an original data set, and uniformly and randomly dividing the original data set into a training data set, a verification data set and a test data set according to the proportion of 8:1: 1.
The invention is divided into a training part and a testing part, wherein the development operating system of the training part can be Linux-ubuntu-18.04, the cpu model is InterCore i7-8700, the memory size is 64GB, three NVIDIA GeForce RTX 2080Ti are used as gpu support for model training, the video card driving version is 430.40, the cuda version is 10.0, the cudnn version is cudnn-v7.3.1, the main development languages are c language and python, the opencv version is 2.4.9 when the darknet framework is configured, and other developed operating systems can be realized.
In one embodiment, the cross-entropy-based loss function modifies the loss function of the YOLOv4 network model, which is expressed as:
Figure BDA0002874866930000051
two factors are added on the basis of cross entropy, wherein gamma >0 reduces the loss of easy-to-classify samples. Making more focused on difficult, miscut samples, a gamma of 2 was found to be optimal experimentally. Another balance factor alpha, in the present invention, the number of the alpha is 0.25, which is used to balance the problem of non-uniform ratio of the positive and negative samples. The original loss under the darknet framework is composed of three loss functions of box (position), confidence (score) and class (category), wherein box-loss adopts MSE (mean square error), and confidence-loss and class-loss adopt cross entropy. The invention modifies the class-Loss function into the Focal-Loss function, and aims to solve the problem of extreme foreground and background class imbalance of training data of a target detector in the training process of the invention.
Based on the above improvement, the specific training process for improving the YOLOv4 network model is as follows: and (3) using a dark net53.conv.74 model as a pre-training model, wherein the anchor frame parameter value and the loss function in the pre-training model are improved, and in addition, fine adjustment is carried out on the pre-training model, and the final model is formed by carrying out 500x2000 iterations for 1000000 times in total. The hyper-parameters during training are as follows: the parameter momentum affecting the gradient down to the optimal speed is set to 0.9; the weight attenuation regular coefficient is set to be 0.0005, and the parameter can effectively prevent the overfitting phenomenon; the initial learning rate is 0.001, and the learning rate is reduced after 1000 iterations in a policy mode; the maximum iteration number is 1000000, the learning rate is decreased in a steps mode, and the learning rate is attenuated according to the proportion of 0.1 when the iterations are 800000, 900000 and 950000. During the training of the model, IoU for the positive sample selection is 0.5, and candidate boxes greater than 0.5 are selected as positive samples, and others are negative samples or background.
In other embodiments, the method further comprises the following steps: the improved YOLOv4 pest detection model is deployed in a cloud platform. After the identification software or APP at the mobile terminal or the computer acquires the picture to be identified, the picture can be uploaded to a cloud platform, an improved YOLOv4 pest detection model deployed on the cloud platform is called to identify the pest picture to be detected, a pest identification result is obtained, see the attached drawing 7, the identification result is finally fed back to the identification software or APP at the mobile terminal or the computer, and the display result of the identification software or APP at the general mobile terminal or the computer comprises: pest category, a rectangular frame formed by pest coordinates, pest scores, pest information, control measures and other information.
Based on all the above embodiments, a complete picture detection method can be formed, and the flowchart thereof can be seen in fig. 3.
In one embodiment, the sizes of pests in the original data sets can be further divided into at least three types of original data sets according to pest morphology, and an improved Yolov4 pest detection model of each type is obtained based on the original data sets of each type. And identifying the pest picture to be detected belonging to each category through the improved YOLOv4 pest detection model of each category to obtain an identification result.
In addition, in the process of the method of the present application, the method further includes the following steps: perfecting and processing the picture of the pest to be detected, which specifically comprises the following steps:
classifying the images in the original dataset according to the body structure of the pests, and marking each based on the structural features of each type of pest;
if the pest image in the pest image to be detected is a defective image, extracting structural features of the pest image in the image, and comparing the extracted features with each type of structural features marked in the original data set to obtain comparison results, wherein at least three groups of the comparison results are recommended;
and complementing the rest of the incomplete pest image based on the comparison result to obtain a pest image to be detected with a complete pest image.
The embodiment is to solve the types of the incomplete pests, the body structure of the pests is classified according to the existence of wings, spots on the wings, tentacles, feet, tails or shells of the pests, and the like, but the classification is not strict in textbooks, because the types of the pests are determined more carefully and accurately in the following process, and the more the marked features are, the better the matching result can be obtained by the incomplete image. If the pest image in the pest image to be detected is a incomplete image, comparing the pest according to the marked features, the comparison result is more ideal if the structural features are overlapped, at least three results are finally selected from the comparison result, the incomplete part of the pest is completely supplemented according to the result to obtain a more complete pest image, it should be noted that the supplemented part is larger than the incomplete part when the complete supplementation is carried out, the connecting parts are overlapped, then the several overlapped and complete pest images are input into a YOLOv4 pest detection model, the YOLOv4 pest detection model carries out detection operation, the pest images with the overlapped parts are detected, if at least one of the three groups of results shows that only one pest is detected, the comparison result recommended in the earlier stage is correct, and the type of the pest is judged based on the result of detecting only one pest, and obtaining a specific pest identification result. If a plurality of recognition results are obtained by inputting the pest images spliced out based on the three results and having the overlapped parts into the YOLOv4 pest detection model, the pest image with the largest repetition number is found out from the plurality of recognition results as the final pest detection result.
Example 2:
a pest detection system based on YOLOv4, as shown in fig. 2, includes a data acquisition module 100, a processing division module 200, a parameter modification module 300, a model training module 400, and a picture recognition module 500;
the data acquisition module 100 is configured to acquire pictures of various pest species to form an original data set;
the processing and dividing module 200 is configured to perform sample enhancement processing on an original data set and divide the original data set into a training data set, a verification data set and a test data set;
the parameter modification module 300 is configured to modify anchor frame parameter values of the YOLOv4 network model based on a k-means clustering algorithm;
the model training module 400 is configured to input a training data set into the YOLOv4 network model for training, and after the training is completed, use a test data set and a verification data set as inputs to verify a training result, thereby obtaining an improved YOLOv4 pest detection model;
the picture recognition module 500 is configured to recognize a pest picture to be detected based on the improved YOLOv4 pest detection model, so as to obtain a pest recognition result.
In one embodiment, a model deployment module 600 is also included that is configured to: the improved YOLOv4 pest detection model is deployed in a cloud platform.
In one embodiment, the system further comprises a classification module configured to: dividing the sizes of the pests in the original data sets into at least three types of original data sets according to the pest shapes, and obtaining an improved YOLOv4 pest detection model of each type based on the original data sets of each type.
Example 3:
a computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the following method steps:
obtaining pictures of various insect pest species to form an original data set;
carrying out sample enhancement processing on an original data set and dividing the original data set into a training data set, a verification data set and a test data set;
modifying anchor frame parameter values of a YOLOv4 network model based on a k-means clustering algorithm;
inputting a training data set into a YOLOv4 network model for training, taking a test data set and a verification data set as input after the training is finished, and verifying a training result to obtain an improved YOLOv4 pest detection model;
and identifying the pest picture to be detected based on the improved YOLOv4 pest detection model to obtain a pest identification result.
In an embodiment, when the processor executes the computer program, the modifying of the anchor frame parameter value of the YOLOv4 network model is implemented by:
and obtaining the optimal anchor points under three scales based on a k-means clustering algorithm, and using the anchor points as anchor frame parameter values for training a Yolov4 pest detection model.
In one embodiment, the processor, when executing the computer program, performs the enhancement processing including one or more of a noise addition processing, an image expansion processing, a rotation processing, an image fusion processing, and a random erasure processing.
In one embodiment, when the processor executes the computer program, the cross-entropy-based loss function is implemented to modify a loss function of the YOLOv4 network model, where the modification is expressed as:
Figure BDA0002874866930000081
in one embodiment, the implementation further comprises, when the computer program is executed by the processor: and carrying out normalization processing on all data in the training data set, and randomly dividing the data after the normalization processing to obtain a training data set, a verification data set and a test data set.
In one embodiment, the implementation further comprises, when the computer program is executed by the processor: the improved YOLOv4 pest detection model is deployed in a cloud platform.
Example 4:
in one embodiment, a YOLOv 4-based pest detection device is provided, and the YOLOv 4-based pest detection device can be a server or a mobile terminal. The YOLOv 4-based pest detection device includes a processor, a memory, a network interface, and a database connected through a system bus. Wherein the processor of the YOLOv 4-based pest detection device is used to provide computing and control capabilities. The memory of the YOLOv 4-based pest detection device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database is used to store all data of the pest detection device based on YOLOv 4. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a method of pest detection based on YOLOv 4.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention has been described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing terminal to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be noted that:
reference in the specification to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. Thus, the appearances of the phrase "one embodiment" or "an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention. In addition, it should be noted that the specific embodiments described in the present specification may differ in the shape of the components, the names of the components, and the like. All equivalent or simple changes of the structure, the characteristics and the principle of the invention which are described in the patent conception of the invention are included in the protection scope of the patent of the invention. Various modifications, additions and substitutions for the specific embodiments described may be made by those skilled in the art without departing from the scope of the invention as defined in the accompanying claims.

Claims (10)

1. A pest detection method based on YOLOv4 is characterized by comprising the following steps:
obtaining pictures of various insect pest species to form an original data set;
carrying out sample enhancement processing on an original data set and dividing the original data set into a training data set, a verification data set and a test data set;
modifying anchor frame parameter values of the YOLOv4 network model based on a k-means clustering algorithm, and modifying a loss function of the YOLOv4 network model based on a cross entropy loss function;
inputting a training data set into a YOLOv4 network model for training, taking a test data set and a verification data set as input after the training is finished, and verifying a training result to obtain an improved YOLOv4 pest detection model;
and identifying the pest picture to be detected based on the improved YOLOv4 pest detection model to obtain a pest identification result.
2. The Yolov 4-based pest detection method according to claim 1, wherein the modification of the anchor frame parameter values of the Yolov4 network model is specifically:
and obtaining the optimal anchor points under three scales based on a k-means clustering algorithm, and using the anchor points as anchor frame parameter values for training a Yolov4 pest detection model.
3. The Yolov 4-based pest detection method according to claim 1, wherein the enhancement process comprises one or more of a noise process, an image expansion process, a rotation process, an image fusion process and a random erasure process.
4. The Yolov 4-based pest detection method according to claim 1, wherein the cross-entropy-based loss function modifies the loss function of the Yolov4 network model, and the modified loss function is represented as:
Figure FDA0002874866920000011
5. the Yolov 4-based pest detection method according to claim 1, further comprising the steps of: and carrying out normalization processing on all data in the training data set, and randomly dividing the data after the normalization processing to obtain a training data set, a verification data set and a test data set.
6. The Yolov 4-based pest detection method according to claim 1, further comprising the steps of:
dividing the sizes of the pests in the original data sets into at least three types of original data sets according to the pest shapes, and obtaining an improved YOLOv4 pest detection model of each type based on the original data sets of each type.
7. A pest detection system based on YOLOv4 is characterized by comprising a data acquisition module, a processing and dividing module, a parameter modification module, a model training module and an image recognition module;
the data acquisition module is used for acquiring pictures of various insect pest species to form an original data set;
the processing and dividing module is used for performing sample enhancement processing on the original data set and dividing the original data set into a training data set, a verification data set and a test data set;
the parameter modification module is used for modifying anchor frame parameter values of the YOLOv4 network model based on a k-means clustering algorithm and modifying loss functions of the YOLOv4 network model based on cross entropy loss functions;
the model training module is used for inputting a training data set into a YOLOv4 network model for training, taking a test data set and a verification data set as input after the training is finished, and verifying a training result to obtain an improved YOLOv4 pest detection model;
the parameter modification module is used for modifying the anchor frame parameter value of the YOLOv4 network model;
the model training module is used for inputting a training data set into a YOLOv4 network model for training, taking a test set and a verification set as input after the training is finished, and verifying a training result to obtain an improved YOLOv4 pest detection model;
the image identification module is used for identifying the pest image to be detected based on the improved YOLOv4 pest detection model to obtain a pest identification result.
8. The YOLOv 4-based pest detection system according to claim 7, further comprising a classification module configured to: dividing the sizes of the pests in the original data sets into at least three types of original data sets according to the pest shapes, and obtaining an improved YOLOv4 pest detection model of each type based on the original data sets of each type.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method steps of one of claims 1 to 6.
10. An apparatus for YOLOv 4-based pest detection, comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor, when executing the computer program, implements the method steps of any one of claims 1 to 6.
CN202011611845.6A 2020-12-30 2020-12-30 Yolov 4-based pest detection method, system, device and readable storage medium Active CN112668490B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011611845.6A CN112668490B (en) 2020-12-30 2020-12-30 Yolov 4-based pest detection method, system, device and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011611845.6A CN112668490B (en) 2020-12-30 2020-12-30 Yolov 4-based pest detection method, system, device and readable storage medium

Publications (2)

Publication Number Publication Date
CN112668490A true CN112668490A (en) 2021-04-16
CN112668490B CN112668490B (en) 2023-01-06

Family

ID=75412059

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011611845.6A Active CN112668490B (en) 2020-12-30 2020-12-30 Yolov 4-based pest detection method, system, device and readable storage medium

Country Status (1)

Country Link
CN (1) CN112668490B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113269208A (en) * 2021-06-12 2021-08-17 四川虹美智能科技有限公司 Food material identification system based on Internet of things refrigerator
CN113362306A (en) * 2021-06-07 2021-09-07 中山大学 Packaged chip defect detection method based on deep learning
CN113569769A (en) * 2021-07-30 2021-10-29 仲恺农业工程学院 Red fire ant nest remote identification and positioning method based on deep neural network
CN113744226A (en) * 2021-08-27 2021-12-03 浙大宁波理工学院 Intelligent agricultural pest identification and positioning method and system
CN113744225A (en) * 2021-08-27 2021-12-03 浙大宁波理工学院 Intelligent detection method for agricultural pests
CN114677553A (en) * 2021-12-31 2022-06-28 广西慧云信息技术有限公司 Image recognition method for solving unbalanced problem of crop disease and insect pest samples
CN114782373A (en) * 2022-04-26 2022-07-22 中国科学院海洋研究所 Intelligent detection method for planktonic larvae of bivalve shellfish based on deep learning
CN115443958A (en) * 2022-08-02 2022-12-09 华中农业大学 Box pest intelligence of board traps and kills monitoring devices

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU6781696A (en) * 1995-09-06 1997-03-27 McCarron Pty Ltd Termite detection system
CA2911801A1 (en) * 2002-03-22 2003-10-02 Greta Arnaut Novel bacillus thuringiensis insecticidal proteins
CN102930249A (en) * 2012-10-23 2013-02-13 四川农业大学 Method for identifying and counting farmland pests based on colors and models
CN103177266A (en) * 2013-04-07 2013-06-26 青岛科技大学 Intelligent stock pest identification system
CN111199245A (en) * 2019-12-20 2020-05-26 湖南城市学院 Rape pest identification method
GB202009146D0 (en) * 2020-06-16 2020-07-29 Dark Horse Tech Ltd System and method for crop monitoring
CN111709489A (en) * 2020-06-24 2020-09-25 广西师范大学 Citrus identification method based on improved YOLOv4
CN111914914A (en) * 2020-07-21 2020-11-10 上海理想信息产业(集团)有限公司 Method, device, equipment and storage medium for identifying plant diseases and insect pests

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU6781696A (en) * 1995-09-06 1997-03-27 McCarron Pty Ltd Termite detection system
CA2911801A1 (en) * 2002-03-22 2003-10-02 Greta Arnaut Novel bacillus thuringiensis insecticidal proteins
CN102930249A (en) * 2012-10-23 2013-02-13 四川农业大学 Method for identifying and counting farmland pests based on colors and models
CN103177266A (en) * 2013-04-07 2013-06-26 青岛科技大学 Intelligent stock pest identification system
CN111199245A (en) * 2019-12-20 2020-05-26 湖南城市学院 Rape pest identification method
GB202009146D0 (en) * 2020-06-16 2020-07-29 Dark Horse Tech Ltd System and method for crop monitoring
CN111709489A (en) * 2020-06-24 2020-09-25 广西师范大学 Citrus identification method based on improved YOLOv4
CN111914914A (en) * 2020-07-21 2020-11-10 上海理想信息产业(集团)有限公司 Method, device, equipment and storage medium for identifying plant diseases and insect pests

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
B DAN ET AL: "Diseases and Pests Identification of Lycium Barbarum Using SE-MobileNet V2 Algorithm", 《2019 12TH INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DESIGN》, 22 May 2020 (2020-05-22) *
刘君等: "基于YOLO的番茄病虫害识别算法", 《中国瓜菜》, no. 09, 5 September 2020 (2020-09-05) *
周爱明等: "基于深度学习的蝴蝶科级标本图像自动识别", 《昆虫学报》, vol. 60, no. 11, 31 December 2017 (2017-12-31) *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113362306A (en) * 2021-06-07 2021-09-07 中山大学 Packaged chip defect detection method based on deep learning
CN113269208A (en) * 2021-06-12 2021-08-17 四川虹美智能科技有限公司 Food material identification system based on Internet of things refrigerator
CN113569769A (en) * 2021-07-30 2021-10-29 仲恺农业工程学院 Red fire ant nest remote identification and positioning method based on deep neural network
CN113744226A (en) * 2021-08-27 2021-12-03 浙大宁波理工学院 Intelligent agricultural pest identification and positioning method and system
CN113744225A (en) * 2021-08-27 2021-12-03 浙大宁波理工学院 Intelligent detection method for agricultural pests
CN114677553A (en) * 2021-12-31 2022-06-28 广西慧云信息技术有限公司 Image recognition method for solving unbalanced problem of crop disease and insect pest samples
CN114677553B (en) * 2021-12-31 2024-05-14 广西慧云信息技术有限公司 Image recognition method for solving imbalance problem of crop disease and pest samples
CN114782373A (en) * 2022-04-26 2022-07-22 中国科学院海洋研究所 Intelligent detection method for planktonic larvae of bivalve shellfish based on deep learning
CN115443958A (en) * 2022-08-02 2022-12-09 华中农业大学 Box pest intelligence of board traps and kills monitoring devices

Also Published As

Publication number Publication date
CN112668490B (en) 2023-01-06

Similar Documents

Publication Publication Date Title
CN112668490B (en) Yolov 4-based pest detection method, system, device and readable storage medium
CN113705478B (en) Mangrove single wood target detection method based on improved YOLOv5
CN112926405A (en) Method, system, equipment and storage medium for detecting wearing of safety helmet
US20170177938A1 (en) Automated detection of nitrogen deficiency in crop
CN103839078A (en) Hyperspectral image classifying method based on active learning
EP3564857A1 (en) Pattern recognition method of autoantibody immunofluorescence image
CN112686862A (en) Pest identification and counting method, system and device and readable storage medium
Liu et al. Automated classification of stems and leaves of potted plants based on point cloud data
CN117576195A (en) Plant leaf morphology recognition method
CN113869098A (en) Plant disease identification method and device, electronic equipment and storage medium
Park et al. Insect classification using Squeeze-and-Excitation and attention modules-a benchmark study
Mashuk et al. Machine learning approach for bird detection
CN103617417B (en) Automatic plant identification method and system
CN114550017A (en) Pine wilt disease integrated early warning and detecting method and device based on mobile terminal
CN114724140A (en) Strawberry maturity detection method and device based on YOLO V3
CN113240640B (en) Colony counting method, apparatus and computer readable storage medium
CN109145955A (en) A kind of Wood Identification Method and system
Ajayi et al. Drone-based crop type identification with convolutional neural networks: an evaluation of the performance of RESNET architectures
CN117392382A (en) Single tree fruit tree segmentation method and system based on multi-scale dense instance detection
CN117437186A (en) Transparent part surface defect detection method and system based on deep learning algorithm
CN116310541A (en) Insect classification method and system based on convolutional network multidimensional learning
Nanditha et al. Classification of animals using toy images
Yang et al. FCBTYOLO: A Lightweight and High-Performance Fine Grain Detection Strategy for Rice Pests
CN115546639A (en) Forest weed detection method based on improved YOLOv5 model
Mwaffo et al. Assessing the predictive performance of two dnn models: A comparative analysis to support reusing training weights for autonomous aerial refueling missions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB03 Change of inventor or designer information

Inventor after: Chen Yuyang

Inventor after: Zhu Xuhua

Inventor after: Wu Hongyang

Inventor after: Feng Jin

Inventor after: Yao Bo

Inventor after: Liu Zhimin

Inventor after: Shen Zhihui

Inventor before: Chen Yuyang

Inventor before: Zhu Xuhua

Inventor before: Wu Hongyang

Inventor before: Feng Jin

Inventor before: Yao Bo

Inventor before: Liu Zhimin

Inventor before: Shen Zhihui

CB03 Change of inventor or designer information
GR01 Patent grant
GR01 Patent grant