CN115546186A - Agricultural pest and disease detection method and device based on YOLO v4 - Google Patents

Agricultural pest and disease detection method and device based on YOLO v4 Download PDF

Info

Publication number
CN115546186A
CN115546186A CN202211338472.9A CN202211338472A CN115546186A CN 115546186 A CN115546186 A CN 115546186A CN 202211338472 A CN202211338472 A CN 202211338472A CN 115546186 A CN115546186 A CN 115546186A
Authority
CN
China
Prior art keywords
agricultural
frame
pest detection
yolo
loss function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211338472.9A
Other languages
Chinese (zh)
Other versions
CN115546186B (en
Inventor
罗长寿
赵瑞芳
魏清凤
陆阳
于峰
王富荣
余军
郑亚明
曹承忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Academy of Agriculture and Forestry Sciences
Original Assignee
Beijing Academy of Agriculture and Forestry Sciences
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Academy of Agriculture and Forestry Sciences filed Critical Beijing Academy of Agriculture and Forestry Sciences
Priority to CN202211338472.9A priority Critical patent/CN115546186B/en
Publication of CN115546186A publication Critical patent/CN115546186A/en
Application granted granted Critical
Publication of CN115546186B publication Critical patent/CN115546186B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • G06V10/763Non-hierarchical techniques, e.g. based on statistics of modelling distributions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Abstract

The invention provides a method and a device for detecting agricultural diseases and insect pests based on YOLO v4, wherein an image of the agricultural diseases and insect pests to be detected is input into an agricultural disease and insect pest detection model based on YOLO v 4; obtaining a detection result according to an agricultural pest detection model based on YOLO v 4; the agricultural pest detection model based on the YOLO v4 comprises a guide head and an auxiliary head, the detection result is obtained by weighted summation according to the prediction result output by the guide head and the prediction result output by the auxiliary head, the alternative classification result can be increased by adding the auxiliary head, missing detection is avoided, and the accuracy of agricultural pest detection is improved.

Description

Agricultural pest and disease detection method and device based on YOLO v4
Technical Field
The invention relates to the technical field of agricultural intellectualization and target detection software, in particular to an agricultural pest and disease damage detection method and device based on YOLO v 4.
Background
Agricultural diseases and insect pests are various and form complex environmental factors, and if the diseases and insect pests are not prevented in advance, huge loss is easily brought to farmers once the diseases and insect pests happen, and the national agricultural harvest and the grain safety are seriously threatened to be influenced. At present, the misjudgment of diseases is probably caused by only depending on artificial observation and experience identification, and the expected prevention and treatment effect is difficult to achieve. Meanwhile, due to insufficient experience of current agricultural technicians, the actual application capability of various modern technologies is low, and diseases are difficult to find and early warn in time in crop production. Therefore, the system for automatically detecting agricultural diseases and insect pests based on the deep learning algorithm is applied, and relevant research for completing target detection through deep learning is applied to agricultural disease and insect pest identification, so that the type of the disease and insect pest can be identified, more importantly, the specific positions of the disease and the insect pest in the picture can be identified, and the system is widely used. In the correlation technique, a YOLO v4 single-stage model algorithm is used for detecting agricultural plant diseases and insect pests, the YOLO v4 single-stage model comprises a trunk feature network, a Neck module and a guide head (Yolo head) module, the features are classified and predicted through the guide head, and due to the fact that the number of candidate prior frames matched with the guide head is limited, missing detection is easily caused, the detection result is inaccurate, and the user experience is poor.
Disclosure of Invention
The invention provides an agricultural pest detection method and device based on YOLO v4, which are used for solving the defect of poor user experience caused by inaccurate output prediction results when a traditional YOLO v4 model is used for agricultural pest detection.
The invention provides an agricultural pest detection method, which comprises the following steps:
inputting an agricultural disease and insect pest image to be detected into an agricultural disease and insect pest detection model based on YOLO v 4;
obtaining a detection result according to the agricultural disease and insect pest detection model based on the YOLO v 4;
the agricultural pest detection model based on the YOLO v4 comprises a guide head and an auxiliary head, and the detection result is obtained by weighted summation of a prediction result output by the guide head and a prediction result output by the auxiliary head.
The invention provides an agricultural pest detection method, which further comprises the following steps:
and generating a coarse label and the fine label according to the guide head, wherein the coarse label is used for assisting the guide head to train, and the fine label is used for assisting the guide head to train.
The invention provides an agricultural pest detection method, and the training process of the agricultural pest detection model based on the YOLO v4 comprises the following steps:
calculating a first loss value between the predicted result and the real result output by the guide head according to a loss function;
guiding the guiding head to output a prediction result according to the first loss value and the thin label;
calculating a second loss value between the auxiliary head output prediction result and the real result according to a loss function;
and guiding the auxiliary head to output a prediction result according to the second loss value and the rough label.
The training process of the agricultural pest detection model based on the YOLO v4 comprises the following steps:
calculating a first loss value between the predicted result and the real result output by the guide head according to a loss function;
guiding the guide head to output a prediction result according to the first loss value;
calculating a second loss value between the auxiliary head output prediction result and a real result according to a loss function;
and guiding the auxiliary head to output a prediction result according to the second loss value.
The invention provides an agricultural pest and disease detection method, wherein the loss function formula is as follows:
Figure BDA0003915452420000031
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003915452420000032
as a loss function, K, S 2 B is the output characteristic graph, the cell and the prior frame number in each cell respectively; alpha is alpha * The weight of each sub-loss function is represented,
Figure BDA0003915452420000033
whether the characteristic diagram representing the kth output, the ith cell and the jth prior frame are positive samples or not is judged, if so, the result is 1, otherwise, the result is 0; t is t p ,t gt Is a predicted vector and a true value vector;
Figure BDA0003915452420000034
the balancing coefficients are used to balance the weights of the output feature maps for each scale.
According to the invention, the agricultural pest detection method is provided, and the sub-loss function comprises the following steps:
the system comprises a coordinate loss function, a target confidence coefficient loss function and a classification loss function, wherein the target confidence coefficient loss function and the classification loss function adopt a two-value cross entropy loss function with log, and the coordinate loss function adopts a CIoU loss function.
The agricultural pest detection model based on YOLO v4 is obtained by training according to a training sample set, and the generation process of the training sample set comprises the following steps:
clustering a plurality of GT frames in a training data set to obtain a plurality of prior frames;
matching each GT frame with each prior frame;
screening out a positive sample, a negative sample and a background according to a matching result;
and generating a training sample set according to the positive sample, the negative sample and the background.
According to the agricultural pest detection method provided by the invention, the training sample set generation process further comprises the following steps:
and taking the preset number neighborhood grid with the nearest central position of the GT frame as a positive sample.
The invention provides an agricultural pest detection method, wherein a positive sample, a negative sample and a background are screened out according to a matching result, and the method comprises the following steps:
respectively calculating the frame size ratio of each GT frame to each prior frame, wherein the frame size ratio comprises a frame width ratio and a frame height ratio;
if the maximum value of the frame size ratio is smaller than a preset ratio threshold, judging that the corresponding GT frame is a positive sample;
if the maximum value of the frame size ratio is not smaller than a preset ratio threshold, judging that the corresponding GT frame is a negative sample;
and if the frame size ratio of the GT frame to any prior frame is less than or equal to 1, judging that the GT frame is a background.
The invention provides an agricultural pest detection method, which comprises the following steps that if the frame size ratio of a GT frame to a plurality of prior frames is larger than a preset ratio threshold value:
arranging a plurality of prior frames corresponding to the ratio threshold value larger than the preset ratio value from small to large according to matching cost, wherein the matching cost is obtained by adding IoU loss and classification loss calculated according to a prediction result and a real result;
and sequentially screening the prior frames with a preset number of thresholds as positive samples according to the arrangement sequence of the prior frames.
The invention also provides an agricultural pest detection device, comprising:
the input module is used for inputting the agricultural disease and insect pest image to be detected into an agricultural disease and insect pest detection model based on YOLO v 4;
the output module is used for obtaining a detection result according to the agricultural pest detection model based on YOLO v 4; the agricultural pest detection model based on the YOLO v4 comprises a guide head and an auxiliary head, and the detection result is obtained by weighted summation of a prediction result output by the guide head and a prediction result output by the auxiliary head.
The invention also provides electronic equipment which comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, wherein when the processor executes the program, the agricultural pest detection method is realized.
The present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of agricultural pest detection as described in any one of the above.
According to the agricultural disease and pest detection method and device based on YOLO v4, the agricultural disease and pest images to be detected are input into an agricultural disease and pest detection model based on YOLO v 4; obtaining a detection result according to an agricultural pest detection model based on YOLO v 4; the agricultural pest detection model based on YOLO v4 comprises a guide head and an auxiliary head, the detection result is obtained by weighted summation according to the prediction result output by the guide head and the prediction result output by the auxiliary head, the auxiliary head is added, the alternative classification result can be added, the missing detection is avoided, and the accuracy of agricultural pest detection is improved.
Drawings
In order to more clearly illustrate the technical solutions of the present invention or the prior art, the drawings needed for the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
FIG. 1 is a schematic flow diagram of an agricultural pest detection method provided by the present invention;
FIG. 2 is a schematic structural diagram of an auxiliary training module of an agricultural pest detection model based on YOLO v4 provided by the invention;
FIG. 3 is a schematic diagram of the learning process of a guide head and an auxiliary head in the agricultural pest detection model based on YOLO v4 provided by the invention;
FIG. 4 is a flow chart of model training in the agricultural pest detection method provided by the present invention;
FIG. 5 is a flow chart for generating a model training sample set in the agricultural pest detection method provided by the invention.
FIG. 6 is a schematic structural diagram of an agricultural pest detection device provided by the present invention;
fig. 7 is a schematic structural diagram of an electronic device provided by the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the embodiments of the present invention, are within the scope of protection of the present invention.
Fig. 1 is a flowchart of an agricultural pest detection method provided by an embodiment of the present invention, and as shown in fig. 1, the agricultural pest detection method provided by the embodiment of the present invention includes:
step 101, inputting an agricultural disease and insect pest image to be detected into an agricultural disease and insect pest detection model based on YOLO v 4;
102, obtaining a detection result according to an agricultural disease and pest detection model based on YOLO v4, wherein the agricultural disease and pest detection model based on YOLO v4 comprises a guide head and an auxiliary head, and the detection result is obtained by weighted summation according to a prediction result output by the guide head and a prediction result output by the auxiliary head.
In the embodiment of the invention, an additional Head for assisting training is added in the middle layer of the network, which is called as an auxiliary Head (Aux Head), and the structure of the YOLO Head in the agricultural pest detection model based on the YOLO v4 is shown in the attached figure 2, and the working principle of the model comprises the following steps:
and respectively inputting the feature data output from some feature layers in the Neolo v4 model Neck module to different auxiliary heads, and respectively inputting the feature data output from other feature layers in the Neck module to different guide heads. The auxiliary head and the guide head need to carry out deep supervision on the target, and the auxiliary head and the self learning optimization are guided through the prediction of the guide head.
According to the embodiment of the invention, the prediction result of the guide Head is used as a guide to generate the hierarchical labels from coarse to fine, the hierarchical labels are respectively used for learning of the auxiliary Head (Aux Head) and the guide Head (Lead Head), the original unilateral YOLO Head output mode of YOLO v4 is changed, and the accuracy of the classification result is improved.
According to the agricultural disease and pest detection method provided by the embodiment of the invention, the image of the agricultural disease and pest to be detected is input into the agricultural disease and pest detection model based on YOLO v4, the detection result is obtained according to the agricultural disease and pest detection model based on YOLO v4, the auxiliary training module in the agricultural disease and pest detection model based on YOLO v4 comprises the guide head and the auxiliary head, the guide head is used for generating a soft label, the auxiliary head and the guide head are trained based on the soft label and a loss function, so that the network weight of the model is optimized according to the trained auxiliary head, the detection result is output according to the trained guide head, on one hand, the network weight of the model is optimized through the auxiliary head, the effect of the auxiliary module is improved, the performance of the model is optimized, the accuracy of the detection result is improved, on the other hand, the auxiliary head and the self prediction result are guided through the prediction of the guide head, and the accuracy of the agricultural disease and pest detection can be further improved.
The auxiliary head and the guide head have different inputs, the auxiliary head and the guide head need to carry out deep supervision on a target, and the auxiliary head and the guide head are guided by prediction of the guide head. Firstly, using the prediction of the leader as a guide to generate hierarchical labels from coarse to fine, and then using the hierarchical labels respectively for learning of the auxiliary header and the leader, as shown in fig. 3, wherein the hierarchical labels need to use a "label distributor", that is, a mechanism for redistributing soft labels after considering the network prediction result together with the real result.
In the embodiment of the present invention, the method further includes:
and generating a coarse label and the fine label according to the guide head, wherein the coarse label is used for assisting the guide head to train, and the fine label is used for assisting the guide head to train.
The thin label is the same as a soft label generated on a label distributor by a guide head in a traditional YOLO v4 model and is used for guiding the output of a classification result;
the coarse labels are used for reducing the positive sample distribution constraint, more grids are allowed to be used as positive samples, and the accuracy of the classification result can be improved.
More accurate classification and location of pests can be carried out by introducing coarse-to-fine to guide a label distribution strategy. Because the traditional YOLO v4 model only has a guide head for classified output, inaccurate label distribution of a prediction result is easy to cause, and the user experience is poor.
Based on any of the above embodiments, as shown in fig. 4, the training process of the agricultural pest detection model based on YOLO v4 includes:
step 401, calculating a first loss value between a prediction result output by a guiding head and a real result according to a loss function;
step 402, guiding a leading head to output a prediction result according to the first loss value and the fine label;
step 403, calculating a second loss value between the predicted result and the actual result output by the auxiliary head according to the loss function;
and step 404, guiding the auxiliary head to output a prediction result according to the second loss value and the rough label.
In the embodiment of the present invention, the formula of the loss function is:
Figure BDA0003915452420000081
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003915452420000082
as a loss function, K, S 2 B is the output characteristic graph, the cell and the prior frame number in each cell respectively; alpha is alpha * The weight of each sub-loss function is represented,
Figure BDA0003915452420000083
whether a characteristic diagram representing the kth output, the ith cell and the jth prior frame are positive samples or not is judged, if the characteristic diagram is the positive sample, the value is 1, otherwise, the value is 0; t is t p ,t gt Is a predicted vector and a true value vector;
Figure BDA0003915452420000084
the balancing coefficients are used to balance the weights of the output feature maps for each scale.
A sub-loss function comprising:
the system comprises a coordinate loss function, a target confidence coefficient loss function and a classification loss function, wherein the target confidence coefficient loss function and the classification loss function adopt a two-value cross entropy loss function with log, and the coordinate loss function adopts a CIoU loss function.
The expression for the coordinate loss function is as follows:
Figure BDA0003915452420000091
Figure BDA0003915452420000092
wherein the content of the first and second substances,
Figure BDA0003915452420000093
the Euclidean distance between the central points of the target frame and the prediction frame,
Figure BDA0003915452420000094
v is a parameter for measuring the uniformity of the aspect ratio, which is the diagonal distance of the target frame. w is a gt ,h gt Is the width and height, w, of the real target frame p ,h p Is the width and height of the prediction box.
Target confidence loss is computed from pairs of samples resulting from positive sample matching, including the target confidence score in the prediction boxp o And the IoU values of the prediction frame and the target frame corresponding to the prediction frame are used as GT frames, and the prediction frame and the target frame calculate binary cross entropy to obtain final target confidence loss. The formula for the target confidence is as follows:
Figure BDA0003915452420000095
the classification loss is similar to the target confidence loss, and is calculated through the class score of the prediction result and the one-hot performance of the target class, and the classification loss function is as follows:
Figure BDA0003915452420000096
the loss of the three parts is calculated by matching the obtained positive sample pairs, each output characteristic diagram is independent, and the final loss value of each part is obtained by directly adding.
And calculating the loss of each GT frame and the candidate prior frame according to a loss function, increasing the classification loss weight at the early stage, reducing the classification loss weight at the later stage, keeping the first K frames with the minimum loss, and removing the condition that the same prior frame is distributed to a plurality of GT frames.
It should be noted that the loss functions of the auxiliary head and the pilot head are the same, and the weighting coefficient of the prediction result output by the auxiliary head cannot be too large, otherwise the accuracy of the detection result output by the model becomes low.
According to the agricultural pest and disease detection method provided by the embodiment of the invention, the original target detector framework without deep detection is improved, the auxiliary head for auxiliary training is added, the weights of the auxiliary head and the guide head are fused, the model performance is improved, meanwhile, a new label distribution method is introduced, the auxiliary head and the guide head are guided through the prediction of the guide head, and the accuracy of agricultural pest and disease detection is improved.
Based on any of the above embodiments, as shown in fig. 5, the agricultural pest detection model based on YOLO v4 is obtained by training according to a training sample set, and the generation process of the training sample set includes:
step 501, clustering a plurality of GT frames in a training data set to obtain a plurality of prior frames;
in the embodiment of the invention, the clustering algorithm is, for example, a K-Means clustering algorithm, and 9 prior boxes are obtained.
Step 502, matching each GT frame with each prior frame;
step 503, screening out a positive sample, a negative sample and a background according to the matching result;
in the embodiment of the invention, the screening of the positive sample, the negative sample and the background according to the matching result comprises the following steps:
respectively calculating the frame size ratio of each GT frame to each prior frame, wherein the frame size ratio comprises a frame width ratio and a frame height ratio;
if the maximum value of the frame size ratio is smaller than a preset ratio threshold, judging that the corresponding GT frame is a positive sample;
in the embodiment of the invention, the ratio of the frame width to the frame height of each GT frame and each prediction frame is respectively calculated;
taking the maximum value of the ratios of the frame widths and the maximum value of the ratios of the frame heights;
and if the maximum value of the ratio of the frame width to the maximum value of the ratio of the frame height is smaller than a preset ratio threshold, judging that the corresponding prior frame is a positive sample.
If the maximum value of the frame size ratio is not smaller than a preset ratio threshold, judging that the corresponding GT frame is a negative sample;
and if the frame size ratio of the GT frame to any prior frame is less than or equal to 1, judging that the GT frame is a background.
If a GT box may not have any prior box matching, the GT box is defaulted to the background and does not participate in model training, and the size of the prior box can be adjusted by one skilled in the art.
And step 504, generating a training sample set according to the positive sample, the negative sample and the background.
In this embodiment of the present invention, the training sample set generating process further includes:
and taking the preset number neighborhood grids with the nearest central position of the GT frame as positive samples.
The predetermined number is, for example, 2, and the positive samples can be expanded by increasing the number of prediction blocks. Taking the 2 nearest neighbor grids from the center of the GT box as the prediction box, where the GT box is predicted by the prior box and 2 neighbor grids, the number of positive samples becomes 3 times (the maximum number of matches in the embodiment of the present invention is increased from 9 to 27) compared with the conventional YOLO v4 model.
According to the agricultural pest detection method provided by the invention, if the frame size ratios of a GT frame and a plurality of prior frames are all larger than a preset ratio threshold, a matching strategy is to use a SimOTA method, and the method comprises the following steps:
arranging a plurality of prior frames corresponding to the ratio threshold value larger than the preset ratio value from small to large according to matching cost (cost), and adding the IoU loss and the classification loss calculated according to the prediction result and the real result to obtain the matching cost (cost);
IoU (interaction over Union) is a standard that measures the accuracy of detecting a corresponding object in a particular data set. The IoU is the result of dividing the overlapping part of the two regions by the aggregation part of the two regions, and is compared with the IoU calculation result through a set threshold value. For example, the predicted frame predicted according to the feature data is compared with the real frame to obtain the predicted result of the top 10 largest ious, and the sum of the 10 ious is used to obtain a value, which is the number of positive samples, and the minimum value is 1.
And sequentially screening the prior frames with a preset number of thresholds according to the arrangement sequence of the prior frames to serve as positive samples.
In the embodiment of the invention, after the model training is finished, the model is analyzed and compared through experiments to identify the precision, the time consumption and the real-time performance of the agricultural pest and disease damage images. The Precision of the model is evaluated by using a Precision-Recall curve, AP (detection Precision) and mAP (mean value of AP value under all categories), wherein the AP value is the area of a region enclosed by a P-R curve and a coordinate axis, the mAP is the mean value of the AP of all categories, and the formula of the Recall rate (R) and the formula of the Precision rate (P) in the P-R curve are as follows:
Figure BDA0003915452420000121
Figure BDA0003915452420000122
wherein T is P Is a positive class judged as a positive class, F N Is a positive class judged as a negative class, F P And carrying out comparative analysis on the prediction effect of the pest and disease images in the test set by using the model prediction module to obtain an optimal detection model for the negative class judged as the positive class.
According to the agricultural disease and pest detection method provided by the embodiment of the invention, the model training sample set is screened, so that the learning performance of the model is improved, and the accuracy of agricultural disease and pest detection is improved.
The agricultural disease and pest detection device provided by the invention is described below, and the agricultural disease and pest detection device described below and the agricultural disease and pest detection method described above can be referred to correspondingly.
Fig. 6 is a schematic view of an agricultural pest detection device provided by an embodiment of the present invention, and as shown in fig. 6, the agricultural pest detection device provided by the embodiment of the present invention includes:
the input module 601 is used for inputting the agricultural disease and pest image to be detected into an agricultural disease and pest detection model based on YOLO v 4;
the output module 602 is configured to obtain a detection result according to an agricultural pest detection model based on YOLO v 4; the agricultural pest detection model based on the YOLO v4 comprises a guide head and an auxiliary head, and the detection result is obtained by weighted summation of a prediction result output by the guide head and a prediction result output by the auxiliary head.
The agricultural disease and pest detection device provided by the embodiment of the invention inputs the agricultural disease and pest image to be detected into an agricultural disease and pest detection model based on YOLO v 4; obtaining a detection result according to an agricultural pest detection model based on YOLO v 4; the agricultural pest detection model based on YOLO v4 comprises a guide head and an auxiliary head, the detection result is obtained by weighted summation according to the prediction result output by the guide head and the prediction result output by the auxiliary head, the auxiliary head is added, the alternative classification result can be added, the missing detection is avoided, and the accuracy of agricultural pest detection is improved.
Fig. 7 illustrates a physical structure diagram of an electronic device, and as shown in fig. 7, the electronic device may include: a processor (processor) 710, a communication Interface (Communications Interface) 720, a memory (memory) 730, and a communication bus 740, wherein the processor 710, the communication Interface 720, and the memory 730 communicate with each other via the communication bus 740. Processor 710 may invoke logic instructions in memory 730 to perform a method of agricultural pest detection comprising: inputting an agricultural disease and insect pest image to be detected into an agricultural disease and insect pest detection model based on YOLO v 4; obtaining a detection result according to an agricultural pest detection model based on YOLO v 4; the agricultural pest detection model based on the YOLO v4 comprises a guide head and an auxiliary head, and the detection result is obtained by weighted summation according to a prediction result output by the guide head and a prediction result output by the auxiliary head.
In addition, the logic instructions in the memory 730 can be implemented in the form of software functional units and stored in a computer readable storage medium when the software functional units are sold or used as independent products. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In another aspect, the present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method for agricultural pest detection provided by the above methods, the method comprising: inputting an agricultural disease and insect pest image to be detected into an agricultural disease and insect pest detection model based on YOLO v 4; obtaining a detection result according to an agricultural pest detection model based on YOLO v 4; the agricultural pest detection model based on the YOLO v4 comprises a guide head and an auxiliary head, and the detection result is obtained by weighted summation according to a prediction result output by the guide head and a prediction result output by the auxiliary head.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (12)

1. An agricultural pest detection method is characterized by comprising the following steps:
inputting an agricultural disease and insect pest image to be detected into an agricultural disease and insect pest detection model based on YOLO v 4;
obtaining a detection result according to the agricultural disease and insect pest detection model based on the YOLO v 4;
the agricultural pest detection model based on the YOLO v4 comprises a guide head and an auxiliary head, and the detection result is obtained by weighted summation of a prediction result output by the guide head and a prediction result output by the auxiliary head.
2. An agricultural pest detection method according to claim 1 further including:
and generating a coarse label and the fine label according to the guide head, wherein the coarse label is used for assisting the guide head to train, and the fine label is used for assisting the guide head to train.
3. An agricultural pest detection method according to claim 2, wherein the training process of the agricultural pest detection model based on YOLO v4 includes:
calculating a first loss value between the predicted result and the real result output by the guiding head according to a loss function;
guiding the guiding head to output a prediction result according to the first loss value and the thin label;
calculating a second loss value between the auxiliary head output prediction result and a real result according to a loss function;
and guiding the auxiliary head to output a prediction result according to the second loss value and the rough label.
4. An agricultural pest detection method according to claim 3 wherein the loss function formula is:
Figure FDA0003915452410000011
wherein the content of the first and second substances,
Figure FDA0003915452410000021
as a loss function, K, S 2 B is the output characteristic graph, the cell and the prior frame number in each cell respectively; alpha is alpha * The weight of each sub-loss function is represented,
Figure FDA0003915452410000022
whether a characteristic diagram representing the kth output, the ith cell and the jth prior frame are positive samples or not is judged, if the characteristic diagram is the positive sample, the value is 1, otherwise, the value is 0; t is t p ,t gt Are a prediction vector and a true value vector;
Figure FDA0003915452410000023
the balancing coefficients are used to balance the weights of the output feature maps for each scale.
5. An agricultural pest detection method according to claim 4 wherein the sub-loss function includes:
the system comprises a coordinate loss function, a target confidence coefficient loss function and a classification loss function, wherein the target confidence coefficient loss function and the classification loss function adopt a two-value cross entropy loss function with log, and the coordinate loss function adopts a CIoU loss function.
6. The agricultural pest detection method according to claim 1, wherein the agricultural pest detection model based on YOLO v4 is obtained by training according to a training sample set, and the generation process of the training sample set comprises the following steps:
clustering a plurality of GT frames in a training data set to obtain a plurality of prior frames;
matching each GT frame with each prior frame;
screening out a positive sample, a negative sample and a background according to a matching result;
and generating a training sample set according to the positive sample, the negative sample and the background.
7. An agricultural pest detection method according to claim 6 wherein the training sample set generation process further includes:
and taking the preset number neighborhood grid with the nearest central position of the GT frame as a positive sample.
8. An agricultural pest detection method according to claim 6, wherein screening out a positive sample, a negative sample and a background according to the matching result comprises:
respectively calculating the frame size ratio of each GT frame to each prior frame, wherein the frame size ratio comprises a frame width ratio and a frame height ratio;
if the maximum value of the frame size ratio is smaller than a preset ratio threshold, judging that the corresponding GT frame is a positive sample;
if the maximum value of the frame size ratio is not smaller than a preset ratio threshold, judging that the corresponding GT frame is a negative sample;
and if the frame size ratio of the GT frame to any prior frame is less than or equal to 1, judging that the GT frame is a background.
9. An agricultural pest detection method according to claim 6 wherein if the frame size ratios of a GT frame to a plurality of prior frames are all greater than a preset ratio threshold, including:
arranging a plurality of prior frames corresponding to the ratio threshold value larger than the preset ratio value from small to large according to matching cost, wherein the matching cost is obtained by adding IoU loss and classification loss calculated according to a prediction result and a real result;
and sequentially screening the prior frames with a preset number of thresholds according to the arrangement sequence of the prior frames to serve as positive samples.
10. The utility model provides an agricultural plant diseases and insect pests detect detection device which characterized in that includes:
the input module is used for inputting the agricultural disease and insect pest image to be detected into an agricultural disease and insect pest detection model based on YOLO v 4;
the output module is used for obtaining a detection result according to the agricultural disease and pest detection model based on the YOLO v 4; the agricultural pest detection model based on the YOLO v4 comprises a guide head and an auxiliary head, and the detection result is obtained by weighted summation of a prediction result output by the guide head and a prediction result output by the auxiliary head.
11. An electronic device comprising a memory, a processor, and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program implements the agricultural pest detection method of any one of claims 1 to 9.
12. A non-transitory computer readable storage medium having stored thereon a computer program, wherein the computer program, when executed by a processor, implements the agricultural pest detection method of any one of claims 1 to 9.
CN202211338472.9A 2022-10-28 2022-10-28 Agricultural pest detection method and device based on YOLO v4 Active CN115546186B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211338472.9A CN115546186B (en) 2022-10-28 2022-10-28 Agricultural pest detection method and device based on YOLO v4

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211338472.9A CN115546186B (en) 2022-10-28 2022-10-28 Agricultural pest detection method and device based on YOLO v4

Publications (2)

Publication Number Publication Date
CN115546186A true CN115546186A (en) 2022-12-30
CN115546186B CN115546186B (en) 2023-07-14

Family

ID=84718052

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211338472.9A Active CN115546186B (en) 2022-10-28 2022-10-28 Agricultural pest detection method and device based on YOLO v4

Country Status (1)

Country Link
CN (1) CN115546186B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170307889A1 (en) * 2016-02-23 2017-10-26 Compedia Software and Hardware Development Ltd. Vision-assisted input within a virtual world
US11176384B1 (en) * 2020-09-18 2021-11-16 XMotors.ai Inc. Apparatus, system and method for object detection around vehicle and application of same
US20210366144A1 (en) * 2020-05-21 2021-11-25 Verizon Connect Ireland Limited Systems and methods for utilizing a deep learning model to determine vehicle viewpoint estimations
CN114220035A (en) * 2021-12-23 2022-03-22 中科合肥智慧农业协同创新研究院 Rapid pest detection method based on improved YOLO V4

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170307889A1 (en) * 2016-02-23 2017-10-26 Compedia Software and Hardware Development Ltd. Vision-assisted input within a virtual world
US20210366144A1 (en) * 2020-05-21 2021-11-25 Verizon Connect Ireland Limited Systems and methods for utilizing a deep learning model to determine vehicle viewpoint estimations
US11176384B1 (en) * 2020-09-18 2021-11-16 XMotors.ai Inc. Apparatus, system and method for object detection around vehicle and application of same
CN114220035A (en) * 2021-12-23 2022-03-22 中科合肥智慧农业协同创新研究院 Rapid pest detection method based on improved YOLO V4

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
郭守向等: "Yolo-C:基于单阶段网络的 X光图像违禁品检测", 《/激光与光电子学进展》, vol. 58, no. 8, pages 1 - 10 *

Also Published As

Publication number Publication date
CN115546186B (en) 2023-07-14

Similar Documents

Publication Publication Date Title
CN111680706B (en) Dual-channel output contour detection method based on coding and decoding structure
CN111079602A (en) Vehicle fine granularity identification method and device based on multi-scale regional feature constraint
CN108830285A (en) A kind of object detection method of the reinforcement study based on Faster-RCNN
CN111553200A (en) Image detection and identification method and device
CN111832608B (en) Iron spectrum image multi-abrasive particle identification method based on single-stage detection model yolov3
CN111898432B (en) Pedestrian detection system and method based on improved YOLOv3 algorithm
CN102034107B (en) Unhealthy image differentiating method based on robust visual attention feature and sparse representation
Sabrol et al. Fuzzy and neural network based tomato plant disease classification using natural outdoor images
CN114841244B (en) Target detection method based on robust sampling and mixed attention pyramid
CN114758288A (en) Power distribution network engineering safety control detection method and device
CN109886146B (en) Flood information remote sensing intelligent acquisition method and device based on machine vision detection
CN110009628A (en) A kind of automatic testing method for polymorphic target in continuous two dimensional image
WO2024032010A1 (en) Transfer learning strategy-based real-time few-shot object detection method
CN115546187A (en) Agricultural pest and disease detection method and device based on YOLO v5
CN111723749A (en) Method, system and equipment for identifying wearing of safety helmet
CN113112498A (en) Grape leaf scab identification method based on fine-grained countermeasure generation network
CN112270671B (en) Image detection method, device, electronic equipment and storage medium
CN117132802A (en) Method, device and storage medium for identifying field wheat diseases and insect pests
CN107368847A (en) A kind of crop leaf diseases recognition methods and system
CN115620083B (en) Model training method, face image quality evaluation method, equipment and medium
CN115546186A (en) Agricultural pest and disease detection method and device based on YOLO v4
CN110659585A (en) Pedestrian detection method based on interactive attribute supervision
CN115631462A (en) AM-YOLOX-based strawberry disease and pest detection method and system
CN112465821A (en) Multi-scale pest image detection method based on boundary key point perception
CN115049870A (en) Target detection method based on small sample

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant