CN111753646B - Agricultural pest detection classification method integrating population season collapse information - Google Patents

Agricultural pest detection classification method integrating population season collapse information Download PDF

Info

Publication number
CN111753646B
CN111753646B CN202010395825.3A CN202010395825A CN111753646B CN 111753646 B CN111753646 B CN 111753646B CN 202010395825 A CN202010395825 A CN 202010395825A CN 111753646 B CN111753646 B CN 111753646B
Authority
CN
China
Prior art keywords
pest
pests
frames
detection
iou
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010395825.3A
Other languages
Chinese (zh)
Other versions
CN111753646A (en
Inventor
蔡舒平
孙仲鸣
沈跃
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu University
Original Assignee
Jiangsu University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu University filed Critical Jiangsu University
Priority to CN202010395825.3A priority Critical patent/CN111753646B/en
Publication of CN111753646A publication Critical patent/CN111753646A/en
Application granted granted Critical
Publication of CN111753646B publication Critical patent/CN111753646B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computational Linguistics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Catching Or Destruction (AREA)

Abstract

The invention provides an agricultural pest detection classification method fusing population season growth information, belongs to the technical field of agricultural pest target detection, and improves classification regression functions and NMS strategies in a traditional YOLOv target detection algorithm. The season-growing rules of different pest populations are utilized to assist in detection, so that misjudgment probability of pests with similar appearance can be reduced, the problem that other sundries with similar appearance to the pests are judged to be pests in a traditional algorithm can be solved, pesticide spraying is more accurate, pesticide waste can not be caused, and cost reduction is facilitated. The improved NMS strategy can also frame the same kind of pests which are gathered together by a prediction frame, thereby helping the pesticide spraying vehicle to spray the pesticide on a large scale and improving the spraying efficiency.

Description

Agricultural pest detection classification method integrating population season collapse information
Technical Field
The invention belongs to the technical field of agricultural pest target detection, and particularly relates to an agricultural pest detection classification method fusing population season collapse information.
Background
Along with the development of agricultural modernization, the planting scale is continuously enlarged, the problems of crop diseases and insect pests are increasingly serious, the characteristics of multiple types, large influence and high outbreak frequency are presented, and great losses are brought to agricultural production. The use of pesticides to kill pests is the most direct and effective solution, but agricultural pests are of a wide variety, and agricultural producers lack sufficient agricultural knowledge to judge the pest species and cannot use the correct pesticide. If the carpet type medicine spraying is carried out, waste is caused, and the environment of a large area is polluted; meanwhile, it is not practical to identify pests in a large area one by using manpower and to select a proper pesticide for precise pest killing.
In recent years, with the rapid development of machine vision, it has become possible to solve the problem of insect diseases in agricultural production processes by means of machine vision, and particularly, a target detection technique based on deep learning has been widely used for detection and identification of insect pests, of which the most representative target detection and identification algorithm is YOLOv. Although YOLOv has better achieved accurate detection and classification of the target, there is still a disadvantage when applied to agricultural pest identification, the identification of the image of the target object mainly considers the color features, the shape features and the texture features, however, when identifying some pests with very similar shapes, colors and textures, the target pests are often misjudged as other similar pests only by the image features, and meanwhile, some impurities are misjudged as pests with similar shapes, so that errors obviously not conforming to actual situations occur. The defect can cause mistakes or mistakes in the operation of the intelligent pesticide spraying device depending on the visual sensor, and the pest control effect is affected.
When distinguishing pest categories, YOLOv needs to carry out regression adjustment of the prediction frames, when the regression adjustment is carried out, a plurality of independent prediction frames of pests can be reserved when clustered pests are detected by utilizing a non-maximum suppression (NMS) strategy of the existing prediction frames, and the strategy enables a pesticide spraying device to execute a cycle flow of identification, positioning and single-point spraying one by one when pesticide is sprayed, so that efficiency is obviously lacking, and improvement is needed.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides the agricultural pest detection and classification method for fusing the population season collapse information, and the improved YOLOv algorithm is adopted to detect and identify the agricultural pests, so that the misjudgment rate on the pests with similar appearance can be effectively reduced, the control cost is reduced, the cluster areas of the pests of the same kind can be detected, and the pesticide spraying process is faster and more efficient.
The present invention achieves the above technical object by the following means.
The agricultural pest detection and classification method for fusing population season collapse information is characterized by comprising the following steps of:
s1: investigation of the number of target pests in the designated area, photographing, inputting pest pictures into a target detection model and obtaining a feature map;
s2: detecting and classifying the feature images by using an improved classification regression function;
S3: making an anchor frame according to the feature map, performing offset regression calculation on the anchor frame, screening a prediction boundary frame according to the intersection ratio (IOU) of the anchor frame and a real frame, and optimizing the prediction boundary frame by utilizing an improved NMS strategy;
S4: and determining the type and position information of the target pests according to the detection classification result and the prediction boundary box calculation result.
Further, the step of investigating the number of target pests in the designated area in S1 specifically includes: 3 tea gardens under different environments are selected, 5 areas with the area of 1m 2 are selected for each tea garden, the number of target pests from the crown to the ground of the tea tree is recorded, the continuous investigation is carried out for 3 years, every 10 days for 4 to 9 months each year, and every 15 days for 10 to 3 months each year.
Further, the calculation formula of the current month population density of the target pests is as follows:
Wherein i represents a pest species tag; d i1、Di2、Di represents the i-th pest population density of one month, two months, and ten months, respectively; n represents the number of surveys in the current month; s represents the total area of the current month investigation; c i represents the number of investigation of the ith pest in the month.
Further, the target detection model is a modified YOLOv model with MobileNetv as a backbone network.
Further, the step S1 further includes: and photographing and labeling pests from multiple angles, making a data set from the labeled photographs, and dividing the data set into a training set and a testing set according to the standard of 8:2 for training the YOLOv model.
Further, the S2 specifically is:
The general form of the two-classification logistic regression function is: Wherein x n represents a vector of feature variables; k n represents a regression coefficient vector;
Expanding the two-class logistic regression function into a multi-class logistic regression function and combining the pest season distribution function lambda i (t) to obtain the probability of each result under the condition of K prediction samples:
Wherein i represents a pest species tag; lambda i (t) represents the seasonal variation parameters of the ith target pest at a certain time t.
Further, the pest season distribution function lambda i (t) is calculated as follows:
When 0< y < mu, lambda i (t) =1
When y is more than or equal to mu,
Wherein y represents the standard deviation of the population density values of various pests at a certain moment t; mu represents a threshold value of the degree of difference between the population densities of various pests; alpha i represents the population density value of the ith pest at a certain time t; And the sum of the population density values of all the species of pests at a certain moment t is represented.
Further, the step S3 specifically includes: according to the feature diagram, anchor frames with different sizes and aspect ratios are made, the coordinates of the anchor frames and the coordinates of the real frames are subjected to linear regression, and a batch of prediction boundary frames are screened out according to the intersection ratio (IOU) of the anchor frames and the real frames; judging two prediction boundary frames in the same class, when IOU > gamma, taking the circumscribed rectangle of the two prediction boundary frames as an output prediction boundary frame, and selecting a value with larger confidence as the final output confidence; when IOU is less than or equal to gamma, both prediction boundary boxes are reserved; wherein γ represents the IOU threshold; repeating the judging process until no overlapped prediction boundary boxes exist, wherein in the repeating process, the IOU value and the IOU threshold value are changed; and finally eliminating the repeated frames with lower confidence values.
Further, in the repetition process, the ratio of the intersection area of the anchor frame and the real frame to the area of the two prediction boundary frames is calculated respectively, the ratio is taken as a new IOU value, and the IOU threshold is adjusted to be high.
The invention has the following beneficial effects:
Compared with the prior art, the invention improves the traditional YOLOv target detection algorithm, so that pest control is more accurate and intelligent; compared with the traditional YOLOv algorithm, the method only uses image characteristics to detect and judge the target pests, and the method uses the season overgrowth law of different pest populations to assist detection, so that the misjudgment probability of pests with similar appearance is greatly reduced, the problem that other sundries with similar appearance to the pests are judged to be pests in the traditional algorithm can be solved, pesticide spraying is more accurate, pesticide waste is reduced, and cost is reduced; the invention also improves the NMS strategy in the algorithm, can frame the aggregated insects of the same kind by using a prediction frame, ensures that the pesticide spraying vehicle can automatically adjust the spraying range and improves the spraying efficiency.
Drawings
FIG. 1 is a graph showing the population density of 4 pests according to the present invention over time;
FIG. 2 is a flow chart of agricultural pest detection according to the present invention;
Fig. 3 is a detection classification result and a prediction block diagram of an angular leaf hopper picture according to the present invention, where fig. 3 (a) is a detection classification result and a prediction block diagram of an angular leaf hopper picture obtained by a conventional YOLOv algorithm model when t=4, and fig. 3 (b) is a detection classification result and a prediction block diagram of an angular leaf hopper picture obtained by an improved YOLOv algorithm model when t=4;
Fig. 4 is a detection and classification result and a prediction block diagram of the collected images of the petunia, wherein fig. 4 (a) is a detection and classification result and a prediction block diagram of the collected images of the petunia obtained by a traditional YOLOv algorithm model when t=7, and fig. 4 (b) is a detection and classification result and a prediction block diagram of the collected images of the petunia obtained by a modified YOLOv algorithm model when t=7.
Detailed Description
The invention will be further described with reference to the drawings and the specific embodiments, but the scope of the invention is not limited thereto.
It was found that agricultural pests go through stages from eggs to larvae to adults in one year, each stage is quite distinct in appearance, and the same species of pests in the same appearance have a greater float in population density over 12 months of the year. Meanwhile, different species of pests and different forms of the same species of pests have large differences in population density in one period. Under the condition that the insect can not be accurately distinguished only by the image features of color, shape and texture, the current theoretical occurrence probability of the insect can be estimated according to the population density difference of the insect, and the confidence level among different varieties of the insect can be changed when the insect images are classified through the difference of the occurrence probability, so that the effect of distinguishing and judging is achieved, and the error of the recognition result against common sense is avoided, and the specific process is as follows:
In this embodiment, the improved YOLOv algorithm using the lightweight convolutional neural network MobileNetv as the backbone network is selected as the target detection model, so that the number of parameters can be greatly reduced without losing excessive accuracy, and the training and prediction time is reduced.
Step 1: survey sampling and data preparation;
For convenience of investigation, the panelist in this embodiment is preferably 4 kinds of pests with similar appearance commonly found in tea gardens: the plant materials comprise Equipped cicada (Geisha), leafhopper (Empoasca), dyer woad leaf cicada (CICADELLA) and leafhopper (Deltocepha). 3 tea gardens under different environments are selected, 5 areas are randomly selected for each tea garden, the area of each area is 1m 2, the number of target pests from the crown to the ground of tea clusters in a designated area is recorded, and continuous investigation is carried out for 3 years to reduce errors, wherein the investigation is carried out every 10 days in 4-9 months each year and every 15 days in 10-3 months each year.
The calculation formula of the current month population density of different kinds of pests is as follows:
Wherein i represents a pest species label, i=1, 2, 3, 4 in this embodiment; d i1、Di2、Di 12 represents the population density of the ith pest in the first month, the second month and the december respectively, and the unit is only/m 2; n represents the number of surveys in the current month; s represents the total area of the current month investigation, the unit is m 2;Ci represents the current month investigation quantity of the ith pest, and the unit is only.
And drawing a time-varying graph of the population densities of the 4 pests shown in figure 1 according to the population density values after the annual average.
While investigating the number of the specified species of pests, the pests are photographed from a plurality of angles, and the photographs are labeled. And when the pest photos cannot be photographed, collecting pest photos from the Internet, manufacturing data sets from all marked photos, wherein the number of the data sets is 500 photos, and dividing training sets and test sets for training YOLOv target detection models according to the standard of 8:2. The trained model is deployed in a detection system of a pesticide spraying vehicle, when insect pests are prevented, the pesticide spraying vehicle inputs an original insect pest picture obtained by shooting into the detection system, data enhancement and size adjustment are carried out on the picture, and then the adjusted picture is input into a YOLOv target detection model.
Step 2: the improved classification regression function is used for accurately detecting and classifying the pests, and the specific steps are as follows:
YOLOv3 performing feature extraction on the input pest pictures with fixed sizes by using the convolutional neural network to obtain feature maps, and performing classification regression on the feature maps. The classification regression function in this embodiment uses a logistic regression function (Logistic Regression) that has the general form:
wherein x n represents a vector of feature variables; k n is a regression coefficient vector, representing the resulting influence on n feature variables.
The logistic regression function is a two-class function, and the YOLOv model in the embodiment uses a plurality of independent binary logistic regression functions when classifying targets, so the invention expands the two-class logistic regression function into a multi-class logistic regression function, combines the multi-class logistic regression function with the pest season distribution function lambda i (t) to obtain the probability of each result under the condition of K prediction samples, and can judge the pest types in the picture according to the calculated probability value, wherein the probability calculation formula is as follows:
Wherein lambda i (t) represents a seasonal extinction parameter of the ith target pest at a certain time t, which reflects that the occupation ratio of the population density of a certain type of pests in the tested pests presents an extinction situation along with the seasonal variation.
The specific calculation of lambda i (t) is as follows:
Taking the population density values of various pests at a certain moment t in a graph of the population density of the pests with time, and calculating a standard deviation y which reflects the degree of difference among the current similar population densities of the pests; setting a difference degree threshold mu, wherein the threshold mu can be set according to actual conditions, and in the embodiment, the threshold mu is 5, and the pest season distribution function lambda i (t) only plays a role when y is larger than or equal to mu, so that the pest season distribution function lambda i (t) has the following calculation formula:
when 0< y < mu >, λ i (t) =1 (4)
When y is more than or equal to mu,
Wherein alpha i represents the population density value of the ith pest at a certain moment t; And the sum of the population density values of all the species of pests at a certain moment t is represented.
Step 3: further optimizing the overlapping homogeneous prediction bounding boxes with an improved NMS strategy;
The existing YOLOv algorithm makes 9 anchor frames (anchor boxes) with different sizes and aspect ratios according to the center points of the predicted objects in the grids in the feature map, carries out offset regression calculation on the anchor frames, namely carries out linear regression on the coordinates of the anchor frames and the coordinates of a real frame (ground truth bbox) to enable the anchor frames to be infinitely close to the real frame, then screens out a batch of predicted boundary boxes (bbox) according to the intersection ratio (IOU) between the anchor frames and the real frame, and finally uses a non-maximum suppression (NMS) strategy to reject repeated frames with lower confidence values.
However, in practical detection, many pests have a phenomenon of clustering on crops, and if the pests clustered in the same category are separated into innumerable single detection frames to be detected and positioned one by one, pest control efficiency is low. Therefore, the invention improves the NMS strategy, combines a plurality of dense independent prediction frames into a large frame to frame a plurality of pests of the same category, and can cover a large designated area at a time when spraying pesticides, thereby improving the spraying efficiency, and the specific improvement process of the NMS strategy is as follows:
Judging two prediction boundary frames in the same class, if IOU > gamma, taking the circumscribed rectangle of the two prediction boundary frames as an output prediction boundary frame, and selecting a value with higher confidence in the two prediction boundary frames as the confidence of the final output prediction boundary frame; if IOU is less than or equal to gamma, both prediction bounding boxes are reserved; where γ represents the IOU threshold, and γ in this embodiment is 0.4. Repeating the steps until no overlapped prediction boundary frames exist, respectively calculating the ratio of the intersection area of the anchor frame and the real frame to the areas of the two prediction boundary frames in the repetition process, taking the new IOU value with larger ratio as the new IOU value, and filtering the situation that the local positions in the prediction boundary frames in the same category are repeatedly detected, namely, the situation that a large frame comprises a small frame; the IOU threshold is adjusted as high as possible, so that two prediction boundary boxes which are too close to each other are prevented from being filtered. And finally, removing the frame with lower confidence value.
Step 4: the agricultural pest detection flow chart is shown in fig. 2, the position and the range of pests are determined according to the prediction frame result obtained in the step 3, the types of the pests are determined according to the detection classification result obtained in the step 2, and then the corresponding pesticides are selected for spraying.
As shown in fig. 3, in fig. 3 (a), the detection classification result and the prediction block diagram of the image of the leafhopper (Deltocepha) obtained by the conventional YOLOv algorithm model are shown in t=4, and in fig. 3 (b), the detection classification result and the prediction block diagram of the image of the leafhopper (Deltocepha) obtained by the improved YOLOv algorithm model are shown in the upper left corner of the prediction block, which is the pest name and the confidence respectively. As can be seen from the figure, the adoption of the traditional algorithm model leads to the fact that the target pest leafhoppers (Deltocepha) are wrongly judged to be the leafhoppers (Geisha), and the adoption of the improved algorithm model can judge the target pests more accurately.
As shown in fig. 4, fig. 4 (a) is a detection classification result and a prediction block diagram of a collected image of a pycnida (Geisha) obtained by a conventional YOLOv algorithm model when t=7, and fig. 4 (b) is a detection classification result and a prediction block diagram of a collected image of a pycnida (Geisha) obtained by a modified YOLOv algorithm model when t=7, wherein the top left corner of the prediction block in the figure shows pest names and confidence degrees respectively. As can be seen from the figure, the traditional algorithm model respectively frames similar pests with very close distances by using independent prediction frames, and the prediction frames have overlapping parts, so that when pesticide is sprayed, the pesticide spraying vehicle aims at the prediction frames one by one to spray the pesticide, the efficiency is low, and the pesticide waste is easy to cause; the improved algorithm model can lead the target pests which are gathered together to be out by using a single prediction frame, and when pesticide is sprayed, the pesticide spraying vehicle can simultaneously spray the pesticide to a large area, so that the spraying effect is better, the efficiency is higher, and the waste is not easy to cause.
The examples are preferred embodiments of the present invention, but the present invention is not limited to the above-described embodiments, and any obvious modifications, substitutions or variations that can be made by one skilled in the art without departing from the spirit of the present invention are within the scope of the present invention.

Claims (7)

1. The agricultural pest detection and classification method for fusing population season collapse information is characterized by comprising the following steps of:
s1: investigation of the number of target pests in the designated area, photographing, inputting pest pictures into a target detection model and obtaining a feature map;
s2: detecting and classifying the feature images by using an improved classification regression function;
S3: making an anchor frame according to the feature map, performing offset regression calculation on the anchor frame, screening a prediction boundary frame according to the intersection ratio IOU of the anchor frame and the real frame, and optimizing the prediction boundary frame by utilizing an improved NMS strategy;
S4: determining the type and position information of the target pests according to the detection classification result and the prediction boundary box calculation result;
The step S2 is specifically as follows:
the two-classification logistic regression function is in the form of: Wherein x n represents a vector of feature variables; k n represents a regression coefficient vector;
Expanding the two-class logistic regression function into a multi-class logistic regression function and combining the pest season distribution function lambda i (t) to obtain the probability of each result under the condition of K prediction samples:
Wherein i represents a pest species tag; lambda i (t) represents a seasonal length parameter of the ith target pest at a certain time t;
the pest season distribution function lambda i (t) is calculated as follows:
When 0< y < mu, lambda i (t) =1
Wherein y represents the standard deviation of the population density values of various pests at a certain moment t; mu represents a threshold value of the degree of difference between the population densities of various pests; alpha i represents the population density value of the ith pest at a certain time t; And the sum of the population density values of all the species of pests at a certain moment t is represented.
2. The agricultural pest detection and classification method according to claim 1, wherein the investigation of the number of target pests in the designated area in S1 specifically includes: 3 tea gardens under different environments are selected, 5 areas with the area of 1m 2 are selected for each tea garden, the number of target pests from the crown to the ground of the tea tree is recorded, the continuous investigation is carried out for 3 years, every 10 days for 4 to 9 months each year, and every 15 days for 10 to 3 months each year.
3. The agricultural pest detection and classification method according to claim 2, wherein the target pest current month population density calculation formula is: Wherein i represents a pest species tag; d i1、Di2、Di represents the i-th pest population density of one month, two months, and ten months, respectively; n represents the number of surveys in the current month; s represents the total area of the current month investigation; c i represents the number of investigation of the ith pest in the month.
4. The agricultural pest detection classification method according to claim 1, wherein the target detection model is a modified YOLOv model with MobileNetv as a backbone network.
5. The agricultural pest detection and classification method according to claim 4, wherein said S1 further includes: and photographing and labeling pests from multiple angles, making a data set from the labeled photographs, and dividing the data set into a training set and a testing set according to the standard of 8:2 for training the YOLOv model.
6. The agricultural pest detection and classification method according to claim 1, wherein said S3 is specifically: according to the feature diagram, anchor frames with different sizes and aspect ratios are made, the coordinates of the anchor frames and the coordinates of the real frames are subjected to linear regression, and a batch of prediction boundary frames are screened according to the intersection ratio IOU of the anchor frames and the real frames; judging two prediction boundary frames in the same class, when IOU > gamma, taking the circumscribed rectangle of the two prediction boundary frames as an output prediction boundary frame, and selecting a value with larger confidence as the final output confidence; when IOU is less than or equal to gamma, both prediction boundary boxes are reserved; wherein γ represents the IOU threshold; repeating the judging process until no overlapped prediction boundary boxes exist, wherein in the repeating process, the IOU value and the IOU threshold value are changed; and finally eliminating the repeated frames with lower confidence values.
7. The agricultural pest detection and classification method according to claim 6, wherein in the repeating process, a ratio of an intersection area of the anchor frame and the real frame to an area of the two prediction boundary frames is calculated, respectively, and the ratio is taken as a new IOU value, and the IOU threshold is increased.
CN202010395825.3A 2020-05-12 2020-05-12 Agricultural pest detection classification method integrating population season collapse information Active CN111753646B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010395825.3A CN111753646B (en) 2020-05-12 2020-05-12 Agricultural pest detection classification method integrating population season collapse information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010395825.3A CN111753646B (en) 2020-05-12 2020-05-12 Agricultural pest detection classification method integrating population season collapse information

Publications (2)

Publication Number Publication Date
CN111753646A CN111753646A (en) 2020-10-09
CN111753646B true CN111753646B (en) 2024-05-14

Family

ID=72673223

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010395825.3A Active CN111753646B (en) 2020-05-12 2020-05-12 Agricultural pest detection classification method integrating population season collapse information

Country Status (1)

Country Link
CN (1) CN111753646B (en)

Families Citing this family (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112528726B (en) * 2020-10-14 2022-05-13 石河子大学 Cotton aphid pest monitoring method and system based on spectral imaging and deep learning
CN112950605A (en) * 2021-03-15 2021-06-11 西安电子科技大学 Pole tower image detection method based on MYOLOv3 network
CN113221749A (en) * 2021-05-13 2021-08-06 扬州大学 Crop disease remote sensing monitoring method based on image processing and deep learning
CN113317295A (en) * 2021-05-14 2021-08-31 北京百瑞盛田环保科技发展有限公司 Drug administration supervision method, device and system
CN113326952A (en) * 2021-05-14 2021-08-31 北京百瑞盛田环保科技发展有限公司 Drug administration supervision method, device and system
CN113776165B (en) * 2021-09-10 2023-03-21 西安建筑科技大学 YOLOv5l algorithm-based multi-region artificial fog pipe network intelligent control method and system
CN114005029B (en) * 2021-10-20 2024-04-23 华南农业大学 Method and system for identifying disease and insect pests of bergamot based on improved yolov network
CN114612898B (en) * 2022-03-16 2024-05-10 华南农业大学 Method for detecting eclosion rate of litchi pedicel borers based on YOLOv network
CN115413634B (en) * 2022-10-08 2024-05-14 广东省农业科学院设施农业研究所 Deinsectization device for greenhouse

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107665355A (en) * 2017-09-27 2018-02-06 重庆邮电大学 A kind of agricultural pests detection method based on region convolutional neural networks
CN109919239A (en) * 2019-03-15 2019-06-21 尹显东 A kind of diseases and pests of agronomic crop intelligent detecting method based on deep learning

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107665355A (en) * 2017-09-27 2018-02-06 重庆邮电大学 A kind of agricultural pests detection method based on region convolutional neural networks
CN109919239A (en) * 2019-03-15 2019-06-21 尹显东 A kind of diseases and pests of agronomic crop intelligent detecting method based on deep learning

Also Published As

Publication number Publication date
CN111753646A (en) 2020-10-09

Similar Documents

Publication Publication Date Title
CN111753646B (en) Agricultural pest detection classification method integrating population season collapse information
Jia et al. Detection and segmentation of overlapped fruits based on optimized mask R-CNN application in apple harvesting robot
CN113538390B (en) Quick identification method for shaddock diseases and insect pests
Pang et al. Improved crop row detection with deep neural network for early-season maize stand count in UAV imagery
CN113392775B (en) Sugarcane seedling automatic identification and counting method based on deep neural network
CN112464971A (en) Method for constructing pest detection model
CN109447169A (en) The training method of image processing method and its model, device and electronic system
CN110479636B (en) Method and device for automatically sorting tobacco leaves based on neural network
WO2021255458A1 (en) System and method for crop monitoring
CN109740483A (en) A kind of rice growing season detection method based on deep-neural-network
CN110569747A (en) method for rapidly counting rice ears of paddy field rice by using image pyramid and fast-RCNN
CN109829425B (en) Farmland landscape small-scale ground feature classification method and system
CN109344738A (en) The recognition methods of crop diseases and pest crop smothering and device
CN114818909A (en) Weed detection method and device based on crop growth characteristics
CN110503140A (en) Classification method based on depth migration study and neighborhood noise reduction
CN114140665A (en) Dense small target detection method based on improved YOLOv5
CN111832448A (en) Disease identification method and system for grape orchard
CN106296702A (en) Cotton Images dividing method and device under natural environment
Kutyrev et al. Recognition and Classification Apple Fruits Based on a Convolutional Neural Network Model.
CN113313692B (en) Automatic banana young plant identification and counting method based on aerial visible light image
CN115690778A (en) Method for detecting, tracking and counting mature fruits based on deep neural network
CN116912265A (en) Remote sensing image segmentation method and system
CN117036926A (en) Weed identification method integrating deep learning and image processing
CN110705698A (en) Target counting depth network design method based on scale self-adaptive perception
Rony et al. BottleNet18: Deep Learning-Based Bottle Gourd Leaf Disease Classification

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant