CN113159125B - Drug application method and system - Google Patents

Drug application method and system Download PDF

Info

Publication number
CN113159125B
CN113159125B CN202110290311.6A CN202110290311A CN113159125B CN 113159125 B CN113159125 B CN 113159125B CN 202110290311 A CN202110290311 A CN 202110290311A CN 113159125 B CN113159125 B CN 113159125B
Authority
CN
China
Prior art keywords
disease
grid
image
visible light
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110290311.6A
Other languages
Chinese (zh)
Other versions
CN113159125A (en
Inventor
陈立平
丁晨琛
张瑞瑞
李龙龙
唐青
张林焕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Research Center of Intelligent Equipment for Agriculture
Original Assignee
Beijing Research Center of Intelligent Equipment for Agriculture
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Research Center of Intelligent Equipment for Agriculture filed Critical Beijing Research Center of Intelligent Equipment for Agriculture
Priority to CN202110290311.6A priority Critical patent/CN113159125B/en
Publication of CN113159125A publication Critical patent/CN113159125A/en
Application granted granted Critical
Publication of CN113159125B publication Critical patent/CN113159125B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/217Validation; Performance evaluation; Active pattern learning techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/267Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion by performing operations on regions, e.g. growing, shrinking or watersheds

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computational Linguistics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Evolutionary Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Image Processing (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

The invention provides a method and a system for applying medicine, comprising the following steps: dividing the planting area into grids; acquiring a detection image group of crop canopy in each grid, wherein the detection image group comprises a visible light image and a thermal infrared image; inputting the detection image group of the crop canopy in each grid into a disease identification model, and acquiring the disease state of the crop in each grid according to the output result of the disease identification model, wherein the disease state comprises disease types and disease grades; generating an application prescription chart according to the disease state of crops in each grid; and applying the medicine to the target planting area based on the medicine application prescription chart. According to the method and the system for applying the pesticide, provided by the invention, the visible light image and the thermal infrared image of the canopy of the same crop are collected at the same time, so that whether the target crop has diseases, the disease degree and the specific area where the diseases occur can be rapidly identified, and the accurate pesticide application of the pesticide application system is controlled according to the disease identification result, so that the purposes of early prevention of the crop diseases, pesticide application reduction and synergy and human-pesticide separation are realized.

Description

Drug application method and system
Technical Field
The invention relates to the technical field of intelligent pesticide application irrigation, in particular to a pesticide application method and system.
Background
The accurate pesticide application is a key link of accurate agriculture and ecological agriculture, and along with the continuous progress of new generation scientific technologies such as artificial intelligence, intelligent control and the like in recent years, unmanned intelligent plant protection pesticide application is possible.
At present, in plant protection pesticide application of facility agriculture or indoor agriculture, most of commonly used pesticide application systems are manually controlled, and accurate pesticide application and pesticide application according to needs cannot be realized according to the disease condition of crops.
By adopting the existing pesticide application control method, because the rough pesticide application is adopted, the targeted pesticide application cannot be carried out according to different disease states of different plots, and the defects of overlarge pesticide application amount, poor pesticide application control effect, large manpower and material resource investment and the like are avoided.
Disclosure of Invention
Aiming at the problems of rough type drug application adopted in the prior art, the embodiment of the invention provides a drug application method and a drug application system.
The invention provides a method of administration comprising: performing grid division on a target planting area; acquiring a detection image group of crop canopy in each grid, wherein the detection image group consists of a visible light image and a thermal infrared image; inputting the detection image group of the crop canopy in each grid into a disease identification model, and acquiring the disease state of the crop in each grid according to the output result of the disease identification model; the disease state includes disease type and disease grade; generating an application prescription chart according to the disease state of crops in each grid; and applying the medicine to the target planting area based on the medicine application prescription chart.
According to the application method provided by the invention, before the detection image group of the crop canopy in each grid is input into the disease identification model, the application method further comprises the following steps: obtaining a plurality of visible light images and a plurality of Zhang Regong external images of a crop canopy which is a sample of an infection target disease, respectively constructing a visible light image sample set and a thermal infrared image sample set, and respectively labeling corresponding disease state labels for each visible light image sample and each thermal infrared image sample; selecting M different deep learning models; model pre-training is carried out on M different deep learning models by using a visible light image sample set, and model pre-training is carried out on the M different deep learning models by using an infrared image sample set, so that 2M trained deep learning models are obtained in total; and determining one of the 2M trained deep learning models with highest disease recognition accuracy as a disease recognition model.
According to the application method provided by the invention, at least one of the M different deep learning models comprises the following steps: resNet-50, inceptionV3, mobileNet.
According to the method for applying the pesticide provided by the invention, the determining of the one with the highest disease recognition precision in the 2M trained deep learning models as the disease recognition model comprises the following steps: aiming at target diseases, constructing a verification sample set consisting of a preset number of image samples; respectively carrying out recognition verification on the 2M trained deep learning models by utilizing all image samples in the verification sample set, and obtaining the recognition accuracy of each deep learning model, the average memory occupied by model operation and the average time of model operation; determining one of 2M trained deep learning models with highest disease recognition accuracy as a disease recognition model based on the recognition accuracy, an average memory occupied by model operation and a preset weight ratio of model operation average time; wherein, in the verification sample set, the ratio of the image samples related to the target disease is more than 50%; the image samples include a visible light image sample and a thermal infrared image sample.
According to the application method provided by the invention, the detection image group of the crop canopy in each grid is input into the disease recognition model, and the disease state of the crop in each grid is obtained according to the output result of the disease recognition model, and the application method comprises the following steps: according to the verification result, determining the image type of the input image sample when the disease identification model has the highest identification precision; when the image type is a visible light image, inputting the visible light image in the detection image group into a disease identification model to obtain a disease state corresponding to the visible light image; when the image type is a thermal infrared image, the thermal infrared image in the detection image group is input into a disease identification model to acquire a disease state corresponding to the visible light image.
According to the application method provided by the invention, a visible light image sample set and a thermal infrared image sample set are respectively constructed on a plurality of visible light images and a plurality of Zhang Regong external images of the canopy of a sample crop infected with a target disease, and the application method specifically comprises the following steps: and amplifying each visible light image and each thermal infrared image in a turnover, translation or rotation mode, and carrying out normalization processing on all amplified images to respectively construct a visible light image sample set and a thermal infrared image sample set.
According to the application method provided by the invention, after the target planting area is subjected to grid division, the application method further comprises the following steps: assigning a grid number to each grid; correspondingly, acquiring a detection image group of the crop canopy in each grid comprises: traversing and shooting a detection image group of crop canopy in each grid; generating an application prescription map according to the disease state of crops in each grid, wherein the application prescription map comprises the following components: determining the disease state corresponding to each grid number to determine the application rate corresponding to each grid number; and generating a drug administration prescription chart according to the positioning of each grid number in the target planting area and the corresponding drug administration amount.
The present invention also provides a delivery system comprising: the grid dividing unit is used for dividing the grid of the target planting area; the image acquisition unit is used for acquiring a detection image group of the crop canopy in each grid, wherein the detection image group consists of a visible light image and a thermal infrared image; the disease recognition unit is used for inputting the detection image group of the crop canopy in each grid to the disease recognition model, and acquiring the disease state of the crop in each grid according to the output result of the disease recognition model, wherein the disease state comprises disease types and disease grades; the prescription drawing making unit is used for generating an application prescription drawing according to the disease state of crops in each grid; and the medicine application execution unit is used for applying medicine to the target planting area based on the medicine application prescription diagram.
The invention also provides an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the method of administering any of the above when the program is executed.
The invention also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method of administering a medicament as described in any of the above.
According to the method and the system for applying the pesticide, provided by the invention, the visible light image and the thermal infrared image of the canopy of the same crop are collected at the same time, so that whether the target crop has diseases, the disease degree and the specific area where the diseases occur can be rapidly identified, and the accurate pesticide application of the pesticide application system is controlled according to the disease identification result, so that the purposes of early prevention of the crop diseases, pesticide application reduction and synergy and human-pesticide separation are realized.
Drawings
In order to more clearly illustrate the invention or the technical solutions of the prior art, the following description will briefly explain the drawings used in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are some embodiments of the invention, and other drawings can be obtained according to the drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic flow chart of the method of application provided by the present invention;
FIG. 2 is a schematic diagram of a meshing and inspection route provided by the present invention;
FIG. 3 is a schematic view of disease states of crops in each grid provided by the invention;
FIG. 4 is a second flow chart of the method of applying the present invention;
FIG. 5 is a schematic structural view of the dispensing system provided by the present invention;
fig. 6 is a schematic structural diagram of an electronic device provided by the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the accompanying drawings, and it is apparent that the described embodiments are some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be noted that in the description of embodiments of the present invention, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. The orientation or positional relationship indicated by the terms "upper", "lower", etc. are based on the orientation or positional relationship shown in the drawings, are merely for convenience of description and to simplify the description, and are not indicative or implying that the apparatus or elements in question must have a specific orientation, be constructed and operated in a specific orientation, and therefore should not be construed as limiting the present invention. Unless specifically stated or limited otherwise, the terms "mounted," "connected," and "coupled" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
Methods and systems for administration provided by embodiments of the present invention are described below in conjunction with fig. 1-6.
FIG. 1 is a schematic flow chart of the method of administration provided by the present invention, as shown in FIG. 1, including but not limited to the following steps:
step 11: performing grid division on a target planting area;
step 12: acquiring a detection image group of crop canopy in each grid, wherein the detection image group consists of a visible light image and a thermal infrared image;
step 13: inputting the detection image group of the crop canopy in each grid into a disease identification model, and acquiring the disease state of the crop in each grid according to the output result of the disease identification model; the disease state includes disease type and disease grade;
step 14: generating an application prescription chart according to the disease state of crops in each grid;
step 15: and applying the medicine to the target planting area based on the medicine application prescription chart.
The invention provides a precise pesticide application method, which mainly utilizes synchronously acquired visible light images and thermal infrared images of crop canopy, is based on an image recognition technology to realize nondestructive detection and automatic diagnosis and identification of crop diseases, and can accurately acquire identification results of disease occurrence areas of crop diseases, disease types in the disease occurrence areas, disease light and heavy degree and the like; then, according to the identification result, determining the generation and decision of the prescription for drug delivery; finally, according to the pesticide application prescription, the related pesticide application device is controlled to perform accurate pesticide application on crops, so the pesticide application method provided by the invention can provide accurate, efficient and environment-friendly accurate pesticide application for facility agriculture or indoor pesticide application research.
In particular, since the space of the target planting area of the facility agriculture or indoor agricultural environment is generally small, the target planting area may be previously grid-divided,
fig. 2 is a schematic diagram of the grid division and the inspection route provided by the present invention, as shown in fig. 2, the whole target planting area can be uniformly divided into grid areas with equal areas, and the division can be performed in the following manner:
firstly, shooting a top view image of the whole target planting area from above the target area, wherein the shooting position is preferably the center point of the target planting area; then, the image to be divided which only comprises the target planting area is extracted from the overlook image; and finally, carrying out grid division on the image to be divided, and making corresponding identification.
After the grid division of the whole target planting area is completed, any grid is taken as a starting point to make a proper inspection route so as to ensure that the whole inspection route traverses all grids until returning to the starting point.
The crop planting area is periodically inspected according to the 'starting point-end point-starting point', and images of crop canopy in each grid are acquired in the inspection process.
Optionally, in the application method provided by the invention, a suspended track can be built above the whole target planting area, and the suspended track can run through the disease detection module.
Wherein, the disease detection module may include: the device comprises a visible light and thermal infrared dual-channel camera, a self-stabilizing cradle head and a data transmission module. When the disease detection module runs above each grid area, the visible light and thermal infrared dual-channel camera is driven to automatically collect a detection image group of crop canopy in each grid, wherein the detection image group comprises at least one visible light image and at least one thermal infrared image.
The self-stabilizing cradle head can be a triaxial self-stabilizing cradle head which is used for clamping and fixing the visible light and thermal infrared dual-channel camera so as to keep the visible light and thermal infrared dual-channel camera stable and fix a shooting angle; the visible light and thermal infrared dual-channel camera has an automatic focusing function, and shooting pixels can ensure the definition of the acquired visible light image and thermal infrared image; the data transmission module is mainly used for transmitting the detection image group acquired by the visible light and thermal infrared dual-channel camera to the disease identification model.
Further, after finishing a patrol cycle, the collected detection image groups of the crop canopy in each grid can be respectively input into a disease recognition model trained in advance, and model recognition is respectively carried out on each detection image group so as to obtain the disease state related to the crop in each grid output by the disease recognition model.
The disease identification model is obtained after training according to a visible light image sample and a thermal infrared image sample with a disease state label. The disease state step includes the type of disease and the disease level of the type of disease.
Alternatively, wherein the inspection time is set according to the period of disease onset of the crop, it is preferable to inspect 4 times per day (i.e., every 6 hours) during the period of disease onset.
Fig. 3 is a schematic view of disease states of crops in each grid provided by the invention, and as shown in fig. 3, a grid needing to be applied, and disease types and disease grades in each grid needing to be applied can be located based on a disease identification result of each grid area. And then, determining the application rate of the grid required to be applied according to the disease types and the disease grades, and finally forming an application prescription chart.
Taking a tomato planting area as an example, the disease identification model provided by the application can identify disease types according to an input detection image group, including: early blight, leaf mold and leaf spot of coryneform; the disease grade of each disease included three grades, mild, moderate and severe.
As shown in fig. 3, grids needing to be applied can be screened out from all grids in the whole target planting area, the disease degree of each grid needing to be applied is obtained, and then an application prescription diagram of the tomato planting area can be generated.
Finally, determining a corresponding administration strategy based on the administration prescription map, comprising: no application is performed to the grid area where no application is required, and different doses are applied to the grid areas of different degrees of illness.
According to the application method provided by the invention, the visible light image and the thermal infrared image of the same crop canopy are collected at the same time, so that the existence of diseases, the disease degree and the specific area of the diseases of target crops can be rapidly identified, and the accurate application of the application system is controlled according to the disease identification result, thereby realizing the purposes of early prevention of crop diseases, pesticide application reduction and synergy and human-medicine separation.
Based on the foregoing embodiment, as an alternative embodiment, before inputting the detected image group of the crop canopy in each grid into the disease recognition model, the method further includes:
obtaining a plurality of visible light images and a plurality of Zhang Regong external images of a crop canopy which is a sample of an infection target disease, respectively constructing a visible light image sample set and a thermal infrared image sample set, and respectively labeling corresponding disease state labels for each visible light image sample and each thermal infrared image sample; selecting M different deep learning models; model pre-training is carried out on the M different deep learning models by using the visible light image sample set, and model pre-training is carried out on the M different deep learning models by using the infrared image sample set, so that 2M trained deep learning models are obtained in total; and determining one of the 2M trained deep learning models with highest disease recognition accuracy as the disease recognition model.
Taking m=3, the target planting area is exemplified by a tomato planting area, and three deep learning models are set as res net-50, conceptionv 3, mobileNet, the step of pre-training the pre-constructed disease recognition model according to the present invention includes, but is not limited to:
step 1, collecting an image sample, which comprises the following steps: original image samples (including visible light images and thermal infrared images) of three kinds of tomato key diseases (early blight, leaf mold and corynespora leaf spot) with different disease grades are respectively collected, and a sample label is manually set for each image sample, wherein the sample label comprises the disease types and the disease grades. Each image sample and the corresponding sample label form a training sample.
Then, all the training samples related to the visible light images are built into a visible light image sample set, and all the training samples related to the thermal infrared images are built into a thermal infrared image sample set. Wherein, the disease grading may include: no disease, primary disease, moderate disease, severe disease.
And 2, selecting the three representative deep learning network models (ResNet-50, acceptance V3 and MobileNet) and respectively pre-training the three deep network models by utilizing the visible light image sample set and the thermal infrared image sample set constructed in the step 1 until the models converge and the recognition accuracy meets the preset requirement, so that 6 trained deep network models can be obtained in total, wherein the three trained deep network models comprise 3 deep network models for visible light image recognition and 3 deep network models for thermal infrared image recognition.
And 3, based on the 6 trained depth network models obtained in the step 2, respectively adopting an identification verification method, and selecting a proper one from the 6 depth network models as a disease identification model of the invention.
Based on the foregoing embodiment, as an optional embodiment, the determining, as the disease recognition model, one of the 2M trained deep learning models having the highest disease recognition accuracy includes:
aiming at the target disease, constructing a verification sample set consisting of a preset number of image samples; respectively identifying and verifying the 2M trained deep learning models by utilizing all image samples in the verification sample set, and acquiring the identification accuracy of each deep learning model, the average memory occupied by model operation and the average model operation time; determining one of the 2M trained deep learning models with highest disease recognition accuracy as the disease recognition model based on the recognition accuracy, the average memory occupied by the model operation and the preset weight ratio of the model operation average time; wherein, in the verification sample set, the ratio of the image samples related to the target disease is greater than 50%; the image samples include a visible light image sample and a thermal infrared image sample.
The invention provides a selection method of an optimal disease identification model. Comprising the following steps:
1000 image samples (comprising 500 visible light images and 500 thermal infrared images, each image sample having a corresponding disease state label) are selected as a verification sample set (which may also be referred to as a model evaluation sample set); and respectively verifying the 6 depth network models by using the verification sample set, namely, sequentially inputting 1000 image samples into each depth network model, recording the identification result of each image sample by each depth network model, comparing the identification result of each image sample with the corresponding disease state label to determine the identification result of each depth network model for the verification sample set, and taking the average value of all the identification results as the identification accuracy. Thus, the recognition accuracy of the 6 depth network models for the same verification sample set can be calculated.
Further, the memory percentage occupied by each depth network model in the identification process of the verification sample set can be counted and used as the average memory occupied by the operation of the depth network model; meanwhile, the total time spent by each depth network model for realizing complete identification of the verification sample set can be obtained and used as the average time of model operation.
The 6 depth network models can be compared according to the weight sequence of the average memory occupied by the operation of the identification accuracy rate model and the calculation average time of the model, so that an optimal model is selected from the 6 depth network models to serve as the disease identification model of the application.
Optionally, different weight ratios of the recognition accuracy, the average memory occupied by the model running and the average time of the model operation can be given, for example: and setting the weight value of the recognition accuracy to be 0.6, the weight value of the average memory occupied by the model operation to be 0.3 and the weight value of the model operation average time to be 0.1, and finally calculating the score of each depth network model. And finally, taking a depth network model with the highest score as a disease identification model.
In order to ensure that the selected model is the one with the highest recognition accuracy in all the deep network models, when a verification sample set is constructed, verification can be performed on each different kind of disease, for example, when the leaf mold needs to be specifically recognized, that is, the recognition capability of the deep learning model on the disease kind needs to be enhanced, so that the proportion of the image sample related to the leaf mold in the whole verification sample set needs to be greater than 50%.
Based on the foregoing embodiments, as an optional embodiment, the inputting the detected image group of the crop canopy in each grid into the disease recognition model, and obtaining the disease state of the crop in each grid according to the output result of the disease recognition model includes: according to the verification result, determining the image type of the input image sample when the disease identification model has the highest identification precision; when the image type is a visible light image, inputting the visible light image in the detection image group into the disease identification model to acquire a disease state corresponding to the visible light image; and when the image type is a thermal infrared image, inputting the thermal infrared image in the detection image group into the disease identification model to acquire a disease state corresponding to the visible light image.
For any target planting area, the planted crops are generally fixed, and the disease types infected by the crops at different periods are generally regular. Therefore, the application method provided by the invention can be used for predetermining the types of diseases to be detected according to different types of planted crops in a target planting area and different application periods.
Then, a deep learning network model with highest identification precision for the disease type to be detected can be selected from 2M different deep learning models according to the preset disease type to serve as a disease identification model, so that the detection image group of the crop canopy in each grid acquired in real time is identified through the disease identification model.
Further, in the process of selecting the disease recognition model by using the verification sample set, it can be determined which type of image the selected disease recognition model has relatively high detection accuracy when detecting. Thus, when the detection image group (including the visible light image and the thermal infrared image) is input to the disease recognition model, the image of the type with high recognition accuracy of the current disease recognition model can be selected from the detection image group as the detection object, and the other image which is not the recognition object can be stored.
Specifically, when the image type is a visible light image, inputting the visible light image in the detection image group to the disease identification model to acquire a disease state corresponding to the visible light image; when the image type is a thermal infrared image, the thermal infrared image in the detection image group is input into a disease identification model to acquire a disease state corresponding to the visible light image.
According to the application method provided by the invention, the visible light image and the thermal infrared image of the same crop canopy are collected at the same time, so that the existence of diseases, the disease degree and the specific area of the diseases of target crops can be rapidly identified, and the accurate application of the application system is controlled according to the disease identification result, thereby realizing the purposes of early prevention of crop diseases, pesticide application reduction and synergy and human-medicine separation.
Based on the content of the above embodiment, as an alternative embodiment, a visible light image sample set and a thermal infrared image sample set are respectively constructed in the process of acquiring a plurality of visible light images and a plurality of Zhang Regong external images of a crop canopy of a sample crop infected with a target disease, and specifically include: and amplifying each visible light image and each thermal infrared image in a turnover, translation or rotation mode, and carrying out normalization processing on all amplified images to respectively construct the visible light image sample set and the thermal infrared image sample set.
According to the drug application method provided by the invention, the number of the image samples can be effectively amplified by preprocessing the image sample set for training, so that the recognition accuracy and the robustness of the model can be enhanced.
Based on the content of the above embodiment, as an alternative embodiment, after meshing the target planting area, it further includes: assigning a grid number to each grid;
correspondingly, the acquiring the detection image group of the crop canopy in each grid comprises the following steps: traversing and shooting a detection image group of the crop canopy in each grid; the method for generating the medicine application prescription chart according to the disease state of crops in each grid comprises the following steps: determining disease states corresponding to each grid number to determine the application rate corresponding to each grid number; and generating the drug application prescription chart according to the positioning of each grid number in the target planting area and the corresponding drug application amount.
Specifically, as shown in fig. 3, after the entire target planting area is gridded, 12 grids are obtained. Each grid can be numbered in turn according to the inspection line and the 12 grid numbers of 1-12, and each grid corresponds to a unique grid number.
Further, after the visible light and thermal infrared dual-channel camera is patrolled and examined above each grid area, a detection image group (comprising a visible light image and a thermal infrared image) is obtained, and each detection image group is correspondingly numbered according to the 12 numbers of 1-12 (also equivalent to each visible light image and each thermal infrared image corresponding to one grid number).
Further, by using the trained disease recognition model to recognize the detection image group of each grid, a corresponding disease state can be obtained, that is, each disease state also has a grid number corresponding to the disease state.
Then, according to the grid number, the disease state corresponding to each grid in the target planting area can be obtained. For example, the grid area with grid number 2 is a moderate disease area, the grid area with grid number 6 is a mild disease area, etc.
Fig. 4 is a second flow chart of the method for applying the drug according to the present invention, as shown in fig. 4, the method for applying the drug according to the present invention mainly includes the following steps:
before the pesticide is applied, the detection image group of the crop canopy in each grid is inspected in the inspection process of the pesticide application system until the inspection of all grids in the whole target planting area is completed.
Stopping inspection, inputting the detection image group of each grid obtained in the inspection process into a disease recognition model trained in advance to realize disease state recognition of images, and obtaining a disease state analysis result of each grid region.
After disease states of all grid areas in the whole target planting area are recognized, namely, disease occurrence conditions in the whole inspection area are mastered, the drug application rate in each grid area is determined respectively, and a corresponding drug application prescription chart is generated.
And according to the medicine application prescription chart, the medicine application system is moved to a grid area where medicine application needs to be started to start medicine application until the whole target operation area medicine application work is completed.
Fig. 5 is a schematic structural view of an application system according to the present invention, as shown in fig. 5, mainly including: a mesh dividing unit 51, an image collecting unit 52, a disease identifying unit 53, a prescription map making unit 54, and a dispensing executing unit 55, wherein:
the grid dividing unit 51 is mainly used for dividing the grid of the target planting area; the image acquisition unit 52 is mainly used for acquiring a detection image group of crop canopy in each grid, wherein the detection image group consists of a visible light image and a thermal infrared image; the disease recognition unit 53 is mainly used for inputting the detected image group of the crop canopy in each grid to a disease recognition model, and obtaining the disease state of the crop in each grid according to the output result of the disease recognition model, wherein the disease state comprises disease types and disease grades; the prescription map making unit 54 is mainly used for generating an application prescription map according to the disease state of crops in each grid; the application execution unit 55 is mainly used for applying the medicine to the target planting area based on the medicine application prescription chart.
Furthermore, the drug delivery system provided by the invention can also comprise a power system unit for driving the motion of the drug delivery system, and can be driven by a battery, an electric control module and a movable chassis.
The delivery system may further comprise: the composite rack unit is used for mounting other units and can be made of light materials such as carbon fiber and the like.
The pesticide application system provided by the invention can also comprise a suspended track suspended above the crop planting area, and the movable chassis of the power system unit can move above to drive the pesticide application system to move.
Further, the pesticide spraying execution unit is used for pesticide spraying and consists of a pesticide box, a pesticide pump, a pesticide pipe, a sprayer and the like.
According to the pesticide application system provided by the invention, the visible light image and the thermal infrared image of the same crop canopy are collected at the same time, so that whether a target crop has diseases, the disease degree and the specific area where the diseases occur can be rapidly identified, and the accurate pesticide application of the pesticide application system is controlled according to the disease identification result, so that the purposes of early prevention of crop diseases, pesticide application reduction and synergy and human-pesticide separation are realized.
It should be noted that, when the application system provided in the embodiment of the present invention is specifically implemented, the application system may be implemented based on the application method described in any one of the above embodiments, which is not described in detail in this embodiment.
Fig. 6 is a schematic structural diagram of an electronic device according to the present invention, and as shown in fig. 6, the electronic device may include: processor 610, communication interface 620, memory 630, and communication bus 640, wherein processor 610, communication interface 620, and memory 630 communicate with each other via communication bus 640. Processor 610 may invoke logic instructions in memory 630 to perform a method of administration comprising: performing grid division on a target planting area; acquiring a detection image group of crop canopy in each grid, wherein the detection image group consists of a visible light image and a thermal infrared image; inputting the detection image group of the crop canopy in each grid into a disease identification model, and acquiring the disease state of the crop in each grid according to the output result of the disease identification model; the disease state includes disease type and disease grade; generating an application prescription chart according to the disease state of crops in each grid; and applying the medicine to the target planting area based on the medicine application prescription chart.
Further, the logic instructions in the memory 630 may be implemented in the form of software functional units and stored in a computer-readable storage medium when sold or used as a stand-alone product. Based on this understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-only memory (ROM), a random access memory (RAM, randomAccessMemory), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In another aspect, the present invention also provides a computer program product comprising a computer program stored on a non-transitory computer readable storage medium, the computer program comprising program instructions which, when executed by a computer, enable the execution of the method of dispensing medicament provided by the methods described above, the method comprising: performing grid division on a target planting area; acquiring a detection image group of crop canopy in each grid, wherein the detection image group consists of a visible light image and a thermal infrared image; inputting the detection image group of the crop canopy in each grid into a disease identification model, and acquiring the disease state of the crop in each grid according to the output result of the disease identification model; the disease state includes disease type and disease grade; generating an application prescription chart according to the disease state of crops in each grid; and applying the medicine to the target planting area based on the medicine application prescription chart.
In yet another aspect, the present invention also provides a non-transitory computer readable storage medium having stored thereon a computer program which, when executed by a processor, is implemented to perform the method of administering medication provided by the above embodiments, the method comprising: performing grid division on a target planting area; acquiring a detection image group of crop canopy in each grid, wherein the detection image group consists of a visible light image and a thermal infrared image; inputting the detection image group of the crop canopy in each grid into a disease identification model, and acquiring the disease state of the crop in each grid according to the output result of the disease identification model; the disease state includes disease type and disease grade; generating an application prescription chart according to the disease state of crops in each grid; and applying the medicine to the target planting area based on the medicine application prescription chart.
The apparatus embodiments described above are merely illustrative, wherein the elements illustrated as separate elements may or may not be physically separate, and the elements shown as elements may or may not be physical elements, may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Those of ordinary skill in the art will understand and implement the present invention without undue burden.
From the above description of the embodiments, it will be apparent to those skilled in the art that the embodiments may be implemented by means of software plus necessary general hardware platforms, or of course may be implemented by means of hardware. Based on this understanding, the foregoing technical solution may be embodied essentially or in a part contributing to the prior art in the form of a software product, which may be stored in a computer readable storage medium, such as ROM/RAM, a magnetic disk, an optical disk, etc., including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method described in the respective embodiments or some parts of the embodiments.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (7)

1. A method of administering a drug comprising:
performing grid division on a target planting area;
acquiring a detection image group of crop canopy in each grid, wherein the detection image group consists of a visible light image and a thermal infrared image;
inputting the detection image group of the crop canopy in each grid into a disease identification model, and acquiring the disease state of the crop in each grid according to the output result of the disease identification model; the disease state includes disease type and disease grade;
generating an application prescription chart according to the disease state of crops in each grid;
applying the drug to the target planting area based on the drug application prescription chart;
Before inputting the detection image group of the crop canopy in each grid into the disease identification model, the method further comprises:
obtaining a plurality of visible light images and a plurality of Zhang Regong external images of a crop canopy which is a sample of an infection target disease, respectively constructing a visible light image sample set and a thermal infrared image sample set, and respectively labeling corresponding disease state labels for each visible light image sample and each thermal infrared image sample;
selecting M different deep learning models;
model pre-training is carried out on the M different deep learning models by using the visible light image sample set, and model pre-training is carried out on the M different deep learning models by using the infrared image sample set, so that 2M trained deep learning models are obtained in total;
determining one of the 2M trained deep learning models with highest disease recognition accuracy as the disease recognition model;
the M different deep learning models at least include: resNet-50, inceptionV3, mobileNet three kinds of learning models;
the determining, as the disease recognition model, one of the 2M trained deep learning models having the highest disease recognition accuracy includes:
Aiming at the target disease, constructing a verification sample set consisting of a preset number of image samples;
respectively identifying and verifying the 2M trained deep learning models by utilizing all image samples in the verification sample set, and acquiring the identification accuracy of each deep learning model, the average memory occupied by model operation and the average model operation time;
determining one of the 2M trained deep learning models with highest disease recognition accuracy as the disease recognition model based on the recognition accuracy, the average memory occupied by the model operation and the preset weight ratio of the model operation average time;
wherein, in the verification sample set, the ratio of the image samples related to the target disease is greater than 50%; the image samples include a visible light image sample and a thermal infrared image sample.
2. The method according to claim 1, wherein the inputting the detected image group of the canopy of the crop in each grid into the disease recognition model, and obtaining the disease state of the crop in each grid according to the output result of the disease recognition model, comprises:
according to the verification result, determining the image type of the input image sample when the disease identification model has the highest identification precision;
When the image type is a visible light image, inputting the visible light image in the detection image group into the disease identification model to acquire a disease state corresponding to the visible light image;
and when the image type is a thermal infrared image, inputting the thermal infrared image in the detection image group into the disease identification model to acquire a disease state corresponding to the visible light image.
3. The method of claim 1, wherein the steps of constructing a visible light image sample set and a thermal infrared image sample set respectively from a plurality of visible light images and a plurality of Zhang Regong external images of the canopy of the sample crop infected with the target disease comprise:
and amplifying each visible light image and each thermal infrared image in a turnover, translation or rotation mode, and carrying out normalization processing on all amplified images to respectively construct the visible light image sample set and the thermal infrared image sample set.
4. The method of claim 1, further comprising, after meshing the target planting area:
assigning a grid number to each grid;
Correspondingly, the acquiring the detection image group of the crop canopy in each grid comprises the following steps:
traversing and shooting a detection image group of the crop canopy in each grid;
the method for generating the medicine application prescription chart according to the disease state of crops in each grid comprises the following steps:
determining disease states corresponding to each grid number to determine the application rate corresponding to each grid number;
and generating the drug application prescription chart according to the positioning of each grid number in the target planting area and the corresponding drug application amount.
5. A delivery system, comprising:
the grid dividing unit is used for dividing the grid of the target planting area;
the image acquisition unit is used for acquiring a detection image group of the crop canopy in each grid, wherein the detection image group consists of a visible light image and a thermal infrared image;
the disease identification unit is used for inputting the detection image group of the crop canopy in each grid to the disease identification model, and acquiring the disease state of the crop in each grid according to the output result of the disease identification model, wherein the disease state comprises disease types and disease grades;
the prescription drawing making unit is used for generating an application prescription drawing according to the disease state of crops in each grid;
A drug delivery execution unit for delivering drugs to the target planting area based on the drug delivery prescription map;
before inputting the detection image group of the crop canopy in each grid into the disease identification model, the method further comprises:
obtaining a plurality of visible light images and a plurality of Zhang Regong external images of a crop canopy which is a sample of an infection target disease, respectively constructing a visible light image sample set and a thermal infrared image sample set, and respectively labeling corresponding disease state labels for each visible light image sample and each thermal infrared image sample;
selecting M different deep learning models;
model pre-training is carried out on the M different deep learning models by using the visible light image sample set, and model pre-training is carried out on the M different deep learning models by using the infrared image sample set, so that 2M trained deep learning models are obtained in total;
determining one of the 2M trained deep learning models with highest disease recognition accuracy as the disease recognition model;
the M different deep learning models at least include: resNet-50, inceptionV3, mobileNet three kinds of learning models;
the determining, as the disease recognition model, one of the 2M trained deep learning models having the highest disease recognition accuracy includes:
Aiming at the target disease, constructing a verification sample set consisting of a preset number of image samples;
respectively identifying and verifying the 2M trained deep learning models by utilizing all image samples in the verification sample set, and acquiring the identification accuracy of each deep learning model, the average memory occupied by model operation and the average model operation time;
determining one of the 2M trained deep learning models with highest disease recognition accuracy as the disease recognition model based on the recognition accuracy, the average memory occupied by the model operation and the preset weight ratio of the model operation average time;
wherein, in the verification sample set, the ratio of the image samples related to the target disease is greater than 50%; the image samples include a visible light image sample and a thermal infrared image sample.
6. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor, when executing the computer program, implements the method steps of administering according to any one of claims 1 to 4.
7. A non-transitory computer readable storage medium, having stored thereon a computer program, characterized in that the computer program, when executed by a processor, implements the method steps of administering according to any of claims 1 to 4.
CN202110290311.6A 2021-03-16 2021-03-16 Drug application method and system Active CN113159125B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110290311.6A CN113159125B (en) 2021-03-16 2021-03-16 Drug application method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110290311.6A CN113159125B (en) 2021-03-16 2021-03-16 Drug application method and system

Publications (2)

Publication Number Publication Date
CN113159125A CN113159125A (en) 2021-07-23
CN113159125B true CN113159125B (en) 2024-04-05

Family

ID=76887849

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110290311.6A Active CN113159125B (en) 2021-03-16 2021-03-16 Drug application method and system

Country Status (1)

Country Link
CN (1) CN113159125B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116468671B (en) * 2023-03-21 2024-04-16 中化现代农业有限公司 Plant disease degree detection method, device, electronic apparatus, and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107392091A (en) * 2017-06-09 2017-11-24 河北威远生物化工有限公司 A kind of agriculture artificial intelligence makees object detecting method, mobile terminal and computer-readable medium
CN109859101A (en) * 2019-01-18 2019-06-07 黑龙江八一农垦大学 The recognition methods of corps canopy thermal infrared images and system
WO2020012259A1 (en) * 2018-07-10 2020-01-16 Adroit Robotics Systems, devices, and methods for in-field diagnosis of growth stage and crop yield estimation in a plant area
WO2020047739A1 (en) * 2018-09-04 2020-03-12 安徽中科智能感知大数据产业技术研究院有限责任公司 Method for predicting severe wheat disease on the basis of multiple time-series attribute element depth features
CA3119812A1 (en) * 2018-12-10 2020-06-18 The Climate Corporation Mapping field anomalies using digital images and machine learning models

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10957036B2 (en) * 2019-05-17 2021-03-23 Ceres Imaging, Inc. Methods and systems for crop pest management utilizing geospatial images and microclimate data

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107392091A (en) * 2017-06-09 2017-11-24 河北威远生物化工有限公司 A kind of agriculture artificial intelligence makees object detecting method, mobile terminal and computer-readable medium
WO2020012259A1 (en) * 2018-07-10 2020-01-16 Adroit Robotics Systems, devices, and methods for in-field diagnosis of growth stage and crop yield estimation in a plant area
WO2020047739A1 (en) * 2018-09-04 2020-03-12 安徽中科智能感知大数据产业技术研究院有限责任公司 Method for predicting severe wheat disease on the basis of multiple time-series attribute element depth features
CA3119812A1 (en) * 2018-12-10 2020-06-18 The Climate Corporation Mapping field anomalies using digital images and machine learning models
CN109859101A (en) * 2019-01-18 2019-06-07 黑龙江八一农垦大学 The recognition methods of corps canopy thermal infrared images and system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
农作物图像特征提取技术及其在病害诊断中的应用;马晓丹;朱可心;关海鸥;冯佳睿;刘梦;郑明;;黑龙江八一农垦大学学报(02);全文 *
基于红外热像的农作物早期病害检测识别技术的研究进展;杨成娅;张艳;赵明珠;朱应燕;;激光杂志(06);全文 *
基于高光谱成像的水稻穗瘟病害程度分级方法;黄双萍;齐龙;马旭;薛昆南;汪文娟;;农业工程学报(01);全文 *

Also Published As

Publication number Publication date
CN113159125A (en) 2021-07-23

Similar Documents

Publication Publication Date Title
CN111582055B (en) Unmanned aerial vehicle aviation pesticide application route generation method and system
CN109389161B (en) Garbage identification evolutionary learning method, device, system and medium based on deep learning
Yu et al. Detection of broadleaf weeds growing in turfgrass with convolutional neural networks
US20220107298A1 (en) Systems and methods for crop health monitoring, assessment and prediction
CN109840549B (en) Method and device for identifying plant diseases and insect pests
CN102084794B (en) Method and device for early detecting crop pests based on multisensor information fusion
CN110132989A (en) A kind of distress in concrete detection device, method and terminal system
CN114037552B (en) Method and system for polling physiological growth information of meat ducks
CN113159125B (en) Drug application method and system
CN107909492A (en) It is a kind of to survey damage identification method using the agricultural insurance of machine learning techniques
CN109886155A (en) Man power single stem rice detection localization method, system, equipment and medium based on deep learning
CN112580552A (en) Method and device for analyzing behavior of rats
CN115601585A (en) Agricultural pest and disease diagnosis method and device based on picture analysis
CN112580671A (en) Automatic detection method and system for multiple development stages of rice ears based on deep learning
EP3990913A1 (en) Automated plant monitoring systems and methods
Kolhe et al. Smart communication system for agriculture
CN115937795A (en) Method and device for acquiring farming activity record based on rural video
CN110327596A (en) A kind of motion analysis technique and system
CN117911939A (en) Real-time monitoring and early warning method and system for pine wood nematode disaster based on image segmentation
CN116277073A (en) Chicken breeding inspection robot equipment, control system and method
CN115956549A (en) Automatic medicine spraying robot based on machine vision
CN111652084B (en) Abnormal layer identification method and device
Browne et al. Cognitive robotics: new insights into robot and human intelligence by reverse engineering brain functions [from the guest editors]
CN214067797U (en) Crop disease detection device based on cooperation of multiple aircrafts
CN116584414B (en) Laying hen infectious disease development stage prediction system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant