CN114463649A - Soil insect pest determination method and device and pesticide formula generation method and device - Google Patents

Soil insect pest determination method and device and pesticide formula generation method and device Download PDF

Info

Publication number
CN114463649A
CN114463649A CN202111667411.2A CN202111667411A CN114463649A CN 114463649 A CN114463649 A CN 114463649A CN 202111667411 A CN202111667411 A CN 202111667411A CN 114463649 A CN114463649 A CN 114463649A
Authority
CN
China
Prior art keywords
soil
image
determining
land area
underground
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111667411.2A
Other languages
Chinese (zh)
Other versions
CN114463649B (en
Inventor
代双亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xaircraft Technology Co Ltd
Original Assignee
Guangzhou Xaircraft Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xaircraft Technology Co Ltd filed Critical Guangzhou Xaircraft Technology Co Ltd
Priority to CN202111667411.2A priority Critical patent/CN114463649B/en
Publication of CN114463649A publication Critical patent/CN114463649A/en
Application granted granted Critical
Publication of CN114463649B publication Critical patent/CN114463649B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M17/00Apparatus for the destruction of vermin in soil or in foodstuffs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • General Health & Medical Sciences (AREA)
  • Wood Science & Technology (AREA)
  • Insects & Arthropods (AREA)
  • Pest Control & Pesticides (AREA)
  • Soil Sciences (AREA)
  • Environmental Sciences (AREA)
  • Zoology (AREA)
  • Food Science & Technology (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Catching Or Destruction (AREA)

Abstract

The application provides a method and a device for determining soil insect pests and a method and a device for generating a pesticide formula, and relates to the technical field of deep learning. The soil insect pest determination method comprises the following steps: determining M underground soil section image sets based on M image acquisition points of a target land area, wherein each underground soil section image set comprises at least two underground soil section images with a time sequence incidence relation, and M is a positive integer; determining image difference data corresponding to M underground soil section image sets by using an image difference detection model; and determining soil pest information of the target land area based on the image difference data corresponding to the M underground soil tangent plane image sets. The technical scheme of this application can acquire the emergence of insect pest in the soil in real time, according to the soil insect pest information that detects, takes corresponding treatment, and then avoids the economic loss that the large tracts of land insect pest takes place to bring.

Description

Soil insect pest determination method and device and pesticide formula generation method and device
Technical Field
The application relates to the technical field of deep learning, in particular to a method and a device for determining soil insect pests and a method and a device for generating a pesticide formula.
Background
The principle of controlling the crop pests is 'prevention is mainly and comprehensively controlled'. According to the occurrence rule of insect pests, effective and feasible measures are taken, effective control is carried out before damage is caused by the insect pests, and economic loss is reduced.
At present, pests on the ground surface are easy to find, economic loss can be reduced by spraying medicines in time, underground pests frequently move below the ground surface and are usually difficult to find, generally, crops can be intuitively observed only when obvious symptoms appear, and great economic loss can be caused at the moment.
Disclosure of Invention
The present application is proposed to solve the above-mentioned technical problems. The embodiment of the application provides a method and a device for determining soil insect pests and a method and a device for generating a pesticide formula.
In a first aspect, the present application provides a method for determining soil insect damage, the method comprising: determining M underground soil section image sets based on M image acquisition points of a target land area, wherein each underground soil section image set comprises at least two underground soil section images with a time sequence incidence relation, and M is a positive integer; determining image difference data corresponding to M underground soil section image sets by using an image difference detection model; and determining soil insect pest information of the target land area based on the image difference data corresponding to the M underground soil section image sets.
With reference to the first aspect, in certain implementations of the first aspect, the determining, by using an image difference detection model, image difference data corresponding to each of the M subsurface soil tangent plane image sets includes: determining vector data corresponding to the underground soil section images contained in the underground soil section image set by using an image difference detection model aiming at each underground soil section image set in the M underground soil section image sets; and determining cosine distance data between the underground soil section images contained in the underground soil section image set based on vector data corresponding to the underground soil section images contained in the underground soil section image set by using the image difference detection model.
With reference to the first aspect, in certain implementations of the first aspect, determining soil pest information for the target land area based on image difference data corresponding to each of the M subsurface soil cut plane image sets includes: determining pest occurrence information corresponding to the M image acquisition points based on cosine distance data corresponding to the M underground soil tangent plane image sets and a preset cosine distance threshold; and if the number of the collecting points with the pests is determined to be larger than or equal to the first preset number based on the pest occurrence information corresponding to the M image collecting points, determining that the occurrence degree of the pests in the soil of the target land area is high.
With reference to the first aspect, in certain implementations of the first aspect, determining soil pest information of the target land area based on image difference data corresponding to each of the M subsurface soil cut plane image sets further includes: and if the number of the collecting points with the insect pests is determined to be less than the first preset number and greater than or equal to the second preset number based on the insect pest occurrence information corresponding to the M image collecting points, determining the occurrence degree of the soil insect pests in the target land area as middle.
With reference to the first aspect, in certain implementations of the first aspect, determining soil pest information of the target land area based on image difference data corresponding to each of the M subsurface soil cut plane image sets further includes: and if the number of the collecting points with the insect pests is determined to be less than a second preset number based on the insect pest occurrence information corresponding to the M image collecting points, determining that the occurrence degree of the insect pests in the soil of the target land area is low.
With reference to the first aspect, in certain implementations of the first aspect, before determining, by using an image difference detection model, image difference data corresponding to each of the M subsurface soil section image sets, the method further includes: determining a training data set, wherein the training data set comprises N underground soil section image sample sets and label data corresponding to the underground soil section image sample sets; and training an initial network model based on a training data set to obtain the image difference detection model.
With reference to the first aspect, in certain implementations of the first aspect, the N underground soil section image sample sets include at least one of soil section image samples before and after watering, soil section image samples of different types of soil samples, soil section image samples before and after crawling of different types of pests, and soil section image samples of the same soil section under different illumination intensities.
In a second aspect, the present application provides a method of forming a pesticide formulation, the method comprising: determining soil pest information of the target land area, the soil pest information being determined based on the method of the first aspect; a pesticide formulation for the target land area is determined based on the soil pest information to generate a pesticide preparation task based on the pesticide formulation.
In a third aspect, the present application provides a pesticide spraying task generating method, including: determining a pesticide formulation for the target land area based on soil pest information for the target land area, the pesticide formulation determined based on the method of the second aspect; and generating a pesticide spraying task corresponding to the target land area based on the pesticide formula so that the pesticide spraying equipment executes the pesticide spraying task.
In a fourth aspect, the present application provides a soil pest determination device, the device comprising: the image determining module is used for determining M underground soil section image sets based on M image acquisition points of the target land area, each underground soil section image set comprises at least two underground soil section images with a time sequence incidence relation, and M is a positive integer; the image difference data determining module is used for determining image difference data corresponding to the M underground soil section image sets by using the image difference detection model; and the pest information determining module is used for determining the soil pest information of the target land area based on the image difference data corresponding to the M underground soil section image sets.
In a fifth aspect, the present application provides a pesticide formulation generating apparatus comprising: a first determination module for determining soil pest information for a target land area, the soil pest information being determined based on the method of the first aspect; a second determination module to generate a pesticide formulation for the target land area based on the soil pest information to generate a pesticide preparation task based on the pesticide formulation.
In a sixth aspect, the present application provides a pesticide spraying task generating device, including: a pesticide formula determining module for determining a pesticide formula of the target land area based on the soil pest information of the target land area, the pesticide formula being determined based on the method of the second aspect; and the spraying task generating module is used for generating a pesticide spraying task corresponding to the target land area based on the pesticide formula so that the pesticide spraying equipment can execute the pesticide spraying task.
In a seventh aspect, the present application provides a computer readable storage medium having stored thereon a computer program for executing the method of any one of the first to third aspects.
In an eighth aspect, the present application provides an electronic device, comprising: a processor; a memory for storing processor-executable instructions; the processor is configured to perform the method of the first to third aspects.
In a ninth aspect, the present application provides a soil pest determination system, comprising: the system comprises an image acquisition system, a data acquisition system and a data processing system, wherein the image acquisition system is used for shooting M underground soil tangent plane image sets corresponding to M image acquisition points of a target land area; and an electronic device as in the eighth aspect, the electronic device being connected to the image acquisition system.
According to the soil insect pest determining method provided by the embodiment of the application, the image difference data corresponding to the M underground soil tangent plane image sets corresponding to the M image acquisition points of the target land area are determined by utilizing the image difference detection model, and the soil insect pest information of the target land area is determined. The method can obtain the occurrence condition of pests in the soil, and further use a corresponding method to control the pests, kill the pests in the germination stage, and avoid economic loss caused by large-area occurrence of the pests.
Drawings
The above and other objects, features and advantages of the present application will become more apparent by describing in more detail embodiments of the present application with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of the embodiments of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the principles of the application. In the drawings, like reference numbers generally represent like parts or steps.
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application.
Fig. 2 is a schematic flow chart of a soil pest determination method according to an embodiment of the present disclosure.
Fig. 3 is a schematic flow chart illustrating a process of determining image difference data corresponding to M subsurface soil slice image sets according to an embodiment of the present application.
Fig. 4 is a schematic flow chart illustrating a process for determining soil pest information for a target land area according to an embodiment of the present disclosure.
Fig. 5 is a schematic flow chart of a soil pest determination method according to another embodiment of the present application.
Fig. 6 is a schematic flow chart of a method for generating a pesticide formulation according to an embodiment of the present disclosure.
Fig. 7 is a schematic flow chart of a method for generating a pesticide spraying task according to an embodiment of the present application.
Fig. 8 is a schematic structural view of a soil pest determination device according to an embodiment of the present application.
Fig. 9 is a schematic structural diagram of a pesticide formula generating apparatus according to an embodiment of the present application.
Fig. 10 is a schematic structural diagram of a pesticide spraying task generating device according to an embodiment of the present application.
Fig. 11 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 12 is a schematic structural diagram of an image acquisition system according to an embodiment of the present application.
Fig. 13 is a schematic structural diagram of an image acquisition system according to another embodiment of the present application.
Fig. 14 is a schematic structural diagram of an image acquisition system according to yet another embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
At present, surface pests are easy to observe, economic loss can be reduced by spraying medicines in time according to observation results, and underground pests are difficult to treat. Underground pests often move below the ground surface, people are difficult to find the pests intuitively, and pest control measures cannot be carried out in time until crops have obvious symptoms, so that certain economic loss is often caused.
The pests in the soil are mostly larger and individuals with crawling ability. During the crawling process, the pests can form holes visible to the naked eyes in the soil, the holes formed by crawling are usually obvious, the moving tracks of the pests are more easily found at the positions close to the roots, and therefore, if the underground soil section view close to the roots can be obtained, the occurrence condition of the soil pests can be sensed earlier.
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application. As shown in fig. 1, the application scenario provided in the embodiment of the present application is a farmland application scenario. Specifically, the scene includes a farm 110, a server 120, and an image capture system 130 coupled to the server 120. Specifically, the farm 110 can be considered a target land area, with an image difference detection model deployed in the server 120 that can determine image difference data. Wherein the image difference data comprises cosine distance data.
Illustratively, in practical applications, the image capturing system 130 captures an image of a subsurface soil section of the agricultural field 110 near the root of the crop and uploads the captured image of the subsurface soil section to the server 120. After the server 120 obtains the underground soil section image, the image difference detection model is used to determine the image difference data corresponding to the underground soil section image set.
In the application scene, tedious collection is not needed, manual participation is less, and after M underground soil section image sets are determined, image difference data corresponding to the underground soil section image sets can be obtained through analysis by using an image difference detection model, so that the method is timely and efficient, and is completely suitable for farmland land scenes with large areas.
It should be noted that the server may be replaced by other types of processing devices, such as, but not limited to, an electronic device with processing capability, such as a tablet computer or a personal computer. For example, in one device, the device may have a function of performing the soil insect pest determination method, and may also have a function of an image acquisition system, which is not limited in this application.
Fig. 2 is a schematic flow chart of a soil pest determination method according to an embodiment of the present disclosure. Illustratively, the soil insect pest determination method can be executed in a processor of hardware equipment such as a farming machine and the like, and can also be executed in an associated server.
As shown in fig. 2, a soil pest determination method provided by the embodiment of the application includes the following steps.
Step 230, determining M underground soil section image sets based on the M image acquisition points of the target land area.
Specifically, each underground soil section image set comprises at least two underground soil section images with a time sequence incidence relation, and M is a positive integer. For example, the two subsurface soil section images are T acquired at the image acquisition point anAnd Tn+1And (3) underground soil section images of the time.
Illustratively, a target land area refers to an area of land where pest detection is desired, such as a field where pest detection is desired for better planting.
Illustratively, the underground soil section image refers to a soil section image near the root of a crop in a target land area, for example, a soil section image obtained by photographing a photographing device buried under the ground at a distance of 50cm from the root of the crop.
And 240, determining image difference data corresponding to the M underground soil section image sets by using the image difference detection model.
In some embodiments, the image difference detection model refers to a deep neural network model. Illustratively, the depth neural network model is input into an underground soil section image set of the target land area, output is cosine distance data corresponding to the underground soil section image set, and the cosine distance data is determined as image difference data corresponding to the underground soil section image set.
And 250, determining soil insect pest information of the target land area based on the image difference data corresponding to the M underground soil section image sets.
Specifically, soil insect pest information of the target land area is determined according to respective corresponding cosine distances of M underground soil section image sets output by the image difference detection model.
In the practical application process, the image difference detection model can be deployed to the edge computing device on the ground so as to be computed in real time, and can also be deployed to the cloud server, and the embodiment of the application does not uniformly limit the image difference detection model.
According to the soil insect pest determining method provided by the embodiment of the application, through determining M underground soil section image sets corresponding to M image collecting points of the target land area, image difference analysis is performed on the underground soil section images contained in each underground soil section image set by using an image difference detection model, and insect pest information of the target land area is determined based on an analysis result. According to the method, after the underground soil section image is determined, the soil insect pest information can be detected by using the image difference detection model, so that the soil insect pest occurrence condition of a target land area which is difficult to directly observe is obtained, the method is timely and efficient, and is suitable for land (farmland) scenes with large areas, so that people can timely take further prevention and control measures according to the soil insect pest occurrence condition, kill the soil insect pest in the germination stage, and avoid economic loss caused by large-area occurrence of the soil insect pest.
Fig. 3 is a schematic flow chart illustrating a process of determining image difference data corresponding to M subsurface soil slice image sets according to an embodiment of the present application. The embodiment shown in fig. 3 is extended based on the embodiment shown in fig. 2, and the differences between the embodiment shown in fig. 3 and the embodiment shown in fig. 2 will be emphasized below, and the descriptions of the same parts will not be repeated.
As shown in fig. 3, in the method for determining soil insect damage provided in the embodiment of the present application, the image difference data includes cosine distance data, and the step of determining the image difference data corresponding to each of the M underground soil section image sets by using the image difference detection model includes the following steps.
And 241, determining vector data corresponding to the underground soil section images contained in the underground soil section image set by using an image difference detection model aiming at each underground soil section image set in the M underground soil section image sets.
Illustratively, if the subsurface soil sectional image set includes two subsurface soil sectional images, the subsurface soil sectional image set is included at TnAnd Tn+1Inputting the underground soil section image shot at any moment into a deep neural network model, wherein the deep neural network model can correspondingly generate xnAnd xn+1Two vectors.
And 242, determining cosine distance data between the underground soil section images contained in the underground soil section image set based on the vector data corresponding to the underground soil section images contained in the underground soil section image set by using the image difference detection model.
Specifically, the formula for calculating the cosine distance is:
Figure BDA0003448638030000081
if T is inputtednAnd Tn+1If the underground soil section images at the moment are the same, the output cosine distance data is 0, and if the input T isnAnd Tn+1And if the underground soil section images at different moments are different, the output cosine distance data is 1.
In the embodiment of the application, the image difference detection model can directly output cosine distance data corresponding to the input underground soil section image set, the method is simple and efficient, the reusability is high, and the collected underground soil section images of various target land areas can be analyzed.
Fig. 4 is a schematic flow chart illustrating a process for determining soil pest information for a target land area according to an embodiment of the present disclosure. The embodiment shown in fig. 4 is extended based on the embodiment shown in fig. 3, and the differences between the embodiment shown in fig. 4 and the embodiment shown in fig. 3 will be emphasized below, and the descriptions of the same parts will not be repeated.
As shown in fig. 4, in the soil pest determination method provided in the embodiment of the present application, the step of determining soil pest information of the target land area based on the image difference data corresponding to each of the M subsurface soil section image sets includes the following steps.
And 251, determining pest occurrence information corresponding to the M image acquisition points based on the cosine distance data corresponding to the M underground soil tangent plane image sets and a preset cosine distance threshold.
Exemplarily, if the preset cosine distance threshold is 0.6, when the cosine distance data of the underground soil tangent plane image set B is greater than or equal to 0.6, it is determined that pest information occurs in the image acquisition point B corresponding to the underground soil tangent plane image set B. And sequentially judging the pest information corresponding to the M image acquisition points according to the cosine distance threshold value of 0.6.
And 252, judging whether the number of the acquisition points with insect pests in the M image acquisition points is less than a first preset number or not based on the insect pest occurrence information corresponding to the M image acquisition points.
Specifically, a first preset number k is determined according to the number of image acquisition points with pest information, and whether the number of acquisition points with pest information in the M image acquisition points is smaller than the first preset number k is judged.
Illustratively, there are 20 image capturing spots, the first preset number k1 is 13, and it is determined whether the number of capturing spots in which insect infestation has occurred among the 20 image capturing spots is less than the first preset number 13.
Illustratively, if the judgment result in the step 252 is negative, that is, the number of the collection points with insect pests in the M image collection points is greater than or equal to a first preset number, the step 253 is executed; if the determination result in the step 252 is yes, that is, the number of the collection points with insect pests in the M image collection points is smaller than the first preset number, the step 254 is executed.
And step 253, determining the occurrence degree of the soil insect pests in the target land area to be high.
Following the example of step 252, if the number of collection points that have infested the 20 image collection points is 15, then the soil infestation level of the target land area is determined to be high.
And step 254, judging whether the number of the acquisition points with insect pests in the M image acquisition points is less than a second preset number.
Following the example of step 252, if the second predetermined number k2 is 7, it is determined whether the number of collection points that have infested insects among the 20 image collection points is less than the second predetermined number of 7.
Illustratively, if the judgment result in the step 254 is yes, that is, the number of the collection points with insect pests in the M image collection points is less than the second preset number, the step 255 is executed; if the judgment result in the step 254 is negative, that is, the number of the collection points with insect pests in the M image collection points is greater than or equal to a second preset number, the step 256 is executed.
Step 255, determining the occurrence of soil pests in the target land area to be low.
Following the example in step 254, if the number of pest-infested image capture sites of the 20 image capture sites is 5, the degree of pest infestation of the soil of the target land area is determined to be low.
Step 256, determining the occurrence of soil pests in the target land area as "medium".
Following the example of step 254, if the number of collection points that have infested a target area of 20 image collection points is 10, then the extent of soil infestation of the target area is determined to be medium.
According to the method for determining the soil insect pests, the insect pest occurrence information of the image acquisition points corresponding to the underground soil section image set is judged by comparing the size relationship between cosine distance data of the underground soil section image set and a preset cosine distance threshold, and the insect pest occurrence degree of the target land area is judged by judging the size relationship between the number of insect pests occurring in the M image acquisition points and a first preset number and a second preset number. The method can more visually reflect the pest occurrence condition of the target land area, and is convenient for people to take corresponding treatment measures according to the pest occurrence degree of the target land area.
Fig. 5 is a schematic flow chart of a soil pest determination method according to another embodiment of the present application. The embodiment shown in fig. 5 is extended based on the embodiment shown in fig. 2, and the differences between the embodiment shown in fig. 5 and the embodiment shown in fig. 2 will be emphasized below, and the descriptions of the same parts will not be repeated.
As shown in fig. 5, the method for determining soil pests according to the embodiment of the present application further includes the following steps before determining the image difference data corresponding to each of the M sets of sectional images of the subsurface soil by using the image difference detection model.
At step 210, a training data set is determined.
Specifically, the training data set includes N subsurface soil section image sample sets and label data corresponding to the subsurface soil section image sample sets.
Illustratively, the N underground soil section image sample sets include at least one of soil section image samples before and after watering, soil section image samples of different types of soil samples, soil section image samples before and after crawling of different types of pests, and soil section image samples of the same soil section under different illumination intensities.
Specifically, the shooting equipment can be embedded into different types of soil samples in advance, and soil section diagrams of the different types of soil samples can be shot; shooting soil section diagrams before and after watering of different types of soil samples; the crawling tracks of different kinds of insects are simulated, and soil section diagrams before and after the insects crawl are shot.
In the practical application process, image shooting equipment buries underground for a long time, in order to obtain clear image, generally all needs the light filling to shoot, and image shooting equipment uses with the cooperation of light filling module promptly, shoots soil tangent plane image. The light filling module can take place the loss along with time, influences light filling intensity, and then leads to even carrying out light filling control with same power, light filling intensity also can be different. Before training the image difference detection model, the embodiment of the application also needs to acquire a plurality of sample images of the same soil section under different illumination intensities.
Illustratively, for a sample set of subsurface soil section images with image contour change caused by worm crawling, the label data of the sample set of subsurface soil section images is recorded as 1. For the underground soil section image sample set which is not caused by image contour change due to crawling of insects, the label data of the underground soil section image sample set is recorded as 0, for example, the label data of the underground soil section image sample set before and after watering is shot is recorded as 0, the label data of the underground soil section image sample set shot under different illumination intensities is recorded as 0, and the like.
And step 220, training the initial network model based on the training data set to obtain an image difference detection model.
It is understood that the initial network model is a pre-established neural network model, and the initial network model and the image difference detection model are different in model parameter difference. Namely, training is carried out by using a training data set to adjust model parameters of the initial network model, and then a converged image difference detection model is finally obtained.
Specifically, the method adopts the cosine distance as a loss function of the image difference detection model, and specifically comprises the following steps:
Figure BDA0003448638030000121
wherein x isnAnd xn+1And respectively representing the corresponding vectors of the two soil section images in the underground soil section image sample set. The cosine distance is 0 when the two vectors are completely identical, and is 1 when the two vectors are completely different.
In some embodiments, the soil section image is an image in RGB three-channel image format, which is a color standard in the industry, and the color standard obtained by changing three color channels of red (R), green (G) and blue (blue) and superimposing them with each other includes almost all colors that can be perceived by human vision, and is one of the most widely used color systems.
In some embodiments, the soil section images in the underground soil section image sample set are preprocessed, and the preprocessed soil section images are included in the training set. Wherein the preprocessing includes a rotation and/or clipping operation. Specifically, the same soil section map is rotated and/or cut to obtain a new soil section map, and all the obtained new soil section maps are recorded in a training data set, so that the diversity of the sample is increased, and the accuracy of the obtained image difference detection model is improved.
According to the embodiment of the application, the model prediction accuracy of the image difference detection model obtained by training can be improved based on the abundant training data set.
Fig. 6 is a schematic flow chart of a method for generating a pesticide formulation according to an embodiment of the present disclosure. As shown in fig. 6, the method for generating a pesticide formulation provided in the examples of the present application includes the following steps.
Step 610, determining soil pest information of the target land area.
Illustratively, the soil pest information of the target land area is determined based on the soil pest determination method mentioned in any of the above embodiments.
Step 620, determining a pesticide formula of the target land area based on the soil pest information.
Illustratively, the pesticide formulation of the corresponding target land area is determined according to the occurrence degree of the soil insect pests of the target land area.
Illustratively, if the target land area has a low incidence of soil pests, a low concentration pesticide formulation may be prepared.
The embodiment of the application can reasonably and scientifically prepare the corresponding pesticide formula based on the occurrence degree of the soil insect pests.
Fig. 7 is a schematic flow chart of a method for generating a pesticide spraying task according to an embodiment of the present application. As shown in fig. 7, the method for generating a pesticide formulation provided in the examples of the present application includes the following steps.
Step 710, determining a pesticide formulation of the target land area based on the soil pest information of the target land area.
And 720, generating a pesticide spraying task corresponding to the target land area based on the pesticide formula.
Method embodiments of the present application are described in detail above in conjunction with fig. 2-7, and apparatus embodiments of the present application are described below in conjunction with fig. 8-14. It is to be understood that the description of the method embodiments corresponds to the description of the apparatus embodiments, and therefore reference may be made to the preceding method embodiments for parts not described in detail.
Fig. 8 is a schematic structural view of a soil pest determination device according to an embodiment of the present application. As shown in fig. 8, a soil pest determination device provided by an embodiment of the present application includes:
the image determining module 8010 is configured to determine M underground soil section image sets based on the M image acquisition points of the target land area, where each underground soil section image set includes at least two underground soil section images having a time sequence association relationship, and M is a positive integer;
the data determining module 8020 is configured to determine, by using the image difference detection model, image difference data corresponding to each of the M underground soil section image sets;
the information determining module 8030 is configured to determine soil pest information of the target land area based on image difference data corresponding to the M underground soil section image sets.
In some embodiments, the data determining module 8020 is further configured to determine, for each subsurface soil slice image set of the M subsurface soil slice image sets, vector data corresponding to each of the subsurface soil slice images included in the subsurface soil slice image set by using the image difference detection model; and determining cosine distance data between the underground soil section images contained in the underground soil section image set based on vector data corresponding to the underground soil section images contained in the underground soil section image set by using the image difference detection model.
In some embodiments, the information determining module 8030 is further configured to determine pest occurrence information corresponding to each of the M image acquisition points based on the cosine distance data and the preset cosine distance threshold corresponding to each of the M underground soil tangent plane image sets; if the number of the collecting points with the pests is determined to be larger than or equal to a first preset number based on the pest occurrence information corresponding to the M image collecting points, determining that the occurrence degree of the pests in the soil of the target land area is high; if the number of the collecting points with the pests is determined to be smaller than a first preset number and larger than or equal to a second preset number based on the pest occurrence information corresponding to the M image collecting points, determining the occurrence degree of the pests in the soil of the target land area as 'middle'; and if the number of the collecting points with the insect pests is determined to be less than a second preset number based on the insect pest occurrence information corresponding to the M image collecting points, determining that the occurrence degree of the insect pests in the soil of the target land area is low.
In some embodiments, the data determination module 8020 is further configured to determine a training data set, where the training data set includes label data corresponding to each of the N subsurface soil section image sample sets and the multiple subsurface soil section image sample sets; training an initial network model based on a training data set to obtain an image difference detection model, wherein the N underground soil section image sample sets comprise at least one of soil section image samples before and after watering, soil section image samples of different types of soil samples, soil section image samples before and after crawling of different types of pests and soil section image samples of the same soil section under different illumination intensities.
Fig. 9 is a schematic structural diagram of a pesticide formula generating apparatus according to an embodiment of the present application. As shown in fig. 9, a pesticide formulation generating device provided by the embodiment of the present application includes a first determining module 9010, configured to determine information about soil pests in a target land area. A second determining module 9020, configured to generate a pesticide formulation for the target land area based on the soil pest information, so as to generate a pesticide preparation task based on the pesticide formulation.
Fig. 10 is a schematic structural diagram of a pesticide spraying task generating device according to an embodiment of the present application. As shown in fig. 10, the pesticide spraying task generating device provided by the embodiment of the application includes a pesticide formulation determining module 1010 for determining a pesticide formulation of a target land area based on the information of soil insect damage of the target land area. And the spraying task generating module is used for generating a pesticide spraying task corresponding to the target land area based on the pesticide formula so that the pesticide spraying equipment can execute the pesticide spraying task.
Fig. 11 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present application. Next, an electronic apparatus according to an embodiment of the present application is described with reference to fig. 11.
As shown in fig. 11, the electronic device 70 includes one or more processors 701 and memory 702.
The processor 701 may be a Central Processing Unit (CPU) or other form of processing unit having data processing capabilities and/or instruction execution capabilities, and may control other components in the electronic device 70 to perform desired functions.
Memory 702 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, Random Access Memory (RAM), cache memory (cache), and/or the like. The non-volatile memory may include, for example, Read Only Memory (ROM), hard disk, flash memory, etc. One or more computer program instructions may be stored on the computer-readable storage medium and executed by the processor 701 to implement the methods described above in connection with the various embodiments of the application and/or other desired functions. Various contents such as a set of subsurface soil sectional images, cosine distances, tag data, and the like may also be stored in the computer-readable storage medium.
In one example, the electronic device 70 may further include: an input device 703 and an output device 704, which are interconnected by a bus system and/or other form of connection mechanism (not shown).
The input device 703 may include, for example, a keyboard, a mouse, and the like.
The output device 704 may output various information to the outside, including the subsurface soil profile image set, cosine distance, tag data, and the like. The output devices 704 may include, for example, a display, speakers, a printer, and a communication network and remote output devices connected thereto, among others.
Of course, for the sake of simplicity, only some of the components of the electronic device 70 relevant to the present application are shown in fig. 11, and components such as buses, input/output interfaces, and the like are omitted. In addition, the electronic device 70 may include any other suitable components, depending on the particular application.
In addition to the above-described methods and apparatus, embodiments of the present application may also be a computer program product comprising computer program instructions that, when executed by a processor, cause the processor to perform the steps in the related methods according to the various embodiments of the present application described above in this specification.
The computer program product may be written with program code for performing the operations of embodiments of the present application in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server.
Furthermore, embodiments of the present application may also be a computer-readable storage medium having stored thereon computer program instructions that, when executed by a processor, cause the processor to perform the steps in the related methods according to the various embodiments of the present application described above in the present specification.
The computer-readable storage medium may take any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may include, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In some embodiments, a soil pest determination system is further provided, and the soil pest determination system comprises an image acquisition system and the electronic device mentioned in the above embodiments, wherein the image acquisition system and the electronic device are in communication connection. The image acquisition system is used for shooting underground soil section images which correspond to the M image acquisition points of the target land area and are concentrated in the M underground soil section images.
In some embodiments, an image acquisition system comprises: the soil contact mechanism is provided with a transparent window used in soil; and a light path is formed between a camera of the image acquisition mechanism and the transparent window.
The image acquisition system is illustrated below in conjunction with fig. 12-14.
Referring to fig. 12, fig. 12 is a schematic structural diagram of a first alternative of an image capturing device according to an embodiment of the present disclosure. As shown in fig. 12, a container having a housing chamber and having the soil contact mechanism 1 entirely buried in the soil 5; the soil contact mechanism 1 comprises a protective cover; one side of the protective cover is provided with a transparent window 2; image acquisition mechanism 3 installs in the protection casing to image acquisition mechanism 3's camera is towards transparent window 2, and cable 4 is used for transmitting the underground soil photo that image acquisition mechanism 3 shot.
Referring to fig. 13, fig. 13 is a schematic structural diagram of a second alternative of an image capturing device according to another embodiment of the present disclosure. As shown in fig. 13, the soil surface formation member 1 may be a periscope having a first periscope opening and a second periscope opening, the first periscope opening of the periscope being a soil observation opening, the second periscope opening of the periscope being a camera observation opening; the transparent window 2 is formed by a soil observation opening or the transparent window 2 is hermetically arranged at the soil observation opening; the camera of the image acquisition mechanism 3 faces the camera observation port. In addition, in the operation process of the image acquisition device, the camera observation port and the image acquisition mechanism 3 are positioned in the space outside the soil, namely on the ground, so that the operation and maintenance are convenient.
Optionally, referring to fig. 14, fig. 14 is a schematic structural diagram of a third optional manner of the image capturing device according to another embodiment of the present application. As shown in fig. 14, the inside diameter of the periscope passage between the two plane mirrors in the periscope is increased from the camera observation port to the soil observation port, that is, the periscope between the first periscope port and the second periscope port of the periscope is a flared tube with an increasing inside diameter, and the soil observation port is disposed at the end with a larger inside diameter of the flared tube, and the end with a larger pipe diameter is embedded in the soil 5 to increase the shooting visual field.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
The foregoing describes the general principles of the present application in conjunction with specific embodiments, however, it is noted that the advantages, effects, etc. mentioned in the present application are merely examples and are not limiting, and they should not be considered essential to the various embodiments of the present application. Furthermore, the foregoing disclosure of specific details is for the purpose of illustration and description and is not intended to be limiting, since the foregoing disclosure is not intended to be exhaustive or to limit the disclosure to the precise details disclosed.
The block diagrams of devices, apparatuses, systems referred to in this application are only given as illustrative examples and are not intended to require or imply that the connections, arrangements, configurations, etc. must be made in the manner shown in the block diagrams. These devices, apparatuses, devices, systems may be connected, arranged, configured in any manner, as will be appreciated by those skilled in the art. Words such as "including," "comprising," "having," and the like are open-ended words that mean "including, but not limited to," and are used interchangeably therewith. The words "or" and "as used herein mean, and are used interchangeably with, the word" and/or, "unless the context clearly dictates otherwise. The word "such as" is used herein to mean, and is used interchangeably with, the phrase "such as but not limited to".
It should also be noted that in the devices, apparatuses, and methods of the present application, the components or steps may be decomposed and/or recombined. These decompositions and/or recombinations are to be considered as equivalents of the present application.
The previous description of the disclosed aspects is provided to enable any person skilled in the art to make or use the present application. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the scope of the application. Thus, the present application is not intended to be limited to the aspects shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description has been presented for purposes of illustration and description. Furthermore, the description is not intended to limit embodiments of the application to the form disclosed herein. While a number of example aspects and embodiments have been discussed above, those of skill in the art will recognize certain variations, modifications, alterations, additions and sub-combinations thereof.

Claims (16)

1. A soil insect pest determination method is characterized by comprising the following steps:
determining M underground soil section image sets based on M image acquisition points of a target land area, wherein each underground soil section image set comprises at least two underground soil section images with a time sequence incidence relation, and M is a positive integer;
determining image difference data corresponding to the M underground soil section image sets by using an image difference detection model;
and determining soil insect pest information of the target land area based on the image difference data corresponding to the M underground soil section image sets.
2. A soil pest determination method according to claim 1, wherein the image difference data includes cosine distance data, and determining the image difference data corresponding to each of the M subsurface soil sectional image sets using an image difference detection model includes:
determining vector data corresponding to the underground soil section images contained in the underground soil section image sets by using the image difference detection model aiming at each underground soil section image set in the M underground soil section image sets;
and determining cosine distance data between the underground soil section images contained in the underground soil section image set based on vector data corresponding to the underground soil section images contained in the underground soil section image set by using the image difference detection model.
3. The method of determining soil pests according to claim 2, wherein determining soil pest information for the target land area based on the image difference data corresponding to each of the M subsurface soil profile image sets comprises:
determining pest occurrence information corresponding to the M image acquisition points based on cosine distance data and a preset cosine distance threshold value corresponding to the M underground soil tangent plane image sets respectively;
and if the number of the collecting points with the insect pests is determined to be larger than or equal to a first preset number based on the insect pest occurrence information corresponding to the M image collecting points, determining the occurrence degree of the insect pests in the soil of the target land area to be high.
4. The method of determining soil pests according to claim 3, wherein determining soil pest information for the target land area based on the image difference data corresponding to each of the M subsurface soil profile image sets further comprises:
and if the number of the collecting points with the pests is determined to be less than the first preset number and greater than or equal to the second preset number based on the pest occurrence information corresponding to the M image collecting points, determining the occurrence degree of the pests in the soil of the target land area as middle.
5. A soil pest determination method according to claim 3 or 4, wherein determining soil pest information for the target land area based on image difference data corresponding to each of the M subsurface soil profile image sets further comprises:
and if the number of the collecting points with the insect pests is determined to be smaller than the second preset number based on the insect pest occurrence information corresponding to the M image collecting points, determining the occurrence degree of the insect pests in the soil of the target land area to be low.
6. A soil pest determination method according to any one of claims 1 to 3, wherein before said determining image difference data corresponding to each of said M subsurface soil sectional image sets using said image difference detection model, further comprising:
determining a training data set, wherein the training data set comprises N underground soil section image sample sets and label data corresponding to the underground soil section image sample sets;
and training an initial network model based on the training data set to obtain the image difference detection model.
7. A soil pest determination method according to claim 6, wherein the N sets of subsurface soil section image samples comprise at least one of soil section image samples before and after watering, soil section image samples for different types of soil samples, soil section image samples before and after crawling of different types of pests, and soil section image samples for the same soil section under different illumination intensities.
8. A method for forming a pesticide formulation, comprising:
determining soil pest information for a target land area, the soil pest information determined based on the method of any one of claims 1 to 7;
determining a pesticide formulation for the target land area based on the soil pest information to generate a pesticide preparation task based on the pesticide formulation.
9. A pesticide spraying task generation method is characterized by comprising the following steps:
determining a pesticide formulation for a target land area based on soil pest information for the target land area, the pesticide formulation determined based on the method of claim 8;
and generating a pesticide spraying task corresponding to the target land area based on the pesticide formula so that the pesticide spraying equipment executes the pesticide spraying task.
10. A soil pest determination device, comprising:
the image determining module is used for determining M underground soil section image sets based on M image acquisition points of a target land area, each underground soil section image set comprises at least two underground soil section images with a time sequence incidence relation, and M is a positive integer;
the data determining module is used for determining image difference data corresponding to the M underground soil section image sets by using an image difference detection model;
and the information determining module is used for determining the soil insect pest information of the target land area based on the image difference data corresponding to the M underground soil section image sets.
11. A pesticide formula generating device, comprising:
a first determination module for determining soil pest information for a target land area, the soil pest information determined based on the method of any one of claims 1 to 7;
a second determination module to generate a pesticide formulation for the target land area based on the soil pest information to generate a pesticide preparation task based on the pesticide formulation.
12. A pesticide spraying task generating device, comprising:
a pesticide formulation determination module for determining a pesticide formulation for a target land area based on soil pest information for the target land area, the pesticide formulation determined based on the method of claim 8;
and the spraying task generating module is used for generating a pesticide spraying task corresponding to the target land area based on the pesticide formula so that pesticide spraying equipment can execute the pesticide spraying task.
13. A computer-readable storage medium, characterized in that the storage medium stores a computer program for executing the method of any of the preceding claims 1 to 9.
14. An electronic device, characterized in that the electronic device comprises:
a processor;
a memory for storing the processor-executable instructions;
the processor configured to perform the method of any of the preceding claims 1 to 9.
15. A soil pest determination system, comprising:
the system comprises an image acquisition system, a data acquisition system and a data processing system, wherein the image acquisition system is used for shooting M underground soil tangent plane image sets corresponding to M image acquisition points of a target land area; and
the electronic device of claim 14, connected to the image acquisition system.
16. A soil pest determination system according to claim 15 wherein said image acquisition system includes:
the soil contact mechanism is provided with a transparent window used in soil;
and an optical path is formed between a camera of the image acquisition mechanism and the transparent window.
CN202111667411.2A 2021-12-30 2021-12-30 Soil insect pest determination method and device and pesticide formula generation method and device Active CN114463649B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111667411.2A CN114463649B (en) 2021-12-30 2021-12-30 Soil insect pest determination method and device and pesticide formula generation method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111667411.2A CN114463649B (en) 2021-12-30 2021-12-30 Soil insect pest determination method and device and pesticide formula generation method and device

Publications (2)

Publication Number Publication Date
CN114463649A true CN114463649A (en) 2022-05-10
CN114463649B CN114463649B (en) 2023-02-14

Family

ID=81407752

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111667411.2A Active CN114463649B (en) 2021-12-30 2021-12-30 Soil insect pest determination method and device and pesticide formula generation method and device

Country Status (1)

Country Link
CN (1) CN114463649B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4445788A (en) * 1982-04-30 1984-05-01 The Board Of Regents Of The University Of Nebraska Soil probe and method of obtaining moisture, temperature and root distribution of a soil profile
US20160050902A1 (en) * 2014-08-19 2016-02-25 Lisi Global, Llc Method and Apparatus for the Management of a Soil Pest
CN106778897A (en) * 2016-12-29 2017-05-31 西京学院 Plant species recognition methods twice based on COS distance and center profile distance
CN107873340A (en) * 2016-09-29 2018-04-06 张瑞芹 The technology of one preventing and treating ginger pest and disease damage
CN108040997A (en) * 2017-12-25 2018-05-18 仲恺农业工程学院 A kind of insect pest monitoring method based on machine vision
CN109446958A (en) * 2018-10-18 2019-03-08 广州极飞科技有限公司 Determination method and device, the system of pesticide supplying effect
CN109872301A (en) * 2018-12-26 2019-06-11 浙江清华长三角研究院 A kind of color image preprocess method counted for rice pest identification
CN110516689A (en) * 2019-08-30 2019-11-29 北京达佳互联信息技术有限公司 Image processing method, device and electronic equipment, storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4445788A (en) * 1982-04-30 1984-05-01 The Board Of Regents Of The University Of Nebraska Soil probe and method of obtaining moisture, temperature and root distribution of a soil profile
US20160050902A1 (en) * 2014-08-19 2016-02-25 Lisi Global, Llc Method and Apparatus for the Management of a Soil Pest
CN107873340A (en) * 2016-09-29 2018-04-06 张瑞芹 The technology of one preventing and treating ginger pest and disease damage
CN106778897A (en) * 2016-12-29 2017-05-31 西京学院 Plant species recognition methods twice based on COS distance and center profile distance
CN108040997A (en) * 2017-12-25 2018-05-18 仲恺农业工程学院 A kind of insect pest monitoring method based on machine vision
CN109446958A (en) * 2018-10-18 2019-03-08 广州极飞科技有限公司 Determination method and device, the system of pesticide supplying effect
CN109872301A (en) * 2018-12-26 2019-06-11 浙江清华长三角研究院 A kind of color image preprocess method counted for rice pest identification
CN110516689A (en) * 2019-08-30 2019-11-29 北京达佳互联信息技术有限公司 Image processing method, device and electronic equipment, storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
WAQARISLAM等: "Silicon-mediated plant defense against pathogens and insect pests", 《PESTICIDE BIOCHEMISTRY AND PHYSIOLOGY》 *
李胜龙: "稻田一田埂过渡区土壤大孔隙分布与水氮渗漏特征", 《中国优秀硕士学位论文全文数据库农业科技辑》 *

Also Published As

Publication number Publication date
CN114463649B (en) 2023-02-14

Similar Documents

Publication Publication Date Title
Roosjen et al. Deep learning for automated detection of Drosophila suzukii: potential for UAV‐based monitoring
Liu et al. Estimating wheat green area index from ground-based LiDAR measurement using a 3D canopy structure model
McCarthy et al. Applied machine vision of plants: a review with implications for field deployment in automated farming operations
Tsouros et al. Data acquisition and analysis methods in UAV-based applications for Precision Agriculture
CA2740503C (en) Variable rate sprayer system and method of variably applying agrochemicals
Mahmud et al. Development of a LiDAR-guided section-based tree canopy density measurement system for precision spray applications
Hočevar et al. Flowering estimation in apple orchards by image analysis
Diago et al. On‐the‐go assessment of vineyard canopy porosity, bunch and leaf exposure by image analysis
Chazette et al. Basic algorithms for bee hive monitoring and laser-based mite control
Yano et al. Identification of weeds in sugarcane fields through images taken by UAV and Random Forest classifier
CN107346424A (en) Lamp lures insect identification method of counting and system
Miao et al. Efficient tomato harvesting robot based on image processing and deep learning
Giakoumoglou et al. White flies and black aphids detection in field vegetable crops using deep learning
Lippi et al. A data-driven monitoring system for the early pest detection in the precision agriculture of hazelnut orchards
Ozguven et al. The technology uses in the determination of sugar beet diseases
CN114463649B (en) Soil insect pest determination method and device and pesticide formula generation method and device
McCarthy et al. Automated variety trial plot growth and flowering detection for maize and soybean using machine vision
Liu et al. Development of a proximal machine vision system for off-season weed mapping in broadacre no-tillage fallows
CN113418509A (en) Automatic target-aiming detection device and detection method for agriculture
KR20180133612A (en) Insect pest image analyzing method for insect pest prediction system of cash crops
Tripathy et al. Image processing techniques aiding smart agriculture
Olsen Improving the accuracy of weed species detection for robotic weed control in complex real-time environments
US20230206626A1 (en) Plant disease and pest control method using spectral remote sensing and artificial intelligence
Negrete Artificial vision in mexican agriculture for identification of diseases, pests and invasive plants
Bonaria Grapevine yield estimation using image analysis for the variety Arinto

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant