WO2023217618A1 - Improvements in the use of crop protection products and/or nutrients - Google Patents

Improvements in the use of crop protection products and/or nutrients Download PDF

Info

Publication number
WO2023217618A1
WO2023217618A1 PCT/EP2023/061746 EP2023061746W WO2023217618A1 WO 2023217618 A1 WO2023217618 A1 WO 2023217618A1 EP 2023061746 W EP2023061746 W EP 2023061746W WO 2023217618 A1 WO2023217618 A1 WO 2023217618A1
Authority
WO
WIPO (PCT)
Prior art keywords
deposits
crop protection
nutrient
image
protection product
Prior art date
Application number
PCT/EP2023/061746
Other languages
French (fr)
Inventor
Malcolm Andrew Faers
Sebastian NIEDENFUEHR
Original Assignee
Bayer Aktiengesellschaft
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bayer Aktiengesellschaft filed Critical Bayer Aktiengesellschaft
Publication of WO2023217618A1 publication Critical patent/WO2023217618A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M7/00Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
    • A01M7/0089Regulating or controlling systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation

Definitions

  • crop protection product refers to a composition which is used to protect crops from harmful organisms or to prevent such exposure, to destroy unwanted plants or plant parts, to inhibit unwanted growth of plants or to prevent such growth, and/or, in another way, as nutrients, to influence the life events of crops (e.g., growth regulators).
  • Growth regulators are employed, for example, for increasing the lodging resistance in cereals by shortening the culm length (culm shorteners or, better, internode shorteners), improving the rooting of nursery plants, reducing plant height by stunting in horticulture, or preventing the germination of potatoes.
  • Other examples of crop protection products are herbicides, fungicides and pesticides (e.g., insecticides).
  • the present disclosure provides a computer system comprising: a processor; and a memory storing an application program configured to perform, when executed by the processor, an operation, the operation comprising: receiving at least one image, the image showing a plant or a part of a plant after an application of a crop protection product and/or a nutrient, identifying one or more types of deposit structures in the image, determining features of the one or more types of deposit structures, determining a performance value based on the features, outputting the performance value and/or a recommendation to improve the crop protection product and/or nutrient and/or the application of the crop protection product and/or nutrient.
  • the at least one image shows a plant or a part of a plant after an application of a crop protection product and/or a nutrient.
  • the at least one image is subjected to one or more filtering operations and/or other transformations before and/or after binarization to more clearly highlight the deposits and/or reduce or remove noise in the image.
  • image processing steps are known to those skilled in the art of image processing and are described in various publications (see, for example: J. Ohser: Angewandte Scheme und Frêt analyses, subuchverlag für, 2018, ISBN: 978-3-446-44933-6; A. Erhardt: Einfiihrung in die Digitale Marshstoff, Vieweg + Teubner, 2008, ISBN: 978-3-519- 00478-3; P. Soille: Morphsammlung Schmtechnik, Springer 1998, ISBN: 978-3-642-72191-5).
  • segmentation refers to the process of dividing an image into segments, also known as image segments, image regions, or image objects. Segmentation is typically used to locate objects and boundaries (lines, curves, etc.) in images. From a segmented image, the localized objects can be separated from the background, visually highlighted (e.g.: colored), measured, counted, or otherwise quantified.
  • a solidity of individual deposits e.g., calculated according to the following formula: — , wherein
  • Another example of a type of deposits is deposits having frayed edges. While coffee-ring deposits are compact and have well-defined edges, frayed deposits are often smeared in one direction or more directions and the edges exhibit fractal geometry.
  • One model or several models can be used to determine the performance value.
  • a model may be a heuristic model, a mechanistic model, a statistical model, a machine learning model, and/or some other/further model and/or a combination of different models.
  • Such model is also referred to herein as performance value prediction model.
  • a feature vector is an n-dimensional vector of numerical features representing an object (e.g., an application process), where n is an integer greater than 0.
  • feature vector also includes scalar values, matrices, tensors, and the like. Examples of methods for generating feature vectors can be found in various textbooks and scientific publications (see, e.g., G.A. Tsihrintzis, L.C. Jain: Machine Learning Paradigms: Advances in Deep Learning-based Technological Applications, in Learning and Analytics in Intelligent Systems Vol. 18, Springer Nature, 2020, ISBN: 9783030497248; K. Grzegorczyk: Vector representations of text data in deep learning, Doctoral Dissertation, 2018, arXiv:1901.01695vl [cs.CL]).
  • the input neurons serve to receive the input data.
  • the input data constitute or comprise an image
  • the output neurons serve to output the output data.
  • a CNN is a class of deep neural networks, most commonly applied to analyzing visual imagery (such as OCT scans and fluorescein angiography images).
  • a CNN comprises an input layer with input neurons, an output layer with at least one output neuron, as well as multiple hidden layers between the input layer and the output layer.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Multimedia (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Zoology (AREA)
  • Environmental Sciences (AREA)
  • Wood Science & Technology (AREA)
  • Pest Control & Pesticides (AREA)
  • Insects & Arthropods (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

Systems, methods, and computer programs disclosed herein relate to the characterization and/or optimization of crop protection products and/or nutrients and their use in crop production based on images showing residues of the crop protection products and/or nutrients in the form of deposit structures on plant parts.

Description

Improvements in the use of crop protection products and/or nutrients
FIELD
Systems, methods, and computer programs disclosed herein relate to the characterization and/or optimization of crop protection products and/or nutrients and their use in crop production based on images showing residues of the crop protection products and/or nutrients in the form of deposit structures on plant parts.
BACKGROUND
Crop cultivation usually involves the use of crop protection products and/or nutrients.
The term "crop protection product" refers to a composition which is used to protect crops from harmful organisms or to prevent such exposure, to destroy unwanted plants or plant parts, to inhibit unwanted growth of plants or to prevent such growth, and/or, in another way, as nutrients, to influence the life events of crops (e.g., growth regulators). Growth regulators are employed, for example, for increasing the lodging resistance in cereals by shortening the culm length (culm shorteners or, better, internode shorteners), improving the rooting of nursery plants, reducing plant height by stunting in horticulture, or preventing the germination of potatoes. Other examples of crop protection products are herbicides, fungicides and pesticides (e.g., insecticides).
"Nutrients" are those organic and inorganic compounds from which crops are able to derive the elements of which their bodies are made. Depending on the location of the crops, the nutrients are taken from the air, the water and/or the soil. These nutrients are often not in the quantity and form in which they can best be utilized. Either they are not present in sufficient quantity naturally, or they are displaced, for example, by leaching in the soil, or are withdrawn from the soil in considerable quantities by the harvested products. These withdrawals of nutrients can be replaced only by supplying plant nutrients through fertilization. Fertilization therefore improves the nutrition of the plant, promotes plant growth, increases yield, improves the quality of the harvested products, and, lastly, maintains and promotes soil fertility. In this description the term "nutrients" is used synonymously with the term "fertilizers".
When using crop protection products and/or nutrients, one goal is to use these products as efficiently as possible, i.e., to use only as much of the products as is necessary to achieve a desired effect and to use the products in a way that maximizes their effect. To do this, it is necessary to understand the relationships between the properties of the products, the parameters of the application of the products and the effects of the products in the field.
This and other tasks are addressed by the present disclosure.
SUMMARY
In a first aspect, the present disclosure provides a computer-implemented method, the method comprising: receiving at least one image, the image showing a plant or a part of a plant after an application of a crop protection product and/or a nutrient, identifying one or more types of deposit structures in the image, determining features of the one or more types of deposit structures, determining a performance value based on the features, outputting the performance value and/or a recommendation to improve the crop protection product and/or nutrient and/or the application of the crop protection product and/or nutrient. In another aspect, the present disclosure provides a computer system comprising: a processor; and a memory storing an application program configured to perform, when executed by the processor, an operation, the operation comprising: receiving at least one image, the image showing a plant or a part of a plant after an application of a crop protection product and/or a nutrient, identifying one or more types of deposit structures in the image, determining features of the one or more types of deposit structures, determining a performance value based on the features, outputting the performance value and/or a recommendation to improve the crop protection product and/or nutrient and/or the application of the crop protection product and/or nutrient.
In another aspect, the present disclosure provides a non-transitory computer readable medium having stored thereon software instructions that, when executed by a processor of a computer system, cause the computer system to execute the following steps: receiving at least one image, the image showing a plant or a part of a plant after an application of a crop protection product and/or a nutrient, identifying one or more types of deposit structures in the image, determining features of the one or more types of deposit structures, determining a performance value based on the features, outputting the performance value and/or a recommendation to improve the crop protection product and/or nutrient and/or the application of the crop protection product and/or nutrient.
DETAILED DESCRIPTION
The invention will be more particularly elucidated below without distinguishing between the aspects of the disclosure (method, computer system, computer-readable storage medium). On the contrary, the following elucidations are intended to apply analogously to all the aspects of the disclosure, irrespective of in which context (method, computer system, computer-readable storage medium) they occur.
If steps are stated in an order in the present description or in the claims, this does not necessarily mean that the disclosure is restricted to the stated order. On the contrary, it is conceivable that the steps can also be executed in a different order or else in parallel to one another, unless one step builds upon another step, this absolutely requiring that the building step be executed subsequently (this being, however, clear in the individual case). The stated orders are thus preferred embodiments of the invention.
As used herein, the articles “a” and “an” are intended to include one or more items and may be used interchangeably with “one or more” and “at least one.” As used in the specification and the claims, the singular form of “a”, “an”, and “the” include plural referents, unless the context clearly dictates otherwise. Where only one item is intended, the term “one” or similar language is used. Also, as used herein, the terms “has”, “have”, “having”, or the like are intended to be open-ended terms. Further, the phrase “based on” is intended to mean “based at least partially on” unless explicitly stated otherwise. Further, the phrase “based on” may mean “in response to” and be indicative of a condition for automatically triggering a specified operation of an electronic device (e.g., a controller, a processor, a computing device, etc.) as appropriately referred to herein.
Some implementations of the present disclosure will be described more fully hereinafter with reference to the accompanying drawings, in which some, but not all implementations of the disclosure are shown. Indeed, various implementations of the disclosure may be embodied in many different forms and should not be construed as limited to the implementations set forth herein; rather, these example implementations are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
The present disclosure provides means for characterizing and/or optimizing the application of a crop protection product and/or nutrient.
The characterization and/or optimization is based on one or more images.
The term "image" as used herein means a data structure that represents a spatial distribution of a physical signal. The spatial distribution may be of any dimension, for example 2D, 3D or 4D. The spatial distribution may be of any shape, for example forming a grid and thereby defining pixels, the grid being possibly irregular or regular. The physical signal may be any signal, for example color, level of gray, depth, surface or volume occupancy, such that the image may be a 2D or 3D RGB/grayscale/depth image, or a 3D surface/volume occupancy model.
For simplicity, the invention is described herein mainly on the basis of two-dimensional images consisting of a rectangular array of pixels. However, this is not to be understood as limiting the invention to such images. Those skilled in image processing and/or image analysis will know how to apply the invention to image data comprising more dimensions and/or being in a different format.
Images can be generated with the aid of a commercially available (digital) camera. Such a camera typically comprises at least one sensor (e.g., a charge-coupled device (CCD) and/or a complementary metal-oxide-semiconductor (CMOS)) on which a plant or part of a plant can be imaged and which converts the image into electrical signals that can be digitized and stored. In addition, a camera usually comprises lenses, apertures and/or other optical elements that provide a sharp image of the plant or part of the plant on the sensor(s).
Preferably, images are provided in digital form. The term "digital" means that the image can be processed by a machine, usually a computer system. "Processing" is understood to mean the known methods for electronic data processing (EDP).
Digital images can be processed, edited and reproduced with computer systems and software, and converted into standardized data formats, such as JPEG (graphics format of the Joint Photographic Experts Group), PNG (Portable Network Graphics), TIFF (Tag Image File Format) or SVG (Scalable Vector Graphics). Digital images can be visualized with suitable display devices, such as computer monitors, projectors and/or printers.
The at least one image shows a plant or a part of a plant after an application of a crop protection product and/or a nutrient.
The plant may be a crop. The term "crop" is understood to mean a plant which is specifically grown as a useful plant by human intervention. Parts of the crop being grown are suitable for human and/or animal consumption.
The plant may be a weed. The term "weed" is understood to mean spontaneously accompanying vegetation plants in crops of crop plants, grassland or gardens that are not being specifically grown there and develop, for example, from the seed potential of the soil or are blown in. Weeds can compete with crop plants for natural resources, such as water, nutrients, and/or sunlight, and are therefore often undesirable.
The part of the plant may be, for example, a leaf or a flower or a twig or a stem or any other part. Preferably, the part of the plant is a leaf.
The at least one image shows one or more deposits on the plant or plant part. “Deposits” are residues of a crop protection product and/or nutrient that remain on plant parts after application, for example by spraying, of the crop protection product and/or nutrient and after drying. Such deposits form characteristic structures. The term "deposit structure" is understood to mean the two- or three- dimensional distribution of components of the crop protection product and/or nutrient on one or more plant parts. Typically, deposits on plant parts, especially on leaves of plants, are not uniformly distributed. Instead, areas occur where there are no deposits and areas where deposits are present. The amounts of substances contained in deposits can vary.
Fig. 1 shows an example of deposits of a formulation on the leaf of a plant. The deposits are shown in white. The black areas are free of deposits. The contour of the leaf of the plant can be seen in Fig. 1. The deposits form characteristic structures. They form islands of different sizes and shapes.
The at least one image shows the plant or part of the plant a pre-defined time after application of the crop protection product and/or nutrient, such that volatile components such as water have volatilized. Residues of the crop protection product and/or nutrient in the form of deposits remain on the plant (parts) after the pre-defined time.
The pre-defined time is usually from several minutes to one or more hours and can be determined empirically. The pre-defined time usually depends on environmental conditions such as temperature, humidity, wind, and solar radiation.
The crop protection product and/or nutrient is preferably in the form of a formulation comprising particles, such as a suspension concentrate formulation, a suspo-emulsion formulation, an oil dispersion formulation, a wettable powder formulation, a water-dispersable granule formulation, a foliar fertilizer formulation. However, the present disclosure is also applicable to other formulations such as emulsifiable concentrate (EC) formulations and/or soluble liquid (SL) concentrate formulations.
The crop protection product and/or nutrient is preferably applied to the plant as a spray using common spray application techniques.
A dye may be added to the crop protection product and/or nutrient to more clearly identify and image the deposit structures. The dye is deposited on the plant part together with other components of the formulation of the crop protection product and/or nutrient and provides a higher contrast between deposits and surrounding areas. Preferably, a fluorescent dye is used and at least one image is generated of the plant part upon irradiation of the plant part with electromagnetic radiation resulting in fluorescence excitation of the fluorescent dye. Preferably, a camera sensor capable of detecting the electromagnetic radiation emitted by the fluorescent dye is used to generate the at least one image. The dye could also be in the form of pigment particles or even as an oil soluble dye in emulsion droplets. Furthermore, two or more dyes could be used where they are present in different physical forms (e.g., water soluble, particles and oil soluble) and with different emission wavelengths allowing different components in the deposits to be identified.
Preferably, one or more images are generated in which deposit structures with a mean (e.g., arithmetically averaged) diameter in the order of 0.005 mm to 1 cm are shown. This may require the use of magnification optics to image the part of the plant in a magnified view on the at least one camera sensor. It is possible to generate microscopic images with an x-fold magnification, where x is the factor by which an object is magnified in the image compared to the original size. For example, if an object has an extension of 1 mm and the extension in the image capture is 5 mm, then a fivefold magnification is present. Preferably, the magnification factor is in the range of 1 to 40. Preferably the magnification is chosen such that the minimum diameter or the (e.g., arithmetically averaged) mean diameter of the deposits covers a minimum number of image elements (e.g., pixels), e.g., at least 9 image elements, preferably at least 16 picture elements, even more preferably at least 25 picture elements.
The at least one image shows the result of applying a crop protection product and/or nutrient in the form of deposit structures. In a further step, the deposit structures are identified in the at least one image.
For example, the at least one image can be converted into a binary image in which white image elements (e.g., pixels or voxels) represent, for example, deposits and black image elements represent areas (e.g., plant parts) without deposits. The generation of a binary image can be done using a gray scale thresholding method (also called binarization). The threshold values, or the threshold value if only one is used, are/is chosen such that a binary image is generated by the gray scale thresholding procedure on which the deposits are shown, e.g., white, and areas without deposits are shown, e.g., black. The deposits usually have a different color and/or shade of gray than plant parts and can thus be easily identified. This is especially true when using a fluorescent dye and using such electromagnetic radiation in capturing the image that results in fluorescence excitation. The threshold value or values during binarization can be determined empirically and transferred to other (new) image.
To increase the contrast between the deposit and the plant part (e.g., leaf), the color or multispectral image can first be divided into different color or wavelength bands and the most appropriate color/wavelength bands can be selected for further analysis. When two or more dyes with different emission wavelengths are used, two or more image color/wavelength bands can be used to analyze the distribution of the different dyes in the deposit and infer the distribution of the different components in the deposit.
It is conceivable that the at least one image is subjected to one or more filtering operations and/or other transformations before and/or after binarization to more clearly highlight the deposits and/or reduce or remove noise in the image. Such image processing steps are known to those skilled in the art of image processing and are described in various publications (see, for example: J. Ohser: Angewandte Bildverarbeitung und Bildanalyse, Fachbuchverlag Leipzig, 2018, ISBN: 978-3-446-44933-6; A. Erhardt: Einfiihrung in die Digitale Bildverarbeitung, Vieweg + Teubner, 2008, ISBN: 978-3-519- 00478-3; P. Soille: Morphologische Bildverarbeitung, Springer 1998, ISBN: 978-3-642-72191-5).
It is also possible to train a machine learning model to automatically identify and segment deposits in the at least one image. The term "segmentation" refers to the process of dividing an image into segments, also known as image segments, image regions, or image objects. Segmentation is typically used to locate objects and boundaries (lines, curves, etc.) in images. From a segmented image, the localized objects can be separated from the background, visually highlighted (e.g.: colored), measured, counted, or otherwise quantified.
In segmentation, each image element (e.g., pixel, voxel) of the image is assigned a label, so that image elements with the same label have certain features in common, e.g., represent deposits. A machine learning model can be trained using, for example, training images that have been manually segmented by a human, i.e., in which deposition structures have been labeled by the human.
Such a “machine learning model”, as used herein, may be understood as a computer implemented data processing architecture. The machine learning model can receive input data and provide output data based on that input data and on parameters of the machine learning model. The machine learning model can learn a relation between input data and output data through training. In training, parameters of the machine learning model may be adjusted in order to provide a desired output for a given input.
The process of training a machine learning model involves providing a machine learning algorithm (that is the learning algorithm) with training data to learn from. The term “trained machine learning model” refers to the model artifact that is created by the training process. The training data must contain the correct answer, which is referred to as the target. The learning algorithm finds patterns in the training data that map input data to the target, and it outputs a trained machine learning model that captures these patterns.
In the training process, training data are inputted into the machine learning model and the machine learning model generates an output. The output is compared with the (known) target. Parameters of the machine learning model are modified in order to reduce the deviations between the output and the (known) target to a (defined) minimum.
In general, a loss function can be used for training, where the loss function can quantify the deviations between the output and the target. The loss function may be chosen in such a way that it rewards a wanted relation between output and target and/or penalizes an unwanted relation between an output and a target. Such a relation can be, e.g., a similarity, or a dissimilarity, or another relation.
In terms of segmentation, this means that a machine learning model can be trained to segment deposits in images using training data. The training data comprise unsegmented and segmented images. The unsegmented images are inputted into the machine learning model and the machine learning model generates an output image. The deviations between the output image and the segmented image (target) can be quantified using a loss function. The aim of the training is to reduce the loss (e.g., to a pre-defined minimum). In the training, parameters of the model are modified, and a gradient descent optimization procedure can be used to reduce the deviations, e.g., to a defined minimum. The loss function can be, e.g., a cross-entropy loss.
Once the deposits are identified and/or segmented in the at least one image, features of the deposits that characterize the deposit structures are determined.
These features can be one or more of the following: coverage, e.g., in terms of a ratio between areas (plant surfaces) covered with deposits and areas (plant surfaces) without deposits, or in terms of percentage of areas (plant surfaces) covered with deposits, size of the individual deposits (e.g., size of the area of the deposits projected onto the image plane), extent of individual deposits (e.g., in terms of the ratio of between the size of the deposit and the area of a bounding box surrounding the deposit), perimeter of individual deposits,
47r- circularity of individual deposits (e.g., calculated according to the following formula: — — ,
Figure imgf000008_0001
wherein TT is the circular number, A the area size of a deposit, and P its perimeter),
4-A roundness of individual deposits (e.g., calculated according to the following formula: - - — , ' 'max wherein TT is the circular number, A the area size of a deposit, and dmax its maximum extent in one direction),
A solidity of individual deposits (e.g., calculated according to the following formula: — , wherein
Figure imgf000008_0002
A is the area size of a deposit and CA the area size of its convex hull, wherein the convex hull is the smallest hull which fulfills the following requirements: (1) it is convex (2) it includes the deposit), compactness of individual deposits (e.g., calculated according to the following formula:
Figure imgf000008_0003
wherein TT is the circular number, A the area size of a deposit, and dmax its maximum extent in one direction), maximum Feret diameter of individual deposits, minimum Feret diameter of individual deposits, ratio of the maximum Feret diameter to the minimum Feret diameter of individual deposits, distance from one deposit to one or more further (preferably adjacent) deposits, and/or other quantities derived therefrom.
The mentioned features can be obtained with known methods of image analysis. There is also commercially available and freely available software that can be used to determine the mentioned features and other features. As an example, the software BioVoxxel (https://www.biovoxxel.de/) is mentioned.
If features of individual deposits are determined, statistical values of these features can be determined.
Thus, for the features of individual deposits, the static distribution for one or more images can be determined and statistical quantities describing the distribution can be determined from the distribution.
For example, the sizes of the individual deposits in an image are often distributed according to a Gaussian distribution, or can be described/approximated by such a distribution. Statistical quantities for the description of a Gaussian distribution are the expected value and the standard deviation or variance. Thus, the expected value and the standard deviation and/or variance can be determined for the size distribution of deposits in the at least one image.
In the case of asymmetric distributions, for example, the skewness of the distribution can be determined as a measure of asymmetry. More about distribution functions and statistic quantities to describe them can be found in publications on this topic (see, e.g.: R.A. Rigby et al: Distributions for Modeling Location, Scale, andShape, CRC Press 2019, ISBN:9781000699968).
Preferred statistical quantities are:
- expected value and/or variance of the size (area) of the deposits,
- expected value and/or variance of the perimeter of the deposits,
- expected value and/or variance of the solidity of the deposits,
- expected value and/or variance of the roundness of the deposits,
- expected value and/or variance of the distance of adjacent deposits from each other.
In a preferred embodiment, different types of deposits are identified in the at least one image and features and/or statistical quantities of the features are determined separately for each type.
An example of a type of deposits is deposits having a coffee-ring structure.
Often, in images of plant parts taken after application of a crop protection product and/or nutrient, compact islands of deposits are found with larger amounts of substances at the edges of the island than in the center of the island. These variations in quantity are reminiscent of a coffee-ring, i.e., an annular stain left by a drop of coffee beverage after drying on a smooth surface (see, e.g.: R. D. Deegan et al.-. Capillary flow as the cause of ring stains from dried liquid drops, Nature 389 (1997) 827-829, M. A. Faers and R. Pontzen, Factors influencing the association between active ingredient and adjuvant in the leaf deposit of adjuvant-containing suspo-emulsion formulations, Pest Management Science 64 (2008) 820-833).
Deposits having coffee-ring structure are in particular observed after the application of a particulate based formulation (e.g., suspension concentrate formulation, water-dispersible granules) of a crop protection product and/or nutrient. Fig. 2 (a) shows an example of an image of a leaf on which deposits appear in the form of coffee-ring structures.
When the spray deposit additionally contains penetration formulants/adjuvants it has been found that such annular quantity variations can have an impact on the efficacy of the crop protection product and/or nutrient. Image features representing such quantity variations can be, for example, a gray value distribution that has a different value at the edge of a deposit than in the area inside the edge. Features of coffee-ring deposits can be, for example, the number of coffee-ring deposits per unit area, their size, perimeter, the thickness of the edge, the size of the area within the edge, the geometric shape of the edge, the relative intensity between the periphery and center, and/or other features.
As already described above, two different dyes with different colours can be added to the formulation of the crop protection product and/or nutrient, one as particles and one representing the penetration adjuvant. Then, the relative distribution of both components can be analysed and their degree of association (important for biodelivery) can be determined.
Another example of a type of deposits is deposits having frayed edges. While coffee-ring deposits are compact and have well-defined edges, frayed deposits are often smeared in one direction or more directions and the edges exhibit fractal geometry.
Deposits having frayed edges are in particular observed after the application of a formulation of a crop protection product and/or nutrient comprising one or more spreading adjuvants. Fig. 2 (b) shows an example of an image of a leaf with deposits having with frayed edges.
Features of deposits with frayed edges can be, for example, the number of such deposits per unit area, their size, the geometric shape of the edge, and/or other features.
Another example of a type of deposits is deposits having irregular shapes. These deposits are characterized by islands delimited from each other with defined edges in which the amounts of substances may be slightly increased (but not as high as in deposits with coffee-ring structures). The islands are larger than in the coffee-ring structures and less compact; the roundness is greater than in coffee-ring structures. Deposits having irregular shapes are often observed at higher spray volume where the adjuvant concentration is lower. Furthermore, here the deposit shapes are mostly governed by coalescence of adjacent spray droplets. Fig. 2 (c) shows an example of an image of a leaf with deposits having irregular shapes.
Features of deposits having irregular shapes can be, for example, the number of deposits per unit area, their size, perimeter, the thickness of the edge, the size of the area within the edge, the geometric shape of the edge, the roundness, the circularity, the solidity, the uniformity across the deposit, and/or other features.
Different types of deposit structures can also be identified using a trained machine learning model. Such a machine learning model can be trained on training images, the training images being labeled (e.g., by one or more experts) and showing different types of deposit structures. The label indicates the respective type of deposits. The machine learning model may be trained to identify different types of deposits as well as segment the different types. Such model is also referred to herein as deposit structure identification, classification, characterization and/or segmentation model.
The type of deposits usually influences the effect of the applied crop protection product and/or nutrient; in other words, different types of deposits differ in their effect.
The effect here refers to the active ingredient(s) contained in the crop protection product or the respective nutrient.
If the crop protection product is a herbicide, the effect usually means the herbicidal effect on weeds, i.e. how efficiently and/or selectively weeds are controlled.
If the crop protection product is a fungicide, the effect usually means the fungicidal effect on fungi, i.e. how efficiently and/or selectively fungi are controlled.
If the crop protection product is a pesticide (e.g., an insecticide), the effect usually means the pesticidal effect on pests (e.g., insects), i.e., how efficiently and/or selectively pests (e.g., insects) are controlled.
The term "control" is understood to mean prevention of infestation of a field/a crop plant or a portion thereof with one or more harmful organisms and/or prevention of the spread of one or more harmful organisms and/or reduction in the amount of harmful organisms present.
If a nutrient is applied, usually the effect of the nutrient on the growth of a crop plant and/or the quality and/or appearance of the crop plant is meant.
However, it is also conceivable that the effect is an undesirable effect such as undesirable toxicity, an effect on the environment and/or other organisms (e.g., beneficial insects such as bees), side effects and/or the like. Based on the features of the (different types of) deposits, a performance value is determined.
A “performance value” is a quantity that quantifies one or more effects that the crop protection product and/or nutrient exerts on the crop and/or on pests of the crop and/or on other plants (e.g., weed) that compete with the crop for resources (e.g., water, nutrients, sunlight) and/or on the environment (including other organisms) of the crop.
The performance value can be, for example, a (biological) efficacy and/or selectivity.
The performance value can indicate the extent to which the above goal is achieved when applying crop protection products and/or nutrients: to use these products as efficiently as possible, i.e., to use only as much of the products as is necessary to achieve a desired effect and to use the products in a way that maximizes their effect.
The performance value is determined based on the features of the (different types of) deposit structures.
One model or several models can be used to determine the performance value. Such a model may be a heuristic model, a mechanistic model, a statistical model, a machine learning model, and/or some other/further model and/or a combination of different models. Such model is also referred to herein as performance value prediction model.
For example, a performance value prediction model can be used that sums the features of individual types of deposit structures (additive model). This can be explained with an example: It is assumed that in the at least one image of a plant part three different types a, b and c of deposit structures can be found. The total coverage CT of the plant part with a crop protection product and/or nutrient can be calculated via the following sum of the individual coverages Cm, C&, Ccl:
Figure imgf000011_0001
wherein CT is the total coverage, Ca, Ct, and Cc are the coverages of the deposit structures of types a, b, and c, respectively, i is an index that indicates individual deposits, and n is the number of deposits.
Similarly, for example, a penetration of the crop protection product and/or nutrient into the part of the plant can be determined by a sum over the penetration proportions of the different types of deposition structures:
Figure imgf000011_0002
wherein PT is the total penetration, Pa, Pb, and Pc are the penetrations of the deposit structures of types a, b, and c, respectively, i is an index that indicates individual deposits, and n is the number of deposits.
In this case, the individual penetrations Pa, Pb, and Pc of the deposit structure types a, b, and c can be determined empirically, for example, by generating only deposit structures of a certain type, e.g., in the labor and measuring the respective penetration.
The performance value prediction model which relates features of the deposit structures depicted in the at least one image to one or more performance value(s) may also be or comprise a simulation of the behavior of a pest, such as described in: T.A. Ebert et al.: Deposit structure and efficacy of pesticide application. 1: Interactions between deposit size, toxicant concentration and deposit number, Pestic Sci 55:783-792 (1999).
In a preferred embodiment, the performance value prediction model is a trained machine learning model. The machine learning model can be configured as a regression model or a classification model, for example, and it can be trained to predict one or more performance value(s) on the basis of the features of the deposit structures. The machine learning model can be trained using training data, the training data comprising sets of features determined from images after application of a crop protection product and/or nutrient and performance values which may be determined experimentally by measuring the performance of the crop protection product and/or nutrient after its application. The sets of features serve as input data of the machine learning model and the performance values serve as target data. The machine learning model may be trained to map the features to the performance values.
The one or more determined performance value(s) can be outputted, e.g., displayed on a monitor, printed with a printer, stored on a data storage device, and/or transmitted to a separate computer system.
Preferably, a recommendation is determined and issued.
The recommendation can indicate how the formulation of the crop protection product and/or nutrient can be modified in order to increase the performance value.
The recommendation can indicate how parameters of the application of the crop protection product and/or nutrient can be changed to increase the performance value.
This may require creating a model that relates formulation parameters and/or application parameters to deposit structures. Such a model is referred to herein as deposit structure prediction model.
The deposit structure prediction model can be a heuristic model, a mechanistic model, a statistical model, a machine learning model, and/or some other/further model and/or a combination of different models.
In a preferred embodiment of the present disclosure, the deposit structure prediction model is a trained machine learning model. The machine learning model can be configured and trained to predict features of deposition structures of a crop protection product and/or nutrient on a plant part based on application parameters (and optionally further input data).
Application parameters are usually one or more of the following: application rate (e.g., in terms of liters of a formulation of a crop protection product and/or nutrient applied per hectare of agricultural land), concentration of one or more active ingredients and/or nutrients in the formulation, spraying technique (e.g. compressed air, spinning disc), type of spray nozzle, relative speed of spray head to spray target (to the treated crop part), distance and orientation of spray nozzle to treated crop part, droplet size distribution, mean (e.g., arithmetically averaged) droplet size, standard deviation of droplet size distribution and/or other/further parameters.
Environmental conditions prevailing during application of the crop protection product and/ nutrient can also have an influence on deposit structures. Examples of such environmental conditions include temperature (e.g., at a height of one meter above the ground), humidity, air pressure, relative air movement during application (wind, driving wind), solar radiation, and/or the like. Such parameters can also be included in predicting features of deposit structures.
In addition to the application parameters, the respective applied formulations and the treated plant parts usually also have an influence on the deposit structures. It is therefore possible that information on the applied formulations and/or the treated crop parts is also included in the prediction of the features of the deposit structures.
If the machine learning model is to be trained to predict features of deposit structures of different formulations, the formulations may be represented by their composition and/or their physical and/or chemical properties. The composition may be represented by a recipe in which the individual ingredients, or at least some of the ingredients, are specified together with the amounts present (for example, in the form of a concentration or weight fraction of the ingredient in the formulation). The individual ingredients themselves may be specified, for example, by categorical variables.
Such a categorical variable can, for example, specify what kind of substance it is. For example, it is possible to define/specify one or more of the following categories of substances: active ingredient, nutrient, solvent, dispersant, emulsifier, lubricant, surfactant, colorant, preservative, thickener, spreading agent, wetting agent, penetrating agent, buffer, antifoaming agent, and/or other. It is also possible to specify formulations by their type, such as suspension concentrate (SC) formulations, suspo-emulsions (SE), oil dispersions (OD), wettable powders (WP), water-dispersable granules (WG), foliar fertilizers, and/or formulations containing one or more spreading adjuvants, penetration adjuvants, and/or drift reducing agent.
It is also possible to represent components of a formulation by one-hot encodings (see, e.g., European patent application Nr. 22169685.9, the content of which is incorporated by reference in its entirety in this description).
One or more components of a formulation can also be specified by molecular descriptors, for example.
Examples of molecular descriptors are SMILES codes (SMILES: simplified molecular-input line-entry system, see for example: D. Weininger et al: SMILES. 2nd algorithm for generation of unique SMILES notation, J Chem Inf Comp Sci 1989, 29(2):97el01), SELFIES codes (see, e.g., M. Krenn et al: Selfreferencing embedded strings (SELFIES): A 100% robust molecular string representation, Mach. Learn. Sci. Technol. 1 (2020) 045024, https://doi.org/10.1088/2632-2153/aba947) and molecular graphs (see, e.g., S.C. Basak et al.: Determining structural similarity of chemicals using graph-theoretic indices, Discrete Applied Mathematics 19, (1988), 17-44).
Examples of physical and/or chemical properties of a formulation are dynamic viscosity, kinematic viscosity, density.
If the machine learning model is to be trained to predict features of deposit structures on different plant parts, the plant parts can be represented by their plant variety, type, developmental stage, and/or physical and/or chemical properties.
The plant part may be specified by the type (e.g., leaf, upper side of leaf (i.e., side facing the sun), lower side of leaf (i.e., side facing away from the sun), flower, stem, twig, fruit, and/or the like).
Preferably, features of deposit structures on the upper side and/or lower side of leaves of one or more plants are predicted.
The size of plant parts that are treated, such as the (e.g., arithmetically averaged) size of the leaf surface, can also be included in the prediction of features of deposit structures.
The structure and/or size and/or spatial orientation of plant parts, such as a canopy, can also be used in predicting features of deposit structures.
An important property of a plant part is its wettability. Wettability can be specified by a category (e.g., heavy wettability, medium wettability, light wettability), by a physically measurable parameter (e.g., the contact angle when wetted by a drop of water and/or a drop of the formulation), and/or by structural parameters (e.g., surface roughness).
Feature vectors can be generated from the values of the parameters and/or categorical variables to be included in the prediction of deposit structures.
A feature vector is an n-dimensional vector of numerical features representing an object (e.g., an application process), where n is an integer greater than 0. The term "feature vector" also includes scalar values, matrices, tensors, and the like. Examples of methods for generating feature vectors can be found in various textbooks and scientific publications (see, e.g., G.A. Tsihrintzis, L.C. Jain: Machine Learning Paradigms: Advances in Deep Learning-based Technological Applications, in Learning and Analytics in Intelligent Systems Vol. 18, Springer Nature, 2020, ISBN: 9783030497248; K. Grzegorczyk: Vector representations of text data in deep learning, Doctoral Dissertation, 2018, arXiv:1901.01695vl [cs.CL]).
Different feature vectors can be generated for application parameters, properties of formulations, properties of plant parts and/or environmental conditions. Different features can also be combined into one feature vector. Different feature vectors are preferably combined and/or aggregated into a single feature vector. Known methods for combining and/or aggregating feature vectors may be used (see, e.g. : C. Zhang el al: Multimodal Intelligence: Representation Learning, Information Fusion, and Applications, https://doi.org/10.48550/arXiv.1911.03977).
One or more recommendations can be generated using the deposit structure prediction model in combination with the performance value prediction model.
In a first step, one or more performance values can be predicted using the performance value prediction model. The prediction is based on at least one image showing deposit structures of a crop protection product and/or nutrient on a plant part. The at least one image is the result of an application process. The one or more performance values indicate the success of the application process. To determine how to increase success, performance maps can be generated. Such a performance map can, for example, show the influence of application parameters on the performance value. Fig. 3 shows an example of a performance map. The performance map depicted in Fig. 3 shows the dependence of a performance value (e.g., efficacy) on the application parameters spray droplet size and spray volume.
Such a performance map can be generated using the deposit structure prediction model and the performance value prediction model: Features of deposit structures can be predicted for different values of application parameters using the deposit structure prediction model. The predicted features of deposit structures can then be used as input data of the performance value prediction model. The performance value prediction model outputs one or more performance values. The relationships between the values of the application parameters and the one or more performance values can be made visible in one or more performance maps.
It is possible to show a user the success of an application process by means of a performance map. In the performance map, the success of the application process can be indicated by a point or an area in a graphic as shown in Fig. 3. The user can then directly read from the performance map which application parameters he/she has to change in which way to achieve an improved performance (a higher performance value).
Any machine learning model described herein may be or comprise, for example, an artificial neural network.
An artificial neural network (ANN) is a biologically inspired computational model. An ANN usually comprises at least three layers of processing elements: a first layer with input neurons (nodes), a If1 layer with at least one output neuron (node), and k-2 inner layers, where k is an integer greater than 2.
In such a network, the input neurons serve to receive the input data. If the input data constitute or comprise an image, there is usually one input neuron for each pixeFvoxel of the input image; there can be additional input neurons for additional input data such as data about the object represented by the input image, measurement conditions, data about the subject/patient and/or the like. The output neurons serve to output the output data.
The processing elements of the layers are interconnected in a predetermined pattern with predetermined connection weights therebetween. Each network node usually represents a (simple) calculation of the weighted sum of inputs from prior nodes and a non-linear output function. The combined calculation of the network nodes relates the inputs to the outputs.
When trained, the connection weights between the processing elements in the ANN contain information regarding the relationship between the input data and the output data which can be used to predict new output data from new input data.
Each network node represents a calculation of the weighted sum of inputs from prior nodes and a nonlinear output function. The combined calculation of the network nodes relates the inputs to the output(s).
Separate networks can be developed for each property measurement or groups of properties can be included in a single network. Preferably, different dimensions of the input dataare combined at the end of the algorithm. Training estimates network weights that allow the network to calculate (an) output value(s) close to the measured output value(s). A supervised training method can be used in which the output data is used to direct the training of the network weights. The network weights can be initialized with small random values or with the weights of a prior partially trained network. The training data inputs are applied to the network and the output values are calculated for each training sample. The network output values are compared to the measured output values. A backpropagation algorithm can be applied to correct the weight values in directions that reduce the error between measured and calculated outputs. The process is iterated until no further reduction in error can be made or until a predefined prediction accuracy has been reached.
A cross-validation method can be employed to split the data into training and validation data sets. The training data set is used in the backpropagation training of the network weights. The validation data set is used to verify that the trained network generalizes to make good predictions. The best network weight set can be taken as the one that best predicts the outputs of the training data. Similarly, varying the number of network hidden nodes and determining the network that performs best with the data sets optimizes the number of hidden nodes.
Any machine learning model disclosed herein can be or comprises a convolutional neural network (CNN).
A CNN is a class of deep neural networks, most commonly applied to analyzing visual imagery (such as OCT scans and fluorescein angiography images). A CNN comprises an input layer with input neurons, an output layer with at least one output neuron, as well as multiple hidden layers between the input layer and the output layer.
The hidden layers of a CNN typically consist of convolutional layers, ReLU (Rectified Linear Units) layer i.e. activation function, pooling layers, fully connected layers and normalization layers.
The nodes in the CNN input layer are organized into a set of "filters" (feature detectors), and the output of each set of filters is propagated to nodes in successive layers of the network. The computations for a CNN include applying the convolution mathematical operation to each filter to produce the output of that filter. Convolution is a specialized kind of mathematical operation performed by two functions to produce a third function that is a modified version of one of the two original functions. In convolutional network terminology, the first function to the convolution can be referred to as the input, while the second function can be referred to as the convolution kernel. The output may be referred to as the feature map. For example, the input to a convolution layer can be a multidimensional array of data that defines the various color components or grey scale values of an input image. The convolution kernel can be a multidimensional array of parameters, where the parameters are adapted by the training process for the neural network.
By analysis of the CNN, one can reveal patterns in the data which are not obvious and were used preferred (i.e., weighted more strongly) by the CNN while analyzing the training data. This explainable approach helps to generate trust in the performance of the predictive machine learning model.
The operations in accordance with the teachings herein may be performed by at least one computer system specially constructed for the desired purposes or at least one general-purpose computer system specially configured for the desired purpose by at least one computer program stored in a typically non- transitory computer readable storage medium.
The term “non-transitory” is used herein to exclude transitory, propagating signals or waves, but to otherwise include any volatile or non-volatile computer memory technology suitable to the application.
A “computer system” is a system for electronic data processing that processes data by means of programmable calculation rules. Such a system usually comprises a “computer”, that unit which comprises a processor for carrying out logical operations, and also peripherals. In computer technology, “peripherals” refer to all devices which are connected to the computer and serve for the control of the computer and/or as input and output devices. Examples thereof are monitor (screen), printer, scanner, mouse, keyboard, drives, camera, microphone, loudspeaker, etc. Internal ports and expansion cards are, too, considered to be peripherals in computer technology.
Computer systems of today are frequently divided into desktop PCs, portable PCs, laptops, notebooks, netbooks and tablet PCs and so-called handhelds (e.g. smartphone); all these systems can be utilized for carrying out the invention.
The term “process” as used above is intended to include any type of computation or manipulation or transformation of data represented as physical, e.g. electronic, phenomena which may occur or reside e.g. within registers and/or memories of at least one computer or processor. The term processor includes a single processing unit or a plurality of distributed or remote such units.
Fig. 4 illustrates a computer system (1) according to some example implementations of the present disclosure in more detail.
Generally, a computer system of exemplary implementations of the present disclosure may be referred to as a computer and may comprise, include, or be embodied in one or more fixed or portable electronic devices. The computer may include one or more of each of a number of components such as, for example, processing unit (20) connected to a memory (50) (e.g., storage device).
The processing unit (20) may be composed of one or more processors alone or in combination with one or more memories. The processing unit (20) is generally any piece of computer hardware that is capable of processing information such as, for example, data, computer programs and/or other suitable electronic information. The processing unit (20) is composed of a collection of electronic circuits some of which may be packaged as an integrated circuit or multiple interconnected integrated circuits (an integrated circuit at times more commonly referred to as a “chip”). The processing unit (20) may be configured to execute computer programs, which may be stored onboard the processing unit (20) or otherwise stored in the memory (50) of the same or another computer.
The processing unit (20) may be a number of processors, a multi-core processor or some other type of processor, depending on the particular implementation. Further, the processing unit (20) may be implemented using a number of heterogeneous processor systems in which a main processor is present with one or more secondary processors on a single chip. As another illustrative example, the processing unit (20) may be a symmetric multi-processor system containing multiple processors of the same type. In yet another example, the processing unit (20) may be embodied as or otherwise include one or more ASICs, FPGAs or the like. Thus, although the processing unit (20) may be capable of executing a computer program to perform one or more functions, the processing unit (20) of various examples may be capable of performing one or more functions without the aid of a computer program. In either instance, the processing unit (20) may be appropriately programmed to perform functions or operations according to example implementations of the present disclosure.
The memory (50) is generally any piece of computer hardware that is capable of storing information such as, for example, data, computer programs (e.g., computer-readable program code (60)) and/or other suitable information either on a temporary basis and/or a permanent basis. The memory (50) may include volatile and/or non-volatile memory, and may be fixed or removable. Examples of suitable memory include random access memory (RAM), read-only memory (ROM), a hard drive, a flash memory, a thumb drive, a removable computer diskette, an optical disk, a magnetic tape or some combination of the above. Optical disks may include compact disk - read only memory (CD-ROM), compact disk - read/write (CD-R/W), DVD, Blu-ray disk or the like. In various instances, the memory may be referred to as a computer-readable storage medium. The computer-readable storage medium is a non-transitory device capable of storing information, and is distinguishable from computer-readable transmission media such as electronic transitory signals capable of carrying information from one location to another. Computer-readable medium as described herein may generally refer to a computer-readable storage medium or computer-readable transmission medium.
The at least one image, any machine learning model, and/or trained machine learning model and/or training data may be stored in the memory (50).
In addition to the memory (50), the processing unit (20) may also be connected to one or more interfaces for displaying, transmitting and/or receiving information. The interfaces may include one or more communications interfaces and/or one or more user interfaces. The communications interface(s) may be configured to transmit and/or receive information, such as to and/or from other computer(s), network(s), database(s) or the like. The communications interface may be configured to transmit and/or receive information by physical (wired) and/or wireless communications links. The communications interface(s) may include interface(s) (41) to connect to a network, such as using technologies such as cellular telephone, Wi-Fi, satellite, cable, digital subscriber line (DSL), fiber optics and the like. In some examples, the communications interface(s) may include one or more short-range communications interfaces (42) configured to connect devices using short-range communications technologies such as NFC, RFID, Bluetooth, Bluetooth LE, ZigBee, infrared (e.g., IrDA) or the like.
The user interfaces may include a display (30). The display may be configured to present or otherwise display information to a user, suitable examples of which include a liquid crystal display (LCD), lightemitting diode display (LED), plasma display panel (PDP) or the like. The user input interface(s) (11) may be wired or wireless, and may be configured to receive information from a user into the computer system (1), such as for processing, storage and/or display. Suitable examples of user input interfaces include a microphone, image or video capture device, keyboard or keypad, joystick, touch-sensitive surface (separate from or integrated into a touchscreen) or the like. In some examples, the user interfaces may include automatic identification and data capture (AIDC) technology (12) for machine-readable information. This may include barcode, radio frequency identification (RFID), magnetic stripes, optical character recognition (OCR), integrated circuit card (ICC), and the like. The user interfaces may further include one or more interfaces for communicating with peripherals such as printers and the like.
As indicated above, program code instructions (60) may be stored in memory (50), and executed by processing unit (20) that is thereby programmed, to implement functions of the systems, subsystems, tools and their respective elements described herein. As will be appreciated, any suitable program code instructions (60) may be loaded onto a computer or other programmable apparatus from a computer- readable storage medium to produce a particular machine, such that the particular machine becomes a means for implementing the functions specified herein. These program code instructions (60) may also be stored in a computer-readable storage medium that can direct a computer, processing unit or other programmable apparatus to function in a particular manner to thereby generate a particular machine or particular article of manufacture. The instructions stored in the computer-readable storage medium may produce an article of manufacture, where the article of manufacture becomes a means for implementing functions described herein. The program code instructions (60) may be retrieved from a computer- readable storage medium and loaded into a computer, processing unit or other programmable apparatus to configure the computer, processing unit or other programmable apparatus to execute operations to be performed on or by the computer, processing unit or other programmable apparatus.
Retrieval, loading and execution of the program code instructions (60) may be performed sequentially such that one instruction is retrieved, loaded and executed at a time. In some example implementations, retrieval, loading and/or execution may be performed in parallel such that multiple instructions are retrieved, loaded, and/or executed together. Execution of the program code instructions (60) may produce a computer-implemented process such that the instructions executed by the computer, processing circuitry or other programmable apparatus provide operations for implementing functions described herein. Execution of instructions by processing unit, or storage of instructions in a computer-readable storage medium, supports combinations of operations for performing the specified functions. In this manner, a computer system (1) may include processing unit (20) and a computer-readable storage medium or memory (50) coupled to the processing circuitry, where the processing circuitry is configmed to execute computer-readable program code instructions (60) stored in the memory (50). It will also be understood that one or more functions, and combinations of functions, may be implemented by special purpose hardware -based computer systems and/or processing circuitry which perform the specified functions, or combinations of special purpose hardware and program code instructions.

Claims

1. A computer-implemented method, the method comprising: receiving at least one image, the image showing a plant or a part of a plant after an application of a crop protection product and/or a nutrient, identifying one or more types of deposit structures in the image, determining features of the one or more types of deposit structures, determining a performance value based on the features, outputting the performance value and/or a recommendation to improve the crop protection product and/or nutrient and/or the application of the crop protection product and/or nutrient.
2. The method according to claim 1, wherein the at least one image shows a leaf of a crop plant or a leaf of a weed or a part of a leaf.
3. The method according to claim 1 or 2, wherein the crop protection product and/or nutrient is in the form of a formulation comprising particles, preferably a suspension concentrate formulation, a suspo- emulsion formulation, an oil dispersion formulation, a wettable powder formulation, a water- dispersable granule formulation, and/or a foliar fertilizer formulation.
4. The method according to any one of claims 1 to 3, wherein the crop protection product and/or nutrient comprises one or more dyes.
5. The method according to claim 4, wherein the one or more dyes comprise(s) one or more fluorescent dyes and the at least one image is generated upon irradiation of the plant or plant part with electromagnetic radiation resulting in fluorescence excitation of the fluorescent dye(s).
6. The method according to claim 4 or 5, wherein the crop protection product and/or nutrient comprises at least two different dyes, wherein the dyes differ in their solubility in water and/or oil and/or in their physical form.
7. The method according to any one of claims 1 to 6, wherein one or more of the following features are determined: coverage of the plant or plant part with deposits, size of deposits, extent of deposits, perimeter of deposits, circularity of deposits, roundness of deposits, solidity of deposits, compactness of deposits, maximum Feret diameter of deposits, minimum Feret diameter of deposits, ratio of the maximum Feret diameter to the minimum Feret diameter of deposits, distance from one deposit to one or more further deposits.
8. The method according to any one of claims 1 to 7, wherein one or more of the following types of deposit structures are identified:
- deposits having coffee-ring structure,
- deposits having frayed edges,
- deposits having irregular shapes.
9. The method according to any one of claims 1 to 8, wherein the performance value quantifies the efficacy and/or selectivity of the crop protection product and/or nutrient.
10. The method according to any one of claims 1 to 9, wherein the performance value is a sum of contributions of the different types of deposit structures.
11. The method according to any one of claim 1 to 10, wherein the performance value is generated by a trained machine learning model that is trained to predict one or more performance value (s) on the basis of the features of the deposit structures.
12. The method according to any one of claims 1 to 11, further comprising: providing a deposit structure prediction model, wherein the deposit structure prediction model is configured to predict one or more features of deposit structures based on parameters of a formulation of the crop protection product and/or nutrient and/or on application parameters, determining parameters of the formulation and/or application parameters resulting in an increased performance value using the deposit structure prediction model, outputting the determined parameters and/or applying a formulation of a crop protection product and/or nutrient according to the determined parameters.
13. The method according to any one of claims 1 to 12, further comprising: generating a performance map, the performance map showing the influence of parameters of a formulation of the crop protection product and/or nutrient and/or application parameters on the performance value, outputting the performance map.
14. A computer system comprising: a processor; and a memory storing an application program configured to perform, when executed by the processor, an operation, the operation comprising: receiving at least one image, the image showing a plant or a part of a plant after an application of a crop protection product and/or a nutrient, identifying one or more types of deposit structures in the image, determining features of the one or more types of deposit structures, determining a performance value based on the features, outputting the performance value and/or a recommendation to improve the crop protection product and/or nutrient and/or the application of the crop protection product and/or nutrient.
15. A non-transitory computer readable medium having stored thereon software instructions that, when executed by a processor of a computer system, cause the computer system to execute the following steps: receiving at least one image, the image showing a plant or a part of a plant after an application of a crop protection product and/or a nutrient, identifying one or more types of deposit structures in the image, - determining features of the one or more types of deposit structures, determining a performance value based on the features, outputting the performance value and/or a recommendation to improve the crop protection product and/or nutrient and/or the application of the crop protection product and/or nutrient.
PCT/EP2023/061746 2022-05-10 2023-05-04 Improvements in the use of crop protection products and/or nutrients WO2023217618A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP22172449 2022-05-10
EP22172449.5 2022-05-10

Publications (1)

Publication Number Publication Date
WO2023217618A1 true WO2023217618A1 (en) 2023-11-16

Family

ID=81597953

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/061746 WO2023217618A1 (en) 2022-05-10 2023-05-04 Improvements in the use of crop protection products and/or nutrients

Country Status (1)

Country Link
WO (1) WO2023217618A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020225277A1 (en) * 2019-05-08 2020-11-12 Bayer Aktiengesellschaft A low volume spray application vehicle
EP3741214A1 (en) * 2019-05-20 2020-11-25 BASF Agro Trademarks GmbH Method for plantation treatment based on image recognition

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020225277A1 (en) * 2019-05-08 2020-11-12 Bayer Aktiengesellschaft A low volume spray application vehicle
EP3741214A1 (en) * 2019-05-20 2020-11-25 BASF Agro Trademarks GmbH Method for plantation treatment based on image recognition

Non-Patent Citations (13)

* Cited by examiner, † Cited by third party
Title
A. ERHARDT, EINFIIHRUNG IN DIE DIGITALE BILDVERARBEITUNG, 2008, ISBN: 978-3-519-00478-3
C. ZHANG ET AL., MULTIMODAL INTELLIGENCE: REPRESENTATION LEARNING, INFORMATION FUSION, AND APPLICATIONS, Retrieved from the Internet <URL:https://doi.org/10.48550/arXiv.1911.03977>
D. WEININGER ET AL.: "SMILES. 2nd algorithm for generation of unique SMILES notation", J CHEM INF COMP SCI, vol. 29, no. 2, 1989, pages 97e101
G.A. TSIHRINTZISL.C. JAIN: "Machine Learning Paradigms: Advances in Deep Learning-based Technological Applications", LEARNING AND ANALYTICS IN INTELLIGENT SYSTEMS, vol. 18, 2020, ISBN: 9783030497248
J. OHSER, ANGEWANDTE BILDVERARBEITUNG UND BILDANALYSE, 2018, ISBN: 978-3-446-44933-6
K. GRZEGORCZYK: "Vector representations of text data in deep learning", DOCTORAL DISSERTATION, 2018
M. A. FAERSR. PONTZEN: "Factors influencing the association between active ingredient and adjuvant in the leaf deposit of adjuvant-containing suspo-emulsion formulations", PEST MANAGEMENT SCIENCE, vol. 64, 2008, pages 820 - 833
M. KRENN ET AL.: "Self-referencing embedded strings (SELFIES): A 100% robust molecular string representation", MACH. LEARN. SCI. TECHNOL, vol. 1, 2020, pages 045024, Retrieved from the Internet <URL:https://doi.org/10.1088/2632-2153/aba947>
P. SOILLE: "Morphologische Bildverarbeitung", 1998, SPRINGER
R. D. DEEGAN ET AL.: "Capillary flow as the cause of ring stains from dried liquid drops", NATURE, vol. 389, 1997, pages 827 - 829, XP037417636, DOI: 10.1038/39827
R.A. RIGBY ET AL.: "Distributions for Modeling Location, Scale, and Shape", 2019, CRC PRESS
S.C. BASAK ET AL.: "Determining structural similarity of chemicals using graph-theoretic indices", DISCRETE APPLIED MATHEMATICS, vol. 19, 1988, pages 17 - 44
T.A. EBERT ET AL.: "Deposit structure and efficacy of pesticide application. 1: Interactions between deposit size, toxicant concentration and deposit number", PESTIC SCI, vol. 55, 1999, pages 783 - 792

Similar Documents

Publication Publication Date Title
US11703855B2 (en) Adaptive cyber-physical system for efficient monitoring of unstructured environments
US20220327815A1 (en) System and method for identification of plant species
US20230165235A1 (en) Image monitoring for control of invasive grasses
Ajayi et al. Effect of varying training epochs of a faster region-based convolutional neural network on the accuracy of an automatic weed classification scheme
Latif et al. Deep learning based intelligence cognitive vision drone for automatic plant diseases identification and spraying
US20230073541A1 (en) System and method for performing machine vision recognition of dynamic objects
Genaev et al. Application of neural networks to image recognition of wheat rust diseases
Abouzahir et al. Iot-empowered smart agriculture: A real-time light-weight embedded segmentation system
WO2023217618A1 (en) Improvements in the use of crop protection products and/or nutrients
Altınbaş et al. Detecting defected crops: Precision agriculture using haar classifiers and UAV
Negrete Artificial vision in mexican agriculture for identification of diseases, pests and invasive plants
Raval et al. Computer vision and machine learning in agriculture
Mahenthiran et al. Smart pest management: an augmented reality-based approach for an organic cultivation
Malik et al. Elimination of Herbicides after the Classification of Weeds Using Deep Learning
Levanon et al. Abiotic stress prediction from rgb-t images of banana plantlets
Su et al. AI, sensors and robotics in plant phenotyping and precision agriculture, volume II
Kumar K et al. Harnessing Computer Vision for Agricultural Transformation: Insights, Techniques, and Applications
Terzi et al. Automatic detection of grape varieties with the newly proposed CNN model using ampelographic characteristics
WO2023208619A1 (en) Prediction of deposition structures of pesticides and/or nutrients on parts of plants
van Helfteren Comparing UAV-based Image Resolution to Deep-learning Weed-detection Performance
Wieme et al. Ultra-high-resolution UAV-imaging and supervised deep learning for accurate detection of Alternaria solani in potato fields
Dohare et al. Plant Health Monitoring System Using Machine Learning
Nelms Tomato Flower Detection and Three-Dimensional Mapping for Precision Pollination
Telkar et al. Detection of Weed plant in Farm and Removing using Unmanned Ground Vehicle
Mahmud Development of a machine vision system for strawberry powdery mildew disease detection

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23724772

Country of ref document: EP

Kind code of ref document: A1