WO2021229248A1 - Method for detection and classification of biotic - abiotic stress in crops from thermal photographs using artificial intelligence - Google Patents

Method for detection and classification of biotic - abiotic stress in crops from thermal photographs using artificial intelligence Download PDF

Info

Publication number
WO2021229248A1
WO2021229248A1 PCT/GR2021/000024 GR2021000024W WO2021229248A1 WO 2021229248 A1 WO2021229248 A1 WO 2021229248A1 GR 2021000024 W GR2021000024 W GR 2021000024W WO 2021229248 A1 WO2021229248 A1 WO 2021229248A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixels
crop
classification
thermal
coloring
Prior art date
Application number
PCT/GR2021/000024
Other languages
French (fr)
Inventor
Georgios FEVGAS
Original Assignee
Fevgas Georgios
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fevgas Georgios filed Critical Fevgas Georgios
Publication of WO2021229248A1 publication Critical patent/WO2021229248A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N25/00Investigating or analyzing materials by the use of thermal means
    • G01N25/20Investigating or analyzing materials by the use of thermal means by investigating the development of heat, i.e. calorimetry, e.g. by measuring specific heat, by measuring thermal conductivity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J5/00Radiation pyrometry, e.g. infrared or optical thermometry
    • G01J5/48Thermography; Techniques using wholly visual means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/28Quantising the image, e.g. histogram thresholding for discrimination between background and foreground patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/762Arrangements for image or video recognition or understanding using pattern recognition or machine learning using clustering, e.g. of similar faces in social networks
    • G06V10/763Non-hierarchical techniques, e.g. based on statistics of modelling distributions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30128Food products

Definitions

  • the present invention relates to a method for detection and classification of biotic - abiotic stress in crops from thermal photographs using Artificial Intelligence and in particular to an automated processing method of thermal photographs and classification using Convolutional Neural Networks - CNNs in order to construction digital photographs of standardized pseudo-color of the crop with demarcated contours of the sub-areas with a verbal representation of the classification of stress.
  • Spectral sensors have the ability to receive radiation in the visible and non- visible spectrum so they can be used to obtain the phenotype of a crop (Candiago et al., 2015. Evaluating Multispectral Images and Vegetation Indices for Precision Farming Applications from UAV Images. Remote Sens. 7, 4026-4047;). Their main disadvantages are complex data processing and sensitivity to weather conditions (Colomina and Molina, 2014. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 92, 79-97; Nasi et al., 2015. Using UAV-Based Photogrammetry and Hyperspectral Imaging for Mapping Bark Beetle Damage at Tree-Level. Remote Sens.
  • the Light Detection and Ranging (LiDAR) sensor determining ranges by targeting an object with a laser and uses the photoelectric detection method. It can be used to measure biomass and plant height. Its advantage is the effective acquisition of high precision horizontal and vertical structure of the vegetation. Its disadvantages are the high cost of acquisition and the large amounts of data processing (Ota et al., 2015. Above ground Biomass Estimation Using Structure from Motion Approach with Aerial Photographs in a Seasonal Tropical Forest. Forests 6, 3882-3898.; Wallace etal., 2012. Development of a UAV-LiDAR System with Application to Forest Inventory. Remote Sens. 4, 1519-1543.).
  • the thermal camera uses infrared radiation. Therefore it can be used to measure the canopy temperature of a crop and the rate of water vapor leave from leaves, and the carbon dioxide (C02) entering in leaves (Baluja et al., 2012. Assessment of vineyard water status variability by thermal and multispectral imagery using an unmanned aerial vehicle (UAV). Irrig. Sci. 30, 511-522.). In this way, it is possible to determine the growth status of the crop indirectly. Disadvantages governing the use of the thermal camera are the sensitivity to weather conditions and the difficulty of eliminating the soil’s effect (Gonzalez-Dugo et al.,
  • the treatment method applicable to a thermal image of a crop is the Crop Water Stress Index - CWSI, which determines the water availability of a crop through infrared temperature measurements (thermal photography) of the plant crown.
  • the plant’s crown’s temperature is an indicator of the culture's aqueous condition because of the foliage's stomata closing response to the water's depletion, causing a decrease in transpiration and increasing the temperature of the leaf.
  • a method was found to find biotic-abiotic stress in crops from thermal photographs using Artificial Intelligence and, in particular using, Convolutional Neural Networks.
  • the process of automated thermal photo processing is initially performed by converting the thermal photo (pseudo-coloring) to grayscale (Fig. 1). Then the OTSU method is used to isolate the background, i.e., the soil (Fig. 2).
  • the OTSU algorithm assumes that the image contains two classes of pixels that follow the bimodal histogram (foreground pixels and background pixels), then calculates the optimal threshold separating the two classes so that their combined spread (minimum variability) is minimal or equivalent (because the sum of the pairs of square distances is constant) so that their variability is maximal.
  • Finding the class of pixels of the OTSU method representing the crop is done by comparing the OTSU method’s two classes of pixels with the corresponding (same coordinates) pixels of the aligned digital photograph (Fig. 3). More specifically, a comparison is made between the two classes of pixels of the aligned digital photo (RGB) which has been converted to HSV color model (Hue, Saturation, Value), in order to find the class that has the more pixels in shades of green. After finding the class with the most pixels with shades of green, the corresponding class of pixels of thermal photography is selected. Then, export the metadata of the thermal image which is the temperature of each pixel of the class selected by the above procedure. In this way, a table is made which contains the coordinates and the temperature of each pixel.
  • RGB aligned digital photo
  • HSV color model Hue, Saturation, Value
  • the k-Means clustering method groups the pixel temperatures into a specific number of groups. Clustering is the process of organizing patterns (temperatures) into groups, where the group members are similar. Depending on the group you belong to, you assign a specific color to each pixel (e.g., the pixel belonging to the group with the lowest temperature, you assign the dark green).
  • the purpose of using standard pseudo-coloring in crop pixels is to normalize the temperatures of any thermal photograph depicting crop or crops.
  • a new photo is created.
  • the new normalized photo shows only the crop's temperature fluctuation with standard pseudo-coloring (Fig. 4).
  • the coloring and the coordinates of the pixels representing the highest temperatures are selected.
  • the pseudo-coloring is placed on the corresponding (same coordinates) pixels of the digital photo (Fig. 5).
  • the pseudo-coloring digital photo is then fed to a trained Convolutional Neural Network, which performs stressed sub-areas classification within the crop with a specific delimitation of the stressed region with contour.
  • the final result is the digital photograph of standardized pseudo-coloring of the crop with delimited outlines of the sub-areas with a verbal representation of stress classification (Fig. 6, Cl stress).

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Medical Informatics (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Quality & Reliability (AREA)
  • Remote Sensing (AREA)
  • Probability & Statistics with Applications (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Analytical Chemistry (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Chemical & Material Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Biochemistry (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

In the grayscale thermal image (Fig. 1), the Otsu method is applied (Fig. 2). Finding the pixels group of the OTSU method representing the crop is done by selecting the corresponding group with the most pixels in shades of green in the aligned digital photo (Fig. 3). From the metadata of the thermal photography of each pixel of the selected category, an array is constructed where it contains coordinates and temperatures, and k-Means clustering is applied to group the temperatures. Each pixel, depending on the group it belongs to, assigned a specific color to it (Fig. 4). Then, the coloring and the coordinates of the pixels representing the highest temperatures are selected. The pseudo-coloring is placed on the corresponding (same coordinates) pixels of the digital photo (Fig. 5). The digital pseudo-color photo is powered to a trained Convolutional Neural Network, which performs classification of stressed sub-area within the crop with an outline and verbal visualization (Fig. 6, C1 stress).

Description

DESCRIPTION
METHOD FOR DETECTION AND CLASSIFICATION OF BIOTIC- ABIOTIC STRESS IN CROPS FROM THERMAL PHOTOGRAPHS USING ARTIFICIAL INTELLIGENCE.
The present invention relates to a method for detection and classification of biotic - abiotic stress in crops from thermal photographs using Artificial Intelligence and in particular to an automated processing method of thermal photographs and classification using Convolutional Neural Networks - CNNs in order to construction digital photographs of standardized pseudo-color of the crop with demarcated contours of the sub-areas with a verbal representation of the classification of stress.
Due to the rapid technological development in recent years, various sensors have been developed that can be used to collect data in crops. These are digital camera (RGB), spectral sensor, Light Detection and Ranging (LiDAR), and thermal camera. Some of their applications are plant height control, biomass, Leaf Area Index (LAI), and other physiological characteristics (Rahaman et al., 2015. Advanced phenotyping and phenotype data analysis for the study of plant growth and development. Front. Plant Sci. 6.; Zhang and Kovacs J.M., 2012. The application of small unmanned aerial systems for precision agriculture: a review. Precis. Agric. 13, 693-712).
The use of digital cameras (RGB) is more common compared to other sensors because they have low cost, lightweight, and simple data processing. Their disadvantages are low radiometric resolution and lack of proper calibration (Ballesteros et al., 2014. Applications of georeferenced high-resolution images obtained with unmanned aerial vehicles. Part I: Description of image acquisition and processing. Precis. Agric. 15, 579-592.; Bendig et al., 2014. Estimating Biomass of Barley Using Crop Surface Models (CSMs) Derived from UAV-Based RGB Imaging. Remote Sens. 6, 10395-10412.). They can be used in the rapid acquisition of color photographs to calculate the height of the crop, the LAI, and the color of the leaves so that through already developed algorithms with the method of photo processing, the damaged "dry" leaves can be detected. However, this method lags in obtaining phenotype information and crop characteristics due to the lack of a digital camera to capture the invisible spectrum (A1 Hiary et al., 2011a. Fast and Accurate Detection and Classification of Plant Diseases. Int. J. Comput. Appl. 17, 31-38; Singh and Misra, 2017b. Detection of plant leaf diseases using image segmentation and soft computing techniques. Inf. Process. Agric. 4, 41-49.).
Spectral sensors have the ability to receive radiation in the visible and non- visible spectrum so they can be used to obtain the phenotype of a crop (Candiago et al., 2015. Evaluating Multispectral Images and Vegetation Indices for Precision Farming Applications from UAV Images. Remote Sens. 7, 4026-4047;). Their main disadvantages are complex data processing and sensitivity to weather conditions (Colomina and Molina, 2014. Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm. Remote Sens. 92, 79-97; Nasi et al., 2015. Using UAV-Based Photogrammetry and Hyperspectral Imaging for Mapping Bark Beetle Damage at Tree-Level. Remote Sens. 7, 15467-15493; Thorp et al., 2015. Proximal hyperspectral sensing and data analysis approaches for field- based plant phenomics. Comput. Electron. Agric. 118, 225-236; Zarco-Tejada et al., 2013. Spatio-temporal patterns of chlorophyll fluorescence and physiological and structural indices acquired from hyperspectral imagery as compared with carbon fluxes measured with eddy covariance. Remote Sens. Environ. 133, 102-115.).
The Light Detection and Ranging (LiDAR) sensor determining ranges by targeting an object with a laser and uses the photoelectric detection method. It can be used to measure biomass and plant height. Its advantage is the effective acquisition of high precision horizontal and vertical structure of the vegetation. Its disadvantages are the high cost of acquisition and the large amounts of data processing (Ota et al., 2015. Above ground Biomass Estimation Using Structure from Motion Approach with Aerial Photographs in a Seasonal Tropical Forest. Forests 6, 3882-3898.; Wallace etal., 2012. Development of a UAV-LiDAR System with Application to Forest Inventory. Remote Sens. 4, 1519-1543.).
The analysis methodology and control of chlorophyll content, the LAI, and the leaf nitrogen content using remote sensing is fully developed (Ballesteros et al.,
2014. Applications of georeferenced high-resolution images obtained with unmanned aerial vehicles. Part I: Description of image acquisition and processing. Precis. Agric. 15, 579-592.; Bendig et al., 2014. Estimating Biomass of Barley Using Crop Surface Models (CSMs) Derived from UAV -Based RGB Imaging. Remote Sens. 6, 10395-10412.). Therefore, we can have accurate plant growth information because the leaves’ spectral characteristics are directly related to the above indices.
The thermal camera uses infrared radiation. Therefore it can be used to measure the canopy temperature of a crop and the rate of water vapor leave from leaves, and the carbon dioxide (C02) entering in leaves (Baluja et al., 2012. Assessment of vineyard water status variability by thermal and multispectral imagery using an unmanned aerial vehicle (UAV). Irrig. Sci. 30, 511-522.). In this way, it is possible to determine the growth status of the crop indirectly. Disadvantages governing the use of the thermal camera are the sensitivity to weather conditions and the difficulty of eliminating the soil’s effect (Gonzalez-Dugo et al.,
2015. Using High-Resolution Hyperspectral and Thermal Airborne Imagery to Assess Physiological Condition in the Context of Wheat Phenotyping. Remote Sens. 7, 13586-13605.; Jones et al., 2009. Thermal infrared imaging of crop canopies for the remote diagnosis and quantification of plant responses to water stress in the field. Funct. Plant Biol. 36, 978.; Sugiura et al., 2007. Correction of Low-altitude Thermal Images applied to estimating Soil Water Status. Biosyst. Eng. 96, 301-313.).
The treatment method applicable to a thermal image of a crop (in the whole area of cultivation, and not at each plant separately) is the Crop Water Stress Index - CWSI, which determines the water availability of a crop through infrared temperature measurements (thermal photography) of the plant crown. The plant’s crown’s temperature is an indicator of the culture's aqueous condition because of the foliage's stomata closing response to the water's depletion, causing a decrease in transpiration and increasing the temperature of the leaf. On the contrary, the adequacy of water in the soil keeps the stomata open and a strong respiration rate, resulting in a decrease in the foliage's temperature compared to the atmospheric temperature above the crop (Idso et ah, 1981. Normalizing the stress-degree-day parameter for environmental variability. Agric. Meteorol. 24, 45-55.). Finally, plants with biotic or abiotic stress have been shown to exhibit higher canopy temperatures than healthy plants (Zarco- Tejada et al., 2012. Fluorescence, temperature and narrow-band indices acquired from a UAV platform for water stress detection using a micro-hyperspectral imager and a thermal camera. Remote Sens. Environ. 117, 322-337.; Zarco-Tejada et al., 2018. Previsual symptoms of Xylella fasti diosa infection revealed in spectral plant- trait alterations. 7, 432-439.).
Considering the above, a method was found to find biotic-abiotic stress in crops from thermal photographs using Artificial Intelligence and, in particular using, Convolutional Neural Networks. The process of automated thermal photo processing is initially performed by converting the thermal photo (pseudo-coloring) to grayscale (Fig. 1). Then the OTSU method is used to isolate the background, i.e., the soil (Fig. 2). The OTSU algorithm assumes that the image contains two classes of pixels that follow the bimodal histogram (foreground pixels and background pixels), then calculates the optimal threshold separating the two classes so that their combined spread (minimum variability) is minimal or equivalent (because the sum of the pairs of square distances is constant) so that their variability is maximal.
Finding the class of pixels of the OTSU method representing the crop is done by comparing the OTSU method’s two classes of pixels with the corresponding (same coordinates) pixels of the aligned digital photograph (Fig. 3). More specifically, a comparison is made between the two classes of pixels of the aligned digital photo (RGB) which has been converted to HSV color model (Hue, Saturation, Value), in order to find the class that has the more pixels in shades of green. After finding the class with the most pixels with shades of green, the corresponding class of pixels of thermal photography is selected. Then, export the metadata of the thermal image which is the temperature of each pixel of the class selected by the above procedure. In this way, a table is made which contains the coordinates and the temperature of each pixel.
Next, the k-Means clustering method is applied to the data in the table above. The k-Means clustering method groups the pixel temperatures into a specific number of groups. Clustering is the process of organizing patterns (temperatures) into groups, where the group members are similar. Depending on the group you belong to, you assign a specific color to each pixel (e.g., the pixel belonging to the group with the lowest temperature, you assign the dark green). The purpose of using standard pseudo-coloring in crop pixels is to normalize the temperatures of any thermal photograph depicting crop or crops.
Using the coordinates and the coloring of each pixel, a new photo is created. The new normalized photo shows only the crop's temperature fluctuation with standard pseudo-coloring (Fig. 4). Then, the coloring and the coordinates of the pixels representing the highest temperatures are selected. The pseudo-coloring is placed on the corresponding (same coordinates) pixels of the digital photo (Fig. 5). The pseudo-coloring digital photo is then fed to a trained Convolutional Neural Network, which performs stressed sub-areas classification within the crop with a specific delimitation of the stressed region with contour.
The final result is the digital photograph of standardized pseudo-coloring of the crop with delimited outlines of the sub-areas with a verbal representation of stress classification (Fig. 6, Cl stress).

Claims

1. Method for detection and classification of biotic-abiotic stress in crops with automated thermal photographs processing using Convolutional Neural Networks - CNNs which includes the following phases: a. conversion of thermal photography (pseudo-coloring) to Grayscale
(Fig- 1). b. Using the OTSU method to isolate the background, which is the soil (Fig. 2). The algorithm OTSU assumes that the image contains two classes of pixels following a bimodal histogram (foreground pixels and background pixels). c. Finding the class of pixels of the OTSU method representing the crop is done by comparing the OTSU method’s two classes of pixels with the corresponding (same coordinates) pixels of the aligned digital photograph (Fig. 3). More specifically, a comparison is made between the two classes of pixels of the aligned digital photo (RGB) converted to HSV color model (Hue, Saturation, Value), to find the class that has the more pixels in shades of green. After finding the class with the most pixels with shades of green, the corresponding class of thermal photography pixels is selected. d. exporting the metadata of the thermal image, which is the temperature of each class’s pixel selected by the above procedure. e. construction of an array that contains the coordinates and the temperature of each pixel. f. application of the k-means clustering method in the data of the above array. The k-means clustering method groups the pixel temperatures into a specific number of groups. Clustering is the process of grouping similar objects (temperatures) into different groups, or more precisely, the partitioning of a data set into subsets so that the data in each subset according to some defined distance measure. g. assigning a specific color to each pixel depending on the group it belongs to; for example, the pixel that belongs to the group with the lowest temperature assigns the dark green. The purpose of using standard pseudo-coloring in crop pixels is to normalize the temperatures of any thermal photograph depicting crop or crops. h. constructing a new photo using the coordinates and coloring of each pixel. The new normalized photo (Fig. 4) depicts only the crop with standard pseudo-coloring. i. selection of the coloring and the coordinates of the pixels representing the highest temperatures and places the pseudo-color on the corresponding (same coordinates) pixels of the digital photograph (Fig.
5)· j. feeding of digital pseudo-coloring photo in a trained Convolutional Neural Network. k. classification stressed sub-areas within the crop with a specific delimitation of the stressed region with contour. l. illustration of the digital photograph of a standardized pseudo-color of the crop with demarcated contours of the sub-areas with a verbal representation of stress classification (Fig. 6, Cl stress).
2. Method for detection and classification of biotic-abiotic stress in crops with automated thermal photographs processing using Convolutional Neural Networks - CNNs according to claim 1 wherein the crop is a vineyard.
3. Method for detection and classification of biotic-abiotic stress in crops with automated thermal photographs processing using Convolutional Neural Networks - CNNs according to claim 1 wherein the crop is treelike.
4. Method for detection and classification of biotic-abiotic stress in crops with automated thermal photographs processing using Convolutional Neural Networks - CNNs according to claim 1 wherein the crop is an industrial crop.
5. Method for detection and classification of biotic-abiotic stress in crops with automated thermal photographs processing using Convolutional Neural Networks - CNNs according to claim 1 wherein the crop is vegetables.
PCT/GR2021/000024 2020-05-12 2021-04-28 Method for detection and classification of biotic - abiotic stress in crops from thermal photographs using artificial intelligence WO2021229248A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GR20200100243 2020-05-12
GR20200100243A GR1009898B (en) 2020-05-12 2020-05-12 Method of detection and evaluation of the biotic-abiotic stress in cultivations via thermal photographs and use of artificial intelligence

Publications (1)

Publication Number Publication Date
WO2021229248A1 true WO2021229248A1 (en) 2021-11-18

Family

ID=75107708

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GR2021/000024 WO2021229248A1 (en) 2020-05-12 2021-04-28 Method for detection and classification of biotic - abiotic stress in crops from thermal photographs using artificial intelligence

Country Status (2)

Country Link
GR (1) GR1009898B (en)
WO (1) WO2021229248A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115200722A (en) * 2022-09-16 2022-10-18 江苏宸洋食品有限公司 Temperature measuring method and refrigerator car temperature measuring system applying same
KR102631597B1 (en) * 2023-06-29 2024-02-02 주식회사 리플로그 Strawberry stress index calculation method and cultivation management system using chlorophyll fluorescence value

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1263962A1 (en) * 2000-02-25 2002-12-11 Avestha Gengraine Technologies PVT Ltd A process for constructing dna based molecular marker for enabling selection of drought and diseases resistant germplasm screening
CN101727665B (en) * 2008-10-27 2011-09-07 广州飒特电力红外技术有限公司 Method and device for fusing infrared images and visible light images
KR101729169B1 (en) * 2014-10-29 2017-05-11 서울대학교산학협력단 Method of diagnosing responses of plants to abiotic stress or herbicide using thermal image
CN105719304B (en) * 2016-01-25 2018-04-13 中山大学 A kind of flower image dividing method based on Otsu
CN105678793B (en) * 2016-02-26 2019-01-15 浙江大学 A kind of method of early diagnosis and device of the Prospect on Kiwifruit Bacterial Canker based on image co-registration
KR101830056B1 (en) * 2017-07-05 2018-02-19 (주)이지팜 Diagnosis of Plant disease using deep learning system and its use
FR3069940B1 (en) * 2017-08-03 2019-09-06 Universite D'orleans METHOD AND SYSTEM FOR CARTOGRAPHY OF THE HEALTH CONDITION OF CULTURES
CN107767364B (en) * 2017-09-12 2021-03-23 中国林业科学研究院林业研究所 Method for accurately extracting temperature of tree canopy based on infrared thermal image
CN108537777A (en) * 2018-03-20 2018-09-14 西京学院 A kind of crop disease recognition methods based on neural network

Non-Patent Citations (21)

* Cited by examiner, † Cited by third party
Title
AL HIARY ET AL.: "201 la. Fast and Accurate Detection and Classification of Plant Diseases", INT. J. COMPUT. APPL., vol. 17, pages 31 - 38
BALLESTEROS ET AL.: "Applications of georeferenced high-resolution images obtained with unmanned aerial vehicles. Part I: Description of image acquisition and processing", PRECIS. AGRIC., vol. 15, 2014, pages 579 - 592
BALUJA ET AL.: "Assessment of vineyard water status variability by thermal and multispectral imagery using an unmanned aerial vehicle (UAV", IRRIG. SCI., vol. 30, 2012, pages 511 - 522, XP035128494, DOI: 10.1007/s00271-012-0382-9
BENDIG ET AL.: "Estimating Biomass of Barley Using Crop Surface Models (CSMs) Derived from UAV-Based RGB Imaging", REMOTE SENS, vol. 6, 2014, pages 10395 - 10412
CANDIAGO ET AL.: "Evaluating Multispectral Images and Vegetation Indices for Precision Farming Applications from UAV Images", REMOTE SENS, vol. 7, 2015, pages 4026 - 4047
COLOMINAMOLINA: "Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS J. Photogramm", REMOTE SENS, vol. 92, 2014, pages 79 - 97
GONZALEZ-DUGO ET AL.: "Using High-Resolution Hyperspectral and Thermal Airborne Imagery to Assess Physiological Condition in the Context of Wheat Phenotyping", REMOTE SENS, vol. 7, 2015, pages 13586 - 13605
IDSO ET AL.: "Normalizing the stress-degree-day parameter for environmental variability", AGRIC. METEOROL., vol. 24, 1981, pages 45 - 55
JONES ET AL.: "Thermal infrared imaging of crop canopies for the remote diagnosis and quantification of plant responses to water stress in the field", FUNCT. PLANT BIOL., vol. 36, 2009, pages 978
MAZARE ALIN GH ET AL: "Embedded system for real time analysis of thermal images for prevention of water stress on plants", 2018 41ST INTERNATIONAL SPRING SEMINAR ON ELECTRONICS TECHNOLOGY (ISSE), IEEE, 16 May 2018 (2018-05-16), pages 1 - 6, XP033391581, DOI: 10.1109/ISSE.2018.8443604 *
NASI ET AL.: "Using UAV-Based Photogrammetry and Hyperspectral Imaging for Mapping Bark Beetle Damage at Tree-Level", REMOTE SENS, vol. 7, 2015, pages 15467 - 15493
OTA ET AL.: "Above ground Biomass Estimation Using Structure from Motion Approach with Aerial Photographs in a Seasonal Tropical Forest", FORESTS, vol. 6, 2015, pages 3882 - 3898
RAHAMAN ET AL.: "Advanced phenotyping and phenotype data analysis for the study of plant growth and development", FRONT. PLANT SCI., 2015, pages 6
SINGHMISRA: "Detection of plant leaf diseases using image segmentation and soft computing techniques", INF. PROCESS. AGRIC., vol. 4, 2017, pages 41 - 49
SUGIURA ET AL.: "Correction of Low-altitude Thermal Images applied to estimating Soil Water Status", BIOSYST. ENG., vol. 96, 2007, pages 301 - 313, XP005887797, DOI: 10.1016/j.biosystemseng.2006.11.006
THORP ET AL.: "Proximal hyperspectral sensing and data analysis approaches for field-based plant phenomics", COMPUT. ELECTRON. AGRIC., vol. 118, 2015, pages 225 - 236
WALLACE ET AL.: "Development of a UAV-LiDAR System with Application to Forest Inventory", REMOTE SENS, vol. 4, 2012, pages 1519 - 1543
ZARCO-TEJADA ET AL., PREVISUAL SYMPTOMS OF XYLELLA FASTIDIOSA INFECTION REVEALED IN SPECTRAL PLANT-TRAIT ALTERATIONS, vol. 7, 2018, pages 432 - 439
ZARCO-TEJADA ET AL.: "Fluorescence, temperature and narrow-band indices acquired from a UAV platform for water stress detection using a micro-hyperspectral imager and a thermal camera", REMOTE SENS. ENVIRON., vol. 117, 2012, pages 322 - 337, XP028123576, DOI: 10.1016/j.rse.2011.10.007
ZARCO-TEJADA ET AL.: "Spatio-temporal patterns of chlorophyll fluorescence and physiological and structural indices acquired from hyperspectral imagery as compared with carbon fluxes measured with eddy covariance", REMOTE SENS. ENVIRON., vol. 133, 2013, pages 102 - 115, XP055523366, DOI: 10.1016/j.rse.2013.02.003
ZHANGKOVACS J.M.: "The application of small unmanned aerial systems for precision agriculture: a review", PRECIS. AGRIC., vol. 13, 2012, pages 693 - 712, XP035134045, DOI: 10.1007/s11119-012-9274-5

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115200722A (en) * 2022-09-16 2022-10-18 江苏宸洋食品有限公司 Temperature measuring method and refrigerator car temperature measuring system applying same
KR102631597B1 (en) * 2023-06-29 2024-02-02 주식회사 리플로그 Strawberry stress index calculation method and cultivation management system using chlorophyll fluorescence value

Also Published As

Publication number Publication date
GR1009898B (en) 2021-01-08

Similar Documents

Publication Publication Date Title
US20230292647A1 (en) System and Method for Crop Monitoring
Xu et al. Multispectral imaging and unmanned aerial systems for cotton plant phenotyping
Stanton et al. Unmanned aircraft system-derived crop height and normalized difference vegetation index metrics for sorghum yield and aphid stress assessment
Solano et al. A methodology based on GEOBIA and WorldView-3 imagery to derive vegetation indices at tree crown detail in olive orchards
Onishi et al. Automatic classification of trees using a UAV onboard camera and deep learning
CN109325431B (en) Method and device for detecting vegetation coverage in feeding path of grassland grazing sheep
Chauhan et al. Wheat lodging assessment using multispectral UAV data
de Oca et al. The AgriQ: A low-cost unmanned aerial system for precision agriculture
Ponti et al. Precision agriculture: Using low-cost systems to acquire low-altitude images
Shirzadifar et al. Field identification of weed species and glyphosate-resistant weeds using high resolution imagery in early growing season
WO2021229248A1 (en) Method for detection and classification of biotic - abiotic stress in crops from thermal photographs using artificial intelligence
Izzuddin et al. Analysis of multispectral imagery from unmanned aerial vehicle (UAV) using object-based image analysis for detection of ganoderma disease in oil palm
Tian et al. Machine learning-based crop recognition from aerial remote sensing imagery
Hanapi et al. A review on remote sensing-based method for tree detection and delineation
Preethi et al. An comprehensive survey on applications of precision agriculture in the context of weed classification, leave disease detection, yield prediction and UAV image analysis
Silva et al. Mapping two competing grassland species from a low-altitude Helium balloon
Lobitz et al. Grapevine remote sensing analysis of phylloxera early stress (GRAPES): remote sensing analysis summary
Ghasemloo et al. Vegetation species determination using spectral characteristics and artificial neural network (SCANN)
Cimtay et al. A new vegetation index in short-wave infrared region of electromagnetic spectrum
Yano et al. Weed identification in sugarcane plantation through images taken from remotely piloted aircraft (RPA) and KNN classifier
CN112577954B (en) Urban green land biomass estimation method
Aziz et al. Detection of Bacterial Leaf Blight Disease Using RGB-Based Vegetation Indices and Fuzzy Logic
Jurišić et al. The evaluation of the RGB and multispectral camera on the unmanned aerial vehicle (UAV) for the machine learning classification of Maize
Yang et al. Using multispectral imagery and linear spectral unmixing techniques for estimating crop yield variability
Papić et al. On Olive Groves Analysis using UAVs

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21726448

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21726448

Country of ref document: EP

Kind code of ref document: A1