WO2023095041A1 - Procédé et système d'identification automatique d'une infestation de culture - Google Patents

Procédé et système d'identification automatique d'une infestation de culture Download PDF

Info

Publication number
WO2023095041A1
WO2023095041A1 PCT/IB2022/061368 IB2022061368W WO2023095041A1 WO 2023095041 A1 WO2023095041 A1 WO 2023095041A1 IB 2022061368 W IB2022061368 W IB 2022061368W WO 2023095041 A1 WO2023095041 A1 WO 2023095041A1
Authority
WO
WIPO (PCT)
Prior art keywords
crop
images
user
infestation
identification system
Prior art date
Application number
PCT/IB2022/061368
Other languages
English (en)
Inventor
Aher PRAJAKTA
Mohammad Shahbaz HUSSAIN
Kedia VEDANSH
Original Assignee
Upl Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Upl Limited filed Critical Upl Limited
Publication of WO2023095041A1 publication Critical patent/WO2023095041A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/01Customer relationship services
    • G06Q30/015Providing customer assistance, e.g. assisting a customer within a business location or via helpdesk
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0201Market modelling; Market analysis; Collecting market data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0281Customer communication at a business location, e.g. providing product or service information, consulting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0623Item investigation
    • G06Q30/0625Directed, with specific intent or strategy
    • G06Q30/0629Directed, with specific intent or strategy for generating comparisons
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0631Item recommendations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/06Buying, selling or leasing transactions
    • G06Q30/0601Electronic shopping [e-shopping]
    • G06Q30/0639Item locations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/72Data preparation, e.g. statistical preprocessing of image or video features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
    • G06V10/993Evaluation of the quality of the acquired pattern
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/10Recognition assisted with metadata

Definitions

  • the present subject matter is, in general, related to identifying crop infestation in the agricultural crops and more particularly, but not exclusively, to a method and system for automatically identifying crop infestation in a real-time agricultural environment.
  • the existing Al based approaches use data collected from a controlled environment to train the learning models. Consequently, the existing Al based approaches fail to accurately diagnose the diseases that occur in a real-world agricultural environment. Also, the existing approaches are limited to diagnosis of the diseases, and do not assist farmers in managing the diseases.
  • the method comprises receiving, by an identification system, a user selection on at least one option from a plurality of options provided to the user, wherein each of the plurality of options relate to a type of the crop infestation. Further, the method comprises receiving one or more images of an affected crop and metadata associated with the one or more images of the affected crop. Further, the method comprises verifying each of the one or more images using a pre-trained neural network for determining a usability of each of the one or more images. Furthermore, the method comprises predicting an infected region in the one or more images according to the option selected by the user. Thereafter, the method comprises analyzing the infected region using the pre-trained neural network for identifying the crop infestation in the infected region.
  • the present disclosure relates to an identification system for automatically identifying a crop infestation in a real-time environment.
  • the identification system comprises a processor and a memory.
  • the memory is communicatively coupled to the processor and stores processorexecutable instractions, which on execution, cause the processor to receive a user selection on at least one option from a plurality of options provided to the user. Each of the plurality of options relate to a type of the crop infestation.
  • the instructions cause the processor to receive one or more images of an affected crop and metadata associated with the one or more images of the affected crop. Further, the instructions cause the processor to verify each of the one or more images using a pre-trained neural network for determining a usability of each of the one or more images.
  • the instructions cause the processor to predict an infected region in the one or more images according to the option selected by the user. Thereafter, the instructions cause the processor to analyze the infected region using the pre-trained neural network for identifying tire crop infestation in the infected region.
  • FIG. 1 provides an overview of the proposed solution for automatically identifying a crop infestation in a real-time environment in accordance with some embodiments of the present disclosure.
  • FIG.2 shows a detailed block diagram of an identification system for automatically identifying the crop infestation in accordance with some embodiments of the present disclosure.
  • FIG. 3 shows a flowchart illustrating a method for automatically identifying the crop infestation in accordance with some embodiments of the present disclosure.
  • FIG. 4 shows a flowchart indicating various steps involved in identifying the crop infestation and recommending products to a user in accordance with some embodiments of the present disclosure.
  • FIG. 5 illustrates a block diagram of an exemplary computer system for implementing embodiments consistent with the present disclosure.
  • FIG. 1 provides an overview of automatically identifying a crop infestation in a real-time environment in accordance with some embodiments of the present disclosure.
  • the user 101 may include, without limiting to, a fanner, a researcher, a landlord, a worker and the like.
  • the identification system 107 may be any computing system, which may be configured for automatically identifying a crop infestation 109 in a real-time environment of the crop, in accordance with embodiments of the present disclosure.
  • the identification system 107 may include a dedicated computing unit such as, without limiting to, a smartphone, a laptop, a computer and the tike.
  • the proposed solution may be provided as an application platform, which may be downloaded and installed on the user device 103 and/or may be installed on a remote server and accessed through the user device 103 through Interet.
  • the user 101 when the user 101 is using the proposed solution or the application for the first time, the user 101 may be instructed to register with the identification system 107 and create a personalized user account. After successfid registration, the user 101 may login to the application using the pre-registered login credentials. After login, the identification system 107 may provide a list of options to the user on a User Interface (UI) of the user device 101.
  • UI User Interface
  • each of the plurality of options provided to the user 101 may be related to a type of the crop infestation 109 that the user intends to identify from his crops.
  • the crop infestation 109 may comprise at least one of diseases, weeds, and insects.
  • the user 101 may select one of the options among the plurality of options displayed on the UI of the application.
  • the user 101 may select the ‘diseases’ option on the UI.
  • the user 101 may select the ‘weeds’ option to determine if the crop contains any weeds.
  • receiving the user selection on one of the plurality' of options related to type of the crop infestation helps in optimizing both the accuracy and speed of predicting the crop infestation.
  • the identification system 107 receives one or more images 105 of an affected crop from the user 101 through the user device 103.
  • the user 101 may capture the one or more images 105 of the affected crop in real-time using a camera of the user device 103.
  • the identification system 109 may activate an on-device camera of the user device 103 and prompt the user to capture the one or more real-time images 105 of the affected crop.
  • the user 101 may choose to upload the one or more images 105 of the affected crop from a storage space associated with the user device 103.
  • the identification system 109 may allow the user to upload the one or more images 105 of the crop from a local storage of the user device 103.
  • the identification system 109 may also collect metadata relating to each of the one or more images 105.
  • the metadata may include, without limitation, at least one of an information related to location of the affected crop, resolution of the one or more images, time and data of capturing the one or more images, and a label associated with each of the one or more images 105.
  • the label associated with the images 105 may include, without limitation, a name and/or stage of the crop infestation 109.
  • the stage of the crop infestation may be one of, without limitation, egg, larvae, nymph, adult, and/or damage to the crop.
  • the identification system 109 may verify each of the one or more images 105 using a pre-trained neural network for determining a usability of each of the one or more images 105.
  • determining the usability of the images 105 may include, without limiting to, checking if the captured image is blurry, distorted, skewed etc., to ensure that the captured image can be used for further analysis. If the captured image is found to be unusable, then the identification system 107 may prompt the user 101 to capture fresh images 105 of the infested crop and/or upload a different image 105 of the infested crop.
  • the pre-trained neural network used for verifying and determining the usability of the images 105 may include, without limitation, at least one of a Convolution Neural Network (CNN), a Recurrent Neural Network (RNN), a Long Short-term Memory (LSTM), a recursive neural network, a graph convolutional network and a sequential neural network.
  • CNN Convolution Neural Network
  • RNN Recurrent Neural Network
  • LSTM Long Short-term Memory
  • recursive neural network a graph convolutional network and a sequential neural network.
  • the identification system 109 may predict an infected region in the verified one or more images 105 according to the option selected by the user 101. As an example, if the option selected by the user 101 is to analyze the crop for presence of a disease, then the identification system may predict the infected region that is likely to be affected by a disease.
  • the neural network is trained with a plurality of infected regions derived from a plurality of training images to identify the infected region in the one or more images 109. As an example, the plurality of training images, comprising the infected regions, are selected from the images of the crop based on the metadata associated with the one or one or more images 105.
  • the identification system 107 may analyze the infected region using the pre-trained neural network for identifying the crop infestation 109 in the infected region.
  • the identification system 109 may train the neural network with a plurality of crop infestations derived from a plurality of crop infestation images.
  • the plurality of crop infestation images for training may be selected fiom the images of the crop, based on the metadata comprising the name of the crop infestation.
  • the name and other details relating to the identified crop infestation 109 may be displayed to the user through the UI of the user device 103.
  • the identification system 107 may receive a user input or user rating on the predicted crop infestation 109 and assign a confidence score to the pre-trained neural network based on the user 101 input. As an example, on a scale of 0-10, where 0 being a low rating and 10 being the highest rating, if the user has provided a rating of more than 8 to the predicted crop infestation 109, then the pre-trained neural network may be assigned with a higher confidence score . Alternatively, if the user rating is a value less than 3, then the pre-trained neural network may be assigned a low confidence score and subjected to further training. In an implementation, the identification system 107 identifies the crop infestation 109 based on a threshold value of a confidence score for the crop infestation. As an example, the threshold value for the crop infestation may be at least 30%.
  • the identification system 107 may recommend one or more products 111 for controlling and/or preventing the crop infestation 109 identified in the affected crop.
  • the one or more recommended products 111 may include, without limitation, pesticides, fungicides, biological products, plant growth regulators, micro and macro nutrients, insecticides and the like.
  • the identification system 107 may provide other information related to the one or more recommended products such as, without limitation, name of the product, product description, contact details of the sellers or distributors of the product, dosage information of the product, Stock Keeping Units (SKU) information of the product and the like.
  • the one or more products may be recommended based on the metadata associated with the one or more images 105 used for identifying the crop infestation 109.
  • the one or more products may be recommended based on the location of the crop and/or the location of the user 101.
  • the identification system 107 may be also configured to map the one or more products with the identified crop infestation 109, and subsequently, provide a comparative analysis of the price of the one or more recommended products. Also, the identification system 107 may provide information related to a selling price of the one or more products across different markets. Additionally, the identification system 107 may indicate historical prices of the same product along with some graphical representation about percentage of an increase or decrease in the price of the recommended product over a period of time. Also, the identification system 107 may provide an average price information of the recommended product for a predefined number of days.
  • the identification system 107 may provide details of the one or more sellers selling the recommended products 111 to the user, based on a geo-location of the crop and/or the user 101. In an embodiment, the identification system 107 may authenticate the user 101 and the one or more sellers at the time of registering them into the identification system 107 to establish a common platform for the user 101 and the one or more sellers to directly connect with each other.
  • the identification system 107 may also allow the user 101 to access any information related to previously identified crop infestation 109 and/or information related to the one or more products recommended for controlling such crop infestation 109 from a database associated with the identification system 109. This allows the user 101 to control the crop infestation 109 even in scenarios where the user 101 is unable to capture the images of the crop and/or when the user 101 does not have access to the Internet.
  • the user 101 may share and discuss his/her own analysis and knowledge about various crop infestation 109 with a community of users 101 (for example, the formers), who have come across similar crop infestation 109, using a database or a live chat window provided on the application.
  • the identification system 107 may facilitate the user 101 for scouting one or more forms managed by the user 101 for determining a field-specific occurrence of the crop infestation 109.
  • the user 101 may register the one or more forms by entering the details related to the one or more forms.
  • the information related to the one or more forms may include, without limitation, a field name, a method of cultivation used in the one or more farms (for example, direct seeded or transplanted), a seed variety (for example, inbred or hybrid), sowing date, harvesting date, field area (for example, manually entering field area or geo-tagging and drawing a boundary of the farm) and the like.
  • the farms may be added to the application. Subsequently, the user 101 may be allowed to select the farms before scouting for any crop infestation 109 across forms.
  • the identification system 107 may also provide a product listing and search functionality to the user 101, using which the user 101 may create a wish list of the recommended products from the scouting prediction or the products page.
  • identification system 107 may also provide a field-specific information related to historical weather, current weather and a future forecast for the one or more forms managed by the user 101.
  • the weather information may include, without limitation, whether the day is a sunny day, a rainy day and whether special events like a thunderstorm or a snowfell is expected in the day.
  • Such field-specific information helps the user make suitable decisions on various activities to be taken up in the farms.
  • FIG. 2 shows a detailed block diagram of an identification system for automatically identifying a crop infestation in accordance with some embodiments of the present disclosure.
  • the identification system 107 may include a processor 201, a I/O Interface 203 and a memory 205.
  • the memory 205 may comprise data 207 and one or more modules 209.
  • the memory 205 may be communicatively coupled to the processor 201.
  • the processor 201 may be configured to perform one or more functions of the identification system 107 for automatically identifying the crop infestation 109 in a real-time environment using one or more modules 209 stored in the memory 205.
  • the I/O interface 203 may be configured for establishing a connection with a user device 103 and other entities like a remote server or a database associated with the identification system 107.
  • the data 207 may include, without limitation, a user selection 211, one or more images 105, metadata 213, crop infestation 109 information and other data 215.
  • the user selection 211 may be an option selected by the user 101 among the plurality of options provided to the user 101.
  • the plurality of options may be provided on a dashboard of the user device 103.
  • the plurality of options provided may comprise information related to the crop infestation 109.
  • the crop infestation 109 may comprise at least one of diseases, weeds, and insects.
  • the one or more images 105 may be the images of the affected crop captured using a user device 103 associated with the user 101 and/or images uploaded from a storage space.
  • the metadata 213 may include, without limitation, at least one of an information related to location of the affected crop, resolution of the one or more images, time and data of capturing the one or more images, and a label associated with each of the one or more images 105.
  • the label associated with the images may include, without limitation, a name and/or stage of the crop infestation 109.
  • the stage of the crop infestation may include, without limitation, egg, larvae, nymph, adult, and damage to the crop.
  • the other data 215 may include various temporary data and files generated by the one or more modules 209 w'hile performing various functions of the identification system 107.
  • the data 207 may be processed by the one or more modules 209.
  • the one or more modules 209 may be communicatively coupled to the processor 201 for performing one or more functions of the identification system 107.
  • the one or more modules 209 may include, without limitation, a receiving module 217, verifying module 219, predicting module 221, analyzing module 223, pre-trained neural network model 225 and other modules 227.
  • module refers to an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and memory that execute one or more software or firmware programs, a combinational logic circuit, and/or other suitable components that provide the described functionality.
  • ASIC Application Specific Integrated Circuit
  • the other modules 227 may be used to perform various miscellaneous functionalities of the identification system 107. It will be appreciated that such one or more modules 209 may be represented as a single module or a combination of different modules.
  • the receiving module 217 may be configured to receive a user selection 211 on at least one option from a plurality of options provided to the user 101, wherein each of the plurality of options relate to a type of the crop infestation 109. Further, the receiving module 217 may be configured to receive one or more images 105 of an affected crop and metadata 213 associated with the one or more images 105 of the affected crop. In an embodiment, the verification module 219 may be configmed to verify each of the one or more images 105 using a pre-trained neural network for determining a usability of each of tire one or more images 105.
  • the pre-trained neural network may include, without limitation, at least one of a Convolution Neural Network (CNN), a Recurrent Neural Network (RNN), a Long Short-term Memory (LSTM), a recursive neural network, a graph convolutional network and a sequential neural network.
  • CNN Convolution Neural Network
  • RNN Recurrent Neural Network
  • LSTM Long Short-term Memory
  • recursive neural network a graph convolutional network and a sequential neural network.
  • the predicting module 221 may be configured to predict an infected region in the verified one or more images 105 according to the option selected by the user 101.
  • the affected region corresponds to the region affected by the crop infestation 109.
  • the analyzing module 223 may be configured to analyze the infected region using the pre-trained neural network model 225 for identifying the crop infestation 109 in the infected region.
  • FIG. 3 shows a flowchart illustrating a method for automatically identifying a crop infestation in accordance with some embodiments of the present disclosure.
  • the method 300 may include one or more blocks illustrating a method for automatically identifying a crop infestation 109 in a real-time environment using an identification system 107 as illustrated in FIG. 1.
  • the method 300 may be described in the general context of computer executable instructions.
  • computer executable instractions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform specific functions or implement specific abstract data types.
  • the order in which the method 300 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof.
  • the method 300 includes receiving, by an identification system 107, a user selection 211 on at least one option fiom a plurality of options provided to the user, wherein each of the plurality of options relate to a type of the crop infestation 109.
  • the type of the crop infestation 109 comprises at least one of diseases of crops, weeds across the crops, and insects that damage the crops.
  • the user 101 may select the option fiom the plurality of options displayed on dashboard of the application using the user device 103.
  • the method 300 includes receiving, by the identification system 107, one or more images 105 of an affected crop and metadata 213 associated with the one or more images 105 of the affected crop.
  • the user 101 may capture one or more images 105 of the affected crop in real-time using a user device 103, and/or the user 101 may upload the one or more images 105 of the affected crop from a storage space.
  • the identification system 107 activates an on-device camera of the user device 103 and allows the user to capture one or more real-time images 105 of the affected crop.
  • the identification system 107 allows the user to upload one or more images 105 of the crop from a pre-defined local storage.
  • the pre-defined local storage may be a database associated with the user device 103.
  • the metadata associated with the one or more images 105 may include, without limitation, information related to location of the affected crop, resolution of the one or more images, time and date of capturing the one or more images, a label associated with each of the one or more images.
  • the label associated with each of the one or more images may include without limitation, name and/or stage of the crop infestation.
  • the stage of the disease or the insect in the crop may include, without limitation, eggs, larvae, nymph, adult, damages and the like.
  • the method 300 includes verifying, by the identification system 107, each of the one or more images using a pre-trained neural network for determining a usability of each of the one or more images 105.
  • the pre-trained neural networks comprises a plurality of layers.
  • the plurality of layers may include, without limitation, to convolution layers, maxpool layers, dense layers fully connected layers and the like.
  • the pre-trained neural network may be modified based on the crop infestation 109 that has to be identified.
  • the neural network may be trained using the plurality of reference images of the crop captured from a real-world environment of the crop.
  • the method 300 includes predicting, by the identification system 107, an infected region in the one or more images according to the option selected by the user 101.
  • the method includes analyzing, by the identification system 107, the infected region using the pre-trained neural network for identifying the crop infestation 109 in the infected region.
  • the method 300 includes recommending one or more products 111 for controlling the crop infestation 109 identified in the affected crop to the user.
  • the one or more recommended products 111 may include, without limitation, pesticides, fungicides, biological products, plant growth regulators, micro and macro nutrients, insecticides and the like.
  • the information related to the one or more recommended products 111 may include, without limitation, a product name, a product description, contact details of the sellers of the products, dosage information of the products, Stock Keeping Units (SKU) information of the products and the like.
  • the one or more products is recommended based on the metadata 213 associated with the one or more images 105 used for identifying the crop infestation 109.
  • the one or more products is recommended based on the location of the crop and/or the user 101.
  • the method 300 includes providing information related to the one or more sellers of the one or more products recommended to the user. Further, the method 300 includes providing field-specific information related to historical weather, current weather and fixture forecast for the one or more forms managed by the user. Additionally, the method 300 includes facilitating the user for scouting one or more forms managed by the user for determining a field-specific occurrence of the crop infestation.
  • FIG. 4 shows a flowchart for identifying a crop disease and recommending products to a user in accordance with some embodiments of the present disclosure.
  • a user 101 may login to a personalized account created on the application.
  • the user 101 may create his personalized account by signing up or registering with an application.
  • the user 101 may download and install the application on a user device 103 and login into the application.
  • the application may display plurality of options to the user 101 on a UI associated with the user device 103 to receive a user selection on at least one option from the plurality of options.
  • the plurality of options relates to a type of the crop infestation 109.
  • a camera in the user device 103 may be launched to capture at least one image of the affected crop, as indicated in the step 406.
  • the application may receive the user selection 211.
  • the user selection 211 may be at least one option selected from the plurality of options by the user 101.
  • the user 101 may capture the one or more images 105 of the affected crop using the image acquisition device associated with the user device of the user 101.
  • the user 101 may upload the one or more images 105 of the affected crop from a storage space associated with the user device of the user 101.
  • the application may predict the type of the crop infestation 109.
  • the type of the crop infestation 109 comprises at least one of diseases of the crops, weeds across the crops, and insects that damage the crops. That is, after capturing the one or more images 105 or uploading the one or more images 105, of the affected crop, the pre-trained neural network analyses the one or more images 105 to identify the crop infestation 109.
  • the application may recommend one or more products 111 to the user to control the identified crop infestation 109.
  • the one or more recommended products 111 may include, without limitation, pesticides, fungicides, biological products, plant growth regulators, micro and macro nutrients, insecticides and the like.
  • the application may recommend one or more sellers or distributors 113 of the one or more products 111.
  • the user 101 may be connected to a computing unit or server associated with the identification system 109 through a network, using the user device 103.
  • the one or more sellers may also connect to the network and the computing unit, using the user device 103.
  • the computing unit may provide details about the recommended products that can be used on the identified crop infestation 109, as well as the corresponding details of the one or more sellers selling the recommended products 111 to the user, based on the geo-location of the crop and the user 101.
  • the user 101 may select an option to refer to the historical data after logging into the application and the user may select the image as indicated in step 420.
  • the application may be configured to store the information related to previously identified crop infestation (i.e., historical data) provided to the user 101 on a storage associated with the application, to ensure that any reference or analysis previously made by the user 101 is not lost and can be revisited at any time. As a result, the user 101 may directly contact the sellers to enquire about the one or more recommended products 111.
  • FIG. 5 illustrates a block diagram of an exemplary computer system 500 for implementing embodiments consistent with the present disclosure.
  • the computer system 500 may be the identification system 107 illustrated in FIG. 1, which automatically identifies a crop infestation in a real-time environment.
  • the computer system 500 may include a Central Processing Unit (“CPU” or “processor”) 502.
  • the processor 502 may comprise at least one data processor for executing program components for executing user-or-system generated processes.
  • the processor 502 may include specialized processing units such as integrated system (bus) controllers, memory management control units, floating point units, graphics processing units, digital signal processing units, etc.
  • the processor 502 may be disposed in communication with one or more Input/Output (I/O) devices (510 and 511) via I/O interface 501.
  • the I/O interface 501 may employ communication protocols/methods such as, without limitation, audio, analogue, digital, stereo, IEEE®- 1394, serial bus, Universal Serial Bus (USB), infrared, PS/2, BNC, coaxial, component, composite, Digital Visual Interface (DVI), high-definition multimedia interface (HDMI), Radio Frequency (RF) antennas, S-Video, Video Graphics Array (VGA), IEEE® 8O2.n /b/g/n/x, Bluetooth, cellular (e.g., Code-Division Multiple Access (CDMA), High-Speed Packet Access (HSPA+), Global System For Mobile Communications (GSM), Long-Term Evolution (LTE) or the like), etc.
  • the computer system 500 may communicate with one or more I/O devices 510 and 511.
  • the processor 502 may be disposed in communication with a communication network 509 via network interface 503.
  • the network interface 503 may communicate with the communication network 509.
  • the network interface 503 may employ connection protocols including, without limitation, direct connect, Ethernet (e.g., twisted pair 10/100/1000 Base T), Transmission Control Protocol/Intemet Protocol (TCP/IP), token ring IEEE® 802.1 la/b/g/n/x, etc.
  • TCP/IP Transmission Control Protocol/Intemet Protocol
  • token ring IEEE® 802.1 la/b/g/n/x etc.
  • the computer system 500 may improve the process of identifying or detecting the presence of child seat in a vehicle.
  • the communication network 509 may be connected with the user device 103 associated with the user 101.
  • the communication network 509 may be implemented as one of the several types of networks, such as intranet or Local Area Network (LAN) and such within the organization.
  • the communication network 509 may either be a dedicated network or a shared network, which represents an association of several types of networks that use a variety- of protocols, for example. Hypertext Transfer Protocol (HTTP), Transmission Control Protocol/Intemet Protocol (TCP/IP), Wireless Application Protocol (WAP), etc., to communicate with each other.
  • HTTP Hypertext Transfer Protocol
  • TCP/IP Transmission Control Protocol/Intemet Protocol
  • WAP Wireless Application Protocol
  • the communication network 509 may include a variety of network devices, including routers, bridges, servers, computing devices, storage devices, etc.
  • the processor 502 may be disposed in communication with a memory 505 (e.g., RAM 512, ROM 513, etc. as shown in FIG. 5) via a storage interface 504.
  • the storage interface 504 may connect to memory' 505 including, without limitation, memory drives, removable disc drives, etc., employing connection protocols such as Serial Advanced Technology- Attachment (SATA), Integrated Drive Electronics (IDE), IEEE-1394, Universal Serial Bus (USB), fiber channel, Small Computer Systems Interface (SCSI), etc.
  • the memory drives may further include a dram, magnetic disc drive, magneto-optical drive, optical drive, Redundant Array of Independent Discs (RAID), solid-state memory devices, solid-state drives, etc.
  • the memory 505 may store a collection of program or database components, including, without limitation, user/application interface 506, an operating system 507, a web browser 508, and the like.
  • computer system 500 may store user/application data 506, such as the data, variables, records, etc. as described in this invention.
  • databases may be implemented as fault-tolerant, scalable, secure databases such as distributed databases.
  • the operating system 507 may facilitate resource management and operation of the computer system 500.
  • Examples of operating systems include, without limitation, APPLE® MACINTOSH®’ OS X®, UNIX®, UNIX-like system distributions (E.G., BERKELEY SOFTWARE DISTRIBUTION® (BSD), FREEBSD®, NETBSD®, OPENBSD, etc.), LINUX® DISTRIBUTIONS (E.G., RED HAT®, UBUNTU®, KUBUNTU®, etc.), IBM® OS/2®, MICROSOFT® WINDOWS® (XP®, VISTA®/7/8, 10 etc.), APPLE® IOS®, GOOGLE TM ANDROID TM, BLACKBERRY® OS, or the like.
  • the user interface 506 may facilitate display, execution, interaction, manipulation, or operation of program components through textual or graphical facilities.
  • the user interface 506 may provide computer interaction interface elements on a display system operatively connected to the computer system 500, such as cursors, icons, check boxes, menus, scrollers, windows, widgets, and the like.
  • GUIs Graphical User Interfeces
  • GUIs may be employed, including, without limitation, APPLE® MACINTOSH® operating systems’ Aqua®, IBM® OS/2®, MICROSOFT® WINDOWS® (e.g., Aero, Metro, etc.), web interface libraries (e.g., ActiveX®, JAVA®, JAVASCRIPT®, AJAX, HTML, ADOBE® FLASH®, etc.), or the like.
  • the web browser 508 may be a hypertext viewing application. Secure web browsing may be provided using Secure Hypertext Transport Protocol (H11PS), Secure Sockets Layer (SSL), Transport Layer Security (TLS), and the like.
  • the web browsers 508 may utilize facilities such as AJAX, DHTML, ADOBE® FLASH®, JAVASCRIPT®, JAVA®, Application Programming Interfaces (APIs), and the like.
  • Furflier, the computer system 500 may implement a mail server stored program component.
  • the mail server may utilize facilities such as ASP, ACTIVEX®, ANSI® C++/C#, MICROSOFT®, NET, CGI SCRIPTS, JAVA®, JAVASCRIPT®, PERL®, PHP, PYTHON®, WEBOBJECTS®, etc.
  • the mail server may utilize communication protocols such as Internet Message Access Protocol (IMAP), Messaging Application Programming Interface (MAPI), MICROSOFT® exchange, Post Office Protocol (POP), Simple Mail Transfer Protocol (SMTP), or the like.
  • the computer system 700 may implement a mail client stored program component.
  • the mail client may be a mail viewing application, such as APPLE® MAIL, MICROSOFT® ENTOURAGE®, MICROSOFT® OUTLOOK®, MOZILLA® THUNDERBIRD®, and the like.
  • a computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored.
  • a computer-readable storage medium may store instractions for execution by one or more processors, including instructions for causing the processors) to perform steps or stages consistent with the embodiments described herein.
  • the term “computer-readable medium’' should be understood to include tangible items and exclude carrier waves and transient signals, i.e., non-transitory. Examples include Random Access Memory (RAM), Read-Only Memory (ROM), volatile memory, nonvolatile memory, hard drives, Compact Disc (CD) ROMs, Digital Video Disc (DVDs), flash drives, disks, and any other known physical storage media.
  • the present disclosure uses on-device filtering techniques to identify and reject distorted/unsuitable images of the crops and prompts the farmers to capture the quality images.
  • the analysis of the unwanted images is avoided and also the unwanted images are prevented from entering the server. Consequently, the method of present disclosure provides a faster analysis and optimal usage of computing resources.
  • the pre-trained neural network used for identifying the crop infestation, is trained with a plurality of images captured from the real- world environment of the crops, rather than the images captured from a controlled environment. Consequently, the neural network of the present disclosure identifies the crop infestation with at most precision.
  • the present disclosure can serve a greater number of users at a time. Further, the present disclosure efficiently identifies all the infestation occurring on the crops, and it also recommends suitable products for the identified crops to control the identified crop infestation.
  • the user may select his/her one or more form before scouting for crop infestation.
  • the present disclosure helps in understanding field-specific occurrence of biotic stresses.
  • the present disclosure create more awareness and better mapping of the one or more recommended products by using multiple predictions.
  • the present disclosure provides a product listing and search functionality. As a result, the user may see all the products for herbicides, fungicides, insecticides, and BioSolutions.
  • the present disclosure allows the user to search one or more products based on weeds, diseases and insects name or based on product’s name. Consequently, the user may directly search for the best products if they already know their weeds, diseases and insect’s name.
  • the user may wish list a product from scouting prediction or products page. As a result, the user details may be forwarded to the three nearest distributors of the one or more recommended products via. Consequently, the present disclosure increases the connectivity between the users and the sellers.
  • the present disclosure enables the users to get the best price of their crop produce by looking into the market price of the respective countries with open-source dynamic websites.
  • the claimed steps are not routine, conventional, or well-known aspects in the art, as the claimed steps provide the aforesaid solutions to the technical problems existing in the conventional technologies. Further, the claimed steps clearly bring an improvement in the functioning of the system itself, as the claimed steps provide a technical solution to a technical problem.
  • an embodiment means “one or more (but not all) embodiments of the inventions)" unless expressly specified otherwise.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Finance (AREA)
  • Strategic Management (AREA)
  • Accounting & Taxation (AREA)
  • Development Economics (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Quality & Reliability (AREA)
  • Game Theory and Decision Science (AREA)
  • Human Resources & Organizations (AREA)
  • Tourism & Hospitality (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Agronomy & Crop Science (AREA)
  • Animal Husbandry (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Mining & Mineral Resources (AREA)
  • Primary Health Care (AREA)
  • Operations Research (AREA)
  • Data Mining & Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

Sont divulgués ici un procédé et un système d'identification automatique d'une infestation de culture. Le procédé consiste à recevoir, par un système d'identification (107), une sélection utilisateur (211) sur au moins une option parmi une pluralité d'options fournies à l'utilisateur (101), chaque option de la pluralité d'options se rapportant au type de l'infestation de culture (109). En outre, le procédé consiste à recevoir une ou plusieurs images d'une culture affectée et des métadonnées associées auxdites images de la culture affectée. Par ailleurs, le procédé consiste à vérifier chacune desdites images à l'aide d'un réseau neuronal pré-formé pour déterminer la facilité d'utilisation de chacune desdites images. De plus, le procédé consiste à prédire une région infectée dans lesdites images en fonction de l'option sélectionnée par l'utilisateur. Ensuite, le procédé consiste à analyser la région infectée à l'aide d'un réseau neuronal pré-formé pour identifier l'infestation de culture dans la région infectée.
PCT/IB2022/061368 2021-11-24 2022-11-24 Procédé et système d'identification automatique d'une infestation de culture WO2023095041A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN202121054111 2021-11-24
IN202121054111 2021-11-24

Publications (1)

Publication Number Publication Date
WO2023095041A1 true WO2023095041A1 (fr) 2023-06-01

Family

ID=86538930

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2022/061368 WO2023095041A1 (fr) 2021-11-24 2022-11-24 Procédé et système d'identification automatique d'une infestation de culture

Country Status (1)

Country Link
WO (1) WO2023095041A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190066234A1 (en) * 2017-08-28 2019-02-28 The Climate Corporation Crop disease recognition and yield estimation
US20190228224A1 (en) * 2018-01-23 2019-07-25 X Development Llc Crop type classification in images
US20190362146A1 (en) * 2018-05-24 2019-11-28 Blue River Technology Inc. Semantic Segmentation to Identify and Treat Plants in a Field and Verify the Plant Treatments
US20200273172A1 (en) * 2019-02-27 2020-08-27 International Business Machines Corporation Crop grading via deep learning
WO2020234296A1 (fr) * 2019-05-20 2020-11-26 Basf Agro Trademarks Gmbh Procédé de traitement de plantation basé sur la reconnaissance d'image
US20210256256A1 (en) * 2017-03-02 2021-08-19 Farmwave, Llc Automated diagnosis and treatment of crop infestations
WO2021181371A1 (fr) * 2020-03-11 2021-09-16 Viewnetic Ltd. Systèmes et procédés de surveillance de plantes dans des zones de culture de plantes

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210256256A1 (en) * 2017-03-02 2021-08-19 Farmwave, Llc Automated diagnosis and treatment of crop infestations
US20190066234A1 (en) * 2017-08-28 2019-02-28 The Climate Corporation Crop disease recognition and yield estimation
US20190228224A1 (en) * 2018-01-23 2019-07-25 X Development Llc Crop type classification in images
US20190362146A1 (en) * 2018-05-24 2019-11-28 Blue River Technology Inc. Semantic Segmentation to Identify and Treat Plants in a Field and Verify the Plant Treatments
US20200273172A1 (en) * 2019-02-27 2020-08-27 International Business Machines Corporation Crop grading via deep learning
WO2020234296A1 (fr) * 2019-05-20 2020-11-26 Basf Agro Trademarks Gmbh Procédé de traitement de plantation basé sur la reconnaissance d'image
WO2021181371A1 (fr) * 2020-03-11 2021-09-16 Viewnetic Ltd. Systèmes et procédés de surveillance de plantes dans des zones de culture de plantes

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
BOUZERDOUM A., HAVSTAD A., BEGHDADI A.: "Image quality assessment using a neural network approach", SIGNAL PROCESSING AND INFORMATION TECHNOLOGY, 2004. PROCEEDINGS OF THE FOURTH IEEE INTERNATIONAL SYMPOSIUM ON ROME, ITALY DEC. 18-21, 2004, PISCATAWAY, NJ, USA,IEEE, 18 December 2004 (2004-12-18) - 21 December 2004 (2004-12-21), pages 330 - 333, XP010800521, ISBN: 978-0-7803-8689-1, DOI: 10.1109/ISSPIT.2004.1433751 *

Similar Documents

Publication Publication Date Title
US11120552B2 (en) Crop grading via deep learning
EP3355201B1 (fr) Procédé et système permettant d'établir une relation entre une pluralité d'éléments d'interface utilisateur
US9858175B1 (en) Method and system for generation a valid set of test configurations for test scenarios
US10586188B2 (en) Method and system for dynamic recommendation of experts for resolving queries
EP3924876A1 (fr) Localisation automatisée non supervisée d'événements sensibles au contexte dans des cultures et calcul de leur étendue
US20190380325A1 (en) Manage and control pests infestation using machine learning in conjunction with automated devices
US20130132327A1 (en) Self configuring knowledge base representation
Corcoran et al. Evaluating new technology for biodiversity monitoring: Are drone surveys biased?
Schwert et al. A comparison of support vector machines and manual change detection for land-cover map updating in Massachusetts, USA
Wang et al. Smartphone application-enabled apple fruit surface temperature monitoring tool for in-field and real-time sunburn susceptibility prediction
US20200409827A1 (en) Method and system for automating generation of test data and associated configuration data for testing
US11178856B2 (en) Cognitive hive architecture
US20190259472A1 (en) Methods and systems of tracking disease carrying arthropods
WO2023095041A1 (fr) Procédé et système d'identification automatique d'une infestation de culture
WO2021089785A1 (fr) Fonctionnalité de repérage pour émergence
US20240020971A1 (en) System and method for identifying weeds
Sibanda et al. Mobile apps utilising AI for plant disease identification: A systematic review of user reviews
CN113240340B (zh) 基于模糊分类的大豆种植区分析方法、装置、设备和介质
US20220237481A1 (en) Visual recognition to evaluate and predict pollination
US11823366B2 (en) System and method for anomaly detection using images
WO2022180372A1 (fr) Systèmes et procédés destinés à l'agriculture intelligente
Boehm et al. Floral phenology of an Andean bellflower and pollination by buff‐tailed sicklebill hummingbird
US11087091B2 (en) Method and system for providing contextual responses to user interaction
US11651248B2 (en) Farm data annotation and bias detection
Abdulla et al. Deep learning and IoT for Monitoring Tomato Plant.

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22898058

Country of ref document: EP

Kind code of ref document: A1