WO2018101848A1 - Predictive dynamic cloud based system for environmental sensing and actuation and respective method of operation - Google Patents

Predictive dynamic cloud based system for environmental sensing and actuation and respective method of operation Download PDF

Info

Publication number
WO2018101848A1
WO2018101848A1 PCT/PT2016/050027 PT2016050027W WO2018101848A1 WO 2018101848 A1 WO2018101848 A1 WO 2018101848A1 PT 2016050027 W PT2016050027 W PT 2016050027W WO 2018101848 A1 WO2018101848 A1 WO 2018101848A1
Authority
WO
WIPO (PCT)
Prior art keywords
plant
data
sensor
module
controller
Prior art date
Application number
PCT/PT2016/050027
Other languages
French (fr)
Inventor
Gonçalo DE ABREU SILVÉRIO CABRITA
João Igor PIÇARRA MONTEIRO
Eduardo José DE JESUS ESTEVES
Liliana Raquel SIMÕES MARQUES
Sabrina ALMEIDA DE CARVALHO
Bruno Duarte GOUVEIA
Rui Manuel CASTILLO LEÃO
Original Assignee
Coolfarm S.A.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Coolfarm S.A. filed Critical Coolfarm S.A.
Priority to PCT/PT2016/050027 priority Critical patent/WO2018101848A1/en
Publication of WO2018101848A1 publication Critical patent/WO2018101848A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01GHORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
    • A01G25/00Watering gardens, fields, sports grounds or the like
    • A01G25/16Control of watering
    • A01G25/167Control by humidity of the soil itself or of devices simulating soil or of the atmosphere; Soil humidity sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"

Definitions

  • This application relates to a predictive dynamic cloud based system for environmental sensing and actuation and the respective method of operation.
  • NDVI Normalized Difference Vegetation Index
  • PAR photosynthetically active radiation
  • healthy green vegetation reflects most of the near-infrared light since the energy level for this wavelength, usually longer than 700 nanometers, is not enough to synthesize organic molecules. Absorbing near- infrared light would only result in overheating the plant and damaging the tissues. Plant diseases result in plant cell structure destruction and overheat, often lowering near-infrared reflection when diseased/stressed.
  • NDVI uses the visible and near-infrared bands of the electromagnetic spectrum. The bigger the difference between the near-infrared (NIR) and the red reflectance, the more healthy green vegetation is present.
  • NIR near-infrared
  • NDVI values are represented by a numerical value ranging from -1 to 1. In practice extreme negative values represent water, values around zero represent bare soil and values over 0.6 represent healthy green vegetation.
  • University Sichuan Agricultural et al . (CN 102524024 issued February 16th 2012) developed a computer vision-based system for crop irrigation. Their system uses computer vision to measure plant water stress and thus control the watering cycles. Decision on when to water the plants is achieved by means of a plant growth model, powered by a genetic neural network.
  • University Sichuan Agricultural et al . (CN 102550374 issued March 18th 2012) developed a similar system that also includes a soil moisture sensor and an air temperature and humidity sensor to complement the computer vision data in the decision process .
  • a predictive dynamic cloud based system for environmental sensing and actuation comprising:
  • each control box comprises a sensor/actuator hardware drivers' module, a controller's module, an artificial intelligence layer module, and a built-in wired and/or wireless communication module, wherein all modules and respective data are managed to be integrated by a control software running on the processor unit of said control box;
  • At least one plant sensor module applied on the indoor farm to be monitored, comprising a processor unit with communication capabilities, a camera sensor and a sensor architecture;
  • a remote cloud based system configured to provide data storage, data analysis and interface between the user and the system, wherein the data stored is related to plants growth models, resource spending data and weather forecast data.
  • control box is configured to execute additional software modules, such as data logging modules and notification modules.
  • the sensor/actuator module is connected to the control box through the communication module by wired or wireless protocols .
  • the controllers' module is configured to control all the environmental variables, sensed by the sensor architecture applied in the farm, by iteratively adjusting the respective controllers' gains, using machine learning techniques to generate a model of the indoor farm to be monitored, predicting its future behavior.
  • the controllers' module comprises water pH controller, water electrical conductivity controller, water dissolved oxygen controller, water oxidation-reduction potential controller, water temperature controller, watering cycle controller, air temperature controller, humidity controller and C02 controller .
  • each controller is configured as a neural network being the respective gains adjusted using least squares algorithm based on the machine learning techniques predictions.
  • the artificial intelligence layer module is configured to establish the connection between the indoor farm controlled and the remote cloud based system.
  • the artificial intelligence layer module is configured to run an heuristic optimization algorithm to actuate on the controllers' module, wherein said algorithm integrates data from the models stored on the cloud, such as plants grow models, resource spending data, weather forecast data, user input and sensor data.
  • the built-in communication module is configured to establish wired and/or wireless data communication process over a local or remote network .
  • the camera sensor of the plant sensor module comprises a single color camera composed by a lens with a 460nm and 800nm pass-through filter
  • the processor unit of the plant sensor is configured to calculate at least plant size, plant growth rate and NDVI index.
  • the remote cloud based system comprises a databases storing sensor data and processor capabilities configured to generate plant growth models based on machine learning techniques applied to said sensor data.
  • the remote cloud based system is configured to distribute plant growth models to the connected control boxes.
  • Plant sensor acquires plant and environment data from both the camera sensor and the sensor architecture mounted in the indoor farm to be monitored;
  • Actuation in the actuators module of the control box in accordance with an optimization algorithm, executed on the artificial intelligence layer of the control box, in order to generate reference points for the controllers which in turn send commands to the actuators installed on the farm the plant models developed .
  • the present application intends to solve the problem of controlling the growth of plants or group of plants in a controllable indoor environment (from now on referred to as indoor farm) , for instance in a greenhouse, applicable to different kinds of cultivation such as, but not limited to, aeroponics, hydroponics, aquaponics, bioponics or even soil, with minimal human interference, being designed for domestic or industrial purposes.
  • What is disclosed is a predictive dynamic cloud based system designed to nurture plants in a controlled environment, controlling in real-time several environmental variables that have a direct impact on plant growth, including feedback from the plant itself that is used in the decision process. Feedback from the plant is acquired using a novel plant sensor, also described in this document.
  • the system combines machine learning techniques and image processing algorithms, aiming at growing crops in the most efficient way. All data is stored in a database on the cloud and processed using machine learning techniques, and can be accessed through an intuitive interface, adaptable to both web and mobile platforms, sustained on the concept of controlling, in a precise way, the environmental variables that contribute to plant growth, which allows them to develop in the most healthy, efficient and effective way.
  • the system herein disclosed is divided into three main parts, the control box and the plant sensor, that are to be installed on the indoor farm, and the remote cloud based system which provides data storage, data analysis, and an easy yet powerful interface between the user and the complete system itself.
  • the plant sensor module consists of a modified computer vision camera paired with a set of algorithms that allow it to segment plants, and extract plant data. Environmental data is also acquired by using a sensor architecture that is mounted in the field, allowing the assessment of different environmental parameters.
  • the plant sensor module connects to the control box using any ethernet based protocol.
  • control box comprises a processor unit which allows the system' s integration within the existing legacy systems already in place as well as allowing the connection to the plant sensor module or other new sensor and actuator technologies.
  • a built-in wired and/or wireless communication module integrated within the processor unit architecture, allowing to establish wired and/or wireless data communication process over a local or remote network, such as the internet, and for carrying out a remote control and or monitoring process in the indoor farm.
  • the communication module provides data connections supporting several protocols in parallel.
  • control box All the intelligent control performed by the control box is ensured due to a software architecture which is able to manage all its different modules, being divided into sensor and actuator hardware drivers' module, controllers' module and the artificial intelligence (AI) layer module which manages the entire farm.
  • sensor and actuator hardware drivers are meant to provide hardware abstraction.
  • control box can also run additional software modules, such as data logging modules and notification modules.
  • the controllers' module includes software controllers, which are loaded into the control box processor unit from the cloud, for all the environmental variables that the control box is capable to manage (variables measured by the sensor architecture of the plant sensor installed on the farm, e.g. water pH, air temperature, etc.), being able to cope with the dynamics of time changing models.
  • Said controller module does not need to be previously tuned by the user since it does it on its own due to the capability to learn the model of the process to be controlled, and in that way be able to adjust to different process dynamics over time.
  • a machine learning technique that runs on the control box, is fed with both the sensor data, from the sensor architecture installed on the farm, that provides the output of the process, and the actuator command that provide the input of the same process.
  • This information is used to create a mathematical model of the said process, allowing the prediction of its behavior in the future and the desired trajectory of the process is used to minimize the errors of the controllers either implicitly or explicitly by iteratively adjusting the controllers' gains (e.g. the PID gains of a traditional PID controller) .
  • the AI layer module provides the connection between the indoor farm being controlled and the remote cloud based system.
  • This layer optimizes the indoor farm control by making projections of how the plants grow based on the models provided by the cloud based system, and uses that information in conjunction with both the user input and resource spending models and weather forecast (when applicable) , to estimate the better set points for the indoor farm.
  • the cloud-based system generates mathematical models of how the plants grow in response to the environmental variables.
  • the AI layer uses those models to create several possible plant growth scenarios, based on a set of different environmental variables, constrained to what is achievable inside the farm in study. This is achieved using a heuristic optimization algorithm that runs on the control box.
  • the resulting set of parameters is then forwarded to the controllers which in turn send commands to the actuators installed on the farm.
  • the plant sensor module comprises a lens, a filter, a camera sensor, and a processor with communication capabilities.
  • Light reflected on the plants is captured by the lens thru the filter (which in this case only allows the passing of blue, 460nm, and near infrared (NIR) light, 800nm) , ending on the camera sensor.
  • the camera sensor creates an image (from now on called the captured image) and sends this image to the processor where it is processed for plant data extraction. All software and algorithms described next run inside the plant sensor processor.
  • NDVI Normalized Difference Vegetation Index
  • a first stage of processing algorithms are applied to the captured image to calculate the NDVI index.
  • the NDVI index as seen by the camera might not be correct, thus a regression technique is used to estimate the real NDVI image.
  • several objects in the image can generate NDVI indexes similar to those of plants.
  • a second stage of image processing is performed to achieve a second image that is used to segment the plants, allowing to determine what is in fact a plant, prior to using the NDVI index on the selected areas. This is accomplished by converting the initial image into the HSV color space.
  • a thresholding technique is then used to determine what is plant, and what is not.
  • the resulting image can be easily used for plant segmentation using blob detection.
  • a tracking algorithm is used to track the plants over time, as they grow.
  • the final stage of the NDVI-based plant sensor is translating the data gathered by the sensor into information that the grower can use.
  • the data extracted by the plant sensor includes, but is not limited to:
  • LAI Leaf Area Index
  • NDVI - Average NDVI index of each plant
  • the cloud based system runs off site and is in charge of storing plant data and using that same data for modelling plant growth. This is accomplished using data not only from one site, but from all sites being used by the system together.
  • a machine learning technique is used to model the plant growth functions. These functions model how the environmental variables result in plant growth and health changes over time.
  • FIG 1 illustrates an overview of the complete proposed system.
  • Both the plant and the environment (3) provide data to the plant sensor and environmental sensors installed in the farm (4) that in turn communicates with the control box (1) .
  • All data (6) acquired by the plant sensor is sent to the plant database (7) on the cloud (5), and can be viewed by the user (10) using the provided interface (9) .
  • this data is also used by the machine learning algorithms to generate plant models (8), which together with the user input (9) provide the necessary information for the control box (1) to determine how to better control the environment using the available set of actuators (2) .
  • Figure 2 illustrates the decision making process inside the AI layer on the control box (12) .
  • Sensor data (13) from both the plant and the environment, user input (14) and the plant models (15), provided by the cloud platform are processed using an optimization algorithm (16) to generate reference points for the controllers (17), which in turn send commands to the actuators (18) installed on the farm.
  • Figure 3 illustrates the plant sensor. It is composed by a lens (19), a filter that allows only certain frequencies to pass (20), a camera sensor, CMOS or CCD (21) and a computer (22) with communication capabilities (23) .
  • FIG. 4 illustrates the working process of the plant sensor.
  • a source of light (24) emits radiation that is reflected on the plant (25) and goes through a filter (26) that allows only certain frequencies to pass. This radiation hits the 3- channel camera sensor and results on a (NIR,G,B) image (27), the captured image.
  • This raw image is then processed into an HSV image (28) and an NDVI single channel image (29), the later using the (NIR-B) / (NIR+B) function.
  • the HSV image (28) goes through a dynamic thresholding technique (30) to calculate a binary mask (32) of what is a plant in the image.
  • the NDVI image (29) goes through a regression algorithm (31) to provide a calibrated, or real reading of the NDVI (33) of the plant. Both images are used to extract the final data that is shown to the user (34), the LAI (Leaf Area Index), growth rate, and NDVI.
  • the control box is an industrial grade computer. It's able to connect to the internet to talk to the cloud based system. This allows it to store data and request data from the cloud, including updated plant models, user settings, etc.
  • the control box is equipped with an Ethernet connection and is able to connect to several RS-485 networks supporting several protocols in parallel.
  • Wi-Fi based sensors can be interfaced to the system by means of an access point.
  • Analog or digital sensors that do not have an Ethernet or RS-485 interface can be connected to the system by means of an adapter or 10 module that supports such protocols.
  • the communication protocols supported by the communication module of the control box are, but not limited to, I2C, SPI, RS-232, USB, RS-485 based protocols, Ethernet/Wi-Fi based protocols and other wireless protocols by means of a gateway. Not all protocols are exposed for all users.
  • the predictive controllers that run on the control box are able to learn the model of the process that they are controlling, ergo being able to adjust to different system dynamics over time.
  • Generating a model of the indoor farm is accomplished using a LSSVM (Least Square Support Vector Machine) with a radial basis function (RBF) Kernel.
  • the model generated by the SVM is used to predict the behavior of the indoor farm on the horizon and its desired trajectory is used to minimize the errors of the controllers implicitly by iteratively adjusting the controllers' gains.
  • the controller itself is a neural network (NN) .
  • the control box optimization module uses a genetic algorithm that optimizes the indoor farm by making projections of how the plants grow based on the models provided by the cloud based system. Achieving an optimization of the indoor farm operations is performed by making a projection into the future in stages: (1) Generate a set of recipes (set of control parameters) that is achievable by the indoor farm according to the weather forecast (e.g. if it's too cold outside it might be impossible to raise the temperature inside past a certain value) . Each controller stores a mathematical model of how each part of the system behaves. (2) Evaluate the fitness of each recipe by applying it to the growth model of the plant (fetched from the cloud) . (3) Calculate the amount of resources spent by applying each recipe. (4) The selection criteria is obtained by maximizing plant growth, while minimizing resources.
  • the optimization module is also in-charge of translation user input into actual hardware controls, providing and abstraction layer to the hardware inside the farm. This is accomplished by merging several used inputs when necessary, and performing automatic scheduling for tasks such as watering .
  • the plant sensor is built using a single color camera with the IR filter on the lens replaced by a blue filter (460nm and 800nm pass-through filter) .
  • This technique allows the use of a single camera instead of a two-camera solution for calculation the NDVI index, thus lowering the cost.
  • the resulting images does not contain the regular RGB channels, the R channel is swapped for a NIR channel, thus resulting in a (NIR,G,B) image. This image is converted into the HSV color space.
  • the resulting image goes through an Otsu thresholding algorithm for calculating the area inside the image that contains the actual plants.
  • the NDVI index is estimated using the NDVI function.
  • the data extracted by the system includes, but is not limited to:
  • LAI Leaf Area Index
  • each plant is estimated by determining the center of mass of each blob (plant) extracted by a blob extraction algorithm.
  • the plant LAI is estimated by the number of pixels of each blob (plant) .
  • the growth rate is the derivative of the plant LAI over time.
  • the NDVI for each pixel is calculated according to the following formula,
  • the Extended NDVI for each pixel is calculated using the following formula,
  • the average NDVI and ENDVI values are estimated for each plant by calculating the average value for all pixels that are masked by each blob (plant) .
  • At the core of the cloud based system there is a non- structured databases, for holding sensor data and running big data and machine learning algorithms. Deep learning algorithms are used to process this data.
  • the deep leaning algorithms connects directly to the non-structured database and is able to serve control boxes all over the world, distributing plant growth models.
  • the deep learning framework is used to train a neural network.
  • the deep neural network is a DBN (Deep Belief Network) connected to a feedforward artificial neural network model MLP (Multilayer Perceptron) .
  • MLP Multilayer Perceptron
  • the DBN is at the interface with the environment and receives the input signal. The input signal passes through each layer until it reaches the hidden layer in the MLP. From the MLP' s hidden layer the signal reaches the output layer.
  • the DBN was chosen as it has been used to
  • the input of the control model is a matrix with 135 variables or features. These features are the t-1 trough t-15 daily environmental parameters plus the resulting plant growth parameters.
  • the parameters vector contains the following values: water pH, water EC, water DO, water temperature, water volume, air temperature, air C02 level, solar radiation, plant health index (estimated from the plant size and plant NDVI index) .
  • the choice of a 15-day window is due to the fact that plants have a certain plasticity that depends from plant to plant. Furthermore several plant diseases can take as long as 15 days to incubate. The time window can be adjusted according to different plants. Also, several models can be generated with added parameters for farms equipped with more sensors (e.g. concentration of individual nutrients, NKP+micro nutrients) .
  • the training data can be denoted by (xk,yk), and the neural network will learn the model F6,

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Operations Research (AREA)
  • Game Theory and Decision Science (AREA)
  • Soil Sciences (AREA)
  • Environmental Sciences (AREA)
  • Water Supply & Treatment (AREA)
  • Marketing (AREA)
  • Development Economics (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • Physics & Mathematics (AREA)
  • General Business, Economics & Management (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Feedback Control In General (AREA)

Abstract

The present application discloses a predictive dynamic cloud based system designed to nurture plants in a controlled environment, controlling in real-time several environmental variables that have a direct impact on plant growth. The system combines machine learning techniques and image processing algorithms, aiming at growing crops in the most efficient way, providing them what they need, when they need it. All data is managed in a plant database stored on the cloud and processed using machine learning techniques, and can be accessed through an intuitive interface, adaptable to both web and mobile platforms, sustained on the concept of controlling, in a precise way, the environmental variables that contribute to plant growth, which allows them to develop in the most healthy, efficient and effective way.

Description

DESCRIPTION
"PREDICTIVE DYNAMIC CLOUD BASED SYSTEM FOR ENVIRONMENTAL
SENSING AND ACTUATION"
Technical field
This application relates to a predictive dynamic cloud based system for environmental sensing and actuation and the respective method of operation.
Background art
In a context of the breaking down of the dominant food paradigm, with climate changing, increasing droughts, soil depletion and disease, water pollution or chemical overloads, the consumers and retailers are demanding healthier production methods that should be able to meet the customer demands while ensuring the health of ecological systems that sustain a livable world. In the majority of the cases the crops, irrespective of what type they are, are monitored and controlled by human intervention, which makes the control of large farms more difficult, thus the proposed system provides, at the same time, an optimization in terms of plant growth and resource management.
The Normalized Difference Vegetation Index (NDVI) is a remote plant sensing tool used to commonly determine if the target observed contains live green vegetation or not. Generally speaking, healthy green vegetation absorbs a wide range of the visible light spectrum, more specifically in the photosynthetically active radiation (PAR) spectral region, which they use in the process of photosynthesis. On the other hand, healthy green vegetation reflects most of the near-infrared light since the energy level for this wavelength, usually longer than 700 nanometers, is not enough to synthesize organic molecules. Absorbing near- infrared light would only result in overheating the plant and damaging the tissues. Plant diseases result in plant cell structure destruction and overheat, often lowering near-infrared reflection when diseased/stressed. The NDVI uses the visible and near-infrared bands of the electromagnetic spectrum. The bigger the difference between the near-infrared (NIR) and the red reflectance, the more healthy green vegetation is present. Theoretically, NDVI values are represented by a numerical value ranging from -1 to 1. In practice extreme negative values represent water, values around zero represent bare soil and values over 0.6 represent healthy green vegetation.
Traditional remote sensing systems are based on multispectral imaging cameras. In the specific case of NDVI, a pair of monochromatic cameras, one for NIR light and another one for red, is usually used. A NDVI setup is usually expensive and does not present, on its own, a valuable tool for growers around the world. Plant sensing technologies need to be cost effective, otherwise even though existing, farmers will not adopt them. Computer vision-based techniques excel due to the fact that they are contactless, non intrusive, and can usually be used on multiple plants at a time, providing a broader sample.
University Sichuan Agricultural et al . (CN 102524024 issued February 16th 2012) developed a computer vision-based system for crop irrigation. Their system uses computer vision to measure plant water stress and thus control the watering cycles. Decision on when to water the plants is achieved by means of a plant growth model, powered by a genetic neural network. On another iteration of their work, University Sichuan Agricultural et al . (CN 102550374 issued March 18th 2012) developed a similar system that also includes a soil moisture sensor and an air temperature and humidity sensor to complement the computer vision data in the decision process .
Martin Tommy J. et al . (WO 2010117944 issued April 6th 2009) propose a system for controlling crop irrigation based on several inputs of data, including crop characteristics such as leaf temperature, leaf wetness or leaf thickness. Data is transmitted to a server where it is analyzed in order to achieve optimal irrigation control. The user is also able to receive notifications regarding the irrigation choices made by the system.
Although these solutions incorporate plant data into the decision process, the main focus is on irrigation control (more oriented for open field farming) , lacking the integrated control needed to operate an indoor farm, from fertigation to environmental control. Furthermore plant growth models should not be limited to how the plant reacts to irrigation, but also to other environmental variables such as water pH, air temperature, C02 concentration, etc.
Summary
It is disclosed a predictive dynamic cloud based system for environmental sensing and actuation comprising:
A control box wherein each control box comprises a sensor/actuator hardware drivers' module, a controller's module, an artificial intelligence layer module, and a built-in wired and/or wireless communication module, wherein all modules and respective data are managed to be integrated by a control software running on the processor unit of said control box;
At least one plant sensor module, applied on the indoor farm to be monitored, comprising a processor unit with communication capabilities, a camera sensor and a sensor architecture;
A remote cloud based system configured to provide data storage, data analysis and interface between the user and the system, wherein the data stored is related to plants growth models, resource spending data and weather forecast data.
In one embodiment of the system now disclosed the control box is configured to execute additional software modules, such as data logging modules and notification modules.
In one embodiment of the system now disclosed the sensor/actuator module is connected to the control box through the communication module by wired or wireless protocols .
In one embodiment of the system now disclosed the controllers' module is configured to control all the environmental variables, sensed by the sensor architecture applied in the farm, by iteratively adjusting the respective controllers' gains, using machine learning techniques to generate a model of the indoor farm to be monitored, predicting its future behavior. In one embodiment of the system now disclosed the controllers' module comprises water pH controller, water electrical conductivity controller, water dissolved oxygen controller, water oxidation-reduction potential controller, water temperature controller, watering cycle controller, air temperature controller, humidity controller and C02 controller .
Yet in another embodiment of the system now disclosed each controller is configured as a neural network being the respective gains adjusted using least squares algorithm based on the machine learning techniques predictions.
In one embodiment of the system now disclosed the artificial intelligence layer module is configured to establish the connection between the indoor farm controlled and the remote cloud based system.
In one embodiment of the system now disclosed the artificial intelligence layer module is configured to run an heuristic optimization algorithm to actuate on the controllers' module, wherein said algorithm integrates data from the models stored on the cloud, such as plants grow models, resource spending data, weather forecast data, user input and sensor data.
In one embodiment of the system now disclosed the built-in communication module is configured to establish wired and/or wireless data communication process over a local or remote network .
In one embodiment of the system now disclosed the camera sensor of the plant sensor module comprises a single color camera composed by a lens with a 460nm and 800nm pass-through filter
In one embodiment of the system now disclosed the processor unit of the plant sensor is configured to calculate at least plant size, plant growth rate and NDVI index.
In one embodiment of the system now disclosed the remote cloud based system comprises a databases storing sensor data and processor capabilities configured to generate plant growth models based on machine learning techniques applied to said sensor data.
In one embodiment of the system now disclosed the remote cloud based system is configured to distribute plant growth models to the connected control boxes.
It is also disclosed a method of operation of the predictive dynamic cloud based system, characterized by the following steps :
Plant sensor acquires plant and environment data from both the camera sensor and the sensor architecture mounted in the indoor farm to be monitored;
Transmission of the data processed by the plant sensor to the control box and to the plant database on the cloud, where can be viewed by the user;
Generation of plant models based on the execution on the cloud of machine learning algorithms on the data provided by the plant sensor and based on the user input ;
Actuation in the actuators module of the control box in accordance with an optimization algorithm, executed on the artificial intelligence layer of the control box, in order to generate reference points for the controllers which in turn send commands to the actuators installed on the farm the plant models developed .
Disclosure
The present application intends to solve the problem of controlling the growth of plants or group of plants in a controllable indoor environment (from now on referred to as indoor farm) , for instance in a greenhouse, applicable to different kinds of cultivation such as, but not limited to, aeroponics, hydroponics, aquaponics, bioponics or even soil, with minimal human interference, being designed for domestic or industrial purposes.
What is disclosed is a predictive dynamic cloud based system designed to nurture plants in a controlled environment, controlling in real-time several environmental variables that have a direct impact on plant growth, including feedback from the plant itself that is used in the decision process. Feedback from the plant is acquired using a novel plant sensor, also described in this document.
The system combines machine learning techniques and image processing algorithms, aiming at growing crops in the most efficient way. All data is stored in a database on the cloud and processed using machine learning techniques, and can be accessed through an intuitive interface, adaptable to both web and mobile platforms, sustained on the concept of controlling, in a precise way, the environmental variables that contribute to plant growth, which allows them to develop in the most healthy, efficient and effective way.
Its application makes farming control easier and more intuitive for the user, leading to an optimization of plants growth parameters for more and better crops and less waste of resources like water, energy, nutrients and human labor.
The system herein disclosed is divided into three main parts, the control box and the plant sensor, that are to be installed on the indoor farm, and the remote cloud based system which provides data storage, data analysis, and an easy yet powerful interface between the user and the complete system itself. The plant sensor module consists of a modified computer vision camera paired with a set of algorithms that allow it to segment plants, and extract plant data. Environmental data is also acquired by using a sensor architecture that is mounted in the field, allowing the assessment of different environmental parameters. The plant sensor module connects to the control box using any ethernet based protocol.
In respect to the control box, it comprises a processor unit which allows the system' s integration within the existing legacy systems already in place as well as allowing the connection to the plant sensor module or other new sensor and actuator technologies.
This flexibility in integration and connection of the system to the existing modules is ensured by a built-in wired and/or wireless communication module, integrated within the processor unit architecture, allowing to establish wired and/or wireless data communication process over a local or remote network, such as the internet, and for carrying out a remote control and or monitoring process in the indoor farm. The communication module provides data connections supporting several protocols in parallel.
All the intelligent control performed by the control box is ensured due to a software architecture which is able to manage all its different modules, being divided into sensor and actuator hardware drivers' module, controllers' module and the artificial intelligence (AI) layer module which manages the entire farm. The sensor and actuator hardware drivers are meant to provide hardware abstraction. Additionally, the control box can also run additional software modules, such as data logging modules and notification modules.
The controllers' module includes software controllers, which are loaded into the control box processor unit from the cloud, for all the environmental variables that the control box is capable to manage (variables measured by the sensor architecture of the plant sensor installed on the farm, e.g. water pH, air temperature, etc.), being able to cope with the dynamics of time changing models. Said controller module does not need to be previously tuned by the user since it does it on its own due to the capability to learn the model of the process to be controlled, and in that way be able to adjust to different process dynamics over time. In order to generate the model of any given process, a machine learning technique, that runs on the control box, is fed with both the sensor data, from the sensor architecture installed on the farm, that provides the output of the process, and the actuator command that provide the input of the same process. This information, over time, is used to create a mathematical model of the said process, allowing the prediction of its behavior in the future and the desired trajectory of the process is used to minimize the errors of the controllers either implicitly or explicitly by iteratively adjusting the controllers' gains (e.g. the PID gains of a traditional PID controller) .
The AI layer module provides the connection between the indoor farm being controlled and the remote cloud based system. This layer optimizes the indoor farm control by making projections of how the plants grow based on the models provided by the cloud based system, and uses that information in conjunction with both the user input and resource spending models and weather forecast (when applicable) , to estimate the better set points for the indoor farm. The cloud-based system generates mathematical models of how the plants grow in response to the environmental variables. The AI layer uses those models to create several possible plant growth scenarios, based on a set of different environmental variables, constrained to what is achievable inside the farm in study. This is achieved using a heuristic optimization algorithm that runs on the control box. The resulting set of parameters is then forwarded to the controllers which in turn send commands to the actuators installed on the farm.
The plant sensor module comprises a lens, a filter, a camera sensor, and a processor with communication capabilities. Light reflected on the plants is captured by the lens thru the filter (which in this case only allows the passing of blue, 460nm, and near infrared (NIR) light, 800nm) , ending on the camera sensor. The camera sensor creates an image (from now on called the captured image) and sends this image to the processor where it is processed for plant data extraction. All software and algorithms described next run inside the plant sensor processor.
To measure the plant health the plant sensor makes use of the Normalized Difference Vegetation Index (NDVI) extracted from the captured images. A first stage of processing algorithms are applied to the captured image to calculate the NDVI index. The NDVI index as seen by the camera might not be correct, thus a regression technique is used to estimate the real NDVI image. However, several objects in the image can generate NDVI indexes similar to those of plants. In order to avoid that, a second stage of image processing is performed to achieve a second image that is used to segment the plants, allowing to determine what is in fact a plant, prior to using the NDVI index on the selected areas. This is accomplished by converting the initial image into the HSV color space. A thresholding technique is then used to determine what is plant, and what is not. The resulting image can be easily used for plant segmentation using blob detection. To avoid being unable to segment the plants once they grow into a single mass, a tracking algorithm is used to track the plants over time, as they grow. The final stage of the NDVI-based plant sensor is translating the data gathered by the sensor into information that the grower can use. The data extracted by the plant sensor includes, but is not limited to:
- Location of each plant on the image;
- Leaf Area Index (LAI);
- Derivative of the LAI over time (plant growth rate) ;
- Average NDVI index of each plant (NDVI) . The resulting data is transmitted to the control box, using the plant sensor computer' s communication capabilities (preferably an ethernet-based communication protocol) .
The cloud based system runs off site and is in charge of storing plant data and using that same data for modelling plant growth. This is accomplished using data not only from one site, but from all sites being used by the system together. A machine learning technique is used to model the plant growth functions. These functions model how the environmental variables result in plant growth and health changes over time.
Brief description of drawings
For easier understanding of this application, figures are attached in the annex that represent the preferred forms of implementation which nevertheless are not intended to limit the technique disclosed herein.
Figure 1 illustrates an overview of the complete proposed system. Both the plant and the environment (3) provide data to the plant sensor and environmental sensors installed in the farm (4) that in turn communicates with the control box (1) . All data (6) acquired by the plant sensor is sent to the plant database (7) on the cloud (5), and can be viewed by the user (10) using the provided interface (9) . In parallel this data is also used by the machine learning algorithms to generate plant models (8), which together with the user input (9) provide the necessary information for the control box (1) to determine how to better control the environment using the available set of actuators (2) . Figure 2 illustrates the decision making process inside the AI layer on the control box (12) . Sensor data (13) from both the plant and the environment, user input (14) and the plant models (15), provided by the cloud platform, are processed using an optimization algorithm (16) to generate reference points for the controllers (17), which in turn send commands to the actuators (18) installed on the farm.
Figure 3 illustrates the plant sensor. It is composed by a lens (19), a filter that allows only certain frequencies to pass (20), a camera sensor, CMOS or CCD (21) and a computer (22) with communication capabilities (23) .
Figure 4 illustrates the working process of the plant sensor. A source of light (24) emits radiation that is reflected on the plant (25) and goes through a filter (26) that allows only certain frequencies to pass. This radiation hits the 3- channel camera sensor and results on a (NIR,G,B) image (27), the captured image. This raw image is then processed into an HSV image (28) and an NDVI single channel image (29), the later using the (NIR-B) / (NIR+B) function. The HSV image (28) goes through a dynamic thresholding technique (30) to calculate a binary mask (32) of what is a plant in the image. The NDVI image (29) goes through a regression algorithm (31) to provide a calibrated, or real reading of the NDVI (33) of the plant. Both images are used to extract the final data that is shown to the user (34), the LAI (Leaf Area Index), growth rate, and NDVI.
Best mode for carrying out the invention
With reference to the figures, certain methods of implementation are now described in more detail. However, they are not intended to limit the scope of this application. The control box is an industrial grade computer. It's able to connect to the internet to talk to the cloud based system. This allows it to store data and request data from the cloud, including updated plant models, user settings, etc.
The control box is equipped with an Ethernet connection and is able to connect to several RS-485 networks supporting several protocols in parallel. Wi-Fi based sensors can be interfaced to the system by means of an access point. Analog or digital sensors that do not have an Ethernet or RS-485 interface can be connected to the system by means of an adapter or 10 module that supports such protocols. The communication protocols supported by the communication module of the control box are, but not limited to, I2C, SPI, RS-232, USB, RS-485 based protocols, Ethernet/Wi-Fi based protocols and other wireless protocols by means of a gateway. Not all protocols are exposed for all users.
The predictive controllers that run on the control box are able to learn the model of the process that they are controlling, ergo being able to adjust to different system dynamics over time. Generating a model of the indoor farm is accomplished using a LSSVM (Least Square Support Vector Machine) with a radial basis function (RBF) Kernel. The model generated by the SVM is used to predict the behavior of the indoor farm on the horizon and its desired trajectory is used to minimize the errors of the controllers implicitly by iteratively adjusting the controllers' gains. The controller itself is a neural network (NN) .
The control box optimization module uses a genetic algorithm that optimizes the indoor farm by making projections of how the plants grow based on the models provided by the cloud based system. Achieving an optimization of the indoor farm operations is performed by making a projection into the future in stages: (1) Generate a set of recipes (set of control parameters) that is achievable by the indoor farm according to the weather forecast (e.g. if it's too cold outside it might be impossible to raise the temperature inside past a certain value) . Each controller stores a mathematical model of how each part of the system behaves. (2) Evaluate the fitness of each recipe by applying it to the growth model of the plant (fetched from the cloud) . (3) Calculate the amount of resources spent by applying each recipe. (4) The selection criteria is obtained by maximizing plant growth, while minimizing resources.
The optimization module is also in-charge of translation user input into actual hardware controls, providing and abstraction layer to the hardware inside the farm. This is accomplished by merging several used inputs when necessary, and performing automatic scheduling for tasks such as watering .
The plant sensor is built using a single color camera with the IR filter on the lens replaced by a blue filter (460nm and 800nm pass-through filter) . This technique allows the use of a single camera instead of a two-camera solution for calculation the NDVI index, thus lowering the cost. The resulting images does not contain the regular RGB channels, the R channel is swapped for a NIR channel, thus resulting in a (NIR,G,B) image. This image is converted into the HSV color space. The resulting image goes through an Otsu thresholding algorithm for calculating the area inside the image that contains the actual plants. The NDVI index is estimated using the NDVI function. As the camera images might change with the lighting conditions, so will the NDVI value, thus this value needs to be converted into a corrected NDVI value. This is achieved using a machine learning regression technique. In this case a Neural Network is trained to learn how to map the NDVI values read by the camera into corrected NDVI values by reading the lighting conditions from the image. The data extracted by the system includes, but is not limited to:
- Location of each plant on the image;
- Leaf Area Index (LAI);
- Derivative of the LAI over time (plant growth rate) ;
- Average NDVI index of each plant (NDVI) .
- Average Extended NDVI index of each plant (ENDVI) .
The location of each plant is estimated by determining the center of mass of each blob (plant) extracted by a blob extraction algorithm. The plant LAI is estimated by the number of pixels of each blob (plant) . The growth rate is the derivative of the plant LAI over time. The NDVI for each pixel is calculated according to the following formula,
NDVI = (NIR - B) / (NIR + B)
The Extended NDVI for each pixel is calculated using the following formula,
ENDVI = (NIR + G - 2B) / (NIR + G + 2B)
The average NDVI and ENDVI values are estimated for each plant by calculating the average value for all pixels that are masked by each blob (plant) . At the core of the cloud based system there is a non- structured databases, for holding sensor data and running big data and machine learning algorithms. Deep learning algorithms are used to process this data. The deep leaning algorithms connects directly to the non-structured database and is able to serve control boxes all over the world, distributing plant growth models. The deep learning framework is used to train a neural network. The deep neural network is a DBN (Deep Belief Network) connected to a feedforward artificial neural network model MLP (Multilayer Perceptron) . The DBN is at the interface with the environment and receives the input signal. The input signal passes through each layer until it reaches the hidden layer in the MLP. From the MLP' s hidden layer the signal reaches the output layer. The DBN was chosen as it has been used to model time series models in the past.
The input of the control model is a matrix with 135 variables or features. These features are the t-1 trough t-15 daily environmental parameters plus the resulting plant growth parameters. The parameters vector contains the following values: water pH, water EC, water DO, water temperature, water volume, air temperature, air C02 level, solar radiation, plant health index (estimated from the plant size and plant NDVI index) .
The choice of a 15-day window is due to the fact that plants have a certain plasticity that depends from plant to plant. Furthermore several plant diseases can take as long as 15 days to incubate. The time window can be adjusted according to different plants. Also, several models can be generated with added parameters for farms equipped with more sensors (e.g. concentration of individual nutrients, NKP+micro nutrients) .
More formally one can express the control model in the following way. Let the output vector be given by, y = {plant_growth, plant_health } and the input vector be given by,
Figure imgf000020_0001
containing 135 elements where, pi = {phi, eci, doi, water_tempi, wateri, air_tempi, co2i, solar_radiationi , areai, healthi}
If one has k training points, the training data can be denoted by (xk,yk), and the neural network will learn the model F6,
FQ : x→y mapping the input vector space, x to the output space y = {plant_growth, plant_health } .
Although the preferred embodiments of the present invention have been disclosed for illustrative purposes, those skilled in the art will appreciate that various modifications, additions and substitutions are possible, without departing from the scope of the invention. Therefore, the present invention is not limited to the above-described embodiments, but the present invention is defined by the claims which follow, along with their fall scope of equivalents.

Claims

1. Predictive dynamic cloud based system for environmental sensing and actuation comprising:
A control box wherein each control box comprises a sensor/actuator hardware drivers' module, a controller's module, an artificial intelligence layer module, and a built-in wired and/or wireless communication module, wherein all modules and respective data are managed to be integrated by a control software running on the processor unit of said control box;
At least one plant sensor module, applied on the indoor farm to be monitored, comprising a processor unit with communication capabilities, a camera sensor and sensor architecture ;
A remote cloud based system configured to provide data storage, data analysis and interface between the user and the system, wherein the data stored is related to plants growth models, resource spending data and weather forecast data.
2. System according to any of the previous claims wherein the control box is configured to execute additional software modules, such as data logging modules and notification modules .
3. System according to any of the previous claims wherein the sensor/actuator module is connected to the control box through the communication module by wired or wireless protocols .
4. System according to any of the previous claims wherein the controllers' module is configured to control all the environmental variables, sensed by the sensor architecture applied in the farm, by iteratively adjusting the respective controllers' gains, using machine learning techniques to generate a model of the indoor farm to be monitored, predicting its future behavior.
5. System according to claim 4 wherein the controllers' module comprises water pH controller, water electrical conductivity controller, water dissolved oxygen controller, water oxidation-reduction potential controller, water temperature controller, watering cycle controller, air temperature controller, humidity controller and C02 controller .
6. System according to claim 4 wherein each controller is configured as a neural network being the respective gains adjusted using least squares algorithm based on the machine learning techniques predictions.
7. System according to claim 1 wherein the artificial intelligence layer module is configured to establish the connection between the indoor farm controlled and the remote cloud based system.
8. System according to claim 7 wherein the artificial intelligence layer module is configured to run an heuristic optimization algorithm to actuate on the controllers' module, wherein said algorithm integrates data from the models stored on the cloud, such as plants grow models, resource spending data, weather forecast data, user input and sensor data.
9. System according to claim 1 wherein the built-in communication module is configured to establish wired and/or wireless data communication process over a local or remote network .
10. System according to claim 1 wherein the camera sensor of the plant sensor module comprises a single color camera composed by a lens with a 460nm and 800nm pass-through filter
11. System according to claim 1 wherein the processor unit of the plant sensor is configured to calculate at least plant size, plant growth rate and NDVI index.
12. System according to claim 1 wherein the remote cloud based system comprises a databases storing sensor data and processor capabilities configured to generate plant growth models based on machine learning techniques applied to said sensor data.
13. System according to any of the previous claims wherein the remote cloud based system is configured to distribute plant growth models to the connected control boxes.
14. System according to any of the previous claims wherein the remote cloud based system is configured to provid an user interface adaptable to both web and mobile platforms.
15. Method of operation of the predictive dynamic cloud based system, of claims 1-14, characterized by the following steps : Plant sensor acquires plant and environment data from both the camera sensor and the sensor architecture mounted in the indoor farm to be monitored;
Transmission of the data processed by the plant sensor to the control box and to the plant database on the cloud, where can be viewed by the user;
Generation of plant models based on the execution on the cloud of machine learning algorithms on the data provided by the plant sensor and based on the user input ;
Actuation in the actuators module of the control box in accordance with an optimization algorithm, executed on the artificial intelligence layer of the control box, in order to generate reference points for the controllers which in turn send commands to the actuators installed on the farm the plant models developed .
PCT/PT2016/050027 2016-11-29 2016-11-29 Predictive dynamic cloud based system for environmental sensing and actuation and respective method of operation WO2018101848A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/PT2016/050027 WO2018101848A1 (en) 2016-11-29 2016-11-29 Predictive dynamic cloud based system for environmental sensing and actuation and respective method of operation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/PT2016/050027 WO2018101848A1 (en) 2016-11-29 2016-11-29 Predictive dynamic cloud based system for environmental sensing and actuation and respective method of operation

Publications (1)

Publication Number Publication Date
WO2018101848A1 true WO2018101848A1 (en) 2018-06-07

Family

ID=57758687

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/PT2016/050027 WO2018101848A1 (en) 2016-11-29 2016-11-29 Predictive dynamic cloud based system for environmental sensing and actuation and respective method of operation

Country Status (1)

Country Link
WO (1) WO2018101848A1 (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180307654A1 (en) * 2017-04-13 2018-10-25 Battelle Memorial Institute System and method for generating test vectors
CN108810146A (en) * 2018-06-14 2018-11-13 鹤壁北岩科技有限公司 Culture control system based on distributed cloud platform and control method
CN109087212A (en) * 2018-10-10 2018-12-25 贵州网安科技发展有限公司 Agricultural production greenhouse regulation device based on big data platform
CN109874584A (en) * 2019-03-19 2019-06-14 广州辰轩农业科技有限公司 A kind of fruit tree growing way monitoring system based on deep learning convolutional neural networks
WO2019242075A1 (en) * 2018-06-20 2019-12-26 江苏优泰智能科技有限公司 Internet of things-based intelligent greenhouse monitoring system
CN110631635A (en) * 2019-09-27 2019-12-31 北京科百宏业科技有限公司 Remote automatic monitoring system for agricultural production environment and working method thereof
WO2020014773A1 (en) * 2018-07-16 2020-01-23 Vineland Research And Innovation Centre Automated monitoring and irrigation of plants in a controlled growing environment
WO2020110063A1 (en) * 2018-11-29 2020-06-04 Germishuys Dennis Mark Plant cultivation
WO2020148998A1 (en) * 2019-01-18 2020-07-23 オムロン株式会社 Model integration device, method, and program, and inference, inspection, and control system
US10743483B1 (en) 2019-03-22 2020-08-18 Hoover Pumping Systems Corporation Wireless remote updates to schedules in irrigation controllers
US20210007300A1 (en) * 2018-03-16 2021-01-14 Alinda Chandra Mondal Soil Ecosystem Management and Intelligent Farming Arrangement
CN112351057A (en) * 2019-08-07 2021-02-09 耐驰-仪器制造有限公司 Data acquisition system, system and method for real-time on-line monitoring of industrial manufacturing processes
CN112462648A (en) * 2020-11-13 2021-03-09 东南大学 System for monitoring and predicting building comprehensive environment
IT202000030038A1 (en) * 2020-12-11 2021-03-11 Impattozero S R L Device for monitoring and managing the environmental conditions of an ecosystem
CN112712437A (en) * 2021-01-18 2021-04-27 河南工业大学 Mining area environment change comprehensive evaluation method
CN113687609A (en) * 2021-07-21 2021-11-23 浙江微科机电有限公司 Intelligent monitoring system and monitoring method for Internet of things applied to abnormal environment
CN114431125A (en) * 2022-03-11 2022-05-06 深圳市山月园园艺有限公司 Intelligent irrigation system
CN114996897A (en) * 2022-04-06 2022-09-02 武昌首义学院 Multi-attribute group decision method based on cloud model joint coefficient
CN115361764A (en) * 2022-09-20 2022-11-18 山东浪潮科学研究院有限公司 Intelligent lighting auxiliary method

Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7058197B1 (en) * 1999-11-04 2006-06-06 Board Of Trustees Of The University Of Illinois Multi-variable model for identifying crop response zones in a field
US20070260400A1 (en) * 2006-05-04 2007-11-08 Omry Morag Computerized crop growing management system and method
CN101211178A (en) * 2006-12-25 2008-07-02 上海都市绿色工程有限公司 Multi-temperature area group control greenhouse environment and irrigation control system
WO2009049361A1 (en) * 2007-10-16 2009-04-23 Aquaspy Group Pty Ltd Water resource management system and method
US20090150000A1 (en) * 2007-12-07 2009-06-11 Mark Stelford System and method of managing substances in a plant root zone
CN101470421A (en) * 2007-12-28 2009-07-01 中国科学院沈阳应用生态研究所 Plant growth room based on artificial intelligence technology and its control system
WO2010117944A1 (en) 2009-04-06 2010-10-14 Martin Tommy J Remote analysis and correction of crop condition
CN102524024A (en) 2012-02-16 2012-07-04 四川农业大学 Crop irrigation system based on computer vision
CN102550374A (en) 2012-03-18 2012-07-11 四川农业大学 Crop irrigation system combined with computer vision and multi-sensor
WO2012123877A1 (en) * 2011-03-14 2012-09-20 Idus Controls Ltd. An irrigation control device using an artificial neural network
JP2013051887A (en) * 2011-08-31 2013-03-21 Hitachi East Japan Solutions Ltd Method for managing growth
US20130153673A1 (en) * 2011-12-19 2013-06-20 Saed G. Younis Remotely sensing and adapting irrigation system
US20140358486A1 (en) * 2014-08-19 2014-12-04 Iteris, Inc. Continual crop development profiling using dynamical extended range weather forecasting with routine remotely-sensed validation imagery
US9140824B1 (en) * 2015-01-23 2015-09-22 Iteris, Inc. Diagnosis and prediction of in-field dry-down of a mature small grain, coarse grain, or oilseed crop using field-level analysis and forecasting of weather conditions, crop characteristics, and observations and user input of harvest condition states
US20150370935A1 (en) * 2014-06-24 2015-12-24 360 Yield Center, Llc Agronomic systems, methods and apparatuses
US20160026940A1 (en) * 2011-12-30 2016-01-28 Aglytix, Inc. Methods, apparatus and systems for generating, updating and executing a crop-harvesting plan
CN103823371B (en) * 2014-02-12 2016-06-29 无锡中科智能农业发展有限责任公司 Agriculture Tree Precise Fertilization system and fertilizing method based on neural network model
US20160202679A1 (en) * 2015-01-08 2016-07-14 International Business Machines Corporation Automated irrigation control system
CN105830870A (en) * 2016-03-24 2016-08-10 华北水利水电大学 Remote wireless farmland monitoring system and method
US20160259309A1 (en) * 2015-03-06 2016-09-08 Telsco Industries, Inc. d/b/a Weathermatic Systems and Methods for Site-Based Irrigation Control

Patent Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7058197B1 (en) * 1999-11-04 2006-06-06 Board Of Trustees Of The University Of Illinois Multi-variable model for identifying crop response zones in a field
US20070260400A1 (en) * 2006-05-04 2007-11-08 Omry Morag Computerized crop growing management system and method
CN101211178A (en) * 2006-12-25 2008-07-02 上海都市绿色工程有限公司 Multi-temperature area group control greenhouse environment and irrigation control system
WO2009049361A1 (en) * 2007-10-16 2009-04-23 Aquaspy Group Pty Ltd Water resource management system and method
US20090150000A1 (en) * 2007-12-07 2009-06-11 Mark Stelford System and method of managing substances in a plant root zone
CN101470421A (en) * 2007-12-28 2009-07-01 中国科学院沈阳应用生态研究所 Plant growth room based on artificial intelligence technology and its control system
WO2010117944A1 (en) 2009-04-06 2010-10-14 Martin Tommy J Remote analysis and correction of crop condition
WO2012123877A1 (en) * 2011-03-14 2012-09-20 Idus Controls Ltd. An irrigation control device using an artificial neural network
JP2013051887A (en) * 2011-08-31 2013-03-21 Hitachi East Japan Solutions Ltd Method for managing growth
US20130153673A1 (en) * 2011-12-19 2013-06-20 Saed G. Younis Remotely sensing and adapting irrigation system
US20160026940A1 (en) * 2011-12-30 2016-01-28 Aglytix, Inc. Methods, apparatus and systems for generating, updating and executing a crop-harvesting plan
CN102524024A (en) 2012-02-16 2012-07-04 四川农业大学 Crop irrigation system based on computer vision
CN102524024B (en) * 2012-02-16 2013-04-03 四川农业大学 Crop irrigation system based on computer vision
CN102550374B (en) * 2012-03-18 2013-04-03 四川农业大学 Crop irrigation system combined with computer vision and multi-sensor
CN102550374A (en) 2012-03-18 2012-07-11 四川农业大学 Crop irrigation system combined with computer vision and multi-sensor
CN103823371B (en) * 2014-02-12 2016-06-29 无锡中科智能农业发展有限责任公司 Agriculture Tree Precise Fertilization system and fertilizing method based on neural network model
US20150370935A1 (en) * 2014-06-24 2015-12-24 360 Yield Center, Llc Agronomic systems, methods and apparatuses
US20140358486A1 (en) * 2014-08-19 2014-12-04 Iteris, Inc. Continual crop development profiling using dynamical extended range weather forecasting with routine remotely-sensed validation imagery
US20160202679A1 (en) * 2015-01-08 2016-07-14 International Business Machines Corporation Automated irrigation control system
US9140824B1 (en) * 2015-01-23 2015-09-22 Iteris, Inc. Diagnosis and prediction of in-field dry-down of a mature small grain, coarse grain, or oilseed crop using field-level analysis and forecasting of weather conditions, crop characteristics, and observations and user input of harvest condition states
US20160259309A1 (en) * 2015-03-06 2016-09-08 Telsco Industries, Inc. d/b/a Weathermatic Systems and Methods for Site-Based Irrigation Control
CN105830870A (en) * 2016-03-24 2016-08-10 华北水利水电大学 Remote wireless farmland monitoring system and method

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180307654A1 (en) * 2017-04-13 2018-10-25 Battelle Memorial Institute System and method for generating test vectors
US10789550B2 (en) * 2017-04-13 2020-09-29 Battelle Memorial Institute System and method for generating test vectors
US20210007300A1 (en) * 2018-03-16 2021-01-14 Alinda Chandra Mondal Soil Ecosystem Management and Intelligent Farming Arrangement
CN108810146A (en) * 2018-06-14 2018-11-13 鹤壁北岩科技有限公司 Culture control system based on distributed cloud platform and control method
WO2019242075A1 (en) * 2018-06-20 2019-12-26 江苏优泰智能科技有限公司 Internet of things-based intelligent greenhouse monitoring system
WO2020014773A1 (en) * 2018-07-16 2020-01-23 Vineland Research And Innovation Centre Automated monitoring and irrigation of plants in a controlled growing environment
CN109087212A (en) * 2018-10-10 2018-12-25 贵州网安科技发展有限公司 Agricultural production greenhouse regulation device based on big data platform
WO2020110063A1 (en) * 2018-11-29 2020-06-04 Germishuys Dennis Mark Plant cultivation
WO2020148998A1 (en) * 2019-01-18 2020-07-23 オムロン株式会社 Model integration device, method, and program, and inference, inspection, and control system
JP2020115311A (en) * 2019-01-18 2020-07-30 オムロン株式会社 Model integration device, model integration method, model integration program, inference system, inspection system and control system
JP7036049B2 (en) 2019-01-18 2022-03-15 オムロン株式会社 Model integration device, model integration method, model integration program, inference system, inspection system, and control system
CN109874584A (en) * 2019-03-19 2019-06-14 广州辰轩农业科技有限公司 A kind of fruit tree growing way monitoring system based on deep learning convolutional neural networks
US11160221B2 (en) 2019-03-22 2021-11-02 Hoover Pumping Systems Corporation Efficient updates to schedules in irrigation controllers
US10743483B1 (en) 2019-03-22 2020-08-18 Hoover Pumping Systems Corporation Wireless remote updates to schedules in irrigation controllers
CN112351057A (en) * 2019-08-07 2021-02-09 耐驰-仪器制造有限公司 Data acquisition system, system and method for real-time on-line monitoring of industrial manufacturing processes
CN110631635A (en) * 2019-09-27 2019-12-31 北京科百宏业科技有限公司 Remote automatic monitoring system for agricultural production environment and working method thereof
CN112462648A (en) * 2020-11-13 2021-03-09 东南大学 System for monitoring and predicting building comprehensive environment
IT202000030038A1 (en) * 2020-12-11 2021-03-11 Impattozero S R L Device for monitoring and managing the environmental conditions of an ecosystem
CN112712437A (en) * 2021-01-18 2021-04-27 河南工业大学 Mining area environment change comprehensive evaluation method
CN112712437B (en) * 2021-01-18 2022-08-26 河南工业大学 Mining area environment change comprehensive evaluation method
CN113687609A (en) * 2021-07-21 2021-11-23 浙江微科机电有限公司 Intelligent monitoring system and monitoring method for Internet of things applied to abnormal environment
CN114431125A (en) * 2022-03-11 2022-05-06 深圳市山月园园艺有限公司 Intelligent irrigation system
CN114996897A (en) * 2022-04-06 2022-09-02 武昌首义学院 Multi-attribute group decision method based on cloud model joint coefficient
CN114996897B (en) * 2022-04-06 2023-03-07 武昌首义学院 Multi-attribute group ship anti-settling capacity evaluation decision method based on cloud model joint coefficient
CN115361764A (en) * 2022-09-20 2022-11-18 山东浪潮科学研究院有限公司 Intelligent lighting auxiliary method

Similar Documents

Publication Publication Date Title
WO2018101848A1 (en) Predictive dynamic cloud based system for environmental sensing and actuation and respective method of operation
Boursianis et al. Internet of things (IoT) and agricultural unmanned aerial vehicles (UAVs) in smart farming: A comprehensive review
Pothuganti et al. IoT and deep learning based smart greenhouse disease prediction
Yang et al. IoT-based framework for smart agriculture
WO2019222860A1 (en) System, method and/or computer readable medium for growing plants in an autonomous green house
Verma et al. An internet of things (IoT) architecture for smart agriculture
CN111008733B (en) Crop growth control method and system
KR20220071405A (en) Agricultural support system and method using big data of smart farm
CN117036088A (en) Data acquisition and analysis method for identifying growth situation of greening plants by AI
CN112465109A (en) Green house controlling means based on cloud limit is in coordination
Suneja et al. Cloud-based tomato plant growth and health monitoring system using IOT
Musa et al. An intelligent plant dissease detection system for smart hydroponic using convolutional neural network
Costa et al. Greenhouses within the Agricultura 4.0 interface
Ali et al. A high performance-oriented AI-enabled IoT-based pest detection system using sound analytics in large agricultural field
Joo et al. Growth analysis system for IT-based plant factory
Jain et al. Ubiquitous sensor based intelligent system for net houses
Premkumar et al. Functional framework for edge-based agricultural system
Ahmad et al. Smart Intelligent Precision Agriculture
Indira et al. Eco-friendly green cloud structure with internet of things for astute agriculture
Saha et al. ML-based smart farming using LSTM
Devi et al. IoT based root rot detection system
Roopashree et al. Smart Farming with IoT: A Case Study
WO2021067847A1 (en) Agricultural platforms
Sethuraman et al. Smart agriculture applications using Internet of Things
Rani et al. Harnessing the future: cutting-edge technologies for plant disease control

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16823363

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 01/10/2019)

122 Ep: pct application non-entry in european phase

Ref document number: 16823363

Country of ref document: EP

Kind code of ref document: A1