WO2021067847A1 - Plateformes agricoles - Google Patents

Plateformes agricoles Download PDF

Info

Publication number
WO2021067847A1
WO2021067847A1 PCT/US2020/054120 US2020054120W WO2021067847A1 WO 2021067847 A1 WO2021067847 A1 WO 2021067847A1 US 2020054120 W US2020054120 W US 2020054120W WO 2021067847 A1 WO2021067847 A1 WO 2021067847A1
Authority
WO
WIPO (PCT)
Prior art keywords
platform
agricultural
data
agricultural product
sensors
Prior art date
Application number
PCT/US2020/054120
Other languages
English (en)
Inventor
Molly STANEK
Jacob HARTNELL
James HAUK
Jay MONTONI
Original Assignee
Sensei Ag Holdings, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sensei Ag Holdings, Inc. filed Critical Sensei Ag Holdings, Inc.
Publication of WO2021067847A1 publication Critical patent/WO2021067847A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/04Forecasting or optimisation specially adapted for administrative or management purposes, e.g. linear programming or "cutting stock problem"
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining

Definitions

  • Agriculture management is an important function to oversee all aspects of running farms and other growing facilities that produce agricultural products. Agriculture management also includes farmers and landowners to address profitability, fertility, and conservation. These types of management functions are essential to a successful farm business and to ensure sufficient and nutrient-rich food for a population of food consumers.
  • a platform for agriculture management comprising: a trained algorithm that provides a smart recipe for growing an agricultural product, wherein the trained algorithm is configured to (i) receive data at least in part from an array of sensors, wherein a sensor is positioned at a location related to an agricultural facility and (ii) direct at least one actuator of an array of actuators positioned at a location in the agricultural facility to adjust one or more operating parameters according to the smart recipe, wherein the trained algorithm is trained on at least five agricultural products and wherein the trained algorithm optimizes the smart recipe to maximize agricultural product yield.
  • the agricultural product comprises an animal-based product.
  • the agricultural product comprises a plant-based product.
  • the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof.
  • the at least five an agricultural products comprise different an agricultural products.
  • the array of sensors comprise a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof.
  • the array of sensors comprise at least 5 different types of sensors.
  • the one or more operating parameters comprise a water amount provided to the agricultural product, a light amount provided to the agricultural product, a soil composition provided to the agricultural product, or any combination thereof.
  • the location comprises two or more locations. In some cases, the location is remote.
  • the location is within the agricultural facility.
  • the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof.
  • the data comprises data collected from previous grows of the agricultural product.
  • a platform for agriculture management comprising: (a) an array of sensors, wherein a sensor is positioned at a location related to an agricultural facility; (b) an array of actuators, wherein an actuator is positioned at a location related to the agricultural facility; and (c) a processor configured to receive data at least in part from a sensor of the array of sensors and upon receipt of the data direct a change in an operating parameter of at least one actuator of the array of actuators, wherein the change in the operating parameter is calculated by a trained algorithm configured for maximizing agricultural product yield.
  • the array of sensors comprises a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof. In some cases, the array of sensors comprise at least 5 different types of sensors. In some cases, the array of actuators comprises a water source, a light source, a nutrient source, a wind source, a temperature source, or any combination thereof. In some cases, the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof. In some cases, the operating parameter comprises a water amount provided to the agricultural product, a light amount provided to the agricultural product, a soil composition provided to the agricultural product, or any combination thereof.
  • the processor directs a change in at least three operating parameters based on receipt of the data.
  • the location comprises two or more locations. In some cases, the location is remote. In some cases, the location is within the agricultural facility.
  • the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof. In some cases, the data comprises data collected from previous grows of the agricultural product.
  • a platform for agriculture management comprising: (a) a plurality of sensors, wherein a sensor is positioned at a location related to an agricultural facility; (b) a processor configured to receive data at least in part from at least one sensor of the plurality of sensors; and (c) a user interface configured to receive a request for an agricultural product from a user, and upon receipt of the request, the processor is configured to provide a result to the user based on the data, wherein the result comprises a nutritional result of the agricultural product, a food safety result of the agricultural product, a provenance of the agricultural product, or any combination thereof.
  • the platform further comprises a database, wherein the database comprises a data set received from a plurality of agricultural facilities.
  • the processor comprises a trained algorithm trained to compare the data set from the database to the data received from the at least one sensor to produce the result.
  • the plurality of agricultural facilities is at least 5.
  • the plurality of agricultural facilities are located in different geographical locations.
  • the request comprises a provenance of the agricultural product, a farming practice of the agricultural product, a nutritional panel of the agricultural product, a food safety of the agricultural product, or any combination thereof.
  • the user is a food consumer.
  • the user is a business entity that sells the agricultural product to a consumer.
  • the location comprises two or more locations.
  • the location is remote. In some cases, the location is within the agricultural facility.
  • the plurality of sensors comprises a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof. In some cases, the plurality of sensors comprise at least 5 different types of sensors.
  • the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof.
  • the agricultural product comprises an animal-based product. In some cases, the agricultural product comprises a plant-based product.
  • the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof. In some cases, the data comprises data collected from previous grows of the agricultural product.
  • a platform for agriculture management comprising: (a) a plurality of sensors, wherein a sensor is positioned at a location related to an agricultural facility; (b) a plurality of actuators, wherein an actuator is positioned at a location related to the agricultural facility; (c) a processor configured to receive data from at least one sensor of the plurality of sensors; and (d) a plurality of discrete user interfaces, wherein a user interface of the plurality is configured to (i) receive data from a user, (ii) receive a request from a user, (iii) provide a result to a user, or (iv) any combination thereof, wherein a first user interface of the plurality is configured for a food consumer.
  • the plurality of discrete user interfaces is at least three.
  • a second user interface is configured for an agricultural grower.
  • a second user interface is configured for an agricultural manager.
  • FIG. 1 shows an embodiment of a system such as used in an agricultural platform as described herein.
  • a platform that includes a processor and user interface for management of production of an agricultural product.
  • a platform as described herein includes a plurality of sensors, such as a plurality of individual sensors and/or one or more arrays of sensors.
  • a platform as described herein includes a plurality of actuators, such as a plurality of individual actuators and/or one or more arrays of actuators.
  • a processor may receive data from at least one sensor and may direct a change in an operating parameter of at least one actuator.
  • a platform as described herein includes a database that comprises a data set. The data set may comprise data from a plurality of agricultural facilities.
  • the processor may include a trained algorithm.
  • the trained algorithm may be trained with a data set.
  • the trained algorithm may be trained to compare data input to the processor to produce a result.
  • the result may be prompted by a request from a user.
  • the platforms, systems, media, and methods described herein include a cloud computing environment.
  • a cloud computing environment comprises one or more computing processors.
  • each of the expressions “at least one of A, B and C,” “at least one of A, B, or C,” “one or more of A, B, and C,” “one or more of A, B, or C” and “A, B, and/or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
  • the term “about” may mean the referenced numeric indication plus or minus 15% of that referenced numeric indication.
  • the term “user” may mean a food consumer, an agricultural grower, an agricultural manager, a business entity in the food consumer industry or any combination thereof.
  • a user may be a person that produces or assisting in at least one aspect of producing the agricultural product.
  • a user may be a farmer, a planter, a breeder, a stockman, an agriculturist, an agronomist, a rancher, a producer, a cropper, a harvester, a gardener, an orchardist, a horticulturist, a hydroponist, a pomologist, a viticulturist, or any combination thereof.
  • a user may be a person in the farm business or agriculture business.
  • a user may be an agricultural manager that oversees an aspect of the business.
  • a user may be a CEO, CSO, or CFO of an agriculture facility.
  • a user may be a person that purchases an agricultural product from a farmer or a food producer.
  • a user may be a person that sells the agricultural product to a consumer.
  • a user may be a consumer, a person who eats the agricultural product or who buys the agricultural product.
  • an agricultural product may mean a product produced for consumption by a person or animal.
  • An agricultural product may include a plant or portion thereof, an animal or portion thereof, an animal product, or any combination thereof.
  • An agricultural product may include one or more crops, a food, a nutrient, a consumable, a livestock, a plant, an animal, an animal product (such as dairy, milk, eggs, cheese), a plant product, or any combination thereof.
  • an agricultural facility may mean a facility for producing one or more types of an agricultural product.
  • An agricultural facility may include a rural or urban facility or both.
  • An agricultural facility may include an outdoor or indoor facility or both.
  • An agricultural facility may include multiple different geographical locations or a singular geographical location.
  • An agricultural facility may include a farm, a dairy farm, a livestock farm, a crop farm, a commercial farm, a fish farm, a meat farm, a poultry farm, a greenhouse, an orchard, a hydroponic farm, an urban farm, or any combination thereof.
  • An agricultural facility may utilize natural growing elements such as sunlight, soil, and weather conditions of a geographical outdoor location.
  • An agricultural facility may utilize artificial elements such as artificial light, artificial soil, artificial heat, or combinations thereof.
  • An agricultural facility may utilize direct sun, indirect sun (such as from solar panels), or artificial light to grow crops.
  • a farming practice may mean a practice performed by one or more farms.
  • a farming practice may include growing an agricultural product under organic food standards or not.
  • a farming practice may include growing an agricultural product under non- GMO standards or not.
  • a farming practice may include growing an agricultural product under hormone-free conditions or not.
  • a farming practice may include growing an agricultural product under antibiotic-free conditions or not.
  • a farming practice may include growing an agricultural product under environmental-sustainable practices or not.
  • a farming practice may include growing an agricultural product under a reduced carbon footprint standard or not.
  • a farming practice may include growing an agricultural product under fair-trade standards or not.
  • a farming practice may include growing an agricultural product under a particular level of animal welfare standards or not.
  • a farming practice may include growing an agricultural product under farm raised conditions or raised in the wild.
  • a farming practice may include growing an agricultural product under in-door conditions, open access conditions, or free range conditions.
  • a farming practice may include growing an animal-based product on a grass-fed diet or not.
  • a farming practice may include any of the forgoing examples or any combination thereof.
  • the term “recipe” may mean a collection of parameters used in planting, growing, and/or maintaining an agricultural product.
  • parameters that might be found in a recipe include geographical location of the agricultural product, elevation of the agricultural product, environmental temperature, environmental humidity, environmental air quality, frequency of watering, amount of watering, frequency of application of fertilizer, amount of fertilizer used, frequency of pesticide applied, amount of pesticide used, soil composition, soil pH, agricultural product seed characteristics, and timing with respect to any actions carried out with respect to an agricultural product and/or seed of an agricultural product.
  • the term “nutritional profile” may mean an agricultural product having one or more nutritional attributes.
  • One or more nutritional attributes of an agricultural product may be quantified and communicated to a user, such as an agricultural manager, agricultural grower, food consumer, or any combination thereof.
  • the platforms as described herein may execute a recipe to grow an agricultural product, the recipe having been optimized to maximize one or more nutritional attributes.
  • a nutritional profile of an agricultural product may be compares across one or more farms growing the agricultural product.
  • a nutritional attribute may include a taste or flavor, a color, a texture, a ripeness, a freshness, a vitamin content or amount, a mineral content or amount, a fat content or amount, a sugar content or amount, a carbohydrate content or amount, a pesticide content or amount, an anti-oxidant content or amount, or any combination thereof.
  • a “software” as used herein comprises computer readable and executable instructions that may be executed by a computer processor.
  • a “software module” comprises computer readable and executable instructions and may, in some embodiments described herein make up a portion of software or may in some embodiments be a stand-alone item.
  • software and/or a software module comprises a file, a section of code, a programming object, a programming structure, or combinations thereof.
  • a software module comprises a plurality of files, a plurality of sections of code, a plurality of programming objects, a plurality of programming structures, or combinations thereof.
  • the one or more software modules comprise, by way of non-limiting examples, a web application, a mobile application, and a standalone application.
  • software modules are in one computer program or application. In other embodiments, software modules are in more than one computer program or application. In some embodiments, software modules are hosted on one machine. In other embodiments, software modules are hosted on more than one machine. In further embodiments, software modules are hosted on cloud computing platforms. In some embodiments, software modules are hosted on one or more machines in one location. In other embodiments, software modules are hosted on one or more machines in more than one location.
  • a sensor may collect data.
  • a sensor may collect a single type of data.
  • a sensor may collect more than one type of data.
  • a data type that one or more sensors may collect may include: an image (such as an image of an agricultural product or portion thereof or an image within the agricultural facility), a temperature, a humidity, a pH, a light level, a light spectrum, an air flow, an air circulation level, an air composition, an air exchange rate, a fertilizer content, a fertilizer concentration, a nutrient content, a nutrient concentration or any combination thereof.
  • a sensor may collect data related to a climate or microclimate within an agricultural facility.
  • a sensor may collect data on a disease type or disease level.
  • a sensor may collect data on an agricultural product yield, size of product, amount of product, rate of growth, or any combination thereof.
  • a sensor may collect data on amount of resources utilized or rate of resources utilized, such as water, fertilizer, soil, nutrients, sun light, heat, or any combination thereof.
  • a sensor may collect data automatically. Automated data collection may include continuous collection, discrete collection (such as when a given threshold may be reached), incremental collection (such as collection as timed intervals), or any combination thereof.
  • a sensor may collect data when a user (grower or farm manager) prompts the sensor.
  • a sensor may be a user (such as a grower).
  • a user may provide a sensory assessment of an agricultural product and may input the data into the processor, such as a user- based estimate of product number.
  • a sensor may be a camera, such as a digital camera.
  • a camera may be a camera on a smart phone (such as an iPhone) or on a smart watch (such as an iWatch).
  • a sensor may be a pH sensor, a temperature sensor, a light sensor, a humidity sensor, an air sensor, a turbidity sensor, a chemical sensor, or any combination thereof.
  • An array of sensors may include nxn sensors, such as lxl sensors, 2x2 sensors, 3x3 sensors or more.
  • An array of sensors may include nxm sensors, such as 1x3 sensors, 2x6 sensors, 3x9 sensors or more.
  • An array of sensors may include a plurality of sensors.
  • a sensor in an array of sensors may be individually addressable by a user or by the processor. For example, a subset of sensors may collect data based on a given parameter - such as time or temperature. Actuators
  • Adjusting an actuator may adjust one or more operating parameters of the agricultural facility. For example, adjusting a vent may adjust the temperature at one or more locations in the agricultural facility.
  • An array of sensors in communication with an array of actuators may facilitate adjustments of the temperature in discrete locations of the agricultural facility that may be insufficient compared to the remaining locations that are sufficiently heated.
  • An adjustment in an actuator may be determined by the processor and based on data received from one or more sensors, users, or a combination thereof.
  • An actuator may include a vent, a shutter, a louver, a sprayer, a pump, a valve, a mixer, a heater, a fan, a light, a humidifier, a dehumidifier, an air exchanger, a gas pump, a water source, a wind source, a food source, a fertilizer source, a pest control source, an evaporative cooler, a gas generator (such as a C02 generator) or any combination thereof.
  • a vent a shutter, a louver, a sprayer, a pump, a valve, a mixer, a heater, a fan, a light, a humidifier, a dehumidifier, an air exchanger, a gas pump, a water source, a wind source, a food source, a fertilizer source, a pest control source, an evaporative cooler, a gas generator (such as a C02 generator) or any combination thereof.
  • a gas generator such as a C02 generator
  • An array of actuators may include nxn actuators, such as lxl actuators, 2x2 actuators,
  • An array of actuators may include nxm actuators, such as 1x3 actuators, 2x6 actuators, 3x9 actuators or more.
  • An array of actuators may include a plurality of actuators.
  • An actuator in an array of actuators may be individually addressable by a user or by the processor. For example, a subset of actuators may collect data based on a given parameter - such as time or temperature.
  • An operating parameter may include at least in part an ingredient or raw material needed to grow an agricultural parameter.
  • An operating parameter may include a seed composition to grow a particular type of agricultural product, a water amount to provide to the agricultural product, a light amount or light spectrum to provide to the agricultural product, a soil composition to provide to the agricultural product, or any combination thereof.
  • an operating parameter may include a feed type, feed amount, feed frequency, water type, water amount, water frequency, or any combination thereof for optimal growth, health, or nutrition of an animal product.
  • An operating parameter may include an amount or type of raw ingredient.
  • An operating parameter may include an environmental condition to provide a nutritionally optimized product, a maximized product yield, a shortened growth cycle to achieve the agricultural product, a product yield that reduces an amount of raw ingredients needed or any combination thereof.
  • An operating parameter may include an acceptable range, such as a growing temperature from about 65 degrees Fahrenheit to about 90 degrees Fahrenheit.
  • An operating parameter may include a suggested starting value, such as a growing temperature of about 75 degrees Fahrenheit, a value that may be updated or modified during the course of a grow cycle.
  • An operating parameter may be modified by a user or by a control system.
  • An operating parameter may be modified based at least in part from data received from one or more sensors.
  • a platform may utilize a processor comprising a trained algorithm.
  • the trained algorithm may be trained on one or more agricultural products.
  • the trained algorithm may be trained to identify features in the data received from the one or more sensors and based on the data received determine predicts outputs such as yield prediction. Further, the trained algorithm may be trained to identify features in the data received from the one or more sensors and suggest changes in operating parameters of one or more actuators to enhance or optimize an output, such as yield production.
  • a trained algorithm may be trained to maximize yield production, minimize disease, minimize weed growth, agricultural nutritional content, animal welfare, water conservancy, soil conservancy, or any combination thereof.
  • a maximized yield production may be the most agricultural product produced in the smallest square footage of agricultural facility.
  • a maximized yield production may be the most agricultural product produced for the least amount of seed, water, or resource input.
  • a Farm Management System may be a complete software solution for managing an agricultural facility, such as a farm.
  • the platform may integrate task management, enterprise resource planning (ERP) solutions, crop planning, computer vision, supply chain management, as well as sensors and automation into a singular powerful platform: the one application an agricultural manager may need to manage a data driven 21st century farm.
  • ERP enterprise resource planning
  • Collecting multiple functionalities into one platform may allow a user to find correlations amongst the data and take action or deliver useful insights with machine learning. This approach may give growers the feedback they need to maximize output and minimize inputs, all while optimizing for growing environmental conditions, nutritional value, yields, food safety, or any combination thereof.
  • a user may be able to collect, maintain, and compare data from multiple agricultural facilities at similar or different geographical locations. This ability can deliver the end consumer more insight into how one agricultural facility compares to another agricultural facility in practices, sustainability, food safety, nutrition, or any combination thereof.
  • the summary covers modules that may comprise the farm management system individually and also how they can be brought together to enable new functionality.
  • a series of sensors may be distributed at carefully chosen locations throughout the agricultural facility, collecting ongoing data on temperature, humidity, fertilizer content, fertilizer concentration, nutrient content, nutrient concentration, pH, light levels, light spectrums, air flow, air circulation, or any combination thereof.
  • Sensor arrays may open visibility into patterns in environmental variables such as temperature. For example, given a greenhouse with a wet wall for cooling at one end of it, there are differences in both temperature and humidity from the near side to the far side. Sensors placed in a grid system can identify, monitor and measure the various microclimates in the growing environments. This monitoring of microclimates in real-time can be input into a processor of the platform and the processor can direct an adjustment in an operating parameter of one or more actuators to correct an undesirable microclimate in a specific location in the agricultural facility. Real-time monitoring and adjustments to correct for undesirable microclimates can maximize agricultural product yield.
  • Data collected from one or more sensors may be processed in real-time, triggering actuators (such as farming equipment) to make changes in the growing environment; e.g., modifications to temperature, adding nutrients to fertigation supply, powering off lights, or any combination thereof.
  • triggering actuators such as farming equipment
  • actuator arrays may consist of vents, heaters, fans, lights, humidifiers, dehumidifiers, other devices, or any combination thereof that can act on the environment or a subset of the environment (e.g. microclimate of a sub-location of the agricultural facility).
  • Processors with built-in machine learning or Artificial Intelligence (AI) can utilize data from one or more sensors of a sensor array and learn best how to use one or more actuators of an array of actuators to achieve an outcome in a specific environment. For example, if the desired temperature is 72 degrees Fahrenheit, a network of fans and directional vents can be configured to correctly circulate the air in an even distribution pattern.
  • Camera arrays can capture data from a portion of a growing area or an entire growing area.
  • the primary use of imaging technology may be correlated with outcomes for growing plants, but it can also be used to monitor for pests, both insect and animal. If there is a pest detected, a user can know exactly which plants within the agricultural facility might be affected. This feature may be important for food safety assurances, research purposes, improving smart recipes, or any combination thereof.
  • Camera arrays may provide multiple perspectives of an agricultural product (such as different angles of a growing plant), data what can be utilized to determine agricultural product size (e.g. fruit size), leaf orientation, color, stage of disease, plant/animal stress, or any combination thereof.
  • cameras biofeedback may be incorporated so that the processor can adjust the operating parameters (such as environmental conditions) based on data (such as leaf orientation, color, reproductive status, plant health or any combination thereof).
  • the processor can adjust the operating parameters (such as environmental conditions) based on data (such as leaf orientation, color, reproductive status, plant health or any combination thereof).
  • Two-dimensional camera technology may be incorporated in a variety of spectmms. Three-dimensional camera technology may unlock live modeling functionality and additional correlation possibilities.
  • ERP Enterprise Resource Planning
  • the farm management system may integrate a subset of ERP functionality with the focus being capturing data and utilizing that data to run an agricultural facility with maximum efficiency.
  • Features include but are not limited to: sales management, order management, customer relationship management, task management, standard operating procedures, user training, inventory, cost analysis, planning or any combination thereof.
  • the platform may be tailored specifically to the needs of growers and domain expertise may be incorporated into the platform.
  • the platform may comprise features allowing growers to collect information relevant for food safety compliance, to generate reports for certification, or a combination thereof.
  • Features of the platform may include:
  • IPM Integrated Pest Management
  • Nutrition certification May provide data on agricultural product provenance to show that a batch of food meets a certain level of nutrition based on laboratory analysis of batch samples.
  • Sustainability certification May provide insight to a food consumer into how their food was grown based on data collected by the platform.
  • Data collected that may be provided to a food consumer may include the amount of water used, the amount of energy used, the amount of soil square footage used, other resources used, metadata of how an agricultural product may be grown (for example, open field vs. soil greenhouse vs. hydroponic greenhouse, etc.), or any combination thereof.
  • Supply chain integration May provide integration with Supply Chain data using REST and GraphQL APIs. This may be important for food safety and efficiently executing a food recall.
  • the platform may provide integration with other ERP products such as NetSuite and Oracle ERP.
  • REST Real State Transfer
  • Farm management, growers, and farm workers constitute distinct user groups.
  • the platform may be customized to each user group’s needs but built from a shared centralized data source. Managers and growers may each have access to a powerful customized dashboard and admin interface giving them a complete view of everything happening on the farm. Farm workers may have access to task management tools, barcode scanners, and applications for workstations such as harvesting, seeding, and packaging as well as access to Standard Operating Procedure documentation.
  • the platform may provide a plurality of discrete user interfaces.
  • the platform may provide separate user interfaces for agricultural managers, agricultural growers, and food consumers.
  • the platform may provide at least 2, 3, 4, 5, 6, 7, 8, 9, 10, 15, 20, 30, 40,
  • the platform may provide at least 50 discrete user interfaces.
  • the platform may provide at least 500 discrete user interfaces.
  • the platform may provide at least 5,000 discrete user interfaces.
  • the platform may provide at least 50,000 discrete user interfaces.
  • the platform may provide at least 500,000 discrete user interfaces.
  • a discrete user interface may limit the data a user can access, whether a user can input data into the platform, and what type of request a user can enter into the interface, or any combination thereof. For example, data entry may be reserved for agricultural grower or manager interfaces.
  • agricultural growers using their mobile devices to access an application to input data may represent a food safety risk.
  • work stations can be set up ahead of time by following a sterilization standard operating procedure: for example, a harvesting station with a tablet computer and a smart scale can be sterilized and used with gloves. Smart watches can be sterilized before each use and used with sterile gloves.
  • Voice recognition systems capture agricultural growers logs and Bluetooth enabled scanners can scan QR codes or RFID chips.
  • Plant Recipes may be a format for encoding information about an organism's environmental requirements to achieve a desired phenotype, as well as any instructions or protocols needed to achieve a certain outcome.
  • the information in a recipe can be serialized in JSON, XML, or JSON-LD.
  • JSON Web Token or “JWT Token” is a JSON-based open standard (RFC 7519) for creating access tokens that assert some number of claims and may include user information including encrypted user information.
  • a Plant Recipe may be interpreted by a reactive module that parses a stream of event data (sensors, actuators, etc.) according to rules defined in the Plant Recipe.
  • This reactive module may emit its own stream of event data including messages, alerts, and new actions (such as adjusting an operating parameter by changing a setting on an actuator - turning on a heater or pump).
  • Sensor data from successful grows can be used to create new recipes, in essence recording environmental variables for playback later. For example, a particular harvest of tomatoes may have higher nutrient content than another harvest of tomatoes. By taking sensor data from the environment for the duration of the grow, it may be encoded into the recipe format to recreate that grow in a different controlled environment or the next grow. With greater data inflow, recipes may become increasingly targeted and intelligent through the development of machine learning algorithms.
  • a smart recipe may be provided by a user.
  • a smart recipe may be at least in part an output provided by a trained algorithm from a previous grow cycle.
  • a smart recipe may be a combination of an initial recipe provided by a user or database that may be additionally modified or updated by a trained algorithm, such as updated during a grow cycle.
  • a smart recipe may be updated based on a subtype of agricultural product that may be grown.
  • a smart recipe may be updated based on a geographical location that the agricultural product may be grown.
  • a smart recipe may be updated based on a user feedback, such as a request from a food consumer for an agricultural product having a particular set of nutritional elements.
  • An agricultural product may be evaluated or ranked against a second agricultural product for one or more nutritional elements.
  • a nutrition element may include a presence of a mineral, an amount of a mineral, a presence of a vitamin, an amount of a vitamin, an amount of calories, an amount of sugar, an amount of salt, an amount of fat, a type of fat, or any combination thereof.
  • a trained algorithm may receive data from an array of sensors, determine one or more nutritional elements of the agricultural product, and direct a modification or one or more operating parameters of one or more actuators or a modification to a recipe or a modification to a raw material to optimize the nutritional element or a panel of nutritional elements as compared to a control agricultural product.
  • machine learning models can be created to help growers improve their operation.
  • the purpose of the FMS may be to collect structured data that can be used to power machine learning services such as:
  • Crop Planning Sensor data and a database of plant recipes may be used to determine what plants would grow well in a particular growing environment and to give a user recommendation on how to optimize any environment for a given cultivar.
  • sales data may drive recommendations on what crops should be grown in a given geographical location.
  • Pest detection, predictions, and mitigations Using data collected from the Integrated Pest Management portion of the FMS ERP alongside historical data, the platform can recognize individual pests from images but it may predict pest occurrence based on growing seasons, environmental conditions, other factors, or any combination thereof.
  • Biofeedback may also be incorporated via camera imaged-based collection, so that the processer can direct adjustment of one or more environmental conditions based on the biofeedback - leaf orientation, color, reproductive status, plant health, or any combination thereof.
  • the platform may provide a singular software solution to run a profitable, data-driven modem agricultural facility.
  • a user may have access to control of monitoring, automation and farm operations which ultimately may empower the user to ran a more profitable and effective farm.
  • plant recipes such as smart recipes
  • SOPs proven Standard Operating Procedures
  • the platform may be implemented in farming systems of many different types (plant factory, greenhouse, hydroponic, aeroponic).
  • One of the key assets of the platform may be the data stream and database.
  • the platform may collect data (including location, types of grow systems already in place, sales and orders, market data, historical data for a particular environment, or any combination thereof) and the platform structures the data to deliver useful insights and accountability.
  • API and export features disclosed herein may allow for access to all this structured data allowing users (such as scientists) to use virtually any data analysis tools. Importantly, this may allow the user to link rich datasets to their publications increasing research transparency and ultimately allowing for better peer review and science.
  • the platform may allow these users to collect structured data from experiments and provide insight into the state of past, present, and future experiments.
  • the platform uses one or more sensors in an n x n array (or grid) or alternative geometric configuration to collect data at n discrete locations over a portion of an agricultural facility, which in some embodiments is digitized using pickup electronics and in some embodiments is connected to a computer for recording and displaying this data. It should be understood, however, that the platform is suitable for measuring a data associated with any type of agricultural product.
  • a sensor is configured to sense data associated with, for example, a plant, an animal, an environment, a climate, a parameter of a location associated with an agricultural facility, or any combination thereof.
  • the platform comprises a mobile base unit that may be movable and housing one or more sensors.
  • the platform comprises a mobile base unit, one or more sensors, and one or more actuators.
  • a mobile base unit includes wheels or a track upon which the mobile base unit is moved on a surface.
  • a trained algorithm may provide an output.
  • the output may comprise a yield prediction of at least one agricultural product, a disease prediction of at least one agricultural product, a weed detection in a grow cycle of at least one agricultural product, a crop quality of a grow cycle of at least one agricultural product, a species recognition of at least one agricultural product, an animal welfare rating of at least one agricultural product that is an animal-based product, a water management result of a grow of at least one agricultural product, a livestock production of at least one animal-based product, a soil management of a grow of at least one agricultural product, or any combination thereof.
  • a platform may comprise a machine learning module, such as a trained algorithm.
  • a machine learning module may be trained on one or more training data sets.
  • a machine learning module may be trained on at least about: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100, 200, 300, 400, 500, 600, 700, 800, 900, 1,000, 5,000, 10,000, 20,000, 50,000 data sets or more.
  • a machine learning module may be trained on from about 50 to about 200 data sets.
  • a machine learning module may be trained on from about 50 to about 1,000 data sets.
  • a machine learning module may be trained on from about 1,000 to about 5,000 data sets.
  • a machine learning module may be trained on from about 5 to about 500 data sets.
  • a machine learning module may generate a training data set from data acquired or extracted from a sensor or user.
  • a machine learning module may be validated with one or more validation data sets.
  • a validation data set may be independent from a training data set.
  • a training data set may comprise data provided by a sensor, data provided by a user, or any combination thereof.
  • a training data set may be stored in a database of the platform.
  • a training data set may be uploaded to the machine learning module from an external source.
  • a training data set may be generated from data acquired at from a grow cycle.
  • a training data set may be updated continuously or periodically.
  • a training data set may comprise data from at least about: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30, 40,
  • a training data set may comprise data from about 50 to about 200 different grows.
  • a training data set may comprise data from about 50 to about 1,000 different grows.
  • a training data set may comprise data from about 1,000 to about 5,000 different grows.
  • a training data set may comprise data from about 5 to about 500 different grows.
  • the sensed parameter(s) herein is received as an input to output a correlation by a processor.
  • correlation herein is received as an input to a machine learning algorithm configured to output guidance or instruction for future grows.
  • the systems, methods, and media described herein may use machine learning algorithms for training prediction models and/or making predictions of for a grow.
  • Machine learning algorithms herein may learn from and make predictions on data, such as data obtained from a sensor or user.
  • Data may be any input, intermediate output, previous outputs, or training information, or otherwise any information provided to or by the algorithms.
  • a machine learning algorithm may use a supervised learning approach.
  • the algorithm can generate a function or model from training data.
  • the training data can be labeled.
  • the training data may include metadata associated therewith.
  • Each training example of the training data may be a pair consisting of at least an input object and a desired output value.
  • a supervised learning algorithm may require the individual to determine one or more control parameters. These parameters can be adjusted by optimizing performance on a subset, for example a validation set, of the training data. After parameter adjustment and learning, the performance of the resulting function/model can be measured on a test set that may be separate from the training set. Regression methods can be used in supervised learning approaches.
  • a machine learning algorithm may use an unsupervised learning approach.
  • the algorithm may generate a function/model to describe hidden structures from unlabeled data (e.g., a classification or categorization that cannot be directed observed or computed). Since the examples given to the learner are unlabeled, there is no evaluation of the accuracy of the structure that is output by the relevant algorithm.
  • Approaches to unsupervised learning include: clustering, anomaly detection, and neural networks.
  • a machine learning algorithm may use a semi -supervised learning approach.
  • Semi- supervised learning can combine both labeled and unlabeled data to generate an appropriate function or classifier.
  • a machine learning algorithm may use a reinforcement learning approach.
  • the algorithm can learn a policy of how to act given an observation of the world. Every action may have some impact in the environment, and the environment can provide feedback that guides the learning algorithm.
  • a machine learning algorithm may use a transduction approach.
  • Transduction can be similar to supervised learning, but does not explicitly construct a function. Instead, tries to predict new outputs based on training inputs, training outputs, and new inputs.
  • a machine learning algorithm may use a “learning to learn” approach. In learning to learn, the algorithm can learn its own inductive bias based on previous experience.
  • a machine learning algorithm is applied to patient data to generate a prediction model.
  • a machine learning algorithm or model may be trained periodically.
  • a machine learning algorithm or model may be trained non-periodically.
  • a machine learning algorithm may include learning a function or a model.
  • the mathematical expression of the function or model may or may not be directly computable or observable.
  • C2x2 has two predictor variables, xl and x2, and coefficients or parameter, CO, Cl, and C2.
  • the predicted variable in this example is Y. After the parameters of the model are learned, values can be entered for each predictor variable in a model to generate a result for the dependent or predicted variable (e.g., Y).
  • a machine learning algorithm comprises a supervised or unsupervised learning method such as, for example, support vector machine (SVM), random forests, gradient boosting, logistic regression, decision trees, clustering algorithms, hierarchical clustering, K-means clustering, or principal component analysis.
  • Machine learning algorithms may include linear regression models, logistical regression models, linear discriminate analysis, classification or regression trees, naive Bayes, K-nearest neighbor, learning vector quantization (LVQ), support vector machines (SVM), bagging and random forest, boosting and Adaboost machines, or any combination thereof.
  • Data input into a machine learning algorithm may include data obtained from an individual, data obtained from a practitioner, or a combination thereof.
  • Data input into a machine learning algorithm may include data extracted from a sensor, from a user, or a combination thereof.
  • Data input into a machine learning algorithm may include a product yield, an environmental condition, a pest resilience, a nutrient profile, a farming practice used, or any combination thereof.
  • Data obtained from one or more grows can be analyzed using feature selection techniques including filter techniques which may assess the relevance of one or more features by looking at the intrinsic properties of the data, wrapper methods which may embed a model hypothesis within a feature subset search, and embedded techniques in which a search for an optimal set of features may be built into a machine learning algorithm.
  • a machine learning algorithm may identify a set of parameters that may provide an optimized grow.
  • a machine learning algorithm may be trained with a training set of samples.
  • the training set of samples may comprise data collected from a grow, from different grows, or from a plurality of grows.
  • a training set of samples may comprise data from a database.
  • a training set of samples may include 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30, 40, 50, 60, 70,
  • a training set of samples may comprise a single data type.
  • a training set of samples may include different data types.
  • a training set of samples may comprise a plurality of data types.
  • a training set of samples may comprise at least three data types.
  • a training set of samples may include data obtained from about: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30, 40, 50, 60, 70,
  • a training set of samples may include data from a single grow.
  • a training set of samples may include data from different grows.
  • a training set of samples may include data from a plurality of grows.
  • Iterative rounds of training may occur to arrive at a set of features to classify data.
  • Different data types may be ranked differently by the machine learning algorithm.
  • One data type may be ranked higher than a second data type.
  • Weighting or ranking of data types may denote significance of the data type.
  • a higher weighted data type may provide an increased accuracy, sensitivity, or specificity of the classification or prediction of the machine learning algorithm.
  • an input parameter of growing temperature may significantly increase crop yield, more than any other input parameter. In this case, growing temperature may be weighted more heavily than other input parameters in increasing crop yield.
  • the weighting or ranking of features may vary from grow to grow.
  • the weighting or ranking of features may not vary from grow to grow.
  • a machine learning algorithm may be tested with a testing set of samples.
  • the testing set of samples may be different from the training set of samples. At least one sample of the testing set of samples may be different from the training set of samples.
  • the testing set of samples may comprise data collected from before a grow, during a grow, after a grow, from different grows, or from a plurality of grows.
  • a testing set of samples may comprise data from a database.
  • a training set of samples may include different data types - such as one or more input parameters and one or more output parameters.
  • a testing set of samples may include 1, 2, 3, 4,
  • a testing set of samples may comprise a data type.
  • a testing set of samples may include different data types.
  • a testing set of samples may comprise a plurality of data types.
  • a testing set of samples may comprise at least three data types.
  • a testing set of samples may include data obtained from 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 20, 30, 40, 50, 60, 70, 80, 90, 100, 200, 300, 400, 500, 600, 700, 800, 900, 1,000, 5,000, 10,000, 20,000,
  • a testing set of samples may include data from a single grow.
  • a testing set of samples may include data from different grows.
  • a testing set of samples may include data from a plurality of grows.
  • a machine learning algorithm may classify or predict an outcome with at least about: 80%, 85%, 90%, 95%, 96%, 97%, 98%, 99% accuracy.
  • a machine learning algorithm may classify or predict an outcome with at least about: 80%, 85%, 90%, 95%, 96%, 97%, 98%, 99% sensitivity.
  • a machine learning algorithm may classify or predict an outcome with at least about: 80%, 85%, 90%, 95%, 96%, 97%, 98%, 99% specificity.
  • a machine learning algorithm may classify with 90% accuracy that an agricultural yield will not succumb to pest infestation.
  • a machine learning algorithm may classify a grow as having at least 90% likelihood of producing an agricultural product with superior nutritional profile as compared to a control.
  • a machine learning algorithm may predict at least 95% likelihood of an agricultural yield under a range of growing temperatures.
  • An independent sample may be independent from the training set of samples, the testing set of samples or both.
  • the independent sample may be input into the machine learning algorithm for classification.
  • An independent sample may not have been previously classified by the machine learning algorithm.
  • a classifier may be employed to determine or to predict a set of growing conditions to be executed during a grow.
  • a classifier may provide real-time feedback and guided adjustments of the one or more growing conditions - such as during a grow.
  • One or more growing conditions may be adjusted real-time during a grow.
  • a machine learning algorithm may identify an ‘ideal’ or ‘optimized’ input parameter for each grow.
  • An ‘ideal’ or ‘optimized’ input parameter may remain constant or may change over time.
  • An ‘ideal’ or ‘optimized’ input parameter may be specific or unique for each grow or agricultural product.
  • Feedback from a machine learning algorithm may be continuous such as feedback during a grow, episodic such as at the end of a grow, or roll-back such as cumulative changes over several different grows, or any combination thereof. Feedback from a machine learning algorithm may result in one or more changes in a recipe or operating parameter.
  • a trained algorithm (such as a machine learning software module) as described herein is configured to undergo at least one training phase wherein the trained algorithm is trained to carry out one or more tasks including data extraction, data analysis, and generation of output or result, such as an recipe for growing an agricultural product with maximal yield or with maximal nutritional benefit.
  • the agricultural platform comprises a training module that trains the training algorithm.
  • the training module is configured to provide training data to the trained algorithm, said training data comprising, for example, a data set from an agricultural facility or a data set from a previous grow.
  • a trained algorithm is trained using a data set and a target in a manner that might be described as supervised learning.
  • the data set is conventionally divided into a training set, a test set, and, in some cases, a validation set.
  • a target is specified that contains the correct classification of each input value in the data set. For example, a data set from one type of agricultural product is repeatedly presented to the trained algorithm, and for each sample presented during training, the output generated by the trained algorithm is compared with the desired target. The difference between the target and the set of input samples is calculated, and the trained algorithm is modified to cause the output to more closely approximate the desired target value, such as maximized yield of the agricultural product.
  • a back-propagation algorithm is utilized to cause the output to more closely approximate the desired target value.
  • the trained algorithm output will closely match the desired target for each sample in the input training set.
  • new input data not used during training
  • it may generate an output classification value indicating which of the categories the new sample is most likely to fall into.
  • the trained algorithm is said to be able to “generalize” from its training to new, previously unseen input samples. This feature of a trained algorithm allows it to be used to classify almost any input data which has a mathematically formulatable relationship to the category to which it should be assigned.
  • the trained algorithm utilizes an individual learning model.
  • An individual learning model is based on the trained algorithm having trained on data from a single individual and thus, a trained algorithm that utilizes an individual learning model is configured to be used on a single individual on whose data it trained.
  • the trained algorithm utilizes a global training model.
  • a global training model is based on the trained algorithm having trained on data from multiple individuals and thus, a trained algorithm that utilizes a global training model is configured to be used on multiple patients/individuals.
  • the trained algorithm utilizes a simulated training model.
  • a simulated training model is based on the trained algorithm having trained on a data set obtained from a grow of an agricultural product.
  • a trained algorithm that utilizes a simulated training model is configured to be used on multiple grows of an agricultural product.
  • Unsupervised learning is used, in some embodiments, to train a trained algorithm to use input data such as, for example, agricultural product data and output, for example, a maximized yield or disease detection.
  • Unsupervised learning in some embodiments, includes feature extraction which is performed by the trained algorithm on the input data. Extracted features may be used for visualization, for classification, for subsequent supervised training, and more generally for representing the input for subsequent storage or analysis. In some cases, each training case may consist of a plurality of agricultural product data.
  • Trained algorithms that are commonly used for unsupervised training include k-means clustering, mixtures of multinomial distributions, affinity propagation, discrete factor analysis, hidden Markov models, Boltzmann machines, restricted Boltzmann machines, autoencoders, convolutional autoencoders, recurrent neural network autoencoders, and long short-term memory autoencoders. While there are many unsupervised learning models, they all have in common that, for training, they require a training set consisting of a data set of grows of an agricultural product, without associated labels.
  • a trained algorithm may include a training phase and a prediction phase.
  • the training phase is typically provided with data in order to train the machine learning algorithm.
  • types of data inputted into a trained algorithm for the purposes of training include an agricultural product yield, an amount and type of raw materials, an environmental condition during a grow, a length of grow, a nutritional profile of the agricultural product, a soil composition, or any combination thereof.
  • Data that is inputted into the trained algorithm is used, in some embodiments, to construct a hypothesis function to determine the presence of an abnormality.
  • a trained algorithm is configured to determine if the outcome of the hypothesis function was achieved and based on that analysis make a determination with respect to the data upon which the hypothesis function was constructed.
  • the outcome tends to either reinforce the hypothesis function with respect to the data upon which the hypothesis functions was constructed or contradict the hypothesis function with respect to the data upon which the hypothesis function was constructed.
  • the machine learning algorithm will either adopts, adjusts, or abandon the hypothesis function with respect to the data upon which the hypothesis function was constructed.
  • the machine learning algorithm described herein dynamically learns through the training phase what characteristics of an input (e.g. data) is most predictive in optimizing a crop yield, minimizing pest infestation or carbon foot print of a product, maximizing profit or nutritional value of a product, or any combination thereof.
  • a trained algorithm is provided with data on which to train so that it, for example, is able to determine the most salient features of a received agricultural product data to operate on.
  • the trained algorithms described herein train as to how to analyze the agricultural product data, rather than analyzing the agricultural product data using pre-defmed instructions.
  • the trained algorithms described herein dynamically learn through training what characteristics of an input signal are most predictive in in optimizing a crop yield, minimizing pest infestation or carbon foot print of a product, maximizing profit or nutritional value of a product, or any combination thereof.
  • the trained algorithm is trained by repeatedly presenting the trained algorithm with agricultural product data along a range of successful and non-successful grows.
  • the trained algorithm may be presented with data from grows having high yield and data having no product produced.
  • the trained algorithm may be presented with data from grows having a high carbon footprint and data from grows having a minimized carbon foot print.
  • a trained algorithm may receive heterogeneous data conveying the range and variability of data that the trained algorithm may encounter in a future grow.
  • Agricultural product data may be generated by computer simulation.
  • training begins when the trained algorithm is given agricultural product data and asked to optimize a crop yield, minimize pest infestation or carbon foot print of a product, maximize profit or nutritional value of a product, or any combination thereof.
  • the predicted output is then compared to the true data that corresponds to the agricultural product data.
  • An optimization technique such as gradient descent and backpropagation is used to update the weights in each layer of the trained algorithm so as to produce closer agreement between the probability predicted by the trained algorithm, and the optimized result. This process is repeated with new agricultural product data until the accuracy of the network has reached the desired level.
  • An optimization technique is used to update the weights in each layer of the trained algorithm so as to produce closer agreement between the data predicted by the trained algorithm, and the true data. This process is repeated with new agricultural product data until the accuracy of the network has reached the desired level.
  • a machine learning algorithm may be trained using a large database of measurements and/or any features or metrics computed from the above said data with the corresponding ground-truth values.
  • the training phase constructs a transformation function for predictive in optimizing a crop yield, minimizing pest infestation or carbon foot print of a product, maximizing profit or nutritional value of a product, or any combination thereof.
  • the machine learning algorithm dynamically learns through training what characteristics or features of an input signal are most predictive in optimizing the features of an agricultural product, such as nutritional profile.
  • a prediction phase uses the constructed and optimized transformation function from the training phase to predict the optimization of the grow and product yield.
  • the trained algorithm may be used to maximize, for example, the agricultural product yield on which the system was trained using the prediction phase.
  • the system can predict in an independent grow cycle the optimized product yield Data Filtering
  • data that is received by a machine learning algorithm software module from a sensor as an input may comprise agricultural product data that has been filtered and or modified.
  • filtering comprises a removal of noise or artifact from a sensed data, such as noise perturbations or temperature fluctuations from a grower entering or exiting a facility.
  • Artifact or noise may comprise, for example, ambient signals that are sensed together with data sensed from in an agricultural facility.
  • sensed agricultural product data is filtered prior to and/or after transmission of said data to a processor.
  • Filtering of sensed agricultural product data may, for example, comprise the removal of ambient signal noise from a sensed agricultural product data.
  • Signal noise may, for example, comprises ambient agricultural product data generated by, for example, electronic devices, electrical grids, or other devices.
  • sensed agricultural product data is converted to another form of data or signal which then undergoes a signal filtering process.
  • a device or system includes a processor including software that is configured to convert sensed agricultural product data to another form of data or signal.
  • the process of converting sensed agricultural product data to another form of data or signal typically comprises an encoding process, wherein a first form of data is converted into a second form of data or signal._Once filtered, the filtered data may be transmitted to a machine learning algorithm for analysis.
  • FIG. 1 shows an embodiment of a system such as used in an agricultural platform as described herein, comprising a digital processing device 101.
  • the digital processing device 101 includes a software application configured for agriculture management. Alternatively or in combination, the digital processing device 101 is configured to generate a trained algorithm (e.g., machine learning algorithm) such as by training the algorithm with a training data set.
  • the digital processing device 101 may include a central processing unit (CPU, also “processor” and “computer processor” herein) 105, which can be a single core or multi-core processor, or a plurality of processors for parallel processing.
  • CPU central processing unit
  • processor also “processor” and “computer processor” herein
  • the digital processing device 101 also includes either memory or a memory location 110 (e.g., random-access memory, read-only memory, flash memory), electronic storage unit 115 (e.g, hard disk), communication interface 120 (e.g, network adapter, network interface) for communicating with one or more other systems, and peripheral devices, such as cache.
  • the peripheral devices can include storage device(s) or storage medium 165 which communicate with the rest of the device via a storage interface 170.
  • the memory 110, storage unit 115, interface 120 and peripheral devices are configured to communicate with the CPU 105 through a communication bus 125, such as a motherboard.
  • the digital processing device 101 can be operatively coupled to a computer network (“network”) 130 with the aid of the communication interface 120.
  • the network 130 can comprise the Internet and/or a local area network (LAN).
  • the network 130 can be a telecommunication and/or data network.
  • the digital processing device 101 includes input device(s) 145 to receive information from a user, the input device(s) in communication with other elements of the device via an input interface 150.
  • the input device(s) includes a remote device such as a smartphone or tablet that is configured to communicate remotely with the digital processing device 101.
  • a remote device such as a smartphone or tablet that is configured to communicate remotely with the digital processing device 101.
  • a user may use a smartphone application to access sensor data, current actuator instructions, the smart recipe, or other information stored on the digital processing device 101.
  • the digital processing device 101 can include output device(s) 155 that communicates to other elements of the device via an output interface 160.
  • the CPU 105 is configured to execute machine-readable instructions embodied in a software application or module.
  • the instructions may be stored in a memory location, such as the memory 110.
  • the memory 110 may include various components (e.g ., machine readable media) including, but not limited to, a random access memory component (e.g., RAM) (e.g, a static RAM “SRAM”, a dynamic RAM “DRAM, etc.), or a read-only component (e.g, ROM).
  • the memory 110 can also include a basic input/output system (BIOS), including basic routines that help to transfer information between elements within the digital processing device, such as during device start-up, may be stored in the memory 110.
  • BIOS basic input/output system
  • the storage unit 115 can be configured to store files, such as sensor data, smart recipe(s), etc.
  • the storage unit 115 can also be used to store operating system, application programs, and the like.
  • storage unit 115 may be removably interfaced with the digital processing device (e.g, via an external port connector (not shown)) and/or via a storage unit interface.
  • Software may reside, completely or partially, within a computer-readable storage medium within or outside of the storage unit 115.
  • software such as the software application and/or module(s) may reside, completely or partially, within processor(s) 105.
  • Information and data can be displayed to a user through a display 135.
  • the display is connected to the bus 125 via an interface 140, and transport of data between the display other elements of the device 101 can be controlled via the interface 140.
  • Methods as described herein can be implemented by way of machine (e.g, computer processor) executable code stored on an electronic storage location of the digital processing device 101, such as, for example, on the memory 110 or electronic storage unit 115.
  • the machine executable or machine readable code can be provided in the form of a software application or software module.
  • the code can be executed by the processor 105.
  • the code can be retrieved from the storage unit 115 and stored on the memory 110 for ready access by the processor 105.
  • the electronic storage unit 115 can be precluded, and machine-executable instructions are stored on memory 110.
  • one or more remote devices 102 are configured to communicate with and/or receive instructions from the digital processing device 101, and may comprise any sensor, actuator, or camera as described herein.
  • the remote device 102 is a temperature sensor that is configured to gather temperature data and send the data to the digital processing device 101 for analysis according to a smart recipe.
  • the sensor can provide information such as sensor data, type of data, sensor ID, sensor location, metadata, or other data.
  • the remote device 102 is an actuator configured to perform one or more actions based on instructions received from the digital processing device 101.
  • the remote device is a camera configured to provide a camera feed or imaging data to the digital processing device 101. The camera may be configured to receive and respond to instructions to perform an action such as, for example, turning on/off, rotating or moving, and/or zooming in or out.
  • Embodiment 1 A platform for agriculture management, the platform comprising: a trained algorithm that provides a smart recipe for growing an agricultural product, wherein the trained algorithm is configured to (i) receive data at least in part from an array of sensors, wherein a sensor is positioned at a location related to an agricultural facility and (ii) direct at least one actuator of an array of actuators positioned at a location in the agricultural facility to adjust one or more operating parameters according to the smart recipe, wherein the trained algorithm is trained on at least five agricultural products and wherein the trained algorithm optimizes the smart recipe to maximize agricultural product yield.
  • the trained algorithm is configured to (i) receive data at least in part from an array of sensors, wherein a sensor is positioned at a location related to an agricultural facility and (ii) direct at least one actuator of an array of actuators positioned at a location in the agricultural facility to adjust one or more operating parameters according to the smart recipe, wherein the trained algorithm is trained on at least five agricultural products and wherein the trained algorithm optimizes the smart recipe to maximize agricultural product yield.
  • Embodiment 2 The platform of embodiment 1, wherein the agricultural product comprises an animal-based product.
  • Embodiment 3 The platform of any one of embodiments 1-2, wherein the agricultural product comprises a plant-based product.
  • Embodiment 4 The platform of any one of embodiments 1-3, wherein the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof.
  • Embodiment 5 The platform of any one of embodiments 1-4, wherein the at least five an agricultural products comprise different an agricultural products.
  • Embodiment 6 The platform of any one of embodiments 1-5, wherein the array of sensors comprise a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof.
  • Embodiment 7 The platform of any one of embodiments 1-6, wherein the array of sensors comprise at least 5 different types of sensors.
  • Embodiment 8 The platform of any one of embodiments 1-7, wherein the one or more operating parameters comprise a water amount provided to the agricultural product, a light amount provided to the agricultural product, a soil composition provided to the agricultural product, or any combination thereof.
  • Embodiment 9 The platform of any one of embodiments 1-8, wherein the location comprises two or more locations.
  • Embodiment 10 The platform of any one of embodiments 1-9, wherein the location is remote.
  • Embodiment 11 The platform of any one of embodiments 1-10, wherein the location is within the agricultural facility.
  • Embodiment 12 The platform of any one of embodiments 1-11, wherein the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof.
  • Embodiment 13 The platform of any one of embodiments 1-12, wherein the data comprises data collected from previous grows of the agricultural product.
  • Embodiment 14 A platform for agriculture management, the platform comprising: (a) an array of sensors, wherein a sensor is positioned at a location related to an agricultural facility; (b) an array of actuators, wherein an actuator is positioned at a location related to the agricultural facility; and (c) a processor configured to receive data at least in part from a sensor of the array of sensors and upon receipt of the data direct a change in an operating parameter of at least one actuator of the array of actuators, wherein the change in the operating parameter is calculated by a trained algorithm configured for maximizing agricultural product yield.
  • Embodiment 15 The platform of embodiment 14, wherein the array of sensors comprises a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof.
  • Embodiment 16 The platform of any one of embodiments 14-15, wherein the array of sensors comprise at least 5 different types of sensors.
  • Embodiment 17 The platform of any one of embodiments 14-16, wherein the array of actuators comprises a water source, a light source, a nutrient source, a wind source, a temperature source, or any combination thereof.
  • Embodiment 18 The platform of any one of embodiments 14-17, wherein the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof.
  • Embodiment 19 The platform of any one of embodiments 14-18, wherein the operating parameter comprises a water amount provided to the agricultural product, a light amount provided to the agricultural product, a soil composition provided to the agricultural product, or any combination thereof.
  • Embodiment 20 The platform of any one of embodiments 14-19, wherein the processor directs a change in at least three operating parameters based on receipt of the data.
  • Embodiment 21 The platform of any one of embodiments 14-20, wherein the location comprises two or more locations.
  • Embodiment 22 The platform of any one of embodiments 14-21, wherein the location is remote.
  • Embodiment 23 The platform of any one of embodiments 14-22, wherein the location is within the agricultural facility.
  • Embodiment 24 The platform of any one of embodiments 14-23, wherein the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof.
  • Embodiment 25 The platform of any one of embodiments 14-24, wherein the data comprises data collected from previous grows of the agricultural product.
  • Embodiment 26 A platform for agriculture management, the platform comprising: (a) a plurality of sensors, wherein a sensor is positioned at a location related to an agricultural facility; (b) a processor configured to receive data at least in part from at least one sensor of the plurality of sensors; and (c) a user interface configured to receive a request for an agricultural product from a user, and upon receipt of the request, the processor is configured to provide a result to the user based on the data, wherein the result comprises a nutritional result of the agricultural product, a food safety result of the agricultural product, a provenance of the agricultural product, or any combination thereof.
  • Embodiment 27 The platform of embodiment 26, further comprising a database, wherein the database comprises a data set received from a plurality of agricultural facilities.
  • Embodiment 28 The platform of any one of embodiments 26-27, wherein the processor comprises a trained algorithm trained to compare the data set from the database to the data received from the at least one sensor to produce the result.
  • Embodiment 29 The platform of any one of embodiments 26-28, wherein the plurality of agricultural facilities is at least 5.
  • Embodiment 30 The platform of any one of embodiments 26-29, wherein the plurality of agricultural facilities are located in different geographical locations.
  • Embodiment 31 The platform of any one of embodiments 26-30, wherein the request comprises a provenance of the agricultural product, a farming practice of the agricultural product, a nutritional panel of the agricultural product, a food safety of the agricultural product, or any combination thereof.
  • Embodiment 32 The platform of any one of embodiments 26-31, wherein the user is a food consumer.
  • Embodiment 33 The platform of any one of embodiments 26-32, wherein the user is a business entity that sells the agricultural product to a consumer.
  • Embodiment 34 The platform of any one of embodiments 26-33, wherein the location comprises two or more locations.
  • Embodiment 35 The platform of any one of embodiments 26-34, wherein the location is remote.
  • Embodiment 36 The platform of any one of embodiments 26-35, wherein the location is within the agricultural facility.
  • Embodiment 37 The platform of any one of embodiments 26-36, wherein the plurality of sensors comprises a camera, a temperature sensor, a light sensor, a pH sensor, or any combination thereof.
  • Embodiment 38 The platform of any one of embodiments 26-37, wherein the plurality of sensors comprise at least 5 different types of sensors.
  • Embodiment 39 The platform of any one of embodiments 26-38, wherein the agricultural facility is a fish farm, a dairy farm, a livestock farm, a crop farm, an orchard, an indoor farm, a hydroponic farm or any combination thereof.
  • Embodiment 40 The platform of any one of embodiments 26-39, wherein the agricultural product comprises an animal-based product.
  • Embodiment 41 The platform of any one of embodiments 26-40, wherein the agricultural product comprises a plant-based product.
  • Embodiment 42 The platform of any one of embodiments 26-41, wherein the data comprises an input from an agricultural grower, an agricultural manager, or a combination thereof.
  • Embodiment 43 The platform of any one of embodiments 26-42, wherein the data comprises data collected from previous grows of the agricultural product.
  • Embodiment 44 A platform for agriculture management, the platform comprising: (a) a plurality of sensors, wherein a sensor is positioned at a location related to an agricultural facility; (b) a plurality of actuators, wherein an actuator is positioned at a location related to the agricultural facility; (c) a processor configured to receive data from at least one sensor of the plurality of sensors; and (d) a plurality of discrete user interfaces, wherein a user interface of the plurality is configured to (i) receive data from a user, (ii) receive a request from a user, (iii) provide a result to a user, or (iv) any combination thereof, wherein a first user interface of the plurality is configured for a food consumer.
  • Embodiment 45 The platform of embodiment 44, wherein the plurality of discrete user interfaces is at least three.
  • Embodiment 46 The platform of any one of embodiments 44-45, wherein a second user interface is configured for an agricultural grower.
  • Embodiment 47 The platform of any one of embodiments 44-46, wherein a second user interface is configured for an agricultural manager.
  • a food consumer values sourcing animal-based products from farms that operate with high animal welfare standards.
  • the food consumer will access an agricultural management platform via the food consumer portal (such as via an application on a personal electronic device) to review one or more animal welfare standards reported by one or more farms that supply the animal-based product that the food consumer is considering to purchase. Based on the food consumer review of the animal welfare standards reported and displayed in the food consumer portal, the food consumer will make a decision to purchase an animal-based product.
  • a food consumer values sourcing agricultural-based products from farms that operate with high water conservation and soil preservation standards The food consumer will access an agricultural management platform via the food consumer portal (such as via an application on a personal electronic device) to review one or more water conservation or soil preservation standards reported by one or more farms that supply the agricultural -based product that the food consumer is considering to purchase. Based on the food consumer review of the water conservation or soil preservation standards reported and displayed in the food consumer portal, the food consumer will make a decision to purchase an agricultural-based product.
  • a food consumer values sourcing food products that contain high nutritional content.
  • the food consumer will access an agricultural management platform via the food consumer portal (such as via an application on a personal electronic device) to review one or more nutritional profiles (comprising one or more nutrition elements) of an agricultural product reported by one or more farms that supply the agricultural product that the food consumer is considering to purchase. Based on the food consumer’s review of the nutritional profiles reported and displayed in the food consumer portal, the food consumer will make a decision to purchase an agricultural product.
  • Example 4 A processor of an agriculture management platform initiates a grow of an agricultural product based on a recipe. During the grow cycle, a drought initiates in the location of the grow, significantly reducing the amount of rainfall received to the agricultural product. Data from one or more sensors will be provided to the processor - data that will be related to the significant reduction of rainfall. The processor will then direct a change in one or more actuators during the grow cycle to increase the amount of water provided to the agricultural product.
  • a processor of an agriculture management platform initiates a grow of an agricultural product based on a recipe.
  • a moisture-based pest infestation initiates in the particular sub-location of the grow, significantly reducing the amount of product yield in that sub-location.
  • Data from one or more sensors will be provided to the processor - data that will be related to the significant reduction of product yield in that sub-location.
  • the processor will then direct a change in one or more actuators within the sub-location (a subset of actuators in the array) during the grow cycle to increase reduce a moisture content within the sub-location to eradicate or reduce damage by the moisture-based pest infestation to the agricultural product.
  • Moisture data collected from the sub-location during the grow cycle will be incorporated into a trained algorithm of the processor to inform future grows of the agricultural product or to minimize risk of moisture-based pest infestations in future grows of the agricultural product.
  • An agricultural management platform will have three distinct user portals.
  • a first user portal will be configured for a food consumer.
  • the first user portal for the food consumer will permit access to nutritional information of the agricultural product, a geographical location of a grow, a farming practice (such as organic grow, non-GMO grow, hormone-free grow, antibiotic- free grow, animal welfare standards, wild or farm raised, caged or open access or free range) of an agricultural product, or any combination thereof.
  • the first user portal will permit a food consumer to provide a feedback to a farm or to another food consumer, a rating of an agricultural product, a question to a farm or to another food consumer, or any combination thereof.
  • a second user portal will be configured for an agricultural grower.
  • the second user portal will permit the agricultural grower to input, review or modify one or more operating parameters, outputs, data, recipes, or any combination thereof.
  • the third user portal will be configured for an agricultural manager.
  • the third user portal will permit the agricultural manager to input, review, or modify one or more operating parameters, outputs, data, recipes, or any combination thereof.
  • the agricultural manager will communicate with the agricultural grower via the individual portals by providing feedback, comments, questions or any combination thereof.
  • a food consumer, an agricultural manager, or agricultural grower will communicate with each other via the user portals.

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • Development Economics (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Medical Informatics (AREA)
  • Evolutionary Computation (AREA)
  • Data Mining & Analysis (AREA)
  • Game Theory and Decision Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Marketing (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)
  • Greenhouses (AREA)
  • Cultivation Of Plants (AREA)

Abstract

L'invention concerne des plateformes, des procédés, des logiciels, des systèmes et des dispositifs pour la gestion de produits agricoles. Dans certains modes de réalisation, une plateforme telle que décrite dans la description comprend une pluralité de capteurs, tels qu'un réseau de capteurs. Dans certains modes de réalisation, une plateforme telle que décrite dans la description comprend une pluralité d'actionneurs, tels qu'un réseau d'actionneurs. Un processeur peut recevoir des données en provenance d'au moins un capteur et peut commander une modification d'un paramètre de fonctionnement d'au moins un actionneur. Dans certains modes de réalisation, une plateforme telle que décrite dans la description comprend une base de données qui comprend un ensemble de données, tel qu'un ensemble de données comprenant des données en provenance d'une pluralité d'installations agricoles.
PCT/US2020/054120 2019-10-03 2020-10-02 Plateformes agricoles WO2021067847A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962910346P 2019-10-03 2019-10-03
US62/910,346 2019-10-03

Publications (1)

Publication Number Publication Date
WO2021067847A1 true WO2021067847A1 (fr) 2021-04-08

Family

ID=75338598

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/054120 WO2021067847A1 (fr) 2019-10-03 2020-10-02 Plateformes agricoles

Country Status (1)

Country Link
WO (1) WO2021067847A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220156603A1 (en) * 2020-11-17 2022-05-19 International Business Machines Corporation Discovering farming practices

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060282274A1 (en) * 2005-06-10 2006-12-14 Bennett Michael S Monitoring and managing farms
US20170023193A1 (en) * 2015-05-18 2017-01-26 Biological Innovation & Optimization Systems, LLC Grow Light Embodying Power Delivery and Data Communications Features
US20180262571A1 (en) * 2016-03-04 2018-09-13 Sabrina Akhtar Integrated IoT (Internet of Things) System Solution for Smart Agriculture Management
US20180263171A1 (en) * 2014-04-21 2018-09-20 The Climate Corporation Generating an agriculture prescription
US20190133026A1 (en) * 2016-04-04 2019-05-09 Freight Farms, Inc. Modular Farm Control and Monitoring System

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060282274A1 (en) * 2005-06-10 2006-12-14 Bennett Michael S Monitoring and managing farms
US20180263171A1 (en) * 2014-04-21 2018-09-20 The Climate Corporation Generating an agriculture prescription
US20170023193A1 (en) * 2015-05-18 2017-01-26 Biological Innovation & Optimization Systems, LLC Grow Light Embodying Power Delivery and Data Communications Features
US20180262571A1 (en) * 2016-03-04 2018-09-13 Sabrina Akhtar Integrated IoT (Internet of Things) System Solution for Smart Agriculture Management
US20190133026A1 (en) * 2016-04-04 2019-05-09 Freight Farms, Inc. Modular Farm Control and Monitoring System

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220156603A1 (en) * 2020-11-17 2022-05-19 International Business Machines Corporation Discovering farming practices

Similar Documents

Publication Publication Date Title
Dharmaraj et al. Artificial intelligence (AI) in agriculture
US20220075344A1 (en) A method of finding a target environment suitable for growth of a plant variety
Khandelwal et al. Artificial intelligence in agriculture: An emerging era of research
CN111476149A (zh) 一种植物培育控制方法和系统
Mishra et al. Artificial intelligence and machine learning in agriculture: Transforming farming systems
CN117036088A (zh) 一种ai识别绿化植物生长态势的数据采集分析方法
Dadios et al. Automation and control for adaptive management system of urban agriculture using computational intelligence
WO2021067847A1 (fr) Plateformes agricoles
Araneta et al. Controlled Environment for Spinach Cultured Plant with Health Analysis using Machine Learning
Pal et al. A survey on IoT-based smart agriculture to reduce vegetable and fruit waste
Saha et al. ML-based smart farming using LSTM
Jackson et al. Robust Ensemble Machine Learning for Precision Agriculture
Baburao et al. Review of Machine Learning Model Applications in Precision Agriculture
Pabitha et al. A digital footprint in enhancing agricultural practices with improved production using machine learning
Rai et al. Application of Machine Learning in Agriculture with Some Examples
Kuppusamy et al. Machine Learning-Enabled Internet of Things Solution for Smart Agriculture Operations
Guragain et al. A low-cost centralized IoT ecosystem for enhancing oyster mushroom cultivation
Mongia et al. Impact of Assistive Technologies in Addressing Challenges in Indoor Farming: A Review
Rajendiran et al. Smart Aeroponic Farming System: Using IoT with LCGM-Boost Regression Model for Monitoring and Predicting Lettuce Crop Yield.
Abdulghani et al. Cyber-Physical System Based Data Mining and Processing Toward Autonomous Agricultural Systems
Singh et al. A Multiple Linear Regression Model for Crop Production using Machine Learning and Neural Network
Guragain et al. Journal of Agriculture and Food Research
Venkatraman et al. Industrial 5.0 Aquaponics System Using Machine Learning Techniques
Azizi Application of Artificial Intelligence (AI) In-Farm
Tageldin Integration of machine learning and predictive analytics in agriculture to optimize plant disease detection and treatment in Egypt

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20870948

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20870948

Country of ref document: EP

Kind code of ref document: A1