WO2024121815A1 - Comprehensive agriculture technology system - Google Patents

Comprehensive agriculture technology system Download PDF

Info

Publication number
WO2024121815A1
WO2024121815A1 PCT/IB2023/062424 IB2023062424W WO2024121815A1 WO 2024121815 A1 WO2024121815 A1 WO 2024121815A1 IB 2023062424 W IB2023062424 W IB 2023062424W WO 2024121815 A1 WO2024121815 A1 WO 2024121815A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
processor
network
sensor
computerized system
Prior art date
Application number
PCT/IB2023/062424
Other languages
French (fr)
Inventor
Yeung Man Teddy LO
Laurent Rene Raymond COLLOT
Original Assignee
Neos Ventures Investment Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neos Ventures Investment Limited filed Critical Neos Ventures Investment Limited
Publication of WO2024121815A1 publication Critical patent/WO2024121815A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01GHORTICULTURE; CULTIVATION OF VEGETABLES, FLOWERS, RICE, FRUIT, VINES, HOPS OR SEAWEED; FORESTRY; WATERING
    • A01G9/00Cultivation in receptacles, forcing-frames or greenhouses; Edging for beds, lawn or the like
    • A01G9/24Devices or systems for heating, ventilating, regulating temperature, illuminating, or watering, in greenhouses, forcing-frames, or the like
    • A01G9/249Lighting means

Definitions

  • aspects of the invention generally relate to intelligent farming systems. More specifically, embodiments of the invention relate to a comprehensive system for plant health monitoring and dynamic control of the plant environment in response to the monitoring in agriculture environments.
  • Embodiments of the invention improve existing approaches by providing a comprehensive intelligent farming system for data-driven management of agriculture endeavors.
  • Embodiments of the invention include three basic areas of improvement: data collection, optimization, and environment modification.
  • Data collection or monitoring in a horticultural or agricultural environment may be performed by a variety of sensors measuring a variety of growth characteristics. Some growth characteristics may include soil or other growth medium properties, light and general radiation characteristics of the growth environment, and other attributes of the plant and environment.
  • Optimization may include an assessment of the data collected by the various sensors in the growth environment against known optimal growth characteristics. Further, data collection and growth monitoring could lead to the development of optimal growth conditions for applications in agriculture endeavors.
  • artificial intelligence (Al) and machine learning (ML) techniques may be applied to the collected data to determine or modify various growth environment characteristics based on growth outcomes for optimal growth environment conditions.
  • Environment modification may include analyzing the collected data to change one or more characteristics of the growth environment such as lighting characteristics, temperature, humidity, growth medium composition, and other elements.
  • the system for data collection, optimization, and environment modification in horticultural and agricultural applications may include several subsystems in implementation.
  • the system may include a sensor subsystem including various sensors and docking support structures for mounting the sensors in the growth environment, an equipment control subsystem for modifying characteristics of the growth environment, a remote management subsystem for remotely monitoring characteristics of the growth environment, an analytics and machine learning subsystem for analyzing the sensor data and determining optimal parameters for the equipment control subsystem, and a digital communications network for data transfer between the subsystems, a cloud data storage system, and other remote systems.
  • the equipment control subsystem may control a lighting system and other agriculture equipment such as an irrigation system, a fertilization system, etc., and may communicate with frontend and/or backend computing devices of the remote management and analytics and machine learning subsystems via a plurality of network switches for various functions within the system.
  • the interaction of the various subsystems may collect and analyze the growth environment data and implement optimal growth conditions via statistical, Al, and/or ML modeling.
  • the sensor subsystem may include a hyperspectral sensor device communicatively attached to a docking support structure within a lighting system.
  • the hyperspectral sensor device and/or its docking support structure may include a network connectivity component to form at least one network node for data collection of spectral bands produced by the hyperspectral sensor.
  • a distributed network of processors connects the various subsystems via the digital communications network, and the processors execute instructions to analyze the data collected by the sensor devices.
  • the instructions include statistical analysis, a set of machine learning (ML), and/or artificial intelligence (Al) algorithms based on received datasets, aggregated additional data from the sensor devices, and historical data to control the plurality of switches and the various other agriculture equipment via the equipment control subsystem (/.e., the lighting system, the irrigation system, the fertilization system, etc.).
  • ML machine learning
  • Al artificial intelligence
  • the sensors enable the sensors to provide coverage for the entire growth environment while sending sensed data to the distributed network of processors for further processing.
  • the sensed data includes a variety types of data that enable the frontend and/or backend computing devices of the remote managements and analytics and machine learning subsystems to identify raw data.
  • the frontend and/or backend computing devices may process collected data to produce quantitative results to achieve desired or optimal growth conditions in real-time or substantially real-time, such as growth of the vegetation, metabolism progress, stress indicators, or the like.
  • FIG. 1 is a diagram illustrating an agriculture technology system for plant health management, according to one embodiment.
  • FIG. 2 is a diagram illustrating one embodiment of a hyperspectral sensor that may be employed within the agriculture technology system.
  • FIG. 3 is a diagram illustrating a traditional Fourier Transform interferometer according to one embodiment.
  • FIG. 4 is a diagram illustrating hyperspectral sensors co-located with lighting fixtures according to one embodiment.
  • FIG. 5 is a diagram illustrating a lighting fixture according to one embodiment.
  • FIG. 6 is a diagram illustrating a machine learning (ML) architecture that may be included in an analytics and machine learning subsystem of the comprehensive agriculture technology system according to one embodiment.
  • ML machine learning
  • FIG. 7 is a diagram illustrating a computerized system for plant health management according to one embodiment.
  • FIG. 8 is a diagram illustrating a high-level block diagram of a computing environment for the comprehensive agriculture technology system and various computing devices of the system and processor-executable instructions according to one embodiment.
  • FIG. 1 illustrates one embodiment of an agricultural technology system 100 for plant health management.
  • the agriculture technology system 100 may include a number of subsystems (e.g., 102, 104, 106, 108, etc.) that each provide many points of innovation.
  • the system 100 may include a sensor subsystem 102, an equipment control subsystem 104, a remote management system 106, and an analytics system 108.
  • the system 100 may include many other types of subsystems including a supply chain subsystem for maintaining various materiel for operation of the system 100, and other subsystems.
  • the system 100 may be configured as an internet of things (loT).
  • the system 100 may include interrelated computing devices, mechanical and digital machines, objects, user interfaces, and even people that are provided with unique identifiers and the ability to transfer data over a network (e.g., network 110) without requiring human-to-human or human-to- computer interaction.
  • the system 100 may automate tasks within a growth environment 114 to optimize plant growth.
  • the system as an loT may also collect and analyze data to make better decisions, provide real-time updates on the status of agricultural products in the growth environment 114, and reduce costs by automating tasks and improving efficiency.
  • the system 100 may employ RFID tags or other wireless sensors that are communicatively connected to the network 110 and attached to objects (e.g., sensors 102a, equipment 104a, etc.) to collect data about them.
  • the system 100 may also employ cloud computing technology to store and access data and applications over the internet.
  • the system 100 may store data collected by loT devices and run applications that analyze the data, as described herein.
  • the sensor subsystem 102 may include several sensor suites or sensor devices 102a that measure various characteristics of a growth environment 114 such as indoor 116, outdoor 118, and plant and soil 120 environmental variables. The sensor subsystem 102 may also measure plant and soil characteristics.
  • the sensor suites or sensor devices 102a include one or more of a temperature sensor, a humidity sensor, a volumetric water sensor, a leaf wetness sensor, an electrical conductivity (EC) sensor, a soil water potential sensor, a light level sensor, and a carbon dioxide sensor.
  • the sensor suites or devices 102a may include both hardware and software components.
  • the sensor suites or devices 102a may include hardware components such as a processor, memory, physical docking elements to provide electrical power to the sensors and to facilitate collection and communication of sensor data 102b among the various elements and subsystems internal to the system 100 or external to the system 100 via network components.
  • hardware components such as a processor, memory, physical docking elements to provide electrical power to the sensors and to facilitate collection and communication of sensor data 102b among the various elements and subsystems internal to the system 100 or external to the system 100 via network components.
  • a computer network 110 may interconnect all subsystems 102, 104, 106, 108 of the system 100 to facilitate internal and external communications with the system 100.
  • the network 110 includes a wireless mesh configuration communicatively connected to a central server.
  • the network 110 may be configured as a wireless mesh network based on the network components of the plurality of the sensor devices of sensor subsystem 102 or other components of the system 100.
  • the network 110 may also include processor-executable instructions for a software stack to organize, arrange and process the sensor data 102b from the sensors 102a of the sensor subsystem 102.
  • the software stack may include instructions to perform optimization and all other functions associated with the growth environment 114.
  • Each sensor 102a may include a hardware and/or software networking component for connecting the sensor 102 to the system 100 generally and the network 110 in particular.
  • the networking component may include a wireless chip or antenna configured to communicate sensor data 102b wirelessly in accordance with one or more network protocols.
  • the networking component may communicate data 102b via Bluetooth Low Energy (BLE), Wi-Fi® (802.11 standard), BLUETOOTH, cellular communication or near field communication (NFC) protocol.
  • BLE Bluetooth Low Energy
  • Wi-Fi® 802.11 standard
  • BLUETOOTH Bluetooth Low Energy
  • NFC near field communication
  • the sensor suites or devices 102a may include software components for operability and compatibility with other internal or external components of the system 100.
  • Software components of the sensors 102a may also incorporate artificial intelligence (Al) or machine learning (ML) processing capabilities that may identify, adjust, and learn what is sensed.
  • the sensor suites or devices 102a may provide a standard dataset or data protocol to arrange the collected or sensed sensor data 102b when the sensor suites or devices 102a communicate with other devices in the current system 100.
  • Artificial intelligence (Al) or machine learning (ML) processing capabilities may also partially or wholly reside in and operate from the analytics and machine learning subsystem 108, as further described below.
  • the sensor suites or devices 102a may also incorporate remote data that is sensed or provided from a plant scientist or biosystem engineer via the network 110 and a remote management subsystem 106.
  • the sensor suites or devices 102a may include a hyperspectral sensor 112.
  • Hyperspectral sensors are devices that can capture and analyze the spectrum of light emitted or reflected by an object. They are used in a variety of applications, including remote sensing, environmental monitoring, and industrial process control. Hyperspectral sensors typically work by using a spectrometer to divide the light into its component wavelengths. The sensor then measures the intensity of each wavelength, which can be used to create a spectrum of the light. Hyperspectral sensors can provide a wealth of information about an object, as each wavelength in the spectrum can correspond to a different physical property of the object. For example, the wavelength of light that is absorbed by a particular material can be used to identify that material. Hyperspectral sensors are becoming increasingly common, as they offer a number of advantages over traditional imaging sensors. They can provide more detailed information about an object, and they can be used to identify objects that are difficult to see with the naked eye.
  • the hyperspectral sensor 112 may provide data for remote sensing, environment monitoring, agricultural process control, and other data.
  • the hyperspectral sensor may provide data from a growth environment 114 related to temperature, humidity, water, leaf wetness, soil electrical conductivity, soil water potential, light level, and carbon dioxide.
  • the hyperspectral sensor 112 may remotely sense the composition of any aspect of the growth environment 114 such as the growth medium (e.g., soil or, in hydroponic applications, water), the plant, and the air surrounding the plant.
  • the growth medium e.g., soil or, in hydroponic applications, water
  • the sensors 102a and/or the hyperspectral sensor 112 may further include components for sensing incident sunlight measurements for tracking exposure of lighting on a given plant, a given area, or a given cultivated space, as well as measuring or calculating a three dimensional (3D) map for tracking plant height, plant width, and measuring intensity of light.
  • This information can be used to understand a plant’s growth environment 114, how that environment changes over time, and identify potential environmental hazards.
  • Environmental monitoring applications may monitor the growth environment 114 for pollutants, plant and/or solid composition, and the like.
  • the hyperspectral sensor 112 can be used to monitor the quality of agricultural products during growth.
  • the interferometric information gathered by the hyperspectral sensor 112 may also allow the system 100 to accurately locate all sensors 102a within the growth environment. Thus, all sensors 102a may be precisely mapped in conjunction with optical images within the growth environment.
  • the hyperspectral sensor may also monitor the condition of equipment 104a.
  • the hyperspectral sensor 112 may also work in tandem with other sensors 102a of the system.
  • sensors 102a may be powered by lighting fixtures that are also used for plant health, and communicate with the other subsystems 104, 106, 108 via the same network 110.
  • the hyperspectral sensor 112 is further described in relation to FIG. 2, below.
  • the sensor suites or devices 102a may be communicably connected to the equipment control subsystem 104 via the sensor networking components and/or the network 110.
  • the equipment control subsystem may include one or more pieces of equipment or devices 104a that physically alter the growth environment 114 and, as a result, the indoor 116, outdoor 118, and plant and soil 120 environmental variables.
  • the equipment 104a may include one or more of shade canopies, dosing or irrigation controls and systems, lighting systems, passive and/or active heating and cooling devices, harvesting equipment, spraying equipment (i.e., for fertilizer, insect and pest control, herbicide, fungicide, etc.), locking mechanisms and other security, foggers, humidifiers, etc.
  • lighting equipment may include high power lighting fixtures, such as light emitting diode (LED) lighting devices that are configured to perform for extended periods, as further described in relation to FIG. 3, below.
  • LED light emitting diode
  • the equipment control subsystem 104 may be controlled remotely by the remote management subsystem 106 via the network 110. Control of the equipment 104a may rely on input from at least the sensor subsystem 102 and the analytics and machine learning subsystem 108, also via the network 110.
  • the equipment control subsystem 104 is configured to receive sensor data 102b via the network 110 from the sensor subsystem 102 and, in response to the sensor data 102b, modify one or more operating parameters 104b of the equipment control subsystem 104. Modification of one or more operating parameters 104b of the equipment control subsystem 104 may also result from input by the analytics and machine learning subsystem 108.
  • the remote management subsystem 106 may include a variety of management components 106a and data trackers 106b.
  • the data trackers 106b may collect data that is internal to the system 100 and/or data that is external to the system 100.
  • data that is internal to the system 100 may include the sensor data 102b, operating parameters 104b of the equipment control subsystem 104, and or artificial intelligence (Al) data 108b of the analytics and machine learning subsystem 108.
  • direct control of the growth environment 114 may be facilitated by the data trackers 106b in cooperation with the one or more management components 106a in communication with the equipment control subsystem 104.
  • the management components 106a may include a smart phone application, a web portal, a messaging service (e.g., short message service or SMS), a scheduling application, and labor schedules, etc.
  • the data trackers 106b may include one or more of historic data tracking, energy tracking, plant growth tracking, supplier data, packaging data, and crop pricing data. Using input from the sensor subsystem 102, observation and analysis of the data trackers 106b by human interaction and/or the analytics and machine learning subsystem 108, may permit modification of the operating parameters 104b and may facilitate direct control of the growth environment 114 via the network 110.
  • the analytics and machine learning subsystem 108 may include artificial intelligence (Al) modules 108a to analyze data that is internal to the system 100 and/or data that is external to the system 100.
  • the Al modules 108a may include one or more processors that are communicatively connected to a computer memory storing processor-executable instructions for the Al modules 108a.
  • the Al modules 108a may include a predictive analytics module and an automated control module.
  • a predictive analytics module may include instructions to process historic data (e.g., 106b) and/or sensor data 102b to predict future outcomes for the sensor data 102b.
  • Instructions of the predictive analytics module may include instructions for statistical modeling, data mining techniques, and machine learning algorithms to identify patterns in the sensor data 102b and/or historic data (e.g., 106b) make predictions about future trends for the sensor data 102b.
  • a predictive analytics module may also be used to assess the risk of certain events to the growth environment 114, such as modification of one or more operating parameters 104b of the equipment control subsystem 104.
  • the system 100 may also employ a predictive analytics module to optimize one or more of energy tracking, plant growth, supplier data, packaging data, and crop pricing.
  • a predictive analytics module may include further instructions for making decisions about inventory levels and production schedules of the system 100.
  • An automated control module may include instructions using machine learning algorithms and machine learning models to optimize operating parameters 104b of the equipment control subsystem 104.
  • an automated control module may train a machine learning model using historic data (e.g., 106b) and then use the model to make predictions about how the system 100 will behave in the future. The model can then be used to control the system 100 accordingly.
  • the machine learning model may include linear regression, a decision tree, a random forest, or a neural network.
  • the linear regression machine learning model may include processor executable instructions for a mathematical model that predicts a continuous value based on a set of input variables.
  • the decision tree machine learning model may include processor executable instructions for making decisions to optimize the system 100 based on a set of rules.
  • the random forest machine learning model may include processor executable instructions for combining predictions of multiple decision trees.
  • the neural network machine learning model may include processor executable instructions to create interconnected nodes, or neurons, that can learn to recognize patterns in historic data (e.g., 106b), as described in relation to FIG. 6, below.
  • FIG. 2 is an illustration of one embodiment of a hyperspectral sensor 200 that may be employed within the agriculture technology system 100.
  • the hyperspectral sensor 200 may include one or more processors and processor- readable memories storing processor-executable instructions to accomplish the results of a typical hyperspectral sensor.
  • aspects of the hyperspectral sensor 200 may include a computer-controlled unstable Fabry-Perot (FP) cavity and a Fourier Transform spectra extraction method.
  • the hyperspectral sensor 200 may include processor-executable instructions to divide light reflected or absorbed from a subject (i.e. , a plant within the growth environment 114 of FIG. 1) into its component wavelengths.
  • FP Fabry-Perot
  • the sensor 200 may also include instructions to measure the intensity of each wavelength, and create a spectrum of the light.
  • the sensor 200 may also include processor-executable instructions to match a measured spectrum to different physical properties of the plant (e.g., plant metabolism, soil and air elements, etc.).
  • FIG. 3 illustrates a traditional Fourier Transform interferometer 300.
  • a Fourier Transform interferometer 300 may include a light source 302, a beam splitter or half-silvered mirror 304, a detector 306, a first mirror 308a, and a second mirror 308b.
  • the light source 302 emits light that hits the beam splitter 304.
  • the beam splitter 304 is partially reflective, so part of the light is transmitted through to the first mirror 308a, while some light is reflected in the direction of the second mirror 308b. Both beams recombine at the beam splitter 304 to produce an interference pattern 310 incident on the detector 306.
  • Translation of one of the mirrors (e.g., the first mirror 308a) along the axis A allows a variable delay in the travel time of the light to be included in one of the beams.
  • the light beams interfere, allowing the temporal coherence of the light to be measured at each different time delay setting along the axis A, effectively converting the time domain into a spatial coordinate.
  • the interference pattern 310 may be translated to an interferogram 312 that may be used to reconstruct a spectrum 314 of the light emitted from the light source 302. For example, by making measurements of the signal at many discrete positions of the movable mirror (i.e., the first mirror 308a), the spectrum 314 can be reconstructed using a Fourier transform 316 of the temporal coherence of the light.
  • the hyperspectral sensor may also include a light source 202, a beam splitter 204, a detector 206, a first mirror 208a, and a second mirror 208b.
  • the detector 206 may include a complementary metal-oxide- semiconductor (CMOS) camera and a FP interferometer.
  • the hyperspectral sensor 200 may also include one or more processors 210 that are communicatively connected to one or more processor-readable memories 212 that store processor-executable instructions 214.
  • the hyperspectral sensor 200 is at least partially communicatively connected to one or more processors and processor-readable memories including processor-executable instructions.
  • the two mirrors 208a and 208b in the FP configuration may be disposed in a housing (not shown).
  • the mirrors 208a, 208b may be relatively or generally flat (e.g., without, devoid, or absent significant curvature or surface curvature).
  • a stepper motor 218 may move the mirrors 208a, 208b.
  • the sensor 200 may include a cascade of motion reduction devices (e.g., 222) in order to achieve nanometer resolution.
  • One or more capacitive sensors may detect or sense information from the subject (e.g., plants or vegetation).
  • the sensor 200 may be configured as a servo loop.
  • the hyperspectral sensor 200 may include processor-executable instructions for a feedback loop to control the mirrors’ 208a, 208b position, velocity, and/or acceleration.
  • processor executable instructions may compare the desired output to the actual output (i.e. , mirror positions) and then adjust the input to the stepper motor 218 to reduce the error.
  • Processor-executable instructions may also fine-tune or align the mirrors 208a, 208b, calibrate the motor 218, and control the motion control devices 222 in real-time or substantially in real-time.
  • Processor-executable instructions may also perform one or more of the operations described in relation to the interferometer 300 of FIG. 3.
  • the processor-executable instructions of the hyperspectral sensor 200 may analyze the captured data. For example, the instructions may analyze the interferometer response (e.g., interference pattern 310 of FIG. 3), or the way that the sensor 200 changes the light it receives from the object (e.g., the plant, soil, or other measured conditions within the growth environment 114 of FIG. 1).
  • the processorexecutable instructions of the sensor 200 may coarsely align the sensor 200 by fitting the appearance of a laser spot in a field of view of the sensor 200. Coarse alignment may be complete when measurement of loss of resolution or clarity in the interference pattern 210 is below a threshold compared to a raw camera image of the same subject.
  • Fine alignment of the sensor 200 may include processor- executable instructions to mathematically model a cavity within the sensor 200.
  • the cavity includes a space between two mirrors (e.g., mirrors 208a and 208b) that forms a slight wedge.
  • the sensor 200 may be configured to control displacements between the mirrors, the beam splitter, and/or the subject to the nanometer level based on multi-stage mechanical reduction hardware components or processor-executable instructions.
  • the hyperspectral sensor 200 may also include processor-executable instructions to apply a spectroscopic technique (e.g., Fourier transform spectroscopy) to capture or sense a spectrum that is particularly suited for observation of various conditions within the growth environment.
  • a spectroscopic technique e.g., Fourier transform spectroscopy
  • the sensor 200 may be configured to capture a 400 to 1000 nanometer (nm) band of light spectrum that is particularly suited to observation of plant health characteristics. For example, anthocyanin, carotene, chlorophyll A, chlorophyll B, starch, and protein content of the subject plant are detectable within this band and may be captured by the hyperspectral sensor 200.
  • Specifications of the sensor 200 may include:
  • a sensor 402 may be co-located with lighting fixtures 404, such as horticulture lighting fixtures of the equipment control subsystem 104 (FIG. 1 ), giving an exceptional vantage point on identifying and observing the metabolism of vegetation or plants 406 within the growth environment 114.
  • the sensor 402 may connect to the lighting fixture 404 via a dock, such as a universal dock.
  • the dock (not shown) may connect on one side to the lighting fixture 404 to receive, control or manage power or other settings of the lighting fixture.
  • the dock may also connect to the sensor 402.
  • the dock may provide power and data specification or protocol to the hyperspectral sensor 402 so that data exchanges within the system 100 may be accomplished.
  • Multiple hyperspectral sensors 402 may be deployed within the growth environment 114.
  • One or more shading elements 408 may confine the measurement of the sensors, the light cast by the lighting fixtures 404, and other environmental influences within the growth environment 114 to the one or more plants 406 that are directly beneath or under the respective shading element 408 and/or within the light cast by a respective lighting fixture 404.
  • the analytics and machine learning subsystem 108 may receive and process data from each of the sensors of the sensor subsystem 102, including the hyperspectral sensor 112, to provide needed metabolism data of the plants to the remote management subsystem 106. Based on the metabolism and other data, the lighting fixture 404 may be further updated, adjusted, or configured.
  • the hyperspectral sensor 402 may be a stand-alone sensor device as shown in FIG. 2 or an add-on or an integrated part of a lighting fixture or fixtures 500 as shown in FIG. 5. As integrated with the lighting fixture, 500, the hyperspectral sensor 200 may be oriented or configured from a vantage point of the lighting fixture 500 above the plants within the growth environment 114. In one example, a housing 502 of the hyperspectral sensor 200 (FIG. 2) may be integrated with a lighting fixture 500 including lighting elements 504. In another example, the sensor 200 may include a power source to energize components thereof. For example, the power source may include a battery. In another embodiment, the connection between the sensor 200 and the lighting fixture 500 may transmit electrical power from the fixture 500 to the sensor 200.
  • the analytics and machine learning subsystem 108 may include a machine learning (ML) architecture 600 may be used with the system 100 in accordance with the current disclosure.
  • the analytics and machine learning subsystem 108 system 100 may include instructions for execution on one or more processors that implement the ML architecture 600.
  • the ML architecture 600 may include an input layer 602, a hidden layer 604, and an output layer 606.
  • the input layer 602 may include inputs 608A, 608B, etc., coupled to other subsystems of the system 100 (e.g., the sensor subsystem 102, the equipment control subsystem 104, the remote management subsystem 106, etc.) and represent those inputs that are observed from actual system data, such as sensor data 102b and operating parameters 104b.
  • other subsystems of the system 100 e.g., the sensor subsystem 102, the equipment control subsystem 104, the remote management subsystem 106, etc.
  • the hidden layer 604 may include weighted nodes 610 that have been trained for the sensor data 102b and operating parameters 104b being observed. Each node 610 of the hidden layer 604 may receive the sum of all inputs 608A, 608B, etc., multiplied by a corresponding weight.
  • the output layer 606 may present various outcomes 612 based on the input values 608A, 608B, etc., and the weighting of the hidden layer 604.
  • the machine learning architecture 600 may be trained to analyze a likely outcome for a given set of inputs based on thousands or even millions of observations of previous agricultural product growth cycles. For example, the architecture 600 may be trained to determine optimal lighting conditions associated with a plant (e.g., 406).
  • a dataset of inputs may be applied and the weights of the hidden layer 610 may be adjusted for the known outcome (e.g., an optimal plant characteristic) associated with that dataset. As more datasets are applied, the weighting accuracy may improve so that the outcome prediction is constantly refined to a more accurate result.
  • the data trackers 106b including historic plant characteristics for optimized plant growth may provide datasets for initial training and ongoing refining of the machine learning architecture 600.
  • Additional training of the machine learning architecture 600 may include the an artificial intelligence engine (Al engine) 614 providing additional values to one or more controllable inputs 616 so that outcomes may be observed for particular changes to the sensor data 102b.
  • Al engine artificial intelligence engine
  • the values selected may represent different data types such as a frequency of optimal ranges of particular sensor data 102b, a frequency with which particular optimal growth conditions occur within the growth environment 114, a time for achieving optimal growth conditions, and other alternative data presented at various points in the growth process and may be generated at random or by a pseudo-random process.
  • the impact may be measured and fed back into the machine learning architecture 600 weighting to allow capture of an impact on a proposed change to the agricultural product growth process in order to optimize growth conditions.
  • the impact of various different data at different points in the growth cycle may be used to predict an outcome for a given set of observed values at the inputs layer 602.
  • data from the hidden layer may be fed to the artificial intelligence engine 614 to generate values for controllable input(s) 616 to optimize the sensor data 102b and the operating parameters 104b.
  • data from the output layer may be fed back into the artificial intelligence engine 614 so that the artificial intelligence engine 614 may, in some embodiments, iterate with different data to determine via the trained machine learning architecture 600, whether the sensor data 102b and the operating parameter data 104b is accurate, and other determinations.
  • the system 700 may include a distributed data storage for storing data accessible by a processor 706.
  • the distributed data storage 702 may include computer data memory storage in distributed areas or data farms.
  • the processor 706 may include multi-core processors in distributed areas connected by a network 704 (or, e.g., network 110).
  • the network 704 may include private or secure networks, as well as public networks.
  • the processor 706 may be configured to execute computer-executable instructions for managing plant health, as described herein.
  • a plurality of sensors 708 may comprise one or more antennas (not shown).
  • a plurality of light sources 710 (or, e.g., lighting elements 504 may be disposed in a growth environment 114.
  • the plurality of light sources 710 may comprise one or more lamps for providing energy in one or more wavelengths as a function of vegetation in the growth environment 114.
  • the system 700 may include the plurality of light sources 710 having a network connectivity component for connecting the plurality of light sources 710 to other devices (e.g., sensors 102a, hyperspectral sensor 112, 200, 402, equipment 104a).
  • the plurality of light sources 710 may further connect to the network 704 via the network connectivity component.
  • a plurality of switches 714 may connect with the plurality of sensors 708, the plurality of light sources 710, and the processor 706 via the network 704.
  • the plurality of switches 714 may be configured to energize the plurality of light sources 710.
  • At least one network node 716 may be coupled to the one or more network antennas (not shown), wherein the at least one network node 716 may be configured to be connected to the network 704.
  • the processor 706 may be configured to receive, at the at least one network node 716, a plurality of datasets from the plurality of sensors 702 disposed at the growth environment 114.
  • the plurality of datasets may be transmitted under a data protocol within the network 704.
  • the processor 706 may aggregate additional data to the plurality of the datasets, wherein the additional data comprises data external to the growth environment 114.
  • the processor 706 may also execute a set of ML or Al algorithms, as described in relation to FIG. 6, based on the received plurality of datasets and the aggregated additional data to control at least one of the following: the plurality of sensors 708, the plurality of light sources 710, the plurality of switches 714 and the at least one network node 716.
  • the processor 706 may generate a plant health management data matrix 750 as a result of executing.
  • the processor 706 may further be configured to localize the plurality of sensors 708 as a function of the one or more antennas.
  • the plurality of datasets may include at least one or more of the following: lighting related data, soil data of soil used in the cultivated site, air condition data of the cultivated site, moisture data of the cultivated site, weather data, temperature data of the cultivated site, plants of the cultivated site, and image data of the cultivated site or technical images collected remotely, satellite data, drone data or aerial data.
  • the lighting related data comprises a type of a lamp, a lighting spectrum of the lamp, a power usage of the lamp, and a size of the lamp.
  • the aggregated additional data may comprise at least one or more of the following: geolocation information of the cultivated site, and elevation data of the cultivated site.
  • the network connectivity component of the plurality of light sources 710 comprises a wireless network component.
  • the plurality of light sources 710 comprises light emitting diode (LED) lamps or any high efficiency light source.
  • the one or more wavelengths comprises grow light spectrum and that the grow light spectrum comprises wavelengths at one of the following ranges: wavelengths from 260 to 380 nanometer (nm); wavelengths from 380 to 740 nm.
  • the processor 706 may be configured to dynamically adjust light distribution patterns for coverage of the cultivated area. Furthermore, the system 700 may include the processor 706 to be configured to process or communicate images in selected spectral bands produced by the plurality of sensors via the at least one network node 716.
  • FIG. 8 is a high-level block diagram of an example computing environment 900 for the system 100, 700, and processor-executable instructions as described herein.
  • the computing device 900 may include a server (e.g., the central server and or various servers with the system 100, 700), a mobile computing device, a cellular phone, a tablet computer, a Wi-Fi-enabled device or other personal computing device capable of wireless or wired communication), a thin client, or other known type of computing device.
  • the various servers may be designed and built to specifically execute certain tasks.
  • the system 100, 700 may receive a large amount of data in a short period of time meaning the system 100, 700 may contain a special, high speed input output circuits to handle the large amount of data.
  • system 100, 700 may execute processor intensive machine learning algorithms and thus the system 100, 700 may have increased processing power that is specially adapted to quickly execute the machine learning algorithms.
  • the computing device 901 includes a processor 902 that is coupled to an interconnection bus.
  • the processor 902 includes a register set or register space 904, which is depicted in Fig. 8 as being entirely on-chip, but which could alternatively be located entirely or partially off-chip and directly coupled to the processor 902 via dedicated electrical connections and/or via the interconnection bus.
  • the processor 902 may be any suitable processor, processing unit or microprocessor.
  • the computing device 901 may be a multi-processor device and, thus, may include one or more additional processors that are identical or similar to the processor 902 and that are communicatively coupled to the interconnection bus.
  • the processor 902 of FIG. 8 is coupled to a chipset 906, which includes a memory controller 908 and a peripheral input/output (I/O) controller 910.
  • a chipset typically provides I/O and memory management functions as well as a plurality of general purpose and/or special purpose registers, timers, etc. that are accessible or used by one or more processors coupled to the chipset 906.
  • the memory controller 908 performs functions that enable the processor 902 (or processors if there are multiple processors) to access a system memory 912 and a mass storage memory 914, that may include either or both of an in-memory cache (e.g., a cache within the memory 912) or an on-disk cache (e.g., a cache within the mass storage memory 914).
  • an in-memory cache e.g., a cache within the memory 912
  • an on-disk cache e.g., a cache within the mass storage memory 914
  • the system memory 912 may include any desired type of volatile and/or non-volatile memory such as, for example, static random access memory (SRAM), dynamic random access memory (DRAM), flash memory, read-only memory (ROM), etc.
  • the mass storage memory 914 may include any desired type of mass storage device.
  • the computing device 901 may be used to implement a module 916 (e.g., the various modules as herein described).
  • the mass storage memory 914 may include a hard disk drive, an optical drive, a tape storage device, a solid-state memory (e.g., a flash memory, a RAM memory, etc.), a magnetic memory (e.g., a hard drive), or any other memory suitable for mass storage.
  • module, block, function, operation, procedure, routine, step, and method refer to tangible computer program logic or tangible computer executable instructions that provide the specified functionality to the computing device 901 , the systems and methods described herein.
  • a module, block, function, operation, procedure, routine, step, and method can be implemented in hardware, firmware, and/or software.
  • program modules and routines are stored in mass storage memory 914, loaded into system memory 912, and executed by a processor 902 or can be provided from computer program products that are stored in tangible computer-readable storage mediums (e.g. RAM, hard disk, optical/magnetic media, etc.).
  • the peripheral I/O controller 910 performs functions that enable the processor 902 to communicate with a peripheral input/output (I/O) device 924, a network interface 926, a local network transceiver 928, (via the network interface 926) via a peripheral I/O bus.
  • the I/O device 924 may be any desired type of I/O device such as, for example, a keyboard, a display (e.g., a liquid crystal display (LCD), a cathode ray tube (CRT) display, etc.), a navigation device (e.g., a mouse, a trackball, a capacitive touch pad, a joystick, etc.), etc.
  • the I/O device 924 may be used with the module 916, etc., to receive data from the transceiver 928, send the data to the components of the system 100, 700 and perform any operations related to the methods as described herein.
  • the local network transceiver 928 may include support for a Wi-Fi network, Bluetooth, Infrared, cellular, or other wireless data transmission protocols.
  • one element may simultaneously support each of the various wireless protocols employed by the computing device 901 .
  • a software-defined radio may be able to support multiple protocols via downloadable instructions.
  • the computing device 901 may be able to periodically poll for visible wireless network transmitters (both cellular and local network) on a periodic basis.
  • the network interface 926 may be, for example, an Ethernet device, an asynchronous transfer mode (ATM) device, an 802.11 wireless interface device, a DSL modem, a cable modem, a cellular modem, etc., that enables the system 100, 700 to communicate with another computer system having at least the elements described in relation to the system 100, 700.
  • ATM asynchronous transfer mode
  • 802.11 wireless interface device a DSL modem
  • cable modem a cable modem
  • a cellular modem etc.
  • the computing environment 900 may also implement the module 916 on a remote computing device 930.
  • the remote computing device 930 may communicate with the computing device 901 over an Ethernet link 932.
  • the module 916 may be retrieved by the computing device 901 from a cloud computing server 934 via the Internet 936. When using the cloud computing server 934, the retrieved module 916 may be programmatically linked with the computing device 901.
  • the module 916 may be a collection of various software platforms including artificial intelligence software and document creation software or may also be a Java® applet executing within a Java® Virtual Machine (JVM) environment resident in the computing device 901 or the remote computing device 930.
  • the module 916 may also be a “plug-in” adapted to execute in a web-browser located on the computing devices 901 and 930.
  • the module 916 may communicate with back end components 938 via the Internet 936.
  • the system 100, 700 may include but is not limited to any combination of a LAN, a MAN, a WAN, a mobile, a wired or wireless network, a private network, or a virtual private network.
  • a remote computing device 930 is illustrated in FIG. 8 to simplify and clarify the description, it is understood that any number of client computers are supported and can be in communication within the computing environment 900.
  • Modules may constitute either software modules (e.g., code or instructions embodied on a machine-readable medium or in a transmission signal, wherein the code is executed by a processor) or hardware modules.
  • a hardware module is tangible unit capable of performing certain operations and may be configured or arranged in a certain manner.
  • one or more computer systems e.g., a standalone, client or server computer system
  • one or more hardware modules of a computer system e.g., a processor or a group of processors
  • software e.g., an application or application portion
  • a hardware module may be implemented mechanically or electronically.
  • a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an applicationspecific integrated circuit (ASIC)) to perform certain operations.
  • a hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
  • the term “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein.
  • “hardware-implemented module” refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times.
  • Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access.
  • one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
  • a resource e.g., a collection of information
  • processors may be temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions.
  • the modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
  • the methods or routines described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or processors or processor-implemented hardware modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment or as a server farm), while in other embodiments the processors may be distributed across a number of locations.
  • the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., application program interfaces (APIs).)
  • SaaS software as a service
  • the performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines.
  • the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Environmental Sciences (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

A system for plant health management comprising a distributed data storage, a network, a processor, sensors, light sources, wherein the light sources includes a network connectivity component for connecting to the light sources and the light sources further connects to the network via the network connectivity component. The system also includes switches connecting with the sensors, the light sources, and the processor via the network, and at least one network node coupled to one or more network antennas. Also, the processor is configured to receiving at the at least one network node datasets from the sensors disposed at the cultivated site. As a function of the received datasets, the process aggregates additional data to the datasets and executes a set of machine learning (ML) algorithms. The system further generates a plant health management data matrix.

Description

COMPREHENSIVE AGRICULTURE TECHNOLOGY SYSTEM
Related Applications
[0001] This application claims the benefit of U.S. Provisional Application No. 63/431 ,360, filed December 9, 2022, entitled “COMPREHENSIVE HORTICULTURE TECHNOLOGY SYSTEM”, and U.S. Provisional Application No. 63/521 ,569, filed June 16, 2023, entitled “COMPREHENSIVE HORTICULTURE TECHNOLOGY SYSTEM”, references of which are hereby incorporated herein in their entirety.
Field of the Invention
[0002] Aspects of the invention generally relate to intelligent farming systems. More specifically, embodiments of the invention relate to a comprehensive system for plant health monitoring and dynamic control of the plant environment in response to the monitoring in agriculture environments.
Background
[0003] Agriculture technology has evolved to incorporate many aspects of technology. While areas such as chemical technology, farming implements, and weather prediction have seen vast improvement and application to farming, electronic applications and data-driven processes have been difficult to implement on a wide scale. Despite different areas of focus within the science of agriculture including horticulture, olericulture, pomology or fruticulture, floriculture, or the like, most of the focus is on using existing technologies, such as fertilizers, lighting systems, and irrigation systems. Further communication or network devices used therein are often off-the-shelf purchase, so the management of the overall system has various shortcomings.
Summary of the Invention
[0004] Aspects of the invention improve existing approaches by providing a comprehensive intelligent farming system for data-driven management of agriculture endeavors. Embodiments of the invention include three basic areas of improvement: data collection, optimization, and environment modification. Data collection or monitoring in a horticultural or agricultural environment may be performed by a variety of sensors measuring a variety of growth characteristics. Some growth characteristics may include soil or other growth medium properties, light and general radiation characteristics of the growth environment, and other attributes of the plant and environment. Optimization may include an assessment of the data collected by the various sensors in the growth environment against known optimal growth characteristics. Further, data collection and growth monitoring could lead to the development of optimal growth conditions for applications in agriculture endeavors. In some embodiments, artificial intelligence (Al) and machine learning (ML) techniques may be applied to the collected data to determine or modify various growth environment characteristics based on growth outcomes for optimal growth environment conditions. Environment modification may include analyzing the collected data to change one or more characteristics of the growth environment such as lighting characteristics, temperature, humidity, growth medium composition, and other elements.
[0005] The system for data collection, optimization, and environment modification in horticultural and agricultural applications may include several subsystems in implementation. In some embodiments, the system may include a sensor subsystem including various sensors and docking support structures for mounting the sensors in the growth environment, an equipment control subsystem for modifying characteristics of the growth environment, a remote management subsystem for remotely monitoring characteristics of the growth environment, an analytics and machine learning subsystem for analyzing the sensor data and determining optimal parameters for the equipment control subsystem, and a digital communications network for data transfer between the subsystems, a cloud data storage system, and other remote systems. The equipment control subsystem may control a lighting system and other agriculture equipment such as an irrigation system, a fertilization system, etc., and may communicate with frontend and/or backend computing devices of the remote management and analytics and machine learning subsystems via a plurality of network switches for various functions within the system. For example, the interaction of the various subsystems may collect and analyze the growth environment data and implement optimal growth conditions via statistical, Al, and/or ML modeling.
[0006] The sensor subsystem may include a hyperspectral sensor device communicatively attached to a docking support structure within a lighting system. The hyperspectral sensor device and/or its docking support structure may include a network connectivity component to form at least one network node for data collection of spectral bands produced by the hyperspectral sensor. In yet another embodiment, a distributed network of processors connects the various subsystems via the digital communications network, and the processors execute instructions to analyze the data collected by the sensor devices. In some embodiments, the instructions include statistical analysis, a set of machine learning (ML), and/or artificial intelligence (Al) algorithms based on received datasets, aggregated additional data from the sensor devices, and historical data to control the plurality of switches and the various other agriculture equipment via the equipment control subsystem (/.e., the lighting system, the irrigation system, the fertilization system, etc.).
[0007] Further aspects of the invention enable the sensors to provide coverage for the entire growth environment while sending sensed data to the distributed network of processors for further processing. In one embodiment, the sensed data includes a variety types of data that enable the frontend and/or backend computing devices of the remote managements and analytics and machine learning subsystems to identify raw data. In another embodiment, the frontend and/or backend computing devices may process collected data to produce quantitative results to achieve desired or optimal growth conditions in real-time or substantially real-time, such as growth of the vegetation, metabolism progress, stress indicators, or the like.
Brief description of the drawings
[0008] Persons of ordinary skill in the art may appreciate that elements in the figures are illustrated for simplicity and clarity so not all connections and options have been shown. For example, common but well-understood elements that are useful or necessary in a commercially feasible embodiment may often not be depicted in order to facilitate a less obstructed view of these various embodiments of the present disclosure. It may be further appreciated that certain actions and/or steps may be described or depicted in a particular order of occurrence while those skilled in the art may understand that such specificity with respect to sequence is not actually required. It may also be understood that the terms and expressions used herein may be defined with respect to their corresponding respective areas of inquiry and study except where specific meanings have otherwise been set forth herein.
[0009] FIG. 1 is a diagram illustrating an agriculture technology system for plant health management, according to one embodiment. [0010] FIG. 2 is a diagram illustrating one embodiment of a hyperspectral sensor that may be employed within the agriculture technology system.
[0011] FIG. 3 is a diagram illustrating a traditional Fourier Transform interferometer according to one embodiment.
[0012] FIG. 4 is a diagram illustrating hyperspectral sensors co-located with lighting fixtures according to one embodiment.
[0013] FIG. 5 is a diagram illustrating a lighting fixture according to one embodiment.
[0014] FIG. 6 is a diagram illustrating a machine learning (ML) architecture that may be included in an analytics and machine learning subsystem of the comprehensive agriculture technology system according to one embodiment.
[0015] FIG. 7 is a diagram illustrating a computerized system for plant health management according to one embodiment.
[0016] FIG. 8 is a diagram illustrating a high-level block diagram of a computing environment for the comprehensive agriculture technology system and various computing devices of the system and processor-executable instructions according to one embodiment.
Detailed Description
[0017] Embodiments may now be described more fully with reference to the accompanying drawings, which form a part hereof, and which show, by way of illustration, specific exemplary embodiments which may be practiced. These illustrations and exemplary embodiments may be presented with the understanding that the present disclosure is an exemplification of the principles of one or more embodiments and may not be intended to limit any one of the embodiments illustrated. Embodiments may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure may be thorough and complete, and may fully convey the scope of embodiments to those skilled in the art. Among other things, the present invention may be embodied as methods, systems, computer readable media, apparatuses, or devices. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. The following detailed description may, therefore, not to be taken in a limiting sense. [0018] FIG. 1 illustrates one embodiment of an agricultural technology system 100 for plant health management. In one example, the agriculture technology system 100 may include a number of subsystems (e.g., 102, 104, 106, 108, etc.) that each provide many points of innovation. For example, the system 100 may include a sensor subsystem 102, an equipment control subsystem 104, a remote management system 106, and an analytics system 108. Of course, the system 100 may include many other types of subsystems including a supply chain subsystem for maintaining various materiel for operation of the system 100, and other subsystems.
[0019] In some embodiments, the system 100 may be configured as an internet of things (loT). For example, the system 100 may include interrelated computing devices, mechanical and digital machines, objects, user interfaces, and even people that are provided with unique identifiers and the ability to transfer data over a network (e.g., network 110) without requiring human-to-human or human-to- computer interaction. Configured as an loT, the system 100 may automate tasks within a growth environment 114 to optimize plant growth. The system as an loT may also collect and analyze data to make better decisions, provide real-time updates on the status of agricultural products in the growth environment 114, and reduce costs by automating tasks and improving efficiency. In some embodiments, the system 100 may employ RFID tags or other wireless sensors that are communicatively connected to the network 110 and attached to objects (e.g., sensors 102a, equipment 104a, etc.) to collect data about them. The system 100 may also employ cloud computing technology to store and access data and applications over the internet. The system 100 may store data collected by loT devices and run applications that analyze the data, as described herein.
[0020] The sensor subsystem 102 may include several sensor suites or sensor devices 102a that measure various characteristics of a growth environment 114 such as indoor 116, outdoor 118, and plant and soil 120 environmental variables. The sensor subsystem 102 may also measure plant and soil characteristics. In some embodiments, the sensor suites or sensor devices 102a include one or more of a temperature sensor, a humidity sensor, a volumetric water sensor, a leaf wetness sensor, an electrical conductivity (EC) sensor, a soil water potential sensor, a light level sensor, and a carbon dioxide sensor. [0021] In some embodiments, the sensor suites or devices 102a may include both hardware and software components. For example, the sensor suites or devices 102a may include hardware components such as a processor, memory, physical docking elements to provide electrical power to the sensors and to facilitate collection and communication of sensor data 102b among the various elements and subsystems internal to the system 100 or external to the system 100 via network components.
[0022] For example, in some embodiments, a computer network 110 may interconnect all subsystems 102, 104, 106, 108 of the system 100 to facilitate internal and external communications with the system 100. In some embodiments, the network 110 includes a wireless mesh configuration communicatively connected to a central server. The network 110 may be configured as a wireless mesh network based on the network components of the plurality of the sensor devices of sensor subsystem 102 or other components of the system 100. The network 110 may also include processor-executable instructions for a software stack to organize, arrange and process the sensor data 102b from the sensors 102a of the sensor subsystem 102. The software stack may include instructions to perform optimization and all other functions associated with the growth environment 114.
[0023] Each sensor 102a may include a hardware and/or software networking component for connecting the sensor 102 to the system 100 generally and the network 110 in particular. In one embodiment, the networking component may include a wireless chip or antenna configured to communicate sensor data 102b wirelessly in accordance with one or more network protocols. For example, the networking component may communicate data 102b via Bluetooth Low Energy (BLE), Wi-Fi® (802.11 standard), BLUETOOTH, cellular communication or near field communication (NFC) protocol. Other protocols may be used without departing from the scope or motivation of the invention.
[0024] The sensor suites or devices 102a may include software components for operability and compatibility with other internal or external components of the system 100. Software components of the sensors 102a may also incorporate artificial intelligence (Al) or machine learning (ML) processing capabilities that may identify, adjust, and learn what is sensed. For example, the sensor suites or devices 102a may provide a standard dataset or data protocol to arrange the collected or sensed sensor data 102b when the sensor suites or devices 102a communicate with other devices in the current system 100. Artificial intelligence (Al) or machine learning (ML) processing capabilities may also partially or wholly reside in and operate from the analytics and machine learning subsystem 108, as further described below. In another example, the sensor suites or devices 102a may also incorporate remote data that is sensed or provided from a plant scientist or biosystem engineer via the network 110 and a remote management subsystem 106.
[0025] In some embodiments, the sensor suites or devices 102a may include a hyperspectral sensor 112. Hyperspectral sensors are devices that can capture and analyze the spectrum of light emitted or reflected by an object. They are used in a variety of applications, including remote sensing, environmental monitoring, and industrial process control. Hyperspectral sensors typically work by using a spectrometer to divide the light into its component wavelengths. The sensor then measures the intensity of each wavelength, which can be used to create a spectrum of the light. Hyperspectral sensors can provide a wealth of information about an object, as each wavelength in the spectrum can correspond to a different physical property of the object. For example, the wavelength of light that is absorbed by a particular material can be used to identify that material. Hyperspectral sensors are becoming increasingly common, as they offer a number of advantages over traditional imaging sensors. They can provide more detailed information about an object, and they can be used to identify objects that are difficult to see with the naked eye.
[0026] Within the system 100, the hyperspectral sensor 112 may provide data for remote sensing, environment monitoring, agricultural process control, and other data. In some embodiments, the hyperspectral sensor may provide data from a growth environment 114 related to temperature, humidity, water, leaf wetness, soil electrical conductivity, soil water potential, light level, and carbon dioxide. In remote applications, the hyperspectral sensor 112 may remotely sense the composition of any aspect of the growth environment 114 such as the growth medium (e.g., soil or, in hydroponic applications, water), the plant, and the air surrounding the plant. The sensors 102a and/or the hyperspectral sensor 112 may further include components for sensing incident sunlight measurements for tracking exposure of lighting on a given plant, a given area, or a given cultivated space, as well as measuring or calculating a three dimensional (3D) map for tracking plant height, plant width, and measuring intensity of light. This information can be used to understand a plant’s growth environment 114, how that environment changes over time, and identify potential environmental hazards. Environmental monitoring applications may monitor the growth environment 114 for pollutants, plant and/or solid composition, and the like. For agricultural process control, the hyperspectral sensor 112 can be used to monitor the quality of agricultural products during growth.
[0027] The interferometric information gathered by the hyperspectral sensor 112 may also allow the system 100 to accurately locate all sensors 102a within the growth environment. Thus, all sensors 102a may be precisely mapped in conjunction with optical images within the growth environment. The hyperspectral sensor may also monitor the condition of equipment 104a.
[0028] The hyperspectral sensor 112 may also work in tandem with other sensors 102a of the system. For example, sensors 102a may be powered by lighting fixtures that are also used for plant health, and communicate with the other subsystems 104, 106, 108 via the same network 110. The hyperspectral sensor 112 is further described in relation to FIG. 2, below.
[0029] The sensor suites or devices 102a may be communicably connected to the equipment control subsystem 104 via the sensor networking components and/or the network 110. The equipment control subsystem may include one or more pieces of equipment or devices 104a that physically alter the growth environment 114 and, as a result, the indoor 116, outdoor 118, and plant and soil 120 environmental variables. In some embodiments, the equipment 104a may include one or more of shade canopies, dosing or irrigation controls and systems, lighting systems, passive and/or active heating and cooling devices, harvesting equipment, spraying equipment (i.e., for fertilizer, insect and pest control, herbicide, fungicide, etc.), locking mechanisms and other security, foggers, humidifiers, etc. For example, lighting equipment may include high power lighting fixtures, such as light emitting diode (LED) lighting devices that are configured to perform for extended periods, as further described in relation to FIG. 3, below.
[0030] The equipment control subsystem 104 may be controlled remotely by the remote management subsystem 106 via the network 110. Control of the equipment 104a may rely on input from at least the sensor subsystem 102 and the analytics and machine learning subsystem 108, also via the network 110. In some embodiments, the equipment control subsystem 104 is configured to receive sensor data 102b via the network 110 from the sensor subsystem 102 and, in response to the sensor data 102b, modify one or more operating parameters 104b of the equipment control subsystem 104. Modification of one or more operating parameters 104b of the equipment control subsystem 104 may also result from input by the analytics and machine learning subsystem 108.
[0031] The remote management subsystem 106 may include a variety of management components 106a and data trackers 106b. In some embodiments, the data trackers 106b may collect data that is internal to the system 100 and/or data that is external to the system 100. For example, data that is internal to the system 100 may include the sensor data 102b, operating parameters 104b of the equipment control subsystem 104, and or artificial intelligence (Al) data 108b of the analytics and machine learning subsystem 108. In some embodiments, direct control of the growth environment 114 may be facilitated by the data trackers 106b in cooperation with the one or more management components 106a in communication with the equipment control subsystem 104. For example, the management components 106a may include a smart phone application, a web portal, a messaging service (e.g., short message service or SMS), a scheduling application, and labor schedules, etc. The data trackers 106b may include one or more of historic data tracking, energy tracking, plant growth tracking, supplier data, packaging data, and crop pricing data. Using input from the sensor subsystem 102, observation and analysis of the data trackers 106b by human interaction and/or the analytics and machine learning subsystem 108, may permit modification of the operating parameters 104b and may facilitate direct control of the growth environment 114 via the network 110.
[0032] The analytics and machine learning subsystem 108 may include artificial intelligence (Al) modules 108a to analyze data that is internal to the system 100 and/or data that is external to the system 100. For example, the Al modules 108a may include one or more processors that are communicatively connected to a computer memory storing processor-executable instructions for the Al modules 108a. In some embodiments, the Al modules 108a may include a predictive analytics module and an automated control module. [0033] A predictive analytics module may include instructions to process historic data (e.g., 106b) and/or sensor data 102b to predict future outcomes for the sensor data 102b. Instructions of the predictive analytics module may include instructions for statistical modeling, data mining techniques, and machine learning algorithms to identify patterns in the sensor data 102b and/or historic data (e.g., 106b) make predictions about future trends for the sensor data 102b. A predictive analytics module may also be used to assess the risk of certain events to the growth environment 114, such as modification of one or more operating parameters 104b of the equipment control subsystem 104. The system 100 may also employ a predictive analytics module to optimize one or more of energy tracking, plant growth, supplier data, packaging data, and crop pricing. A predictive analytics module may include further instructions for making decisions about inventory levels and production schedules of the system 100.
[0034] An automated control module may include instructions using machine learning algorithms and machine learning models to optimize operating parameters 104b of the equipment control subsystem 104. In some embodiments, an automated control module may train a machine learning model using historic data (e.g., 106b) and then use the model to make predictions about how the system 100 will behave in the future. The model can then be used to control the system 100 accordingly. In some embodiments, the machine learning model may include linear regression, a decision tree, a random forest, or a neural network. The linear regression machine learning model may include processor executable instructions for a mathematical model that predicts a continuous value based on a set of input variables. The decision tree machine learning model may include processor executable instructions for making decisions to optimize the system 100 based on a set of rules. The random forest machine learning model may include processor executable instructions for combining predictions of multiple decision trees. The neural network machine learning model may include processor executable instructions to create interconnected nodes, or neurons, that can learn to recognize patterns in historic data (e.g., 106b), as described in relation to FIG. 6, below.
[0035] FIG. 2 is an illustration of one embodiment of a hyperspectral sensor 200 that may be employed within the agriculture technology system 100. The hyperspectral sensor 200 may include one or more processors and processor- readable memories storing processor-executable instructions to accomplish the results of a typical hyperspectral sensor. In one example, as shown as FIG. 2, aspects of the hyperspectral sensor 200 may include a computer-controlled unstable Fabry-Perot (FP) cavity and a Fourier Transform spectra extraction method. The hyperspectral sensor 200 may include processor-executable instructions to divide light reflected or absorbed from a subject (i.e. , a plant within the growth environment 114 of FIG. 1) into its component wavelengths. The sensor 200 may also include instructions to measure the intensity of each wavelength, and create a spectrum of the light. The sensor 200 may also include processor-executable instructions to match a measured spectrum to different physical properties of the plant (e.g., plant metabolism, soil and air elements, etc.).
[0036] For example, FIG. 3 illustrates a traditional Fourier Transform interferometer 300. Aspects of the example hyperspectral sensor 200 apply the same concept but with a different configuration. A Fourier Transform interferometer 300 may include a light source 302, a beam splitter or half-silvered mirror 304, a detector 306, a first mirror 308a, and a second mirror 308b. In operation, the light source 302 emits light that hits the beam splitter 304. The beam splitter 304 is partially reflective, so part of the light is transmitted through to the first mirror 308a, while some light is reflected in the direction of the second mirror 308b. Both beams recombine at the beam splitter 304 to produce an interference pattern 310 incident on the detector 306. Translation of one of the mirrors (e.g., the first mirror 308a) along the axis A allows a variable delay in the travel time of the light to be included in one of the beams. The light beams interfere, allowing the temporal coherence of the light to be measured at each different time delay setting along the axis A, effectively converting the time domain into a spatial coordinate. The interference pattern 310 may be translated to an interferogram 312 that may be used to reconstruct a spectrum 314 of the light emitted from the light source 302. For example, by making measurements of the signal at many discrete positions of the movable mirror (i.e., the first mirror 308a), the spectrum 314 can be reconstructed using a Fourier transform 316 of the temporal coherence of the light.
[0037] Returning to FIG. 2, the hyperspectral sensor may also include a light source 202, a beam splitter 204, a detector 206, a first mirror 208a, and a second mirror 208b. The detector 206 may include a complementary metal-oxide- semiconductor (CMOS) camera and a FP interferometer. In some embodiments, the hyperspectral sensor 200 may also include one or more processors 210 that are communicatively connected to one or more processor-readable memories 212 that store processor-executable instructions 214. In further embodiments, the hyperspectral sensor 200 is at least partially communicatively connected to one or more processors and processor-readable memories including processor-executable instructions. The two mirrors 208a and 208b in the FP configuration may be disposed in a housing (not shown). The mirrors 208a, 208b may be relatively or generally flat (e.g., without, devoid, or absent significant curvature or surface curvature). A stepper motor 218 may move the mirrors 208a, 208b. The sensor 200 may include a cascade of motion reduction devices (e.g., 222) in order to achieve nanometer resolution. One or more capacitive sensors (not shown) may detect or sense information from the subject (e.g., plants or vegetation). In one aspect, the sensor 200 may be configured as a servo loop. For example, the hyperspectral sensor 200 may include processor-executable instructions for a feedback loop to control the mirrors’ 208a, 208b position, velocity, and/or acceleration. In some embodiments, processor executable instructions may compare the desired output to the actual output (i.e. , mirror positions) and then adjust the input to the stepper motor 218 to reduce the error. Processor-executable instructions may also fine-tune or align the mirrors 208a, 208b, calibrate the motor 218, and control the motion control devices 222 in real-time or substantially in real-time. Processor-executable instructions may also perform one or more of the operations described in relation to the interferometer 300 of FIG. 3.
[0038] The processor-executable instructions of the hyperspectral sensor 200 may analyze the captured data. For example, the instructions may analyze the interferometer response (e.g., interference pattern 310 of FIG. 3), or the way that the sensor 200 changes the light it receives from the object (e.g., the plant, soil, or other measured conditions within the growth environment 114 of FIG. 1). The processorexecutable instructions of the sensor 200 may coarsely align the sensor 200 by fitting the appearance of a laser spot in a field of view of the sensor 200. Coarse alignment may be complete when measurement of loss of resolution or clarity in the interference pattern 210 is below a threshold compared to a raw camera image of the same subject. Fine alignment of the sensor 200 may include processor- executable instructions to mathematically model a cavity within the sensor 200. In some embodiments, the cavity includes a space between two mirrors (e.g., mirrors 208a and 208b) that forms a slight wedge. In one example, the sensor 200 may be configured to control displacements between the mirrors, the beam splitter, and/or the subject to the nanometer level based on multi-stage mechanical reduction hardware components or processor-executable instructions.
[0039] The hyperspectral sensor 200 may also include processor-executable instructions to apply a spectroscopic technique (e.g., Fourier transform spectroscopy) to capture or sense a spectrum that is particularly suited for observation of various conditions within the growth environment. In some embodiments, the sensor 200 may be configured to capture a 400 to 1000 nanometer (nm) band of light spectrum that is particularly suited to observation of plant health characteristics. For example, anthocyanin, carotene, chlorophyll A, chlorophyll B, starch, and protein content of the subject plant are detectable within this band and may be captured by the hyperspectral sensor 200.
[0040] Specifications of the sensor 200 may include:
Figure imgf000015_0001
Figure imgf000016_0001
[0041] With reference to FIG. 4, a sensor 402 (e.g., a hyperspectral sensor) may be co-located with lighting fixtures 404, such as horticulture lighting fixtures of the equipment control subsystem 104 (FIG. 1 ), giving an exceptional vantage point on identifying and observing the metabolism of vegetation or plants 406 within the growth environment 114. In one aspect, the sensor 402 may connect to the lighting fixture 404 via a dock, such as a universal dock. In one aspect, the dock (not shown) may connect on one side to the lighting fixture 404 to receive, control or manage power or other settings of the lighting fixture. The dock may also connect to the sensor 402. The dock may provide power and data specification or protocol to the hyperspectral sensor 402 so that data exchanges within the system 100 may be accomplished. Multiple hyperspectral sensors 402 may be deployed within the growth environment 114. One or more shading elements 408 may confine the measurement of the sensors, the light cast by the lighting fixtures 404, and other environmental influences within the growth environment 114 to the one or more plants 406 that are directly beneath or under the respective shading element 408 and/or within the light cast by a respective lighting fixture 404. Within the system, 100, the analytics and machine learning subsystem 108 may receive and process data from each of the sensors of the sensor subsystem 102, including the hyperspectral sensor 112, to provide needed metabolism data of the plants to the remote management subsystem 106. Based on the metabolism and other data, the lighting fixture 404 may be further updated, adjusted, or configured.
[0042] The hyperspectral sensor 402 may be a stand-alone sensor device as shown in FIG. 2 or an add-on or an integrated part of a lighting fixture or fixtures 500 as shown in FIG. 5. As integrated with the lighting fixture, 500, the hyperspectral sensor 200 may be oriented or configured from a vantage point of the lighting fixture 500 above the plants within the growth environment 114. In one example, a housing 502 of the hyperspectral sensor 200 (FIG. 2) may be integrated with a lighting fixture 500 including lighting elements 504. In another example, the sensor 200 may include a power source to energize components thereof. For example, the power source may include a battery. In another embodiment, the connection between the sensor 200 and the lighting fixture 500 may transmit electrical power from the fixture 500 to the sensor 200.
[0043] With reference to Fig. 6, the analytics and machine learning subsystem 108 may include a machine learning (ML) architecture 600 may be used with the system 100 in accordance with the current disclosure. In some embodiments, the analytics and machine learning subsystem 108 system 100 may include instructions for execution on one or more processors that implement the ML architecture 600. The ML architecture 600 may include an input layer 602, a hidden layer 604, and an output layer 606. The input layer 602 may include inputs 608A, 608B, etc., coupled to other subsystems of the system 100 (e.g., the sensor subsystem 102, the equipment control subsystem 104, the remote management subsystem 106, etc.) and represent those inputs that are observed from actual system data, such as sensor data 102b and operating parameters 104b.
[0044] The hidden layer 604 may include weighted nodes 610 that have been trained for the sensor data 102b and operating parameters 104b being observed. Each node 610 of the hidden layer 604 may receive the sum of all inputs 608A, 608B, etc., multiplied by a corresponding weight. The output layer 606 may present various outcomes 612 based on the input values 608A, 608B, etc., and the weighting of the hidden layer 604. Just as a machine learning system for a self-driving car may be trained to determine hazard avoidance actions based on received visual input, the machine learning architecture 600 may be trained to analyze a likely outcome for a given set of inputs based on thousands or even millions of observations of previous agricultural product growth cycles. For example, the architecture 600 may be trained to determine optimal lighting conditions associated with a plant (e.g., 406).
[0045] During training of the machine learning architecture 600, a dataset of inputs may be applied and the weights of the hidden layer 610 may be adjusted for the known outcome (e.g., an optimal plant characteristic) associated with that dataset. As more datasets are applied, the weighting accuracy may improve so that the outcome prediction is constantly refined to a more accurate result. In this case, the data trackers 106b including historic plant characteristics for optimized plant growth may provide datasets for initial training and ongoing refining of the machine learning architecture 600. [0046] Additional training of the machine learning architecture 600 may include the an artificial intelligence engine (Al engine) 614 providing additional values to one or more controllable inputs 616 so that outcomes may be observed for particular changes to the sensor data 102b. The values selected may represent different data types such as a frequency of optimal ranges of particular sensor data 102b, a frequency with which particular optimal growth conditions occur within the growth environment 114, a time for achieving optimal growth conditions, and other alternative data presented at various points in the growth process and may be generated at random or by a pseudo-random process. By adding controlled variables to the sensor data 102b and equipment operating parameters 104b, over time, the impact may be measured and fed back into the machine learning architecture 600 weighting to allow capture of an impact on a proposed change to the agricultural product growth process in order to optimize growth conditions. Over time, the impact of various different data at different points in the growth cycle may be used to predict an outcome for a given set of observed values at the inputs layer 602.
[0047] After training of the machine learning architecture 600 is completed, data from the hidden layer may be fed to the artificial intelligence engine 614 to generate values for controllable input(s) 616 to optimize the sensor data 102b and the operating parameters 104b. Similarly, data from the output layer may be fed back into the artificial intelligence engine 614 so that the artificial intelligence engine 614 may, in some embodiments, iterate with different data to determine via the trained machine learning architecture 600, whether the sensor data 102b and the operating parameter data 104b is accurate, and other determinations.
[0048] With reference to FIGs. 1 and 7, a diagram illustrates a computerized system 700 for plant health management according to one embodiment. For example, the system 700 may include a distributed data storage for storing data accessible by a processor 706. In one example, the distributed data storage 702 may include computer data memory storage in distributed areas or data farms. In another embodiment, the processor 706 may include multi-core processors in distributed areas connected by a network 704 (or, e.g., network 110). In one embodiment, the network 704 may include private or secure networks, as well as public networks. In one embodiment, the processor 706 may be configured to execute computer-executable instructions for managing plant health, as described herein. In a further embodiment, a plurality of sensors 708 (or, e.g., sensors 102a, hyperspectral sensor 112, 200, 402) may comprise one or more antennas (not shown). Further, a plurality of light sources 710 (or, e.g., lighting elements 504 may be disposed in a growth environment 114. In one example, the plurality of light sources 710 may comprise one or more lamps for providing energy in one or more wavelengths as a function of vegetation in the growth environment 114.
[0049] The system 700 may include the plurality of light sources 710 having a network connectivity component for connecting the plurality of light sources 710 to other devices (e.g., sensors 102a, hyperspectral sensor 112, 200, 402, equipment 104a). In one aspect, the plurality of light sources 710 may further connect to the network 704 via the network connectivity component. In one aspect, a plurality of switches 714 may connect with the plurality of sensors 708, the plurality of light sources 710, and the processor 706 via the network 704. In a further aspect, the plurality of switches 714 may be configured to energize the plurality of light sources 710. At least one network node 716 may be coupled to the one or more network antennas (not shown), wherein the at least one network node 716 may be configured to be connected to the network 704.
[0050] The processor 706 may be configured to receive, at the at least one network node 716, a plurality of datasets from the plurality of sensors 702 disposed at the growth environment 114. In one aspect, the plurality of datasets may be transmitted under a data protocol within the network 704. As a function of the received plurality of datasets, the processor 706 may aggregate additional data to the plurality of the datasets, wherein the additional data comprises data external to the growth environment 114.
[0051] The processor 706 may also execute a set of ML or Al algorithms, as described in relation to FIG. 6, based on the received plurality of datasets and the aggregated additional data to control at least one of the following: the plurality of sensors 708, the plurality of light sources 710, the plurality of switches 714 and the at least one network node 716. In view of the above, the processor 706 may generate a plant health management data matrix 750 as a result of executing.
[0052] In another embodiment, the processor 706 may further be configured to localize the plurality of sensors 708 as a function of the one or more antennas. For example, the plurality of datasets may include at least one or more of the following: lighting related data, soil data of soil used in the cultivated site, air condition data of the cultivated site, moisture data of the cultivated site, weather data, temperature data of the cultivated site, plants of the cultivated site, and image data of the cultivated site or technical images collected remotely, satellite data, drone data or aerial data.
[0053] In yet another embodiment, the lighting related data comprises a type of a lamp, a lighting spectrum of the lamp, a power usage of the lamp, and a size of the lamp. Further, the aggregated additional data may comprise at least one or more of the following: geolocation information of the cultivated site, and elevation data of the cultivated site. It is to be understood that the network connectivity component of the plurality of light sources 710 comprises a wireless network component. In one aspect, the plurality of light sources 710 comprises light emitting diode (LED) lamps or any high efficiency light source. In yet a further embodiment, the one or more wavelengths comprises grow light spectrum and that the grow light spectrum comprises wavelengths at one of the following ranges: wavelengths from 260 to 380 nanometer (nm); wavelengths from 380 to 740 nm.
[0054] In yet another embodiment, the processor 706 may be configured to dynamically adjust light distribution patterns for coverage of the cultivated area. Furthermore, the system 700 may include the processor 706 to be configured to process or communicate images in selected spectral bands produced by the plurality of sensors via the at least one network node 716.
[0055] FIG. 8 is a high-level block diagram of an example computing environment 900 for the system 100, 700, and processor-executable instructions as described herein. The computing device 900 may include a server (e.g., the central server and or various servers with the system 100, 700), a mobile computing device, a cellular phone, a tablet computer, a Wi-Fi-enabled device or other personal computing device capable of wireless or wired communication), a thin client, or other known type of computing device.
[0056] Logically, the various servers may be designed and built to specifically execute certain tasks. For example, the system 100, 700 may receive a large amount of data in a short period of time meaning the system 100, 700 may contain a special, high speed input output circuits to handle the large amount of data. Similarly, system 100, 700 may execute processor intensive machine learning algorithms and thus the system 100, 700 may have increased processing power that is specially adapted to quickly execute the machine learning algorithms.
[0057] As will be recognized by one skilled in the art, in light of the disclosure and teachings herein, other types of computing devices can be used that have different architectures. Processor systems similar or identical to the example systems and methods described herein may be used to implement and execute the example systems and processor-executable instructions described herein. Although the example system 100, 700 is described below as including a plurality of peripherals, interfaces, chips, memories, etc., one or more of those elements may be omitted from other example processor systems used to implement and execute the example systems and methods. Also, other components may be added.
[0058] As shown in FIG. 8, the computing device 901 includes a processor 902 that is coupled to an interconnection bus. The processor 902 includes a register set or register space 904, which is depicted in Fig. 8 as being entirely on-chip, but which could alternatively be located entirely or partially off-chip and directly coupled to the processor 902 via dedicated electrical connections and/or via the interconnection bus. The processor 902 may be any suitable processor, processing unit or microprocessor. Although not shown in FIG. 8, the computing device 901 may be a multi-processor device and, thus, may include one or more additional processors that are identical or similar to the processor 902 and that are communicatively coupled to the interconnection bus.
[0059] The processor 902 of FIG. 8 is coupled to a chipset 906, which includes a memory controller 908 and a peripheral input/output (I/O) controller 910. As is well known, a chipset typically provides I/O and memory management functions as well as a plurality of general purpose and/or special purpose registers, timers, etc. that are accessible or used by one or more processors coupled to the chipset 906. The memory controller 908 performs functions that enable the processor 902 (or processors if there are multiple processors) to access a system memory 912 and a mass storage memory 914, that may include either or both of an in-memory cache (e.g., a cache within the memory 912) or an on-disk cache (e.g., a cache within the mass storage memory 914).
[0060] The system memory 912 may include any desired type of volatile and/or non-volatile memory such as, for example, static random access memory (SRAM), dynamic random access memory (DRAM), flash memory, read-only memory (ROM), etc. The mass storage memory 914 may include any desired type of mass storage device. For example, the computing device 901 may be used to implement a module 916 (e.g., the various modules as herein described). The mass storage memory 914 may include a hard disk drive, an optical drive, a tape storage device, a solid-state memory (e.g., a flash memory, a RAM memory, etc.), a magnetic memory (e.g., a hard drive), or any other memory suitable for mass storage. As used herein, the terms module, block, function, operation, procedure, routine, step, and method refer to tangible computer program logic or tangible computer executable instructions that provide the specified functionality to the computing device 901 , the systems and methods described herein. Thus, a module, block, function, operation, procedure, routine, step, and method can be implemented in hardware, firmware, and/or software. In one embodiment, program modules and routines are stored in mass storage memory 914, loaded into system memory 912, and executed by a processor 902 or can be provided from computer program products that are stored in tangible computer-readable storage mediums (e.g. RAM, hard disk, optical/magnetic media, etc.).
[0061] The peripheral I/O controller 910 performs functions that enable the processor 902 to communicate with a peripheral input/output (I/O) device 924, a network interface 926, a local network transceiver 928, (via the network interface 926) via a peripheral I/O bus. The I/O device 924 may be any desired type of I/O device such as, for example, a keyboard, a display (e.g., a liquid crystal display (LCD), a cathode ray tube (CRT) display, etc.), a navigation device (e.g., a mouse, a trackball, a capacitive touch pad, a joystick, etc.), etc. The I/O device 924 may be used with the module 916, etc., to receive data from the transceiver 928, send the data to the components of the system 100, 700 and perform any operations related to the methods as described herein. The local network transceiver 928 may include support for a Wi-Fi network, Bluetooth, Infrared, cellular, or other wireless data transmission protocols. In other embodiments, one element may simultaneously support each of the various wireless protocols employed by the computing device 901 . For example, a software-defined radio may be able to support multiple protocols via downloadable instructions. In operation, the computing device 901 may be able to periodically poll for visible wireless network transmitters (both cellular and local network) on a periodic basis. Such polling may be possible even while normal wireless traffic is being supported on the computing device 901 . The network interface 926 may be, for example, an Ethernet device, an asynchronous transfer mode (ATM) device, an 802.11 wireless interface device, a DSL modem, a cable modem, a cellular modem, etc., that enables the system 100, 700 to communicate with another computer system having at least the elements described in relation to the system 100, 700.
[0062] While the memory controller 908 and the I/O controller 910 are depicted in FIG. 8 as separate functional blocks within the chipset 906, the functions performed by these blocks may be integrated within a single integrated circuit or may be implemented using two or more separate integrated circuits. The computing environment 900 may also implement the module 916 on a remote computing device 930. The remote computing device 930 may communicate with the computing device 901 over an Ethernet link 932. In some embodiments, the module 916 may be retrieved by the computing device 901 from a cloud computing server 934 via the Internet 936. When using the cloud computing server 934, the retrieved module 916 may be programmatically linked with the computing device 901. The module 916 may be a collection of various software platforms including artificial intelligence software and document creation software or may also be a Java® applet executing within a Java® Virtual Machine (JVM) environment resident in the computing device 901 or the remote computing device 930. The module 916 may also be a “plug-in” adapted to execute in a web-browser located on the computing devices 901 and 930. In some embodiments, the module 916 may communicate with back end components 938 via the Internet 936.
[0063] The system 100, 700 may include but is not limited to any combination of a LAN, a MAN, a WAN, a mobile, a wired or wireless network, a private network, or a virtual private network. Moreover, while only one remote computing device 930 is illustrated in FIG. 8 to simplify and clarify the description, it is understood that any number of client computers are supported and can be in communication within the computing environment 900.
[0064] Additionally, certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code or instructions embodied on a machine-readable medium or in a transmission signal, wherein the code is executed by a processor) or hardware modules. A hardware module is tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
[0065] In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an applicationspecific integrated circuit (ASIC)) to perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
[0066] Accordingly, the term “hardware module” should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. As used herein, “hardware-implemented module” refers to a hardware module. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time. [0067] Hardware modules can provide information to, and receive information from, other hardware modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware modules. In embodiments in which multiple hardware modules are configured or instantiated at different times, communications between such hardware modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware modules have access. For example, one hardware module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
[0068] The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
[0069] Similarly, the methods or routines described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or processors or processor-implemented hardware modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment or as a server farm), while in other embodiments the processors may be distributed across a number of locations.
[0070] The one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., application program interfaces (APIs).)
[0071] The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
[0072] Some portions of this specification are presented in terms of algorithms or symbolic representations of operations on data stored as bits or binary digital signals within a machine memory (e.g., a computer memory). These algorithms or symbolic representations are examples of techniques used by those of ordinary skill in the data processing arts to convey the substance of their work to others skilled in the art. As used herein, an “algorithm” is a self-consistent sequence of operations or similar processing leading to a desired result. In this context, algorithms and operations involve physical manipulation of physical quantities. Typically, but not necessarily, such quantities may take the form of electrical, magnetic, or optical signals capable of being stored, accessed, transferred, combined, compared, or otherwise manipulated by a machine. It is convenient at times, principally for reasons of common usage, to refer to such signals using words such as “data,” “content,” “bits,” “values,” “elements,” “symbols,” “characters,” “terms,” “numbers,” “numerals,” or the like. These words, however, are merely convenient labels and are to be associated with appropriate physical quantities.
[0073] Unless specifically stated otherwise, discussions herein using words such as “processing,” “computing,” “calculating,” “determining,” “presenting,” “displaying,” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile

Claims

memory, non-volatile memory, or a combination thereof), registers, or other machine components that receive, store, transmit, or display information. [0074] As used herein any reference to “some embodiments” or “an embodiment” or “teaching” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in some embodiments” or “teachings” in various places in the specification are not necessarily all referring to the same embodiment. [0075] Some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. For example, some embodiments may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still cooperate or interact with each other. The embodiments are not limited in this context. [0076] Further, the figures depict preferred embodiments for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein. [0077] Upon reading this disclosure, those of skill in the art will appreciate still additional alternative structural and functional designs for the systems and methods described herein through the disclosed principles herein. Thus, while particular embodiments and applications have been illustrated and described, it is to be understood that the disclosed embodiments are not limited to the precise construction and components disclosed herein. Various modifications, changes and variations, which will be apparent to those skilled in the art, may be made in the arrangement, operation and details of the systems and methods disclosed herein without departing from the spirit and scope defined in any appended claims. What is claimed is:
1 . A computerized system for plant health management comprising: a distributed data storage for storing data accessible by a processor; a network connecting the processor and the distributed data storage; wherein the processor is configured to execute computer-executable instructions for managing plant health; a plurality of sensors, wherein the plurality of sensors comprises one or more antennas; a plurality of light sources disposed in a cultivated site, wherein the plurality of light sources comprises one or more lamps for providing energy in one or more wavelengths as a function of vegetation in the cultivated site; wherein the plurality of light sources includes a network connectivity component for connecting to the plurality of light sources; wherein the plurality of light sources further connects to the network via the network connectivity component; a plurality of switches connecting with the plurality of sensors, the plurality of light sources, and the processor via the network, wherein the plurality of switches is configured to energize the plurality of light sources; at least one network node coupled to the one or more network antennas, wherein the at least one network node is configured to be connected to the network; wherein the processor is configured to: receiving at the at least one network node a plurality of datasets from the plurality of sensors disposed at the cultivated site, wherein the plurality of datasets is transmitted under a data protocol within the network; as a function of the received plurality of datasets, aggregating additional data to the plurality of the datasets, wherein the additional data comprises data external to the cultivated site; executing a set of machine learning (ML) algorithms based on the received plurality of datasets and the aggregated additional data to control at least one of the following: the plurality of sensors, the plurality of light sources, the plurality of switches and the at least one network node; and generating a plant health management data matrix as a result of executing.
2. The computerized system of claim 1 , wherein the processor is further configured to localize the plurality of sensors as a function of the one or more antennas.
3. The computerized system of claim 1 , wherein the plurality of datasets comprises at least one or more of the following: lighting related data, soil data of soil used in the cultivated site, air condition data of the cultivated site, moisture data of the cultivated site, weather data, temperature data of the cultivated site, plants of the cultivated site, and image data of the cultivated site or technical images collected remotely, satellite data, drone data or aerial data.
4. The computerized system of claim 3, wherein the lighting related data comprises a type of a lamp, a lighting spectrum of the lamp, a power usage of the lamp, and a size of the lamp.
5. The computerized system of claim 1 , wherein the aggregated additional data comprises at least one or more of the following: geolocation information of the cultivated site, and elevation data of the cultivated site.
6. The computerized system of claim 1 , wherein the network connectivity component of the plurality of light sources comprises a wireless network component.
7. The computerized system of claim 1 , wherein the plurality of light sources comprises light emitting diode (LED) lamps or any high efficiency light source.
8. The computerized system of claim 1 , wherein the one or more wavelengths comprises grow light spectrum.
9. The computerized system of claim 8, wherein the grow light spectrum comprises wavelengths at one of the following ranges: wavelengths from 260 to 380 nanometer (nm); wavelengths from 380 to 740 nm.
10. The computerized system of claim 8, the processor is configured to dynamically adjust light distribution patterns for coverage of the cultivated area.
11 . The computerized system of claim 1 , wherein the processor is configured to process or communicate images in selected spectral bands produced by the plurality of sensors via the at least one network node.
12. A computerized system for plant health management comprising: a plurality of sensors disposed within a growth environment, wherein the plurality of sensors includes a hyperspectral sensor, a sensor processor, and a sensor memory, and each of the plurality of sensors includes an antenna, and wherein the sensor memory stores instructions for execution by the sensor processor for collecting present sensor data corresponding to one or more of growth environment conditions and plant characteristics; an analytics and machine learning subsystem including an analytics and machine learning processor and an analytics and machine learning memory, wherein the analytics and machine learning memory stores instructions for execution by the analytics and machine learning processor for: predicting future sensor data corresponding to the growth environment based on one or more of the present sensor data, optimal growth environment conditions, and plant characteristics, and determining optimal parameters for a plurality of equipment of the growth environment based on the predicted future sensor data; and an equipment control subsystem communicably coupled to the plurality of equipment, the equipment control subsystem including an equipment control processor and an equipment control memory, the equipment control memory storing instructions for execution by the equipment control processor for: receiving the optimal parameters for the plurality of equipment, and physically altering one or more of the plurality of equipment based on the optimal parameters.
13. The computerized system of claim 12, further comprising a network including a network processor and a network memory storing instructions for execution by the network processor for interconnecting at least the plurality of sensors, the analytics and machine learning subsystem, and the equipment control subsystem.
14. The computerized system of claim 13, wherein the network includes a data protocol for the present sensor data corresponding to one or more of growth environment conditions and plant characteristics, the future sensor data corresponding to the growth environment based on one or more of the present sensor data, optimal growth environment conditions, and plant characteristics, and the optimal parameters for the plurality of equipment of the growth environment based on the predicted future sensor data.
15. The computerized system of claim 12, wherein the present sensor data and the predicted future sensor data include one or more of temperature, humidity, water, leaf wetness, soil electrical conductivity, soil water potential, light level, and carbon dioxide of the growth environment conditions and the plant characteristics.
16. The computerized system of claim 12, wherein the sensor memory includes further instructions for execution by the sensor processor for monitoring one or more conditions of the plurality of equipment.
17. The computerized system of claim 12, wherein the plurality of equipment includes one or more of a shade canopy, a dosing system, an irrigation system, a lighting systems, a passive and/or an active heating and cooling device, a harvesting equipment, a spraying equipment, a locking mechanism, a fogger, and a humidifier.
18. The computerized system of claim 12, wherein the sensor memory stores further instructions for execution by the sensor processor to gather interferometric information by the hyperspectral sensor, the interferometric information corresponding to the one or more growth environment conditions and plant characteristics, and to accurately locate the plurality of sensors within the growth environment.
19. The computerized system of claim 12, further comprising an automated control module including an automated control module memory and an automated control module processor, wherein the automated control module memory stores processorexecutable instructions for: training a machine learning model using historic data, and predicting the future sensor data corresponding to the growth environment based on both the present sensor data and the machine learning model.
20. The computerized system of claim 12, wherein the instructions for collecting present sensor data corresponding to one or more of growth environment conditions and plant characteristics include a Fourier Transform.
PCT/IB2023/062424 2022-12-09 2023-12-08 Comprehensive agriculture technology system WO2024121815A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202263431360P 2022-12-09 2022-12-09
US63/431,360 2022-12-09
US202363521569P 2023-06-16 2023-06-16
US63/521,569 2023-06-16

Publications (1)

Publication Number Publication Date
WO2024121815A1 true WO2024121815A1 (en) 2024-06-13

Family

ID=89474298

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2023/062424 WO2024121815A1 (en) 2022-12-09 2023-12-08 Comprehensive agriculture technology system

Country Status (1)

Country Link
WO (1) WO2024121815A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019049048A1 (en) * 2017-09-08 2019-03-14 9337-4791 Quebec, Inc. System and method for controlling a growth environment of a crop
US20190259108A1 (en) * 2018-02-20 2019-08-22 Osram Gmbh Controlled Agricultural Systems and Methods of Managing Agricultural Systems
WO2019222860A1 (en) * 2018-05-25 2019-11-28 Greenearth Automation Inc. System, method and/or computer readable medium for growing plants in an autonomous green house
WO2019237200A1 (en) * 2018-06-12 2019-12-19 Paige Growth Technologies Inc. Precision agriculture system and related methods
WO2022164963A1 (en) * 2021-01-28 2022-08-04 Heliponix, Llc System for monitoring enclosed growing environment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019049048A1 (en) * 2017-09-08 2019-03-14 9337-4791 Quebec, Inc. System and method for controlling a growth environment of a crop
US20190259108A1 (en) * 2018-02-20 2019-08-22 Osram Gmbh Controlled Agricultural Systems and Methods of Managing Agricultural Systems
WO2019222860A1 (en) * 2018-05-25 2019-11-28 Greenearth Automation Inc. System, method and/or computer readable medium for growing plants in an autonomous green house
WO2019237200A1 (en) * 2018-06-12 2019-12-19 Paige Growth Technologies Inc. Precision agriculture system and related methods
WO2022164963A1 (en) * 2021-01-28 2022-08-04 Heliponix, Llc System for monitoring enclosed growing environment

Similar Documents

Publication Publication Date Title
Rajak et al. Internet of Things and smart sensors in agriculture: Scopes and challenges
Patil et al. A model for smart agriculture using IoT
US10349584B2 (en) System and method for plant monitoring
CA2999865C (en) Estimating intra-field properties within a field using hyperspectral remote sensing
AU2021258100A1 (en) Modeling trends in crop yields
US20170139380A1 (en) Cloud-based cultivation system for plants
US20180035605A1 (en) Estimating nitrogen content using hyperspectral and multispectral images
AU2016200178A1 (en) Precision agriculture system
WO2021237333A1 (en) Real-time projections and estimated distributions of agricultural pests, diseases, and biocontrol agents
Kakamoukas et al. A multi-collective, IoT-enabled, adaptive smart farming architecture
CN111062358A (en) Unmanned aerial vehicle intelligent farmland information acquisition and monitoring system and method based on block chain
Chen et al. Pest incidence forecasting based on internet of things and long short-term memory network
Rahu et al. Wireless sensor networks-based smart agriculture: sensing technologies, application and future directions
Vadlamudi Rethinking food sufficiency with smart agriculture using Internet of things
Telagam et al. Review on smart farming and smart agriculture for society: Post-pandemic era
Pawar et al. IoT-based smart agriculture: an exhaustive study
Killeen et al. Corn grain yield prediction using UAV-based high spatiotemporal resolution multispectral imagery
KR102467508B1 (en) Smart grass infestation management system based on photographic image and gps analysis
MAKARIO et al. Long Range Low Power Sensor Networks for Agricultural Monitoring-A Case Study in Kenya
Livanos et al. Extraction of reflectance maps for smart farming applications using unmanned aerial vehicles
WO2024121815A1 (en) Comprehensive agriculture technology system
Sharma et al. Crop yield prediction using hybrid deep learning algorithm for smart agriculture
Fishman et al. Digital Villages: A Data-Driven Approach to Precision Agriculture in Small Farms.
Prasad et al. System model for smart precision farming for high crop yielding
Bochtis et al. Information and Communication Technologies for Agriculture-Theme I: Sensors

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23836584

Country of ref document: EP

Kind code of ref document: A1