WO2022091092A1 - System and method for indoor crop management - Google Patents
System and method for indoor crop management Download PDFInfo
- Publication number
- WO2022091092A1 WO2022091092A1 PCT/IL2021/051273 IL2021051273W WO2022091092A1 WO 2022091092 A1 WO2022091092 A1 WO 2022091092A1 IL 2021051273 W IL2021051273 W IL 2021051273W WO 2022091092 A1 WO2022091092 A1 WO 2022091092A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- fruit
- vegetable
- sensor
- airborne vehicle
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01D—HARVESTING; MOWING
- A01D46/00—Picking of fruits, vegetables, hops, or the like; Devices for shaking trees or shrubs
- A01D46/30—Robotic devices for individually picking crops
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D1/00—Dropping, ejecting, releasing or receiving articles, liquids, or the like, in flight
- B64D1/16—Dropping or releasing powdered, liquid, or gaseous matter, e.g. for fire-fighting
- B64D1/18—Dropping or releasing powdered, liquid, or gaseous matter, e.g. for fire-fighting by spraying, e.g. insecticides
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64D—EQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
- B64D1/00—Dropping, ejecting, releasing or receiving articles, liquids, or the like, in flight
- B64D1/22—Taking-up articles from earth's surface
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U20/00—Constructional aspects of UAVs
- B64U20/80—Arrangement of on-board electronics, e.g. avionics systems or wiring
- B64U20/87—Mounting of imaging devices, e.g. mounting of gimbals
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/40—UAVs specially adapted for particular uses or applications for agriculture or forestry operations
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/70—UAVs specially adapted for particular uses or applications for use inside enclosed spaces, e.g. in buildings or in vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2201/00—UAVs characterised by their flight controls
- B64U2201/10—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
- B64U2201/104—UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] using satellite radio beacon positioning systems, e.g. GPS
Definitions
- the present invention relates to the field of automated agriculture.
- the present invention pertains particularly to means and methods for indoor crop management and more specifically to automated aerial vehicle for indoor harvesting and management of agricultural produce.
- Greenhouse growers are the most interested in integrating harvesting robotics into their operations; with 34% of growers reporting that they are considering it. A key reason is because vegetable growers maintain a steadier workforce than other field crops, with 40% of vegetable farms having permanent employees. It is emphasized that greenhouse conditions are different from those of orchards and other plantations, for example by physical conditions and plant types, therefore unique adaptations of automated management systems are required in order to operate and harvest in greenhouse. In addition to reduction of labor costs, quality of harvest is another reason that vegetable growers are interested in robotics.
- UAVs unmanned aerial vehicles
- US Patent 10492374 describes a method for acquiring sensor data associated with a plant growing in a field, and analyzing the sensor data to extract one or more phenotypic traits associated with the plant to determine information on the state of the plant. It is mentioned that the sensor data may be acquired using a human- operated vehicle, an unmanned aerial vehicle (UAV), or an unmanned ground vehicle (UGV).
- UAV unmanned aerial vehicle
- UUV unmanned ground vehicle
- US Patent 10555460 describes a system for harvesting produce from a tree, specifically a coconut tree, by a drone capable of hovering and having a video camera gathering visual data of movement.
- PCT publications WO2018033922, WO2018033923, WO2018033925 and WO2018033926 relate to an autonomous unmanned aircraft vehicle (UAV) for management, mapping and harvesting or diluting fruit trees in an orchard.
- UAV autonomous unmanned aircraft vehicle
- suitable trees for the described technology are trees that their fruits are relatively large and visible such as avocado, mango, and grapefruit. It is stated that such large fruits are connected to the branch through a thin and visible stipe.
- the UAVs further comprising a protruding, netted cage adapted for pushing branches and leaves.
- a system for indoor management of growing agricultural produce comprising: (a) at least one autonomous airborne vehicle, wherein said vehicle is designed to commute inside an enclosed space comprising said growing agricultural produce; (b) data acquisition module operably engaged with the at least one autonomous airborne vehicle, wherein said module is configured to collect data related to said growing agricultural produce; (c) at least one controller operably engaged with the at least one autonomous airborne vehicle; (d) a computing device, in communication with the data acquisition module and the at least one controller; and, optionally, (e) an operator in communication with the at least one controller.
- the autonomous airborne vehicle further comprises at least one harvesting element configured to extend from the vehicle and to become physically engaged with the agricultural produce.
- harvesting element is a fruit or vegetable harvesting element configured to detach fruit or vegetable form the fruit or vegetable bearing crop.
- the harvesting element is selected from the group consisting of a suction, a cutter, a grasper and any combination thereof.
- the autonomous airborne vehicle further comprises a movement-controlling element configured to enable the vehicle to commute within the enclosed space.
- the data related to fruit or vegetable quality comprises fruit or vegetable size, fruit or vegetable shape, fruit or vegetable length, fruit or vegetable color, fruit or vegetable location, BRIX content, fruit or vegetable ripeness, BRIX (%), Firmness (sh), Dry Weight (%), Quality Defect, Condition Defect, Color range, Acidity, Hue, Color Variance, Appearance, Color Coverage (%), Starch (%), Weight, Juice (%), Seeds (%), Temperature, TSS / Acidity, number of fruits or vegetable per crop, yield, health, state of the plant, performance, or any combination thereof.
- the data acquisition module comprises at least one image acquisition element, wherein said image acquisition element is configured to collect image data related to the agricultural produce.
- the at least one image acquisition element is a multispectral camera.
- the at least one image acquisition element is selected from the group consisting of a Hyperspectral camera, an Infra-Red (IR) camera, a RGB camera, visible light frequency range camera, near infrared (NIR) frequency range camera, monochrome, specific light wavelengths (e.g. LED and/or laser and/or halogen and/or xenon and/or fluorescent), UV frequency range and any combination thereof.
- IR Infra-Red
- RGB camera visible light frequency range camera
- NIR near infrared
- monochrome monochrome
- specific light wavelengths e.g. LED and/or laser and/or halogen and/or xenon and/or fluorescent
- UV frequency range any combination thereof.
- the data acquisition module comprises at least one sensor configured to collect data related to the agricultural produce, wherein the data is not an image data.
- the data acquisition module comprises data acquired from one or more of an optical sensor, an acoustic sensor, a chemical sensor, a geo-location sensor, an environmental sensor, a weather sensor; a refractometer, a pH meter, acidometer, penetrometer, durometer, temperature meter, weight sensor, quantity sensor, pressure sensor, plant and/or surrounding detection sensor, plant and/or surrounding distance sensor, device or sensor detecting odors and/or flavors, or any combination thereof.
- the computing device comprises a memory and a processor and is configured to process the data collected by the data acquisition module and to transmit instructions to the controller based on the processed data.
- controller is configured to fulfill at least one of the functions selected from: receiving instructions from the computer, providing instructions to the operator, providing instructions to the movement-controlling element of the autonomous airborne vehicle, turning on the fruit or vegetable harvesting element of the autonomous airborne vehicle, turning off the fruit or vegetable harvesting element of the autonomous airborne vehicle, communicate with other one or more autonomous airborne vehicles.
- the fleet of autonomous airborne vehicles comprises one or more scouting autonomous airborne vehicle and/or one or more harvesting autonomous airborne vehicle.
- fruit or vegetable-bearing crop is selected from the group consisting of tomato, pepper, cucumber, strawberry, raspberry, blueberry and any snack fruit or vegetable.
- Cherry tomato for loose and cluster harvesting, cherry and mini blocky Pepper, baby cucumber and any snack fruit and vegetable is selected from the group consisting of tomato, pepper, cucumber, strawberry, raspberry, blueberry and any snack fruit or vegetable.
- GUI graphical user interface
- It is a further object of the present invention to disclose a method for indoor management of growing agricultural produce comprising: (a) providing the system as defined in any of the above; (b) providing an enclosed space comprising growing agricultural produce, wherein said space comprises outer walls and a ceiling; (c) directing the at least one autonomous airborne vehicle towards the growing agricultural produce; (d) collecting data related to the growing agricultural produce by the data acquisition module; (e) transmitting the collected data to the computer; (f) processing the collected data and providing instructions to the controller based on the collected data; and, (g) executing the instructions by the controller to thereby effectively manage the growing agricultural produce.
- the indoor management of growing agricultural produce comprises harvesting fruits or vegetables from the fruit or vegetable-bearing crops by the at least one autonomous airborne vehicle.
- said system is designed to fulfill at least one of the functions selected from harvesting the agricultural produce, diluting the agricultural produce, collecting data related to the agricultural produce, pollinating the agricultural produce, or any combination thereof.
- the agricultural produce comprises at least one of fruit-bearing crops, vegetable crops and flowering crops.
- the autonomous airborne vehicle further comprises at least one harvesting element configured to extend from the vehicle and to become physically engaged with the agricultural produce.
- harvesting element is a fruit or vegetable harvesting element configured to detach fruit or vegetable form the fruit or vegetable bearing crop.
- harvesting element is selected from the group consisting of a suction, a cutter, a grasper and any combination thereof.
- the autonomous airborne vehicle further comprises a movement-controlling element configured to enable the vehicle to commute within the enclosed space.
- the data related to fruit or vegetable quality comprises fruit or vegetable size, fruit or vegetable shape, fruit or vegetable length, fruit or vegetable color, fruit or vegetable location, BRIX content, fruit or vegetable ripeness, BRIX (%), Firmness (sh), Dry Weight (%), Quality Defect, Condition Defect, Color range, Acidity, Hue, Color Variance, Appearance, Color Coverage (%), Starch (%), Weight, Juice (%), Seeds (%), Temperature, TSS / Acidity, number of fruits or vegetable per crop, yield, health, state of the plant, performance, or any combination thereof.
- the data acquisition module comprises at least one image acquisition element, wherein said image acquisition element is configured to collect image data related to the agricultural produce.
- the at least one image acquisition element is selected from the group consisting of a Hyperspectral camera, an Infra-Red (IR) camera, a RGB camera, visible light frequency range camera, near infrared (NIR) frequency range camera, monochrome, specific light wavelengths (e.g. LED and/or laser and/or halogen and/or xenon and/or fluorescent), UV frequency range and any combination thereof.
- IR Infra-Red
- RGB camera visible light frequency range camera
- NIR near infrared
- monochrome monochrome
- specific light wavelengths e.g. LED and/or laser and/or halogen and/or xenon and/or fluorescent
- UV frequency range any combination thereof.
- the data acquisition module comprises at least one sensor configured to collect data related to the agricultural produce, wherein the data is not an image data.
- the data acquisition module comprises data acquired from one or more of an optical sensor, an acoustic sensor, a chemical sensor, a geo-location sensor, an environmental sensor, a weather sensor; a refractometer, a pH meter, acidometer, penetrometer, durometer, temperature meter, weight sensor, quantity sensor, pressure sensor, plant and/or surrounding detection sensor, plant and/or surrounding distance sensor, device or sensor detecting odors and/or flavors, or any combination thereof.
- the computing device comprises a memory and a processor and is configured to process the data collected by the data acquisition module and to transmit instructions to the controller based on the processed data.
- controller configured to fulfill at least one of the functions selected from: receiving instructions from the computer, providing instructions to the operator, providing instructions to the movement-controlling element of the autonomous airborne vehicle, turning on the fruit or vegetable harvesting element of the autonomous airborne vehicle, turning off the fruit or vegetable harvesting element of the autonomous airborne vehicle, communicate with other one or more autonomous airborne vehicles.
- the fleet of autonomous airborne vehicles comprises one or more scouting autonomous airborne vehicle and/or one or more harvesting autonomous airborne vehicle.
- GUI graphical user interface
- an autonomous airborne vehicle comprising: (a) a power source; (b) a data acquisition module; (c) a computing system comprising a memory and a processor; (d) a transducer; and, optionally, (e) a controller; wherein said autonomous airborne vehicle is designed to commute inside an enclosed space characterized by having outer walls and a ceiling; and, wherein said enclosed space comprises fruit or vegetable-bearing growing agricultural produce.
- an autonomous airborne vehicle comprising: (a) a power source; (b) a data acquisition module; (c) a transducer; (d) a fruit or vegetable harvesting element; and, optionally, (e) a controller; and (f) a computing system comprising a memory and a processor; wherein said autonomous airborne vehicle is designed to commute inside an enclosed space characterized by having outer walls and a ceiling; and, wherein said enclosed space comprises fruit or vegetable -bearing growing agricultural produce.
- said at least one image acquisition element comprises a multi-function camera such as Hyperspectral, IR and/or RGB camera.
- a navigation unit such as a Global Positioning System (GPS).
- GPS Global Positioning System
- at least one sensor selected from the group consisting of optical sensor, an acoustic sensor, a chemical sensor, a geo-location sensor, an environmental sensor, a weather sensor; a refractometer, a pH meter, acidometer, penetrometer, durometer, temperature meter, weight sensor, quantity sensor, pressure sensor, plant and/or surrounding detection sensor, plant and/or surrounding distance sensor, device or sensor detecting odors and/or flavors, or
- It is a further object of the present invention to disclose a computer implemented method of indoor management of a growing fruit or vegetable-bearing crops comprising: (a) providing the system as defined in any of the above; (b) collecting image data by the data acquisition module; (c) processing the collected data and assessing the state of the fruit or vegetable-bearing crops based on a set of parameters indicative of the state of the fruit or vegetable-bearing crops extracted from the collected image data; (d) providing instructions to the controller based on the state of the growing fruit or vegetable -bearing crops; and, (e) managing the growing fruitbearing crops.
- a computerized system including at least one of a machine learning system, a deep learning system, an optical flow technique, a computer vision technique, a convolutional neural network (CNN), a recurrent neutral network (RNN), or a machine learning dataflow library, and wherein said system analyzing the acquired image data comprising autonomously predicting one or more phenotypic traits of said fruit or vegetable bearing agricultural produce.
- said machine learning process comprises computing by the at least one neural network, a tag of at least one desired category for the at least one fruit-bearing crop, wherein the tag of at least one classification category is computed at least according to weights of the at least one neural network, wherein the at least one neural network is trained according to a training dataset comprising a plurality of training images of a plurality of fruit or vegetablebearing crops captured by the at least one imaging sensor, wherein each respective training image of the plurality of training images is associated with said tag of at least one desired category of at least one fruit or vegetable-bearing crop depicted in the respective training image; and generating according to the tag of at least one classification category, instructions for execution by the controller.
- the state of the fruit or vegetable comprises parameters including fruit or vegetable size, fruit or vegetable shape, fruit or vegetable length, fruit or vegetable color, fruit or vegetable location, BRIX content, fruit or vegetable ripeness, BRIX (%), Firmness (sh), Dry Weight (%), Quality Defect, Condition Defect, Color range, Acidity, Hue, Color Variance, Appearance, Color Coverage (%), Starch (%), Weight, Juice (%), Seeds (%), Temperature, TSS / Acidity, number of fruits or vegetable per crop, yield, health, state of the plant, performance, or any combination thereof.
- FIG. 1 is a schematic illustration of a scouting autonomous airborne vehicle as an embodiment of the system of the present invention
- FIG. 2 is a schematic illustration of a harvesting autonomous airborne vehicle as an embodiment of the system of the present invention
- FIG. 3 is a schematic illustration of an embodiment comprising a computerized managing unit that may be implemented in the system of the present invention, in accordance with certain aspects of the present invention
- FIG. 4 is a schematic illustration of exemplified embodiments of the system for indoor management of growing agricultural produce of the present invention.
- Fig. 5 is a schematic illustration depicting an exemplified robotic arm according to some embodiments of the autonomous airborne vehicle of the present invention.
- the present invention provides a system for indoor management of growing agricultural produce comprising: (a) at least one autonomous airborne vehicle, wherein said vehicle is designed to commute inside an enclosed space comprising said growing agricultural produce, said enclosed space preferably having outer walls and a ceiling; (b) data acquisition module operably engaged with the at least one autonomous airborne vehicle, wherein said module is configured to collect data related to said growing agricultural produce; (c) at least one controller operably engaged with the at least one autonomous airborne vehicle; (d) a computing device, in communication with the data acquisition module and the at least one controller; and, optionally, (d) an operator in communication with the at least one controller.
- greenhouse conditions are different from those of orchards (e.g. physical conditions and plant types), therefore unique adaptations of the drone system are required in order to operate and harvest in greenhouse.
- the present invention provides autonomous or semi- autonomous airborne robots (drones), using computer-vision-based technologies, for monitoring and harvesting fruits and vegetables in green -houses.
- one or more drones operate fully or partially autonomously in the green house (e.g. plastic, incubator, glass or other material and/or mesh houses) and detects and harvest the fruits or vegetables, e.g. when ready for harvesting.
- the green house e.g. plastic, incubator, glass or other material and/or mesh houses
- the drone of the present invention is configured with:
- [00113] a camera that allows autonomous movement in the growth area and image capturing of the fruit or vegetable for a computer vision-based decision making whether the fruit is due for picking;
- the solution provided by the current invention enables efficient and rapid harvesting for different growth habits, in short or long season, passive (PGH) and / or acclimated (AGH) growth houses.
- the herein provided solution encompasses computer-vision based airborne robots (drones) for monitoring and harvesting fruits and vegetables in greenhouse.
- drones computer-vision based airborne robots
- the provided technological solution enables harvesting the fruit or vegetable at the right time with the right cost.
- a further advantage of using the current invention is that by automatically and autonomously harvesting fruits and vegetables at their optimal time, improved fruit quality is obtained, in addition to consistency at the consumer level.
- the inventors of the present invention provides an autonomous airborne vehicle and a system comprising such airborne vehicle designed and adjusted to indoor agriculture environment (e.g. greenhouse, plastic or glass house and net house) and to harvesting different crops of various varieties, shapes, sizes, fruit types grown in such habitats (e.g. Tomato, Pepper, Cucumber , Strawberry, Blueberry, Raspberry).
- indoor agriculture environment e.g. greenhouse, plastic or glass house and net house
- crops e.g. Tomato, Pepper, Cucumber , Strawberry, Blueberry, Raspberry
- fruits and vegetables grown in indoor agricultural environment such as greenhouses include cherry tomato for loose and cluster harvesting, cherry and mini blocky pepper, baby cucumber and any snack fruit and/or vegetable including vegetables such as tomato, sweet pepper and cucumber and fruits such as apple, peach and pineapple.
- the airborne robot of the present invention is capable of picking/harvesting fruit/vegetable only when they are ready for picking, at their optimal maturation time and quality. This can be done by detecting means (sensors, e.g. image sensors) detecting the color, BRIX (herein also referred to as sugar content of a fruit or vegetable or Total Soluble Solids or TSS) and thus the fruit/vegetable quality/ ripeness.
- detecting means sensors, e.g. image sensors
- BRIX herein also referred to as sugar content of a fruit or vegetable or Total Soluble Solids or TSS
- system of the present invention comprises a fleet of more than one autonomous airborne vehicle or unmanned aerial vehicle (UAV) or drone, having the following functions:
- autonomous airborne vehicle unmanned aerial vehicle
- UAV uncrewed aerial vehicle
- drone uncrewed aerial vehicle
- drone airborne robot
- flying unmanned aircraft “harvesting robot” and the like are used herein interchangeably and refer to an aircraft without a human pilot and a type of flying unmanned vehicle in any shape and size as needed and defined herein.
- the term "indoor” or “indoor management” generally refers to management of any process in which plants are grown inside a closed environment, or enclosed space such as a greenhouse or grow room.
- a controlled-environment agriculture is performed aims to optimize plant growth and allows regulation of all aspects of the growing environment, including light, temperature, and humidity, to produce crops of a consistent quality, all year round.
- Indoor or enclosed space within the scope of the present invention include greenhouse growing, vertical farms, and some rooftop farms.
- growing techniques and tools within controlled environments may incorporate hydroponics, artificial lighting systems, soilless farming techniques such as hydroponics, aquaponics, and aeroponics.
- a main aspect of the present invention is providing a system comprising at least one autonomous airborne vehicle for management and harvesting of fruits or vegetable bearing crops within a greenhouse.
- Greenhouse refers to an outdoor structure or enclosed space, preferably having outer walls and a celling that houses growing plants.
- Greenhouses can vary significantly in terms of size, design, and structure. They range from simple frames covered with plastic to warehouse-sized, fully sealed buildings with walls of glass or PVC.
- Greenhouses provide a suitable environment for the intensive production of various crops. They are designed to protect from diseases, pests, solar radiation, temperature, humidity and carbon dioxide levels in the aerial environment.
- the function of greenhouses is to provide an insulated environment that can protect the agricultural produce and extend the growing season by sheltering plants as they grow.
- the term greenhouse encompass a plastic or glass house, and a net or mesh house.
- a greenhouse may refer to a passive growth house (PGH) and / or to acclimated growth houses (AGH).
- PSH passive growth house
- AGH acclimated growth houses
- Passive greenhouses (PGH) or passive methods of environmental control approach that does not involve the ongoing use of energy to regulate conditions and crop management such as heating, cooling, air circulation and lighting.
- Acclimated greenhouses (AGH) encompass methods of environmental control that involve the ongoing use of energy to regulate conditions and crop management such as heating, cooling, air circulation and lighting.
- acclimated growth houses or methods refer to active environmental control using an external energy source to power heating, cooling, venting, supplemental lighting, , and climate control systems. It includes active control mechanisms that encompass complimentary tools that allow growers to more precisely and predictably create desired conditions. Such methods may herein defined to encompass automated and/or computer-directed or based systems for heating, cooling, air circulation and supplemental lighting.
- the term greenhouse refers to enclosed structure for growing high value crops such as fruits and vegetables, e.g. pepper, tomato and cucumber and different berries (for example, strawberry, blubbery, raspberry, blackberries, red currants, white currants and blackcurrants).
- the term "container” as used herein refers to a collection bin or bin for collecting fruits and/or vegetables.
- the container may be any standard collection bin as currently used in the field or any other container that can be used to collect fruits and/or vegetables.
- a “container” also includes a portable container, for example with wheels that can be dispersed in several locations in a greenhouse.
- the container may be operably engaged with the one or more autonomous airborne vehicles of the system of the present invention.
- controller refers, without limitation, to any hardware device or a software program, or a combination of the two that manages or directs the flow of data between two or more entities.
- a controller can be thought of as something or someone that interfaces between two systems and manages communications between them.
- the at least one controller is operably engaged with at least one autonomous airborne vehicle.
- system e.g. using the data acquisition module engaged with the at least one autonomous airborne vehicle, processor and controller parts or units or modules, is configured to manage, monitor and harvest agricultural produce within an enclosed growing house such as a greenhouse.
- the system of the present invention may comprise a control or managing center or station or computing managing unit, or server wherein said server or control center is configured to monitor and direct system performance.
- the term "sensor” as used herein generally refers to a device, module, machine, or system capable of detecting or measuring a property or changes in its environment and sending the information or data (e.g. optical or image or other data) to other electronics, frequently a computer processor.
- information or data e.g. optical or image or other data
- Non limiting examples of sensor types within the scope of the present invention include, but are not limited to electric current, electric potential, magnetic and/or radio sensors, weight/ quantity sensors, optical, light, imaging and/or photon sensors, pressure sensors, thermal, heat and/or temperature sensors, position/location sensors, chemical sensor, a geo-location sensor, an environmental sensor, a weather sensor; a refractometer, a pH meter, acidometer, penetrometer, durometer, plant and/or surrounding detection sensor, plant and/or surrounding distance sensor, device or sensor detecting odors and/or flavors, or any combination thereof.
- imaging sensor or “image sensor” or “image acquisition sensor” as used herein refers to a sensor that detects and conveys information used to make an image. Without wishing to be bound by theory, an imaging sensor conveys the variable attenuation of light waves, passed through or reflect off objects, into signals, that convey the information.
- the waves can be light or other electromagnetic radiation.
- Image sensors are used in electronic imaging devices of both analog and digital types, which include digital cameras, camera modules, camera phones, and others.
- Exemplary imaging sensors within the scope of the present invention include: RGB (red, green, blue) frequency spectrum, multispectral, hyperspectral, visible light frequency range, near infrared (NIR) frequency range, infrared (IR) frequency range, monochrome, specific light wavelengths (e.g. LED or laser and/or laser and/or halogen and/or xenon and/or fluorescent), UV frequency range, a reflectometer and combinations of the aforementioned.
- image data herein means a photographic or trace objects that represent the underlying pixel data of an area of an image element, which is created, collected and stored using image constructor devices.
- Image data attributes include for example, image resolution, data-point size and spectral bands.
- computer stands for but no limited to a machine or device that performs processes, calculations and operations based on instructions provided by a software or hardware program.
- computer also means in the context of the present invention a control unit or controller. It is designed to process and execute applications and provides a variety of solutions by combining integrated hardware and software components.
- the computer of the invention is configured to extract a predetermined set of feature vectors from the image data of the agricultural produce; to compute characteristics of the agricultural produce, e.g. fruits and/or vegetables based on the set of feature vectors, attributes or parameters; to generate output and to transmit the output to the controller unit.
- the present invention provides a system for indoor management of growing agricultural produce comprising: (a) at least one autonomous airborne vehicle, wherein said vehicle is designed to commute inside an enclosed space comprising said growing agricultural produce; (b) data acquisition module operably engaged with the at least one autonomous airborne vehicle, wherein said module is configured to collect data related to said growing agricultural produce; (c) at least one controller operably engaged with the at least one autonomous airborne vehicle; (d) a computing device, in communication with the data acquisition module and the at least one controller; and, optionally, (e) an operator in communication with the at least one controller.
- the agricultural produce comprises at least one of fruit-bearing crops, vegetable- bearing crops and flowering crops.
- the autonomous airborne vehicle further comprises at least one harvesting element configured to extend from the vehicle and to become physically engaged with the agricultural produce.
- harvesting element is a fruit or vegetable harvesting element configured to detach fruit or vegetable form the fruit or vegetable bearing crop.
- the at least one harvesting element is a robotic arm.
- the harvesting element is selected from the group consisting of a suction, a cutter, a grasper and any combination thereof.
- the autonomous airborne vehicle further comprises a movement-controlling element configured to enable the vehicle to commute within the enclosed space.
- system as defined in any of the above, wherein said system comprises an autonomous navigation unit such as a Global Positioning System (GPS).
- GPS Global Positioning System
- the data acquisition module is selected from data related to crop performance, environmental data, or a combination thereof.
- the data related to fruit or vegetable quality comprises fruit or vegetable size, fruit or vegetable shape, fruit or vegetable length, fruit or vegetable color, fruit or vegetable location, BRIX content, fruit or vegetable ripeness, BRIX (%), Firmness (sh), Dry Weight (%), Quality Defect, Condition Defect, Color range, Acidity, Hue, Color Variance, Appearance, Color Coverage (%), Starch (%), Weight, Juice (%), Seeds (%), Temperature, TSS / Acidity, number of fruits or vegetable per crop, yield, health, state of the plant, performance, or any combination thereof.
- the environmental data is selected from temperature, humidity, irradiation, location of the vehicle, location of the agricultural produce within the enclosed space and any combination thereof.
- the data acquisition module comprises at least one image acquisition element, wherein said image acquisition element is configured to collect image data related to the agricultural produce.
- the at least one image acquisition element is selected from the group consisting of a Hyperspectral camera, an Infra-Red (IR) camera, a RGB camera, visible light frequency range camera, near infrared (NIR) frequency range camera, monochrome, specific light wavelengths (e.g. LED and/or laser and/or halogen and/or xenon and/or fluorescent), UV frequency range and any combination thereof.
- IR Infra-Red
- RGB camera visible light frequency range camera
- NIR near infrared
- monochrome monochrome
- specific light wavelengths e.g. LED and/or laser and/or halogen and/or xenon and/or fluorescent
- UV frequency range any combination thereof.
- the data acquisition module comprises at least one sensor configured to collect data related to the agricultural produce, wherein the data is not an image data.
- the data acquisition module comprises data acquired from one or more of an optical sensor, an acoustic sensor, a chemical sensor, a geo-location sensor, an environmental sensor, a weather sensor; a refractometer, a pH meter, acidometer, penetrometer, durometer, temperature meter, weight sensor, quantity sensor, pressure sensor, plant and/or surrounding detection sensor, plant and/or surrounding distance sensor, device or sensor detecting odors and/or flavors, or any combination thereof.
- the computing device comprises a memory and a processor and is configured to process the data collected by the data acquisition module and to transmit instructions to the controller based on the processed data.
- controller configured to fulfill at least one of the functions selected from: receiving instructions from the computer, providing instructions to the operator, providing instructions to the movementcontrolling element of the autonomous airborne vehicle, turning on the fruit or vegetable harvesting element of the autonomous airborne vehicle, turning off the fruit or vegetable harvesting element of the autonomous airborne vehicle, communicate with other one or more autonomous airborne vehicles.
- the fleet of autonomous airborne vehicles comprises one or more scouting autonomous airborne vehicle and/or one or more harvesting autonomous airborne vehicle.
- the enclosed space is selected from a greenhouse, a plastic or glass house, and a net or mesh house.
- the fruit or vegetable-bearing crop is selected from the group consisting of tomato, pepper, cucumber, strawberry, raspberry, blueberry and any snack fruit or vegetable.
- Cherry tomato for loose and cluster harvesting, cherry and mini blocky Pepper, baby cucumber and any snack fruit and vegetable is selected from the group consisting of tomato, pepper, cucumber, strawberry, raspberry, blueberry and any snack fruit or vegetable.
- said autonomous airborne vehicle further comprises a wireless communication module or device, such as a Bluetooth device.
- said autonomous airborne vehicle further comprises a communication module to wirelessly communicate with other one or more autonomous airborne vehicles within said enclosed space.
- GUI graphical user interface
- the present invention provides a method for indoor management of growing agricultural produce comprising: (a) providing the system as defined in any of the above; (b) providing an enclosed space comprising growing agricultural produce, wherein said space comprises outer walls and a ceiling; (c) directing the at least one autonomous airborne vehicle towards the growing agricultural produce; (d) collecting data related to the growing agricultural produce by the data acquisition module; (e) transmitting the collected data to the computer; (f) processing the collected data and providing instructions to the controller based on the collected data; and, (g) executing the instructions by the controller to thereby effectively manage the growing agricultural produce.
- the present invention provides an autonomous airborne vehicle comprising: (a) a power source; (b) a data acquisition module; (c) a computing system comprising a memory and a processor; (d) a transducer; and, optionally, (e) a controller; wherein said autonomous airborne vehicle is designed to commute inside an enclosed space characterized by having outer walls and a ceiling; and, wherein said enclosed space comprises fruit or vegetable -bearing growing agricultural produce.
- an autonomous airborne vehicle comprising: (a) a power source; (b) a data acquisition module; (c) a transducer; (d) a fruit or vegetable harvesting element; and, optionally, (e) a controller; and (f) a computing system comprising a memory and a processor; wherein said autonomous airborne vehicle is designed to commute inside an enclosed space characterized by having outer walls and a ceiling; and, wherein said enclosed space comprises fruit or vegetable-bearing growing agricultural produce.
- It is also within the scope of the present invention to provide a computer implemented method of indoor management of a growing fruit or vegetable-bearing crops comprising: (a) providing the system as defined in any of the above; (b) collecting image data by the data acquisition module; (c) processing the collected data and assessing the state of the fruit or vegetable-bearing crops based on a set of parameters indicative of the state of the fruit or vegetable-bearing crops extracted from the collected image data; (d) providing instructions to the controller based on the state of the growing fruit or vegetable -bearing crops; and, (e) managing the growing fruitbearing crops.
- said assessing the state of the growing agricultural produce comprises using a computerized system including at least one of a machine learning system, a deep learning system, an optical flow technique, a computer vision technique, a convolutional neural network (CNN), a recurrent neutral network (RNN), or a machine learning dataflow library, and wherein said system analyzing the acquired image data comprising autonomously predicting one or more phenotypic traits of said fruit or vegetable bearing agricultural produce.
- a computerized system including at least one of a machine learning system, a deep learning system, an optical flow technique, a computer vision technique, a convolutional neural network (CNN), a recurrent neutral network (RNN), or a machine learning dataflow library, and wherein said system analyzing the acquired image data comprising autonomously predicting one or more phenotypic traits of said fruit or vegetable bearing agricultural produce.
- said training process comprises steps of (a) capturing images of the fruit or vegetable-bearing crop using an imaging sensor; (b) classifying images into desired categories by applying a tag associated with parameters or attributes indicative of the state of the fruit or vegetable -bearing crop extracted from the image data; and (c) applying a computer vision algorithm to determine a set of feature vectors associated with each desired category.
- said machine learning process comprises computing by the at least one neural network, a tag of at least one desired category for the at least one fruit-bearing crop, wherein the tag of at least one classification category is computed at least according to weights of the at least one neural network, wherein the at least one neural network is trained according to a training dataset comprising a plurality of training images of a plurality of fruit or vegetablebearing crops captured by the at least one imaging sensor, wherein each respective training image of the plurality of training images is associated with said tag of at least one desired category of at least one fruit or vegetable-bearing crop depicted in the respective training image; and generating according to the tag of at least one classification category, instructions for execution by the controller.
- the state of the fruit or vegetable comprises parameters including fruit or vegetable size, fruit or vegetable shape, fruit or vegetable length, fruit or vegetable color, fruit or vegetable location, BRIX content, fruit or vegetable ripeness, BRIX (%), Firmness (sh), Dry Weight (%), Quality Defect, Condition Defect, Color range, Acidity, Hue, Color Variance, Appearance, Color Coverage (%), Starch (%), Weight, Juice (%), Seeds (%), Temperature, TSS / Acidity, number of fruits or vegetable per crop, yield, health, state of the plant, performance, or any combination thereof.
- the present invention may be a system, a method, and/or a computer program product.
- the computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
- the computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device.
- the computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
- a non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, and any suitable combination of the foregoing.
- RAM random access memory
- ROM read-only memory
- EPROM or Flash memory erasable programmable read-only memory
- SRAM static random access memory
- CD-ROM compact disc read-only memory
- DVD digital versatile disk
- memory stick a floppy disk, and any suitable combination of the foregoing.
- a computer readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
- Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network.
- a network for example, the Internet, a local area network, a wide area network and/or a wireless network.
- These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
- These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
- the method of the present invention comprises steps of applying a machine learning process with the computer implemented trained algorithm to determine the status (e.g. ripeness, readiness for harvesting and other parameters) of fruits and vegetables grown in greenhouses.
- the algorithm or computer readable program
- the algorithm is implemented with a machine learning process using a neural network with the processed data.
- training in the context of machine learning implemented within the system of the present invention refers to the process of creating a machine learning algorithm. Training involves the use of a deep-learning framework and training dataset. A source of training data can be used to train machine learning models for a variety of use cases, in the context of the present invention it is used with the system and method for indoor management of growing agricultural produce.
- the neural network may compute a classification category, and/or the embedding, and/or perform clustering, for detecting, identifying the status and harvesting fruit(s) or vegetable(s) grown in greenhouse using one or more autonomous airborne vehicle.
- the computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks herein disclosed.
- the architecture of the neural network(s) may be implemented, for example, as convolutional, pooling, nonlinearity, locally-connected, fully-connected layers, and/or combinations of the aforementioned.
- the tagging or classifying or detection of the fruits or vegetables in the images may be manually or semi manually entered by a user (e.g., via the GUI, for example, selected from a list of available phenotypic characteristic targets or parameters), obtained as predefined values stored in a data storage device, and/or automatically computed.
- feature vector refers hereinafter in the context of machine learning to an individual measurable property or characteristic or parameter or attribute of a phenotype being observed e.g., detected by a sensor. It is herein apparent that choosing an informative, discriminating and independent feature is a crucial step for effective algorithms in pattern recognition, computer vision, machine learning, classification and regression. Algorithms using classification from a feature vector include nearest neighbor classification, neural networks, and statistical techniques.
- a feature is an information which is relevant for solving the computational task related to a certain application.
- Features may be specific structures in the image such as points, edges or objects.
- Features may also be the result of a general neighborhood operation or feature detection applied to the image.
- features are defined in terms of local neighborhood operations applied to an image, a procedure commonly referred to as feature extraction is executed.
- the present invention provides a system for indoor management of growing agricultural produce comprising: (a) at least one autonomous airborne vehicle, wherein said vehicle is designed to commute inside an enclosed space comprising said growing agricultural produce; (b) data acquisition module operably engaged with the at least one autonomous airborne vehicle, wherein said module is configured to collect data related to said growing agricultural produce; (c) at least one controller operably engaged with the at least one autonomous airborne vehicle; (d) a computing device, in communication with the data acquisition module and the at least one controller; and, optionally, (e) an operator in communication with the at least one controller.
- the present invention provides a method for indoor management of growing agricultural produce comprising: (a) providing the system as defined in any of the above; (b) providing an enclosed space comprising growing agricultural produce, wherein said space comprises outer walls and a ceiling; (c) directing the at least one autonomous airborne vehicle towards the growing agricultural produce; (d) collecting data related to the growing agricultural produce by the data acquisition module; (e) transmitting the collected data to the computer; (f) processing the collected data and providing instructions to the controller based on the collected data; and, (g) executing the instructions by the controller to thereby effectively manage the growing agricultural produce.
- an autonomous airborne vehicle comprising: (a) a power source; (b) a data acquisition module; (c) a computing system comprising a memory and a processor; (d) a transducer; and, optionally, (e) a controller.
- the aforementioned autonomous airborne vehicle is designed to commute inside an enclosed space characterized by having outer walls and a ceiling; and, wherein said enclosed space comprises fruit or vegetablebearing growing agricultural produce.
- an autonomous airborne vehicle comprises (a) a power source; (b) a data acquisition module; (c) a transducer; (d) a fruit or vegetable harvesting element; and, optionally, (e) a controller; and (f) a computing system comprising a memory and a processor.
- the autonomous airborne vehicle is designed to commute inside an enclosed space characterized by having outer walls and a ceiling; and, wherein said enclosed space comprises fruit or vegetable-bearing growing agricultural produce.
- the present invention provides a computer implemented method of indoor management of a growing fruit or vegetablebearing crops comprising: (a) providing the system of as defined in any of the above; (b) collecting image data by the data acquisition module; (c) processing the collected data and assessing the state of the fruit or vegetable -bearing crops based on a set of parameters indicative of the state of the fruit or vegetable-bearing crops extracted from the collected image data; (d) providing instructions to the controller based on the state of the growing fruit or vegetable-bearing crops; and, (e) managing the growing fruitbearing crops.
- Fig. 1 depicting a schematic illustration of a scouting autonomous airborne vehicle 100 as an embodiment of the system of the present invention.
- the scouting autonomous airborne vehicle 100 is configured to maneuvering in the greenhouse and performing greenhouse mapping and identification and marking of fruit and/or vegetables ready to be picked by the at least one harvesting drone in the fleet.
- such a scouting autonomous airborne vehicle 100 comprises a main body 10 of the drone, which is adopted in size and shape to indoor conditions, specifically, to greenhouse conditions.
- a drone may be equipped with (physically engaged with the main body 10) at least one image acquisition element, e.g. camera 20.
- image acquisition element e.g. camera 20.
- Non limiting of a camera or an image acquisition element within the scope of the present invention include a camera type of 2D, 3D, multifunctional, hyperspectral, IR, RGB, navigation camera, fruit/vegetable camera and/or stereoscopic camera.
- the drone 100 may be further equipped with at least one of module selected from: at least one navigation module, i.e.
- the scouting device is configured to (1) mapping the enclosed space, e.g. greenhouse, (2) detecting and identifying the fruit/vegetable status, for example ripeness, BRIX content, readiness for picking, and (3) communication with one or more drones in the fleet to mark and/or communicate data regarding specific fruits/vegetables, such as readiness for picking, to one or more harvesting drones.
- the scouting autonomous airborne vehicle has capability of operating in day and night conditions.
- the harvesting drone comprises the drone's main body 11, which is adapted in size, shape and structure/function relations to operating indoor for example in green houses.
- the autonomous airborne vehicle 200 is integrated with at least one navigation module, i.e. GPS 30, at least one antenna module 40 for receiving or transmitting electromagnetic/ radio waves, at least one Bluetooth device 50 for wirelessly communicating data and at least one communication device 60 for communication with other one or more drones (e.g. through radio or cellular communication).
- a harvesting or operating drone is further equipped with at least one harvesting element, e.g.
- the robotic arm 75 to detach the fruit or vegetable form the fruit or vegetable bearing crop.
- the robotic arm 75 may be in a configuration of a fixed long arm that can protrude leaves and branches.
- the arm may be equipped with on-drone-camera that enable closed loop feedback in addition to having functionality of harvester.
- Such a robotic arm 75 enable the detection of ripe fruits followed by access thereto with the arm and in addition controlling the movement of the drone.
- the harvesting drone 200 and/or robotic arm 75 comprises a harvesting element which may be a suction, a cutter, a grasper, a rotary blade and any combination thereof.
- the harvesting element can be a robotic arm, which extends from the drone body 11, its length needs to be long enough for protruding the branches and accessing the fruit or vegetable without injuring the crop plant, e.g. branches, and without un-balancing of the drone.
- the robotic arm 75 may be a flexible or adjustable arm with at least 2 degrees of freedom, which enables fruit or vegetable access without moving the drone.
- the robotic arm is a rigid or fixed arm.
- the robotic harvesting arm 75 can be installed on top, bottom, or side of the drone, or any combination thereof.
- the harvesting arm comprises a fruit or vegetable-grabbing/gripping/detaching/pulling mechanism e.g. forceps, clamps, a guillotine mechanism robotic fingers or by a vacuum pump that pulls the fruit/vegetable away of the plant.
- a fruit or vegetable-grabbing/gripping/detaching/pulling mechanism e.g. forceps, clamps, a guillotine mechanism robotic fingers or by a vacuum pump that pulls the fruit/vegetable away of the plant.
- the harvesting arm further comprises a fruit's cutting unit (e.g. secateurs, saw, scissors, shears, and laser) for assisting in the removal of the fruit or vegetable off the crop plant.
- the harvesting autonomous airborne vehicle 200 further comprises a collection container 80. After disconnecting or removing the fruit or vegetable from the fruit or vegetable-bearing plant, the drone can either take the fruit/vegetable to a collection point or to a collection unit or element or box or container integrated with the drone (as shown in Fig.
- the harvesting drone delivers the harvested fruit(s) or vegetable(s) to a collection container 80 mounted with the drone.
- the UAV perform fruit or vegetable quality analysis and deliver or carry the harvested fruit or vegetable to the appropriate container, according to the fruits? vegetable's quality; this enables quality assessment and sorting of the fruits/vegetables during their picking process in real time, according to predetermined criteria or parameters.
- the harvesting autonomous airborne vehicle 200 is further mounted with at least one sensor, for example, in this embodiment, one or more weight/quantity sensor 90 and/or one or more a plant & surrounding distance or mapping sensor 95.
- sensors within the scope of the current invention include in a non-limiting manner optical sensor, an acoustic sensor, a chemical sensor, a geo-location sensor, an environmental sensor, a weather sensor; a refractometer, a pH meter, acidometer, penetrometer, durometer, temperature meter, weight sensor, quantity sensor, pressure sensor, plant and/or surrounding detection sensor, plant and/or surrounding distance sensor, device or sensor detecting odors and/or flavors, or any combination thereof.
- the harvesting drone 200 is configured to picking the fruit or vegetable according to the data detected by the scouting drone 100, transported for processing into output comprising fruits or vegetables marked as ready for picking or harvesting.
- the one or more fruits or vegetables harvested by the harvesting drone 200 are delivered to the collection container 80.
- the harvesting autonomous airborne vehicle 200 is adapted to working in rows of planted agricultural produce, in defined areas in an enclosed space or structure (e.g. greenhouse).
- the harvesting autonomous airborne vehicle 200 is equipped with means 60 for communicating with the scouting drone or any other done in the fleet.
- the harvesting autonomous airborne vehicle 200 is equipped with harvesting means or elements 75 such as a robotic arm for picking fruit(s) or vegetable(s) and optionally with at least one fruit or vegetable collection container 80.
- the harvesting autonomous airborne vehicle 200 has the ability of 3D identification in space (e.g. 3D camera or sensor) and communication between other members of the fleet of drones.
- Fig. 3 schematically illustrating an embodiment that may be implemented in the system of the present invention, in accordance with certain aspects of the present invention.
- This embodiment schematically depicts a computerized managing or central system or central control and management unit 300.
- a computing device comprising a server on-premises (server on-prem) manager main body 12 is mounted with at least one drone battery charging platform 25, a GPS 30, at least one antenna module 40 for receiving or transmitting electromagnetic/ radio waves, at least one Bluetooth device 50 and at least one drones communication device 60.
- server on-prem server on-premises
- the central managing unit 300 is configured to control and monitor the conditions in the greenhouse and/or performance of the one or more autonomous airborne vehicles in the greenhouse. Such a unit 300 may be positioned on site or remotely. According to further aspects, the central control and management unit 300 may comprise at least one On-premises managing software and/or at least one off-premises or remote or on-cloud managing software.
- central unit 300 is configured to have the ability to manage and interface a central managing unit with the fleet of automated drones.
- the managing unit 300 may comprise a user interface for the grower, manager or operator in the greenhouse., means for communication with an on-site or cloud based software, means for managing statistical data on the condition of the agricultural produce in the greenhouse at any given moment and cross-referencing them with data on greenhouse parameters such as: temperature, irrigation, fertilization and others.
- the computerized managing unit 300 may enable feedback mode of action in the greenhouse.
- the unit 300 encompass a controller module or function.
- a harvester drone comprising main body 11 and means 60 for communicating with other one or more drones, is in communication with a scouting drone within the fleet.
- the scouting drone comprises main body 10 and means 60 for communicating with other one or more drones.
- the harvesting drone is in further communication with the main server on-site managing unit 12, which is in communication with user interface 65.
- a main server on-cloud manager 13 is in communication with the harvesting drone (comprising main body 11 and means 60 for communicating with other one or more drones), the scouting drone (comprising main body 10 and means 60 for communicating with other one or more drones), with the main server on-site managing unit 12 and with the user interface 65.
- the robotic arm 500 comprises elements including the robotic arm main body 13, harvesting element 85, for example at least one suction, cutter and/or grasper, and weight/quantity sensor(s) 90 for evaluating harvested fruit/vegetable weight or quantity, or any other preselected sensor.
- a robotic arm engaged with the autonomous airborne vehicle of the present invention is used for picking fruits/vegetable ready for picking within the greenhouse.
- the robotic arm is configured for harvesting in various methods such as by engulfing a fruit while cutting/tearing it (e.g. the stipe) from the plant; by loop or two loops that holding the fruit/ vegetable and pulls it or by a vacuum pump.
- Particular harvesting methods are exemplified below: [00237] • Fruit/vegetable gripping, and pulling and rotating movements in parallel - the fruit/vegetable is then transferred to a local or specific collection container and optionally from there to a general collection container.
- the fruit/vegetable is then transferred to a local or specific collection container and optionally from there to a general collection container.
- the autonomous airborne vehicle is equipped with a vacuum pipe, picking the fruit/vegetable with the assistance of a vacuum action.
- the fruit/vegetable is then transferred to a local or specific collection container and optionally from there to a general collection container.
- a vacuum pump may be used to pull a fruit/vegetable from a plant.
- the harvesting arm comprises a fruit/vegetable grabbing/gripping mechanism (e.g. forceps, clamps, or robotic fingers).
- the fruit/vegetable pulling is done with a vacuum pump that pulls the fruit/vegetable away of the fruit or vegetable bearing plant.
- the drone harvesting arm has a palm/gripping mechanism that can hold a fruit/vegetable.
- the palm may comprise of a few fingers, a flexible cab, or a vacuum mechanism.
- the robotic arm activity and configuration may include:
- a robotic arm with several degrees of freedom that allow gentle movement within the plant branches, without harming or damaging the plant or fruits/vegetables of other plants.
- the robotic arm is built from a lightweight material that allows the autonomous airborne vehicle or drone to carry it at a minimum energy cost.
- the robotic arm is configured to transfer the fruit/vegetable using gravity forces to a collection container located within the greenhouse.
- the autonomous airborne vehicle or drone may have various configurations, including:
- the autonomous airborne vehicle is mounted with a multispectral camera equipped with Al/machine learning capabilities or systems, which allows three-dimensional movement in a complicated environment and detection of ripen fruit/vegetable for harvesting.
- the autonomous airborne vehicle of the present invention is configured to automatically manage agricultural produce collected in a container (based on weight or quantity parameters) and provide instructions on transferring the collection to a central container.
- the autonomous airborne vehicle is configured to manage the battery status and switch it to charging mode on a suitable surface automatically.
- the autonomous airborne vehicle is capable of management of communication with a central control and monitoring system in order to report data / work status / faults in the system automatically.
- system of the present invention and/ or the autonomous airborne vehicle of the present invention comprises computer visionbased detection capabilities.
- the computer vision-based detection system is configured to identify the position of the autonomous airborne vehicle within the greenhouse space, e.g. longitudinal axis (specific row), width axis (specific plant in a row) height axis (across a single plant, e.g. specific fruit/vegetable within a plant).
- the computer vision-based detection system is configured to monitoring a specific marker geographic locations in a work area using X,Y, Z coordinates position.
- the computer vision-based detection system enable function of the autonomous airborne vehicle at day and night conditions by using infrared camera adjusted for dark conditions.
- the computer vision-based detection system is configured to photograph the plant at the level of a group of plants (more than one plant), single plant, a cluster within one plant, or at the level of a single fruit/vegetable.
- system of the present invention is capable of identification by computer vision-based technology of a fruit/vegetable as an object in space.
- the computer vision-based detection system of the present invention which is combined with Al/machine learning algorithms is configured to infer or extrapolate or predict from a single fruit/vegetable image or video on the condition or status of the fruit/vegetable and cross it or compare it with other sample images or data examined and classified by a professional breeder or cultivator.
- the computer vision-based autonomous airborne vehicle detection system of the present invention is configured to predict yield (e.g. fruits or vegetables) expected from a certain agricultural growing area throughout all stages of harvesting of the agricultural produce.
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Life Sciences & Earth Sciences (AREA)
- Microelectronics & Electronic Packaging (AREA)
- Mechanical Engineering (AREA)
- Remote Sensing (AREA)
- Pest Control & Pesticides (AREA)
- Robotics (AREA)
- Environmental Sciences (AREA)
- Guiding Agricultural Machines (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Harvesting Machines For Specific Crops (AREA)
Abstract
The present invention discloses a system for indoor management of growing agricultural produce comprising: (a) at least one autonomous airborne vehicle, wherein said vehicle is designed to commute inside an enclosed space comprising said growing agricultural produce; (b) data acquisition module operably engaged with the at least one autonomous airborne vehicle, wherein said module is configured to collect data related to said growing agricultural produce; (c) at least one controller operably engaged with the at least one autonomous airborne vehicle; (d) a computing device, in communication with the data acquisition module and the at least one controller; and, optionally, (e) an operator in communication with the at least one controller. Methods for indoor management of growing agricultural produce are further disclosed.
Description
SYSTEM AND METHOD FOR INDOOR CROP MANAGEMENT
FIELD OF THE INVENTION
[0001] The present invention relates to the field of automated agriculture. The present invention pertains particularly to means and methods for indoor crop management and more specifically to automated aerial vehicle for indoor harvesting and management of agricultural produce.
BACKGROUND OF THE INVENTION
[0002] The use of greenhouses to grow vegetables is expanding to meet the increasing demand for supply and high-quality produce year-round. The vegetable greenhouse production sector is highly labor intensive. Therefore, labor shortage is a major challenge for farmers.
[0003] Labor costs for fruit harvesting in greenhouses comprise 30% to 60% of total greenhouses production costs. With rising labor costs and difficulties in finding and retaining qualified workers, growers are seeking for technological solutions to reduce costs and improve production efficiency.
[0004] Greenhouse growers are the most interested in integrating harvesting robotics into their operations; with 34% of growers reporting that they are considering it. A key reason is because vegetable growers maintain a steadier workforce than other field crops, with 40% of vegetable farms having permanent employees. It is emphasized that greenhouse conditions are different from those of orchards and other plantations, for example by physical conditions and plant types, therefore unique adaptations of automated management systems are required in order to operate and harvest in greenhouse. In addition to reduction of labor costs, quality of harvest is another reason that vegetable growers are interested in robotics.
[0005] The importance of automation, even partial automation, is becoming highly significant. The world population is expected to reach 9 billion by 2050. It is expected that a dramatic increase in agricultural production will follow, doubling to meet future demand. This need made farmers turn to precision agriculture, automation and robotics as a solution for the future.
[0006] Field and row crops such as potatoes, wheat and com are often gathered mechanically. But many delicate leafy greens and soft fruits and vegetables have remained largely resistant to robotic picking. Any scalable solution has to be cost-
efficient in order to be implemented. The technology requires complex motion planning, computer vision and object manipulation in the field which will solve the problem of picking high-value, delicate fruits such as tomato, pepper, cucumber and potentially more.
[0007] One of the solutions attempting to automate agriculture management is the use of drones or unmanned aerial vehicles (UAVs). Several examples of agricultural drones used in fields and plantations such as orchards are described in the art.
[0008] US Patent 10492374 describes a method for acquiring sensor data associated with a plant growing in a field, and analyzing the sensor data to extract one or more phenotypic traits associated with the plant to determine information on the state of the plant. It is mentioned that the sensor data may be acquired using a human- operated vehicle, an unmanned aerial vehicle (UAV), or an unmanned ground vehicle (UGV).
[0009] US Patent 10555460 describes a system for harvesting produce from a tree, specifically a coconut tree, by a drone capable of hovering and having a video camera gathering visual data of movement.
[0010] PCT publications WO2018033922, WO2018033923, WO2018033925 and WO2018033926 relate to an autonomous unmanned aircraft vehicle (UAV) for management, mapping and harvesting or diluting fruit trees in an orchard. These publications teach that suitable trees for the described technology are trees that their fruits are relatively large and visible such as avocado, mango, and grapefruit. It is stated that such large fruits are connected to the branch through a thin and visible stipe. The UAVs further comprising a protruding, netted cage adapted for pushing branches and leaves.
[0011] In view of the above, there is still a long felt and unmet need for efficient and cost-effective automated systems to perform management and harvesting tasks in greenhouses to improve crop yield and productivity.
SUMMARY OF THE INVENTION
[0012] Accordingly, it is a principal object of the present invention to overcome disadvantages of prior art. This is accomplished in one embodiment by a system for indoor management of growing agricultural produce comprising: (a) at least one autonomous airborne vehicle, wherein said vehicle is designed to commute inside an
enclosed space comprising said growing agricultural produce; (b) data acquisition module operably engaged with the at least one autonomous airborne vehicle, wherein said module is configured to collect data related to said growing agricultural produce; (c) at least one controller operably engaged with the at least one autonomous airborne vehicle; (d) a computing device, in communication with the data acquisition module and the at least one controller; and, optionally, (e) an operator in communication with the at least one controller.
[0013] It is a further object of the present invention to disclose the system as defined above, wherein the enclosed space having outer walls and a ceiling.
[0014] It is a further object of the present invention to disclose the system as defined in any of the above, wherein said system is designed to fulfill at least one of the functions selected from harvesting the agricultural produce, diluting the agricultural produce, collecting data related to the agricultural produce, pollinating the agricultural produce, or any combination thereof.
[0015] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the agricultural produce comprises at least one of fruit-bearing crops, vegetable- bearing crops and flowering crops.
[0016] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the autonomous airborne vehicle further comprises at least one harvesting element configured to extend from the vehicle and to become physically engaged with the agricultural produce.
[0017] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the harvesting element is a fruit or vegetable harvesting element configured to detach fruit or vegetable form the fruit or vegetable bearing crop.
[0018] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the at least one harvesting element is a robotic arm.
[0019] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the harvesting element is selected from the group consisting of a suction, a cutter, a grasper and any combination thereof.
[0020] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the autonomous airborne vehicle further comprises a movement-controlling element configured to enable the vehicle to commute within the enclosed space.
[0021] It is a further object of the present invention to disclose the system as defined in any of the above, wherein said system comprises an autonomous navigation unit such as a Global Positioning System (GPS).
[0022] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the data acquisition module is selected from data related to crop performance, environmental data, or a combination thereof.
[0023] It is a further object of the present invention to disclose the system as defined in any of the above, wherein data related to crop performance is selected from fruit or vegetable quality, flowering stage, crop size, color of leaves, crop diseases or a combination thereof.
[0024] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the data related to fruit or vegetable quality comprises fruit or vegetable size, fruit or vegetable shape, fruit or vegetable length, fruit or vegetable color, fruit or vegetable location, BRIX content, fruit or vegetable ripeness, BRIX (%), Firmness (sh), Dry Weight (%), Quality Defect, Condition Defect, Color range, Acidity, Hue, Color Variance, Appearance, Color Coverage (%), Starch (%), Weight, Juice (%), Seeds (%), Temperature, TSS / Acidity, number of fruits or vegetable per crop, yield, health, state of the plant, performance, or any combination thereof.
[0025] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the environmental data is selected from temperature, humidity, irradiation, location of the vehicle, location of the agricultural produce within the enclosed space and any combination thereof.
[0026] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the data acquisition module comprises at least one image acquisition element, wherein said image acquisition element is configured to collect image data related to the agricultural produce.
[0027] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the at least one image acquisition element is a multispectral camera.
[0028] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the at least one image acquisition element is selected from the group consisting of a Hyperspectral camera, an Infra-Red (IR) camera, a RGB camera, visible light frequency range camera, near infrared (NIR) frequency range camera, monochrome, specific light wavelengths (e.g. LED and/or laser and/or halogen and/or xenon and/or fluorescent), UV frequency range and any combination thereof.
[0029] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the data acquisition module comprises at least one sensor configured to collect data related to the agricultural produce, wherein the data is not an image data.
[0030] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the data acquisition module comprises data acquired from one or more of an optical sensor, an acoustic sensor, a chemical sensor, a geo-location sensor, an environmental sensor, a weather sensor; a refractometer, a pH meter, acidometer, penetrometer, durometer, temperature meter, weight sensor, quantity sensor, pressure sensor, plant and/or surrounding detection sensor, plant and/or surrounding distance sensor, device or sensor detecting odors and/or flavors, or any combination thereof.
[0031] It is a further object of the present invention to disclose the system as defined in any of the above, further comprising a transducer wherein said transducer is configured to transmit the data collected by the data acquisition module to the computing device.
[0032] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the computing device comprises a memory and a processor and is configured to process the data collected by the data acquisition module and to transmit instructions to the controller based on the processed data.
[0033] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the controller is configured to fulfill at least one
of the functions selected from: receiving instructions from the computer, providing instructions to the operator, providing instructions to the movement-controlling element of the autonomous airborne vehicle, turning on the fruit or vegetable harvesting element of the autonomous airborne vehicle, turning off the fruit or vegetable harvesting element of the autonomous airborne vehicle, communicate with other one or more autonomous airborne vehicles.
[0034] It is a further object of the present invention to disclose the system as defined in any of the above, comprising more than one autonomous airborne vehicle.
[0035] It is a further object of the present invention to disclose the system as defined in any of the above, comprising a fleet of autonomous airborne vehicles.
[0036] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the fleet of autonomous airborne vehicles comprises one or more scouting autonomous airborne vehicle and/or one or more harvesting autonomous airborne vehicle.
[0037] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the enclosed space is selected from a greenhouse, a plastic or glass house, and a net or mesh house.
[0038] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the fruit or vegetable-bearing crop is selected from the group consisting of tomato, pepper, cucumber, strawberry, raspberry, blueberry and any snack fruit or vegetable. Cherry tomato for loose and cluster harvesting, cherry and mini blocky Pepper, baby cucumber and any snack fruit and vegetable
[0039] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the operator is a human operator.
[0040] It is a further object of the present invention to disclose the system as defined in any of the above, wherein the operator is an automated operator.
[0041] It is a further object of the present invention to disclose the system as defined in any of the above, further comprising at least one container operably engaged with the one or more autonomous airborne vehicles.
[0042] It is a further object of the present invention to disclose the system as defined in any of the above, wherein said autonomous airborne vehicle further comprises a wireless communication module or device, such as a Bluetooth device.
[0043] It is a further object of the present invention to disclose the system as defined in any of the above, wherein said autonomous airborne vehicle further comprises a communication module to wirelessly communicate with other one or more autonomous airborne vehicles within said enclosed space.
[0044] It is a further object of the present invention to disclose the system as defined in any of the above, wherein said system further comprises at least one Onpremises managing software and/or at least one off-premises or remote or on-cloud managing software.
[0045] It is a further object of the present invention to disclose the system as defined in any of the above, wherein said system further comprises a user interface such as a graphical user interface (GUI).
[0046] It is a further object of the present invention to disclose a method for indoor management of growing agricultural produce comprising: (a) providing the system as defined in any of the above; (b) providing an enclosed space comprising growing agricultural produce, wherein said space comprises outer walls and a ceiling; (c) directing the at least one autonomous airborne vehicle towards the growing agricultural produce; (d) collecting data related to the growing agricultural produce by the data acquisition module; (e) transmitting the collected data to the computer; (f) processing the collected data and providing instructions to the controller based on the collected data; and, (g) executing the instructions by the controller to thereby effectively manage the growing agricultural produce.
[0047] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the growing agricultural produce comprises fruit or vegetable- bearing crops.
[0048] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the indoor management of growing agricultural produce comprises harvesting fruits or vegetables from the fruit or vegetable-bearing crops by the at least one autonomous airborne vehicle.
[0049] It is a further object of the present invention to disclose the method as defined in any of the above, wherein said system is designed to fulfill at least one of the functions selected from harvesting the agricultural produce, diluting the agricultural produce, collecting data related to the agricultural produce, pollinating the agricultural produce, or any combination thereof.
[0050] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the agricultural produce comprises at least one of fruit-bearing crops, vegetable crops and flowering crops.
[0051] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the autonomous airborne vehicle further comprises at least one harvesting element configured to extend from the vehicle and to become physically engaged with the agricultural produce.
[0052] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the harvesting element is a fruit or vegetable harvesting element configured to detach fruit or vegetable form the fruit or vegetable bearing crop.
[0053] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the harvesting element is selected from the group consisting of a suction, a cutter, a grasper and any combination thereof.
[0054] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the at least one harvesting element is a robotic arm.
[0055] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the autonomous airborne vehicle further comprises a movement-controlling element configured to enable the vehicle to commute within the enclosed space.
[0056] It is a further object of the present invention to disclose the method as defined in any of the above, wherein said system comprises an autonomous navigation unit such as a Global Positioning System (GPS).
[0057] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the data acquisition module is selected from data related to crop performance, environmental data, or a combination thereof.
[0058] It is a further object of the present invention to disclose the method as defined in any of the above, wherein data related to crop performance is selected from fruit or vegetable quality, flowering stage, crop size, color of leaves, crop diseases or a combination thereof.
[0059] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the data related to fruit or vegetable quality comprises fruit or vegetable size, fruit or vegetable shape, fruit or vegetable length, fruit or vegetable color, fruit or vegetable location, BRIX content, fruit or vegetable ripeness, BRIX (%), Firmness (sh), Dry Weight (%), Quality Defect, Condition Defect, Color range, Acidity, Hue, Color Variance, Appearance, Color Coverage (%), Starch (%), Weight, Juice (%), Seeds (%), Temperature, TSS / Acidity, number of fruits or vegetable per crop, yield, health, state of the plant, performance, or any combination thereof.
[0060] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the environmental data is selected from temperature, humidity, irradiation, location of the vehicle, location of the agricultural produce within the enclosed space and any combination thereof.
[0061] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the data acquisition module comprises at least one image acquisition element, wherein said image acquisition element is configured to collect image data related to the agricultural produce.
[0062] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the at least one image acquisition element is a multispectral camera.
[0063] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the at least one image acquisition element is selected from the group consisting of a Hyperspectral camera, an Infra-Red (IR) camera, a RGB camera, visible light frequency range camera, near infrared (NIR) frequency range camera, monochrome, specific light wavelengths (e.g. LED and/or
laser and/or halogen and/or xenon and/or fluorescent), UV frequency range and any combination thereof.
[0064] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the data acquisition module comprises at least one sensor configured to collect data related to the agricultural produce, wherein the data is not an image data.
[0065] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the data acquisition module comprises data acquired from one or more of an optical sensor, an acoustic sensor, a chemical sensor, a geo-location sensor, an environmental sensor, a weather sensor; a refractometer, a pH meter, acidometer, penetrometer, durometer, temperature meter, weight sensor, quantity sensor, pressure sensor, plant and/or surrounding detection sensor, plant and/or surrounding distance sensor, device or sensor detecting odors and/or flavors, or any combination thereof.
[0066] It is a further object of the present invention to disclose the method as defined in any of the above, wherein said system further comprising a transducer wherein said transducer is configured to transmit the data collected by the data acquisition module to the computing device.
[0067] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the computing device comprises a memory and a processor and is configured to process the data collected by the data acquisition module and to transmit instructions to the controller based on the processed data.
[0068] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the controller is configured to fulfill at least one of the functions selected from: receiving instructions from the computer, providing instructions to the operator, providing instructions to the movement-controlling element of the autonomous airborne vehicle, turning on the fruit or vegetable harvesting element of the autonomous airborne vehicle, turning off the fruit or vegetable harvesting element of the autonomous airborne vehicle, communicate with other one or more autonomous airborne vehicles.
[0069] It is a further object of the present invention to disclose the method as defined in any of the above, comprising more than one autonomous airborne vehicle.
[0070] It is a further object of the present invention to disclose the method as defined in any of the above, comprising a fleet of autonomous airborne vehicles.
[0071] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the fleet of autonomous airborne vehicles comprises one or more scouting autonomous airborne vehicle and/or one or more harvesting autonomous airborne vehicle.
[0072] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the enclosed space is selected from a greenhouse, a plastic or glass house, and a net or mesh house.
[0073] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the fruit or vegetable-bearing crop is selected from the group consisting of tomato, pepper, cucumber, strawberry, raspberry, blueberry and any snack fruit or vegetable.
[0074] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the operator is a human operator.
[0075] It is a further object of the present invention to disclose the method as defined in any of the above, wherein the operator is an automated operator.
[0076] It is a further object of the present invention to disclose the method as defined in any of the above, wherein said system further comprising at least one container operably engaged with the one or more autonomous airborne vehicles.
[0077] It is a further object of the present invention to disclose the method as defined in any of the above, wherein said autonomous airborne vehicle further comprises a wireless communication module or device, such as a Bluetooth device.
[0078] It is a further object of the present invention to disclose the method as defined in any of the above, wherein said autonomous airborne vehicle further comprises a communication module to wirelessly communicate with other one or more autonomous airborne vehicles within said enclosed space.
[0079] It is a further object of the present invention to disclose the method as defined in any of the above, wherein said system further comprises at least one Onpremises managing software and/or at least one off -premises or remote or on-cloud managing software.
[0080] It is a further object of the present invention to disclose the method as defined in any of the above, wherein said system further comprises a user interface such as a graphical user interface (GUI).
[0081] It is a further object of the present invention to disclose an autonomous airborne vehicle comprising: (a) a power source; (b) a data acquisition module; (c) a computing system comprising a memory and a processor; (d) a transducer; and, optionally, (e) a controller; wherein said autonomous airborne vehicle is designed to commute inside an enclosed space characterized by having outer walls and a ceiling; and, wherein said enclosed space comprises fruit or vegetable-bearing growing agricultural produce.
[0082] It is a further object of the present invention to disclose an autonomous airborne vehicle comprising: (a) a power source; (b) a data acquisition module; (c) a transducer; (d) a fruit or vegetable harvesting element; and, optionally, (e) a controller; and (f) a computing system comprising a memory and a processor; wherein said autonomous airborne vehicle is designed to commute inside an enclosed space characterized by having outer walls and a ceiling; and, wherein said enclosed space comprises fruit or vegetable -bearing growing agricultural produce.
[0083] It is a further object of the present invention to disclose the autonomous airborne vehicle as defined in any of the above, wherein said data acquisition module comprises at least one image acquisition element.
[0084] It is a further object of the present invention to disclose the autonomous airborne vehicle as defined in any of the above, wherein said at least one image acquisition element comprises a multi-function camera such as Hyperspectral, IR and/or RGB camera.
[0085] It is a further object of the present invention to disclose the autonomous airborne vehicle as defined in any of the above, further comprising a navigation unit such as a Global Positioning System (GPS).
[0086] It is a further object of the present invention to disclose the autonomous airborne vehicle as defined in any of the above, further comprising a communication device with one or more other autonomous airborne vehicles.
[0087] It is a further object of the present invention to disclose the autonomous airborne vehicle as defined in any of the above, further comprising a fruit and/or vegetable collection container.
[0088] It is a further object of the present invention to disclose the autonomous airborne vehicle as defined in any of the above, further comprising at least one sensor selected from the group consisting of optical sensor, an acoustic sensor, a chemical sensor, a geo-location sensor, an environmental sensor, a weather sensor; a refractometer, a pH meter, acidometer, penetrometer, durometer, temperature meter, weight sensor, quantity sensor, pressure sensor, plant and/or surrounding detection sensor, plant and/or surrounding distance sensor, device or sensor detecting odors and/or flavors, or any combination thereof.
[0089] It is a further object of the present invention to disclose a computer implemented method of indoor management of a growing fruit or vegetable-bearing crops comprising: (a) providing the system as defined in any of the above; (b) collecting image data by the data acquisition module; (c) processing the collected data and assessing the state of the fruit or vegetable-bearing crops based on a set of parameters indicative of the state of the fruit or vegetable-bearing crops extracted from the collected image data; (d) providing instructions to the controller based on the state of the growing fruit or vegetable -bearing crops; and, (e) managing the growing fruitbearing crops.
[0090] It is a further object of the present invention to disclose the computer implemented method as defined in any of the above, wherein said assessing the state of the growing agricultural produce comprises using a computerized system including at least one of a machine learning system, a deep learning system, an optical flow technique, a computer vision technique, a convolutional neural network (CNN), a recurrent neutral network (RNN), or a machine learning dataflow library, and wherein said system analyzing the acquired image data comprising autonomously predicting one or more phenotypic traits of said fruit or vegetable bearing agricultural produce.
[0091] It is a further object of the present invention to disclose the computer implemented method as defined in any of the above, wherein said assessing the state of the growing agricultural produce comprises using a trained neural network.
[0092] It is a further object of the present invention to disclose the computer implemented method as defined in any of the above, wherein said step of processing comprises steps of computing said image data using computer implemented algorithm trained to generate output based on the image data.
[0093] It is a further object of the present invention to disclose the computer implemented method as defined in any of the above, wherein said computer implemented algorithm is trained to generate output based on predetermined feature vectors or attributes extracted from the image data.
[0094] It is a further object of the present invention to disclose the computer implemented method as defined in any of the above, wherein said method comprises steps of implementing with said algorithm a training process according to a training dataset comprising a plurality of training images of a plurality of fruit or vegetablebearing crops captured by the at least one image acquisition element, wherein each respective training image of the plurality of training images is associated with the state of said fruit or vegetable-bearing crop depicted in the respective training image.
[0095] It is a further object of the present invention to disclose the computer implemented method as defined in any of the above, wherein said training process comprises steps of (a) capturing images of the fruit or vegetable-bearing crop using an imaging sensor; (b) classifying images into desired categories by applying a tag associated with parameters or attributes indicative of the state of the fruit or vegetablebearing crop extracted from the image data; and (c) applying a computer vision algorithm to determine a set of feature vectors associated with each desired category.
[0096] It is a further object of the present invention to disclose the computer implemented method as defined in any of the above, comprising steps of applying a machine learning process with the computer implemented trained algorithm to determine the state of the fruit or vegetable-bearing crop.
[0097] It is a further object of the present invention to disclose the computer implemented method as defined in any of the above, wherein said machine learning process comprises computing by the at least one neural network, a tag of at least one desired category for the at least one fruit-bearing crop, wherein the tag of at least one classification category is computed at least according to weights of the at least one neural network, wherein the at least one neural network is trained according to a training
dataset comprising a plurality of training images of a plurality of fruit or vegetablebearing crops captured by the at least one imaging sensor, wherein each respective training image of the plurality of training images is associated with said tag of at least one desired category of at least one fruit or vegetable-bearing crop depicted in the respective training image; and generating according to the tag of at least one classification category, instructions for execution by the controller.
[0098] It is a further object of the present invention to disclose the computer implemented method as defined in any of the above, wherein the state of the fruit or vegetable comprises parameters including fruit or vegetable size, fruit or vegetable shape, fruit or vegetable length, fruit or vegetable color, fruit or vegetable location, BRIX content, fruit or vegetable ripeness, BRIX (%), Firmness (sh), Dry Weight (%), Quality Defect, Condition Defect, Color range, Acidity, Hue, Color Variance, Appearance, Color Coverage (%), Starch (%), Weight, Juice (%), Seeds (%), Temperature, TSS / Acidity, number of fruits or vegetable per crop, yield, health, state of the plant, performance, or any combination thereof.
[0099] It is a further object of the present invention to disclose a computer implemented algorithm comprising code to perform the steps of the method as defined in any of the above.
BRIEF DESCRIPTION OF THE DRAWINGS
[00100] For a better understanding of the invention and to show how the same may be carried into effect, reference will now be made, purely by way of example, to the accompanying drawings in which like numerals designate corresponding elements or sections throughout.
[00101] With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of the preferred embodiments of the present invention only, and are presented in the cause of providing what is believed to be the most useful and readily understood description of the principles and conceptual aspects of the invention. In this regard, no attempt is made to show structural details of the invention in more detail than is necessary for a fundamental understanding of the invention, the description taken with the drawings making apparent to those skilled in the art how the several forms of the invention may be embodied in practice. In the accompanying drawings:
[00102] Fig. 1 is a schematic illustration of a scouting autonomous airborne vehicle as an embodiment of the system of the present invention;
[00103] Fig. 2 is a schematic illustration of a harvesting autonomous airborne vehicle as an embodiment of the system of the present invention;
[00104] Fig. 3 is a schematic illustration of an embodiment comprising a computerized managing unit that may be implemented in the system of the present invention, in accordance with certain aspects of the present invention;
[00105] Fig. 4 is a schematic illustration of exemplified embodiments of the system for indoor management of growing agricultural produce of the present invention; and
[00106] Fig. 5 is a schematic illustration depicting an exemplified robotic arm according to some embodiments of the autonomous airborne vehicle of the present invention.
DETAILED DESCRIPTION OF THE OF THE INVENTION
[00107] Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of the components set forth in the following description or illustrated in the drawings. The invention is applicable to other embodiments or of being practiced or carried out in various ways. Also, it is to be understood that the phraseology and terminology employed herein is for the purpose of description and should not be regarded as limiting.
[00108] The present invention provides a system for indoor management of growing agricultural produce comprising: (a) at least one autonomous airborne vehicle, wherein said vehicle is designed to commute inside an enclosed space comprising said growing agricultural produce, said enclosed space preferably having outer walls and a ceiling; (b) data acquisition module operably engaged with the at least one autonomous airborne vehicle, wherein said module is configured to collect data related to said growing agricultural produce; (c) at least one controller operably engaged with the at least one autonomous airborne vehicle; (d) a computing device, in communication with the data acquisition module and the at least one controller; and, optionally, (d) an operator in communication with the at least one controller.
[00109] It is noted that greenhouse conditions are different from those of orchards (e.g. physical conditions and plant types), therefore unique adaptations of the drone system are required in order to operate and harvest in greenhouse.
[00110] In one embodiment, the present invention provides autonomous or semi- autonomous airborne robots (drones), using computer-vision-based technologies, for monitoring and harvesting fruits and vegetables in green -houses.
[00111] According aspects of the present invention, one or more drones operate fully or partially autonomously in the green house (e.g. plastic, incubator, glass or other material and/or mesh houses) and detects and harvest the fruits or vegetables, e.g. when ready for harvesting.
[00112] In some embodiments, the drone of the present invention is configured with:
[00113] - a camera that allows autonomous movement in the growth area and image capturing of the fruit or vegetable for a computer vision-based decision making whether the fruit is due for picking; and
[00114] - a mechanical arm that enables the fruit or vegetable picking and transport to a storage container.
[00115] The solution provided by the current invention enables efficient and rapid harvesting for different growth habits, in short or long season, passive (PGH) and / or acclimated (AGH) growth houses.
[00116] The herein provided solution encompasses computer-vision based airborne robots (drones) for monitoring and harvesting fruits and vegetables in greenhouse. By using the present invention, the following is achieved:
[00117] According to one aspect of the present invention, the provided technological solution enables harvesting the fruit or vegetable at the right time with the right cost.
[00118] According to a further aspect of the present invention, labor cost for harvesting fruits and vegetables in greenhouses is reduced and marketable production is improved. Due to harvesting the fruits and/or vegetables at the right time improved grower profitability is achieved.
[00119] According to some further aspects of the present invention, waste is significantly reduced due to optimal harvesting time by the system and method of the present invention, which is further capable of picking the best fruit without harming it.
[00120] A further advantage of using the current invention is that by automatically and autonomously harvesting fruits and vegetables at their optimal time, improved fruit quality is obtained, in addition to consistency at the consumer level.
[00121] The inventors of the present invention provides an autonomous airborne vehicle and a system comprising such airborne vehicle designed and adjusted to indoor agriculture environment (e.g. greenhouse, plastic or glass house and net house) and to harvesting different crops of various varieties, shapes, sizes, fruit types grown in such habitats (e.g. Tomato, Pepper, Cucumber , Strawberry, Blueberry, Raspberry). Specific examples of fruits and vegetables grown in indoor agricultural environment such as greenhouses include cherry tomato for loose and cluster harvesting, cherry and mini blocky pepper, baby cucumber and any snack fruit and/or vegetable including vegetables such as tomato, sweet pepper and cucumber and fruits such as apple, peach and pineapple.
[00122] By using computer vision systems combined with machine learning algorithms communicably engaged with the airborne robot, the airborne robot of the present invention is capable of picking/harvesting fruit/vegetable only when they are ready for picking, at their optimal maturation time and quality. This can be done by detecting means (sensors, e.g. image sensors) detecting the color, BRIX (herein also referred to as sugar content of a fruit or vegetable or Total Soluble Solids or TSS) and thus the fruit/vegetable quality/ ripeness.
[00123] It is within the scope that the system of the present invention comprises a fleet of more than one autonomous airborne vehicle or unmanned aerial vehicle (UAV) or drone, having the following functions:
[00124] A. One or more explorer or scout-type drones;
[00125] B. One or more operating e.g. harvesting- type drones;
[00126] C. Management or control unit or station or module.
[00127] Each of the above functions is performed by a member or part of the fleet, designated and configured to exert the specific task or capability or role.
[00128] As used herein, the terms "comprises", "comprising", "includes", "including", "having" and their conjugates mean "including but not limited to".
[00129] As used herein, the singular form "a", "an" and "the" include plural references unless the context clearly dictates otherwise. For example, the term "a compound" or "at least one compound" may include a plurality of compounds, including mixtures thereof.
[00130] The use of “adapted to” or “configured to” herein is meant as open and inclusive language that does not foreclose devices adapted to or configured to perform additional tasks or steps. Additionally, the use of “based on” is meant to be open and inclusive, in that a process, step, calculation, or other action “based on” one or more recited conditions or values may, in practice, be based on additional conditions or values beyond those recited. Similarly, the use of “based at least in part on” is meant to be open and inclusive, in that a process, step, calculation, or other action “based at least in part on” one or more recited conditions or values may, in practice, be based on additional conditions or values beyond those recited. Headings, lists, and numbering included herein are for ease of explanation only and are not meant to be limiting.
[00131] The term "about" as used herein denotes ± 25% of the defined amount or measure or value.
[00132] The terms "autonomous airborne vehicle", "unmanned aerial vehicle", "UAV", "uncrewed aerial vehicle", "drone", "airborne robot", "flying unmanned aircraft", "harvesting robot" and the like are used herein interchangeably and refer to an aircraft without a human pilot and a type of flying unmanned vehicle in any shape and size as needed and defined herein.
[00133] As used herein, the term "indoor" or "indoor management" generally refers to management of any process in which plants are grown inside a closed environment, or enclosed space such as a greenhouse or grow room. In such a closed environment a controlled-environment agriculture is performed aims to optimize plant growth and allows regulation of all aspects of the growing environment, including light, temperature, and humidity, to produce crops of a consistent quality, all year round. Indoor or enclosed space within the scope of the present invention include greenhouse growing, vertical farms, and some rooftop farms. In further aspects of the present invention, growing techniques and tools within controlled environments may
incorporate hydroponics, artificial lighting systems, soilless farming techniques such as hydroponics, aquaponics, and aeroponics.
[00134] A main aspect of the present invention is providing a system comprising at least one autonomous airborne vehicle for management and harvesting of fruits or vegetable bearing crops within a greenhouse.
[00135] The term "greenhouse" as used herein refers to an outdoor structure or enclosed space, preferably having outer walls and a celling that houses growing plants. Greenhouses can vary significantly in terms of size, design, and structure. They range from simple frames covered with plastic to warehouse-sized, fully sealed buildings with walls of glass or PVC. Greenhouses provide a suitable environment for the intensive production of various crops. They are designed to protect from diseases, pests, solar radiation, temperature, humidity and carbon dioxide levels in the aerial environment. Thus, the function of greenhouses is to provide an insulated environment that can protect the agricultural produce and extend the growing season by sheltering plants as they grow. In some embodiments of the present invention, the term greenhouse encompass a plastic or glass house, and a net or mesh house.
[00136] It is further within the scope of the present invention that a greenhouse may refer to a passive growth house (PGH) and / or to acclimated growth houses (AGH). Passive greenhouses (PGH) or passive methods of environmental control approach that does not involve the ongoing use of energy to regulate conditions and crop management such as heating, cooling, air circulation and lighting. Acclimated greenhouses (AGH) encompass methods of environmental control that involve the ongoing use of energy to regulate conditions and crop management such as heating, cooling, air circulation and lighting.
[00137] According to further embodiments of the present invention, acclimated growth houses or methods refer to active environmental control using an external energy source to power heating, cooling, venting, supplemental lighting, , and climate control systems. It includes active control mechanisms that encompass complimentary tools that allow growers to more precisely and predictably create desired conditions. Such methods may herein defined to encompass automated and/or computer-directed or based systems for heating, cooling, air circulation and supplemental lighting.
[00138] In further aspects of the invention, the term greenhouse refers to enclosed structure for growing high value crops such as fruits and vegetables, e.g. pepper, tomato and cucumber and different berries (for example, strawberry, blubbery, raspberry, blackberries, red currants, white currants and blackcurrants).
[00139] The term "container" as used herein refers to a collection bin or bin for collecting fruits and/or vegetables. The container may be any standard collection bin as currently used in the field or any other container that can be used to collect fruits and/or vegetables. Notably, a “container” also includes a portable container, for example with wheels that can be dispersed in several locations in a greenhouse. Alternatively, the container may be operably engaged with the one or more autonomous airborne vehicles of the system of the present invention.
[00140] The term "controller" refers, without limitation, to any hardware device or a software program, or a combination of the two that manages or directs the flow of data between two or more entities. In a general sense, a controller can be thought of as something or someone that interfaces between two systems and manages communications between them. According one embodiment of the present invention, the at least one controller is operably engaged with at least one autonomous airborne vehicle.
[00141] It is within the scope of the present invention that system, e.g. using the data acquisition module engaged with the at least one autonomous airborne vehicle, processor and controller parts or units or modules, is configured to manage, monitor and harvest agricultural produce within an enclosed growing house such as a greenhouse.
[00142] The system of the present invention may comprise a control or managing center or station or computing managing unit, or server wherein said server or control center is configured to monitor and direct system performance.
[00143] The term "sensor" as used herein generally refers to a device, module, machine, or system capable of detecting or measuring a property or changes in its environment and sending the information or data (e.g. optical or image or other data) to other electronics, frequently a computer processor. Non limiting examples of sensor types within the scope of the present invention include, but are not limited to electric current, electric potential, magnetic and/or radio sensors, weight/ quantity sensors,
optical, light, imaging and/or photon sensors, pressure sensors, thermal, heat and/or temperature sensors, position/location sensors, chemical sensor, a geo-location sensor, an environmental sensor, a weather sensor; a refractometer, a pH meter, acidometer, penetrometer, durometer, plant and/or surrounding detection sensor, plant and/or surrounding distance sensor, device or sensor detecting odors and/or flavors, or any combination thereof.
[00144] The term "imaging sensor" or "image sensor" or "image acquisition sensor" as used herein refers to a sensor that detects and conveys information used to make an image. Without wishing to be bound by theory, an imaging sensor conveys the variable attenuation of light waves, passed through or reflect off objects, into signals, that convey the information. The waves can be light or other electromagnetic radiation. Image sensors are used in electronic imaging devices of both analog and digital types, which include digital cameras, camera modules, camera phones, and others.
[00145] Exemplary imaging sensors within the scope of the present invention include: RGB (red, green, blue) frequency spectrum, multispectral, hyperspectral, visible light frequency range, near infrared (NIR) frequency range, infrared (IR) frequency range, monochrome, specific light wavelengths (e.g. LED or laser and/or laser and/or halogen and/or xenon and/or fluorescent), UV frequency range, a reflectometer and combinations of the aforementioned.
[00146] The term "image data" herein means a photographic or trace objects that represent the underlying pixel data of an area of an image element, which is created, collected and stored using image constructor devices. Image data attributes include for example, image resolution, data-point size and spectral bands. In the context of the present invention the term "computer" stands for but no limited to a machine or device that performs processes, calculations and operations based on instructions provided by a software or hardware program. The term computer also means in the context of the present invention a control unit or controller. It is designed to process and execute applications and provides a variety of solutions by combining integrated hardware and software components. The computer of the invention is configured to extract a predetermined set of feature vectors from the image data of the agricultural produce; to compute characteristics of the agricultural produce, e.g. fruits and/or vegetables based on the set of feature vectors, attributes or parameters; to generate output and to transmit the output to the controller unit.
[00147] In one embodiment, the present invention provides a system for indoor management of growing agricultural produce comprising: (a) at least one autonomous airborne vehicle, wherein said vehicle is designed to commute inside an enclosed space comprising said growing agricultural produce; (b) data acquisition module operably engaged with the at least one autonomous airborne vehicle, wherein said module is configured to collect data related to said growing agricultural produce; (c) at least one controller operably engaged with the at least one autonomous airborne vehicle; (d) a computing device, in communication with the data acquisition module and the at least one controller; and, optionally, (e) an operator in communication with the at least one controller.
[00148] It is further within the scope of the present invention to provide the system as defined above, wherein the enclosed space having outer walls and a ceiling.
[00149] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein said system is designed to fulfill at least one of the functions selected from harvesting the agricultural produce, diluting the agricultural produce, collecting data related to the agricultural produce, pollinating the agricultural produce, or any combination thereof.
[00150] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the agricultural produce comprises at least one of fruit-bearing crops, vegetable- bearing crops and flowering crops.
[00151] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the autonomous airborne vehicle further comprises at least one harvesting element configured to extend from the vehicle and to become physically engaged with the agricultural produce.
[00152] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the harvesting element is a fruit or vegetable harvesting element configured to detach fruit or vegetable form the fruit or vegetable bearing crop.
[00153] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the at least one harvesting element is a robotic arm.
[00154] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the harvesting element is selected from the group consisting of a suction, a cutter, a grasper and any combination thereof.
[00155] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the autonomous airborne vehicle further comprises a movement-controlling element configured to enable the vehicle to commute within the enclosed space.
[00156] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein said system comprises an autonomous navigation unit such as a Global Positioning System (GPS).
[00157] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the data acquisition module is selected from data related to crop performance, environmental data, or a combination thereof.
[00158] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein data related to crop performance is selected from fruit or vegetable quality, flowering stage, crop size, color of leaves, crop diseases or a combination thereof.
[00159] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the data related to fruit or vegetable quality comprises fruit or vegetable size, fruit or vegetable shape, fruit or vegetable length, fruit or vegetable color, fruit or vegetable location, BRIX content, fruit or vegetable ripeness, BRIX (%), Firmness (sh), Dry Weight (%), Quality Defect, Condition Defect, Color range, Acidity, Hue, Color Variance, Appearance, Color Coverage (%), Starch (%), Weight, Juice (%), Seeds (%), Temperature, TSS / Acidity, number of fruits or vegetable per crop, yield, health, state of the plant, performance, or any combination thereof.
[00160] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the environmental data is selected from temperature, humidity, irradiation, location of the vehicle, location of the agricultural produce within the enclosed space and any combination thereof.
[00161] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the data acquisition module comprises
at least one image acquisition element, wherein said image acquisition element is configured to collect image data related to the agricultural produce.
[00162] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the at least one image acquisition element is a multispectral camera.
[00163] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the at least one image acquisition element is selected from the group consisting of a Hyperspectral camera, an Infra-Red (IR) camera, a RGB camera, visible light frequency range camera, near infrared (NIR) frequency range camera, monochrome, specific light wavelengths (e.g. LED and/or laser and/or halogen and/or xenon and/or fluorescent), UV frequency range and any combination thereof.
[00164] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the data acquisition module comprises at least one sensor configured to collect data related to the agricultural produce, wherein the data is not an image data.
[00165] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the data acquisition module comprises data acquired from one or more of an optical sensor, an acoustic sensor, a chemical sensor, a geo-location sensor, an environmental sensor, a weather sensor; a refractometer, a pH meter, acidometer, penetrometer, durometer, temperature meter, weight sensor, quantity sensor, pressure sensor, plant and/or surrounding detection sensor, plant and/or surrounding distance sensor, device or sensor detecting odors and/or flavors, or any combination thereof.
[00166] It is further within the scope of the present invention to provide the system as defined in any of the above, further comprising a transducer wherein said transducer is configured to transmit the data collected by the data acquisition module to the computing device.
[00167] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the computing device comprises a memory and a processor and is configured to process the data collected by the data
acquisition module and to transmit instructions to the controller based on the processed data.
[00168] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the controller is configured to fulfill at least one of the functions selected from: receiving instructions from the computer, providing instructions to the operator, providing instructions to the movementcontrolling element of the autonomous airborne vehicle, turning on the fruit or vegetable harvesting element of the autonomous airborne vehicle, turning off the fruit or vegetable harvesting element of the autonomous airborne vehicle, communicate with other one or more autonomous airborne vehicles.
[00169] It is further within the scope of the present invention to provide the system as defined in any of the above, comprising more than one autonomous airborne vehicle.
[00170] It is further within the scope of the present invention to provide the system as defined in any of the above, comprising a fleet of autonomous airborne vehicles.
[00171] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the fleet of autonomous airborne vehicles comprises one or more scouting autonomous airborne vehicle and/or one or more harvesting autonomous airborne vehicle.
[00172] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the enclosed space is selected from a greenhouse, a plastic or glass house, and a net or mesh house.
[00173] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the fruit or vegetable-bearing crop is selected from the group consisting of tomato, pepper, cucumber, strawberry, raspberry, blueberry and any snack fruit or vegetable. Cherry tomato for loose and cluster harvesting, cherry and mini blocky Pepper, baby cucumber and any snack fruit and vegetable
[00174] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the operator is a human operator.
[00175] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein the operator is an automated operator.
[00176] It is further within the scope of the present invention to provide the system as defined in any of the above, further comprising at least one container operably engaged with the one or more autonomous airborne vehicles.
[00177] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein said autonomous airborne vehicle further comprises a wireless communication module or device, such as a Bluetooth device.
[00178] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein said autonomous airborne vehicle further comprises a communication module to wirelessly communicate with other one or more autonomous airborne vehicles within said enclosed space.
[00179] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein said system further comprises at least one On-premises managing software and/or at least one off-premises or remote or on- cloud managing software.
[00180] It is further within the scope of the present invention to provide the system as defined in any of the above, wherein said system further comprises a user interface such as a graphical user interface (GUI).
[00181] According to a further aspect, the present invention provides a method for indoor management of growing agricultural produce comprising: (a) providing the system as defined in any of the above; (b) providing an enclosed space comprising growing agricultural produce, wherein said space comprises outer walls and a ceiling; (c) directing the at least one autonomous airborne vehicle towards the growing agricultural produce; (d) collecting data related to the growing agricultural produce by the data acquisition module; (e) transmitting the collected data to the computer; (f) processing the collected data and providing instructions to the controller based on the collected data; and, (g) executing the instructions by the controller to thereby effectively manage the growing agricultural produce.
[00182] According to a further embodiment, the present invention provides an autonomous airborne vehicle comprising: (a) a power source; (b) a data acquisition module; (c) a computing system comprising a memory and a processor; (d) a
transducer; and, optionally, (e) a controller; wherein said autonomous airborne vehicle is designed to commute inside an enclosed space characterized by having outer walls and a ceiling; and, wherein said enclosed space comprises fruit or vegetable -bearing growing agricultural produce.
[00183] It is further within the scope to provide an autonomous airborne vehicle comprising: (a) a power source; (b) a data acquisition module; (c) a transducer; (d) a fruit or vegetable harvesting element; and, optionally, (e) a controller; and (f) a computing system comprising a memory and a processor; wherein said autonomous airborne vehicle is designed to commute inside an enclosed space characterized by having outer walls and a ceiling; and, wherein said enclosed space comprises fruit or vegetable-bearing growing agricultural produce.
[00184] It is also within the scope of the present invention to provide a computer implemented method of indoor management of a growing fruit or vegetable-bearing crops comprising: (a) providing the system as defined in any of the above; (b) collecting image data by the data acquisition module; (c) processing the collected data and assessing the state of the fruit or vegetable-bearing crops based on a set of parameters indicative of the state of the fruit or vegetable-bearing crops extracted from the collected image data; (d) providing instructions to the controller based on the state of the growing fruit or vegetable -bearing crops; and, (e) managing the growing fruitbearing crops.
[00185] It is further within the scope to disclose the computer implemented method as defined in any of the above, wherein said assessing the state of the growing agricultural produce comprises using a computerized system including at least one of a machine learning system, a deep learning system, an optical flow technique, a computer vision technique, a convolutional neural network (CNN), a recurrent neutral network (RNN), or a machine learning dataflow library, and wherein said system analyzing the acquired image data comprising autonomously predicting one or more phenotypic traits of said fruit or vegetable bearing agricultural produce.
[00186] It is further within the scope to disclose the computer implemented method as defined in any of the above, wherein said assessing the state of the growing agricultural produce comprises using a trained neural network.
[00187] It is further within the scope to disclose the computer implemented method as defined in any of the above, wherein said step of processing comprises steps of computing said image data using computer implemented algorithm trained to generate output based on the image data.
[00188] It is further within the scope to disclose the computer implemented method as defined in any of the above, wherein said computer implemented algorithm is trained to generate output based on predetermined feature vectors or attributes extracted from the image data.
[00189] It is further within the scope to disclose the computer implemented method as defined in any of the above, wherein said method comprises steps of implementing with said algorithm a training process according to a training dataset comprising a plurality of training images of a plurality of fruit or vegetable-bearing crops captured by the at least one image acquisition element, wherein each respective training image of the plurality of training images is associated with the state of said fruit or vegetable -bearing crop depicted in the respective training image.
[00190] It is further within the scope to disclose the computer implemented method as defined in any of the above, wherein said training process comprises steps of (a) capturing images of the fruit or vegetable-bearing crop using an imaging sensor; (b) classifying images into desired categories by applying a tag associated with parameters or attributes indicative of the state of the fruit or vegetable -bearing crop extracted from the image data; and (c) applying a computer vision algorithm to determine a set of feature vectors associated with each desired category.
[00191] It is further within the scope to disclose the computer implemented method as defined in any of the above, comprising steps of applying a machine learning process with the computer implemented trained algorithm to determine the state of the fruit or vegetable-bearing crop.
[00192] It is further within the scope to disclose the computer implemented method as defined in any of the above, wherein said machine learning process comprises computing by the at least one neural network, a tag of at least one desired category for the at least one fruit-bearing crop, wherein the tag of at least one classification category is computed at least according to weights of the at least one neural network, wherein the at least one neural network is trained according to a training
dataset comprising a plurality of training images of a plurality of fruit or vegetablebearing crops captured by the at least one imaging sensor, wherein each respective training image of the plurality of training images is associated with said tag of at least one desired category of at least one fruit or vegetable-bearing crop depicted in the respective training image; and generating according to the tag of at least one classification category, instructions for execution by the controller.
[00193] It is further within the scope to disclose the computer implemented method as defined in any of the above, wherein the state of the fruit or vegetable comprises parameters including fruit or vegetable size, fruit or vegetable shape, fruit or vegetable length, fruit or vegetable color, fruit or vegetable location, BRIX content, fruit or vegetable ripeness, BRIX (%), Firmness (sh), Dry Weight (%), Quality Defect, Condition Defect, Color range, Acidity, Hue, Color Variance, Appearance, Color Coverage (%), Starch (%), Weight, Juice (%), Seeds (%), Temperature, TSS / Acidity, number of fruits or vegetable per crop, yield, health, state of the plant, performance, or any combination thereof.
[00194] It is according to a further embodiment to provide a computer implemented algorithm comprising code to perform the steps of the method as defined in any of the above.
[00195] The present invention may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
[00196] The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital
versatile disk (DVD), a memory stick, a floppy disk, and any suitable combination of the foregoing.
[00197] A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
[00198] Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network.
[00199] Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
[00200] These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
[00201] According to certain aspects, the method of the present invention comprises steps of applying a machine learning process with the computer implemented
trained algorithm to determine the status (e.g. ripeness, readiness for harvesting and other parameters) of fruits and vegetables grown in greenhouses. Thus it is within the scope of the present invention that the algorithm (or computer readable program) is implemented with a machine learning process using a neural network with the processed data.
[00202] The term training in the context of machine learning implemented within the system of the present invention refers to the process of creating a machine learning algorithm. Training involves the use of a deep-learning framework and training dataset. A source of training data can be used to train machine learning models for a variety of use cases, in the context of the present invention it is used with the system and method for indoor management of growing agricultural produce.
[00203] The neural network may compute a classification category, and/or the embedding, and/or perform clustering, for detecting, identifying the status and harvesting fruit(s) or vegetable(s) grown in greenhouse using one or more autonomous airborne vehicle.
[00204] The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks herein disclosed.
[00205] The architecture of the neural network(s) may be implemented, for example, as convolutional, pooling, nonlinearity, locally-connected, fully-connected layers, and/or combinations of the aforementioned.
[00206] It is noted that the tagging or classifying or detection of the fruits or vegetables in the images may be manually or semi manually entered by a user (e.g., via the GUI, for example, selected from a list of available phenotypic characteristic targets or parameters), obtained as predefined values stored in a data storage device, and/or automatically computed.
[00207] The term "feature vector" refers hereinafter in the context of machine learning to an individual measurable property or characteristic or parameter or attribute
of a phenotype being observed e.g., detected by a sensor. It is herein apparent that choosing an informative, discriminating and independent feature is a crucial step for effective algorithms in pattern recognition, computer vision, machine learning, classification and regression. Algorithms using classification from a feature vector include nearest neighbor classification, neural networks, and statistical techniques.
[00208] In computer vision and image processing, a feature is an information which is relevant for solving the computational task related to a certain application. Features may be specific structures in the image such as points, edges or objects. Features may also be the result of a general neighborhood operation or feature detection applied to the image. When features are defined in terms of local neighborhood operations applied to an image, a procedure commonly referred to as feature extraction is executed.
[00209] Although embodiments of the disclosure are not limited in this regard, discussions utilizing terms such as, for example, "processing", "computing", "communicating", "training", "capturing", "executing", "calculating", "feeding", "determining", "establishing", "analyzing", "checking", "tagging", "classifying", "transmitting", "exerting" or the like, may refer to operation(s) and/or process(es) of a computer, a computing platform, a computing system, a computer implemented algorithm or other electronic computing device, that manipulates and/or transforms data represented as physical (e.g., electronic) quantities within the computer's registers and/or memories into other data similarly represented as physical quantities within the computer's registers and/or memories or other information non-transitory storage medium (e.g., a memory) that may store instructions to perform operations and/or processes.
[00210] According to one embodiment, the present invention provides a system for indoor management of growing agricultural produce comprising: (a) at least one autonomous airborne vehicle, wherein said vehicle is designed to commute inside an enclosed space comprising said growing agricultural produce; (b) data acquisition module operably engaged with the at least one autonomous airborne vehicle, wherein said module is configured to collect data related to said growing agricultural produce; (c) at least one controller operably engaged with the at least one autonomous airborne vehicle; (d) a computing device, in communication with the data acquisition module
and the at least one controller; and, optionally, (e) an operator in communication with the at least one controller.
[00211] According to a further embodiment, the present invention provides a method for indoor management of growing agricultural produce comprising: (a) providing the system as defined in any of the above; (b) providing an enclosed space comprising growing agricultural produce, wherein said space comprises outer walls and a ceiling; (c) directing the at least one autonomous airborne vehicle towards the growing agricultural produce; (d) collecting data related to the growing agricultural produce by the data acquisition module; (e) transmitting the collected data to the computer; (f) processing the collected data and providing instructions to the controller based on the collected data; and, (g) executing the instructions by the controller to thereby effectively manage the growing agricultural produce.
[00212] If is further within the scope of the present invention to provide an autonomous airborne vehicle comprising: (a) a power source; (b) a data acquisition module; (c) a computing system comprising a memory and a processor; (d) a transducer; and, optionally, (e) a controller. The aforementioned autonomous airborne vehicle is designed to commute inside an enclosed space characterized by having outer walls and a ceiling; and, wherein said enclosed space comprises fruit or vegetablebearing growing agricultural produce.
[00213] In another embodiment of the present invention, an autonomous airborne vehicle is provided. The autonomous airborne vehicle comprises (a) a power source; (b) a data acquisition module; (c) a transducer; (d) a fruit or vegetable harvesting element; and, optionally, (e) a controller; and (f) a computing system comprising a memory and a processor. In certain aspects of the present invention, the autonomous airborne vehicle is designed to commute inside an enclosed space characterized by having outer walls and a ceiling; and, wherein said enclosed space comprises fruit or vegetable-bearing growing agricultural produce.
[00214] According to a further embodiment, the present invention provides a computer implemented method of indoor management of a growing fruit or vegetablebearing crops comprising: (a) providing the system of as defined in any of the above; (b) collecting image data by the data acquisition module; (c) processing the collected data and assessing the state of the fruit or vegetable -bearing crops based on a set of parameters indicative of the state of the fruit or vegetable-bearing crops extracted from
the collected image data; (d) providing instructions to the controller based on the state of the growing fruit or vegetable-bearing crops; and, (e) managing the growing fruitbearing crops.
[00215] Reference is now made to Fig. 1 depicting a schematic illustration of a scouting autonomous airborne vehicle 100 as an embodiment of the system of the present invention. The scouting autonomous airborne vehicle 100 is configured to maneuvering in the greenhouse and performing greenhouse mapping and identification and marking of fruit and/or vegetables ready to be picked by the at least one harvesting drone in the fleet.
[00216] According to some embodiments, such a scouting autonomous airborne vehicle 100 comprises a main body 10 of the drone, which is adopted in size and shape to indoor conditions, specifically, to greenhouse conditions. Such a drone may be equipped with (physically engaged with the main body 10) at least one image acquisition element, e.g. camera 20. Non limiting of a camera or an image acquisition element within the scope of the present invention include a camera type of 2D, 3D, multifunctional, hyperspectral, IR, RGB, navigation camera, fruit/vegetable camera and/or stereoscopic camera. The drone 100 may be further equipped with at least one of module selected from: at least one navigation module, i.e. GPS 30, at least one antenna module 40 for receiving or transmitting electromagnetic/ radio waves, at least one Bluetooth device 50 for wirelessly communicating data, at least one communication device 60 for communication with other one or more drones (e.g. through radio or cellular communication) and at least one computing device 70 comprising a memory and preferably a processor. In this embodiment, the scouting device is configured to (1) mapping the enclosed space, e.g. greenhouse, (2) detecting and identifying the fruit/vegetable status, for example ripeness, BRIX content, readiness for picking, and (3) communication with one or more drones in the fleet to mark and/or communicate data regarding specific fruits/vegetables, such as readiness for picking, to one or more harvesting drones. The scouting autonomous airborne vehicle has capability of operating in day and night conditions.
[00217] Reference is now made to Fig. 2 schematically illustrating a harvesting autonomous airborne vehicle 200 as an embodiment of the system of the present invention. In this embodiment, the harvesting drone comprises the drone's main body 11, which is adapted in size, shape and structure/function relations to operating indoor
for example in green houses. In aspects of the present invention the autonomous airborne vehicle 200 is integrated with at least one navigation module, i.e. GPS 30, at least one antenna module 40 for receiving or transmitting electromagnetic/ radio waves, at least one Bluetooth device 50 for wirelessly communicating data and at least one communication device 60 for communication with other one or more drones (e.g. through radio or cellular communication). Such a harvesting or operating drone is further equipped with at least one harvesting element, e.g. robotic arm 75 to detach the fruit or vegetable form the fruit or vegetable bearing crop. The robotic arm 75 may be in a configuration of a fixed long arm that can protrude leaves and branches. In one embodiment the arm may be equipped with on-drone-camera that enable closed loop feedback in addition to having functionality of harvester. Such a robotic arm 75 enable the detection of ripe fruits followed by access thereto with the arm and in addition controlling the movement of the drone. According to some embodiments, the harvesting drone 200 and/or robotic arm 75 comprises a harvesting element which may be a suction, a cutter, a grasper, a rotary blade and any combination thereof.
[00218] It is within the scope of the present invention that the harvesting element can be a robotic arm, which extends from the drone body 11, its length needs to be long enough for protruding the branches and accessing the fruit or vegetable without injuring the crop plant, e.g. branches, and without un-balancing of the drone. The robotic arm 75 may be a flexible or adjustable arm with at least 2 degrees of freedom, which enables fruit or vegetable access without moving the drone. In specific embodiments, the robotic arm is a rigid or fixed arm.
[00219] The robotic harvesting arm 75 can be installed on top, bottom, or side of the drone, or any combination thereof.
[00220] In certain embodiments of the harvesting device 200 of the invention, the harvesting arm comprises a fruit or vegetable-grabbing/gripping/detaching/pulling mechanism e.g. forceps, clamps, a guillotine mechanism robotic fingers or by a vacuum pump that pulls the fruit/vegetable away of the plant.
[00221] In certain embodiments of the harvesting device 200 of the invention, the harvesting arm further comprises a fruit's cutting unit (e.g. secateurs, saw, scissors, shears, and laser) for assisting in the removal of the fruit or vegetable off the crop plant.
[00222] In further aspects of the present invention, the harvesting autonomous airborne vehicle 200 further comprises a collection container 80. After disconnecting or removing the fruit or vegetable from the fruit or vegetable-bearing plant, the drone can either take the fruit/vegetable to a collection point or to a collection unit or element or box or container integrated with the drone (as shown in Fig. 2), or to throw/drop it to the ground or throw/drop it to a collection base/trampoline which is installed around the crop plant within the greenhouse. In certain embodiments, the harvesting drone delivers the harvested fruit(s) or vegetable(s) to a collection container 80 mounted with the drone. In certain embodiments there are several collection containers 80. In some embodiments the UAV perform fruit or vegetable quality analysis and deliver or carry the harvested fruit or vegetable to the appropriate container, according to the fruits? vegetable's quality; this enables quality assessment and sorting of the fruits/vegetables during their picking process in real time, according to predetermined criteria or parameters.
[00223] It is further within the scope that the harvesting autonomous airborne vehicle 200 is further mounted with at least one sensor, for example, in this embodiment, one or more weight/quantity sensor 90 and/or one or more a plant & surrounding distance or mapping sensor 95. Other examples of sensors within the scope of the current invention include in a non-limiting manner optical sensor, an acoustic sensor, a chemical sensor, a geo-location sensor, an environmental sensor, a weather sensor; a refractometer, a pH meter, acidometer, penetrometer, durometer, temperature meter, weight sensor, quantity sensor, pressure sensor, plant and/or surrounding detection sensor, plant and/or surrounding distance sensor, device or sensor detecting odors and/or flavors, or any combination thereof.
[00224] According to one embodiment, the harvesting drone 200 is configured to picking the fruit or vegetable according to the data detected by the scouting drone 100, transported for processing into output comprising fruits or vegetables marked as ready for picking or harvesting. The one or more fruits or vegetables harvested by the harvesting drone 200 are delivered to the collection container 80.
[00225] According to another embodiment of the present invention, the harvesting autonomous airborne vehicle 200 is adapted to working in rows of planted agricultural produce, in defined areas in an enclosed space or structure (e.g. greenhouse).
[00226] According to another embodiment, the harvesting autonomous airborne vehicle 200 is equipped with means 60 for communicating with the scouting drone or any other done in the fleet.
[00227] According to another embodiment, the harvesting autonomous airborne vehicle 200 is equipped with harvesting means or elements 75 such as a robotic arm for picking fruit(s) or vegetable(s) and optionally with at least one fruit or vegetable collection container 80.
[00228] According to further embodiments of the present invention, the harvesting autonomous airborne vehicle 200 has the ability of 3D identification in space (e.g. 3D camera or sensor) and communication between other members of the fleet of drones.
[00229] Reference is now made to Fig. 3 schematically illustrating an embodiment that may be implemented in the system of the present invention, in accordance with certain aspects of the present invention. This embodiment schematically depicts a computerized managing or central system or central control and management unit 300. According to some aspects, a computing device comprising a server on-premises (server on-prem) manager main body 12 is mounted with at least one drone battery charging platform 25, a GPS 30, at least one antenna module 40 for receiving or transmitting electromagnetic/ radio waves, at least one Bluetooth device 50 and at least one drones communication device 60.
[00230] In certain aspects of the present invention, the central managing unit 300 is configured to control and monitor the conditions in the greenhouse and/or performance of the one or more autonomous airborne vehicles in the greenhouse. Such a unit 300 may be positioned on site or remotely. According to further aspects, the central control and management unit 300 may comprise at least one On-premises managing software and/or at least one off-premises or remote or on-cloud managing software.
[00231] In addition, central unit 300 is configured to have the ability to manage and interface a central managing unit with the fleet of automated drones.
[00232] The managing unit 300 may comprise a user interface for the grower, manager or operator in the greenhouse., means for communication with an on-site or cloud based software, means for managing statistical data on the condition of the
agricultural produce in the greenhouse at any given moment and cross-referencing them with data on greenhouse parameters such as: temperature, irrigation, fertilization and others. The computerized managing unit 300 may enable feedback mode of action in the greenhouse. In some aspects of the present invention, the unit 300 encompass a controller module or function.
[00233] Reference is now made to Fig. 4 a schematic illustration of exemplified embodiments of the system 400 for indoor management of growing agricultural produce of the present invention. In this embodiment a harvester drone comprising main body 11 and means 60 for communicating with other one or more drones, is in communication with a scouting drone within the fleet. The scouting drone comprises main body 10 and means 60 for communicating with other one or more drones. The harvesting drone is in further communication with the main server on-site managing unit 12, which is in communication with user interface 65. In certain aspects of the present invention, a main server on-cloud manager 13 is in communication with the harvesting drone (comprising main body 11 and means 60 for communicating with other one or more drones), the scouting drone (comprising main body 10 and means 60 for communicating with other one or more drones), with the main server on-site managing unit 12 and with the user interface 65.
[00234] Reference is now made to Fig. 5 schematically depicting an exemplified robotic arm 500 according to some embodiments of the autonomous airborne vehicle of the present invention. In this embodiment, the robotic arm 500 comprises elements including the robotic arm main body 13, harvesting element 85, for example at least one suction, cutter and/or grasper, and weight/quantity sensor(s) 90 for evaluating harvested fruit/vegetable weight or quantity, or any other preselected sensor.
[00235] According to one embodiment, a robotic arm engaged with the autonomous airborne vehicle of the present invention is used for picking fruits/vegetable ready for picking within the greenhouse.
[00236] According to further aspects of the present invention, the robotic arm is configured for harvesting in various methods such as by engulfing a fruit while cutting/tearing it (e.g. the stipe) from the plant; by loop or two loops that holding the fruit/ vegetable and pulls it or by a vacuum pump. Particular harvesting methods are exemplified below:
[00237] • Fruit/vegetable gripping, and pulling and rotating movements in parallel - the fruit/vegetable is then transferred to a local or specific collection container and optionally from there to a general collection container.
[00238] • Holding the fruit/vegetable and detaching it with a clipper or cutter.
The fruit/vegetable is then transferred to a local or specific collection container and optionally from there to a general collection container.
[00239] • In one embodiment the autonomous airborne vehicle is equipped with a vacuum pipe, picking the fruit/vegetable with the assistance of a vacuum action. The fruit/vegetable is then transferred to a local or specific collection container and optionally from there to a general collection container. In this embodiment, a vacuum pump may be used to pull a fruit/vegetable from a plant.
[00240] In certain embodiments of the harvesting device of the invention, the harvesting arm comprises a fruit/vegetable grabbing/gripping mechanism (e.g. forceps, clamps, or robotic fingers). In some embodiments of the harvesting device of the invention, the fruit/vegetable pulling is done with a vacuum pump that pulls the fruit/vegetable away of the fruit or vegetable bearing plant.
[00241] Accordingly, the drone harvesting arm according to some embodiments of the invention has a palm/gripping mechanism that can hold a fruit/vegetable. The palm may comprise of a few fingers, a flexible cab, or a vacuum mechanism.
[00242] According to further aspects of the invention, the robotic arm activity and configuration may include:
[00243] According to one embodiment, a robotic arm with several degrees of freedom that allow gentle movement within the plant branches, without harming or damaging the plant or fruits/vegetables of other plants.
[00244] According to a further embodiment, the robotic arm is built from a lightweight material that allows the autonomous airborne vehicle or drone to carry it at a minimum energy cost.
[00245] According to yet another embodiment, the robotic arm is configured to transfer the fruit/vegetable using gravity forces to a collection container located within the greenhouse.
[00246] In other aspects of the present invention the autonomous airborne vehicle or drone may have various configurations, including:
[00247] In one embodiment, the autonomous airborne vehicle is mounted with a multispectral camera equipped with Al/machine learning capabilities or systems, which allows three-dimensional movement in a complicated environment and detection of ripen fruit/vegetable for harvesting.
[00248] In a further embodiment, the autonomous airborne vehicle of the present invention is configured to automatically manage agricultural produce collected in a container (based on weight or quantity parameters) and provide instructions on transferring the collection to a central container.
[00249] In yet another embodiment, the autonomous airborne vehicle is configured to manage the battery status and switch it to charging mode on a suitable surface automatically.
[00250] In yet another embodiment of the present invention, the autonomous airborne vehicle is capable of management of communication with a central control and monitoring system in order to report data / work status / faults in the system automatically.
[00251] According to further aspects, the system of the present invention and/ or the autonomous airborne vehicle of the present invention comprises computer visionbased detection capabilities.
[00252] According to one embodiment, the computer vision-based detection system is configured to identify the position of the autonomous airborne vehicle within the greenhouse space, e.g. longitudinal axis (specific row), width axis (specific plant in a row) height axis (across a single plant, e.g. specific fruit/vegetable within a plant).
[00253] According to another embodiment, the computer vision-based detection system is configured to monitoring a specific marker geographic locations in a work area using X,Y, Z coordinates position.
[00254] According to yet another embodiment, the computer vision-based detection system enable function of the autonomous airborne vehicle at day and night conditions by using infrared camera adjusted for dark conditions.
[00255] According to yet another embodiment, the computer vision-based detection system is configured to photograph the plant at the level of a group of plants (more than one plant), single plant, a cluster within one plant, or at the level of a single fruit/vegetable.
[00256] It is further within the scope that the system of the present invention is capable of identification by computer vision-based technology of a fruit/vegetable as an object in space.
[00257] Using autonomous airborne vehicle equipped with computer visionbased detection system, information on fruit/vegetable status within a greenhouse, for example ripening status (readiness to be picked) is provided to the greenhouse (or several greenhouses) central control and monitoring system.
[00258] It is further within the scope that the computer vision-based detection system of the present invention which is combined with Al/machine learning algorithms is configured to infer or extrapolate or predict from a single fruit/vegetable image or video on the condition or status of the fruit/vegetable and cross it or compare it with other sample images or data examined and classified by a professional breeder or cultivator.
[00259] It another embodiment, the computer vision-based autonomous airborne vehicle detection system of the present invention is configured to predict yield (e.g. fruits or vegetables) expected from a certain agricultural growing area throughout all stages of harvesting of the agricultural produce.
[00260] It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub-combination.
[00261] Unless otherwise defined, all technical and scientific terms used herein have the same meanings as are commonly understood by one of ordinary skill in the art to which this invention belongs. Although methods similar or equivalent to those described herein can be used in the practice or testing of the present invention, suitable methods are described herein.
[00262] All publications, patent applications, patents, and other references mentioned herein are incorporated by reference in their entirety. In case of conflict, the patent specification, including definitions, will prevail. In addition, the materials, methods, and examples are illustrative only and not intended to be limiting.
[00263] It will be appreciated by persons skilled in the art that the present invention is not limited to what has been particularly shown and described hereinabove. Rather the scope of the present invention is defined by the appended claims and includes both combinations and sub-combinations of the various features described hereinabove as well as variations and modifications thereof, which would occur to persons skilled in the art upon reading the foregoing description.
Claims
CLAIMS A system for indoor management of growing agricultural produce comprising: a) at least one autonomous airborne vehicle, wherein said vehicle is designed to commute inside an enclosed space comprising said growing agricultural produce; b) data acquisition module operably engaged with the at least one autonomous airborne vehicle, wherein said module is configured to collect data related to said growing agricultural produce; c) at least one controller operably engaged with the at least one autonomous airborne vehicle; d) a computing device, in communication with the data acquisition module and the at least one controller; and, optionally, e) an operator in communication with the at least one controller. The system of claim 1, wherein the enclosed space having outer walls and a ceiling. The system of any one of claims 1 and 2, wherein said system is designed to fulfill at least one of the functions selected from harvesting the agricultural produce, diluting the agricultural produce, collecting data related to the agricultural produce, pollinating the agricultural produce, or any combination thereof. The system of any one of claims 1 to 3, wherein the agricultural produce comprises at least one of fruit-bearing crops, vegetable- bearing crops and flowering crops. The system of any one of claims 1 to 4, wherein the autonomous airborne vehicle further comprises at least one harvesting element configured to extend from the vehicle and to become physically engaged with the agricultural produce. The system of claim 5, wherein the harvesting element is a fruit or vegetable harvesting element configured to detach fruit or vegetable form the fruit or vegetable bearing crop. The system of claim 5, wherein the at least one harvesting element is a robotic arm. The system of any one of claims 5 to 7, wherein the harvesting element is selected from the group consisting of a suction, a cutter, a grasper and any combination thereof.
44
The system of any one of claims 1 to 8, wherein the autonomous airborne vehicle further comprises a movement-controlling element configured to enable the vehicle to commute within the enclosed space. The system of any one of claims 1 to 9, wherein said system comprises an autonomous navigation unit such as a Global Positioning System (GPS). The system of any one of claims 1 to 10, wherein the data acquisition module is selected from data related to crop performance, environmental data, or a combination thereof. The system of claim 11 , wherein data related to crop performance is selected from fruit or vegetable quality, flowering stage, crop size, color of leaves, crop diseases or a combination thereof. The system of claim 12, wherein the data related to fruit or vegetable quality comprises fruit or vegetable size, fruit or vegetable shape, fruit or vegetable length, fruit or vegetable color, fruit or vegetable location, BRIX content, fruit or vegetable ripeness, BRIX (%), Firmness (sh), Dry Weight (%), Quality Defect, Condition Defect, Color range, Acidity, Hue, Color Variance, Appearance, Color Coverage (%), Starch (%), Weight, Juice (%), Seeds (%), Temperature, TSS / Acidity, number of fruits or vegetable per crop, yield, health, state of the plant, performance, or any combination thereof. The system of claim 11, wherein the environmental data is selected from temperature, humidity, irradiation, location of the vehicle, location of the agricultural produce within the enclosed space and any combination thereof. The system of claim 1, wherein the data acquisition module comprises at least one image acquisition element, wherein said image acquisition element is configured to collect image data related to the agricultural produce. The system of claim 15, wherein the at least one image acquisition element is a multispectral camera. The system of claim 15, wherein the at least one image acquisition element is selected from the group consisting of a Hyperspectral camera, an Infra-Red (IR) camera, a RGB camera, visible light frequency range camera, near infrared (NIR) frequency range camera, monochrome, specific light wavelengths (e.g. LED and/or laser and/or halogen and/or xenon and/or fluorescent), UV frequency range and any combination thereof.
45
is. The system of claims 1, wherein the data acquisition module comprises at least one sensor configured to collect data related to the agricultural produce, wherein the data is not an image data.
19. The system of claim 1, wherein the data acquisition module comprises data acquired from one or more of an optical sensor, an acoustic sensor, a chemical sensor, a geo-location sensor, an environmental sensor, a weather sensor; a refractometer, a pH meter, acidometer, penetrometer, durometer, temperature meter, weight sensor, quantity sensor, pressure sensor, plant and/or surrounding detection sensor, plant and/or Surrounding distance sensor, device or sensor detecting odors and/or flavors, or any combination thereof.
20. The system of any one of claims 1 to 19, further comprising a transducer wherein said transducer is configured to transmit the data collected by the data acquisition module to the computing device.
21. The system of any one of claims 1 to 20, wherein the computing device comprises a memory and a processor and is configured to process the data collected by the data acquisition module and to transmit instructions to the controller based on the processed data.
22. The system of any one of claims 1 to 21, wherein the controller is configured to fulfill at least one of the functions selected from: receiving instructions from the computer, providing instructions to the operator, providing instructions to the movement-controlling element of the autonomous airborne vehicle, turning on the fruit or vegetable harvesting element of the autonomous airborne vehicle, turning off the fruit or vegetable harvesting element of the autonomous airborne vehicle, communicate with other one or more autonomous airborne vehicles.
23. The system of any one of claims 1 to 22 comprising more than one autonomous airborne vehicle.
24. The system of claim 23, comprising a fleet of autonomous airborne vehicles.
25. The system of claim 24, wherein the fleet of autonomous airborne vehicles comprises one or more scouting autonomous airborne vehicle and/or one or more harvesting autonomous airborne vehicle.
26. The system of any one of claims 1 to 25 wherein the enclosed space is selected from a greenhouse, a plastic or glass house, and a net or mesh house.
27. The system of any one of claims 4 to 26, wherein the fruit or vegetable -bearing crop is selected from the group consisting of tomato, pepper, cucumber, strawberry, raspberry, blueberry and any snack fruit or vegetable. Cherry tomato
46
for loose and cluster harvesting, cherry and mini blocky Pepper , baby cucumber and any snack fruit and vegetable The system of any one of claims 1 to 27, wherein the operator is a human operator. The system of any one of claims 1 to 27, wherein the operator is an automated operator. The system of any one of claims 1 to 29, further comprising at least one container operably engaged with the one or more autonomous airborne vehicles. The system of any one of claims 1 to 30, wherein said autonomous airborne vehicle further comprises a wireless communication module or device, such as a Bluetooth device. The system of any one of claims 1 to 31, wherein said autonomous airborne vehicle further comprises a communication module to wirelessly communicate with other one or more autonomous airborne vehicles within said enclosed space. The system of any one of claims 1 to 32, wherein said system further comprises at least one On-premises managing software and/or at least one off-premises or remote or on-cloud managing software. The system of any one of claims 1 to 33, wherein said system further comprises a user interface such as a graphical user interface (GUI). A method for indoor management of growing agricultural produce comprising: a) providing the system of any one of claims 1 to 34; b) providing an enclosed space comprising growing agricultural produce, wherein said space comprises outer walls and a ceiling; c) directing the at least one autonomous airborne vehicle towards the growing agricultural produce; d) collecting data related to the growing agricultural produce by the data acquisition module; e) transmitting the collected data to the computer; f) processing the collected data and providing instructions to the controller based on the collected data; and, g) executing the instructions by the controller to thereby effectively manage the growing agricultural produce. The method of claim 35, wherein the growing agricultural produce comprises fruit or vegetable- bearing crops.
The method of any one of claims 35 and 36, wherein the indoor management of growing agricultural produce comprises harvesting fruits or vegetables from the fruit or vegetable-bearing crops by the at least one autonomous airborne vehicle. The method of any one of claims 35 to 37, wherein said system is designed to fulfill at least one of the functions selected from harvesting the agricultural produce, diluting the agricultural produce, collecting data related to the agricultural produce, pollinating the agricultural produce, or any combination thereof. The method of any one of claims 35 to 38, wherein the agricultural produce comprises at least one of fruit-bearing crops, vegetable crops and flowering crops. The method of any one of claims 35 to 39, wherein the autonomous airborne vehicle further comprises at least one harvesting element configured to extend from the vehicle and to become physically engaged with the agricultural produce. The method of claim 40, wherein the harvesting element is a fruit or vegetable harvesting element configured to detach fruit or vegetable form the fruit or vegetable bearing crop. The method of claim 40, wherein the harvesting element is selected from the group consisting of a suction, a cutter, a grasper and any combination thereof. The method of claim 40, wherein the at least one harvesting element is a robotic arm. The method of any one of claims 35 to 43, wherein the autonomous airborne vehicle further comprises a movement-controlling element configured to enable the vehicle to commute within the enclosed space. The method of any one of claims 35 to 44, wherein said system comprises an autonomous navigation unit such as a Global Positioning System (GPS). The method of any one of claims 35 to 45, wherein the data acquisition module is selected from data related to crop performance, environmental data, or a combination thereof. The method of claim 46, wherein data related to crop performance is selected from fruit or vegetable quality, flowering stage, crop size, color of leaves, crop diseases or a combination thereof. The method of claim 47, wherein the data related to fruit or vegetable quality comprises fruit or vegetable size, fruit or vegetable shape, fruit or vegetable length, fruit or vegetable color, fruit or vegetable location, BRIX content, fruit
or vegetable ripeness, BRIX (%), Firmness (sh), Dry Weight (%), Quality Defect, Condition Defect, Color range, Acidity, Hue, Color Variance, Appearance, Color Coverage (%), Starch (%), Weight, Juice (%), Seeds (%), Temperature, TSS / Acidity, number of fruits or vegetable per crop, yield, health, state of the plant, performance, or any combination thereof. The method of claim 46, wherein the environmental data is selected from temperature, humidity, irradiation, location of the vehicle, location of the agricultural produce within the enclosed space and any combination thereof. The method of any one of claims 35 to 49, wherein the data acquisition module comprises at least one image acquisition element, wherein said image acquisition element is configured to collect image data related to the agricultural produce. The method of claim 50, wherein the at least one image acquisition element is a multispectral camera. The method of claim 50, wherein the at least one image acquisition element is selected from the group consisting of a Hyperspectral camera, an Infra-Red (IR) camera, a RGB camera, visible light frequency range camera, near infrared (NIR) frequency range camera, monochrome, specific light wavelengths (e.g. LED and/or laser and/or halogen and/or xenon and/or fluorescent), UV frequency range and any combination thereof. The method of claim 35, wherein the data acquisition module comprises at least one sensor configured to collect data related to the agricultural produce, wherein the data is not an image data. The method of claim 35, wherein the data acquisition module comprises data acquired from one or more of an optical sensor, an acoustic sensor, a chemical sensor, a geo-location sensor, an environmental sensor, a weather sensor; a refractometer, a pH meter, acidometer, penetrometer, durometer, temperature meter, weight sensor, quantity sensor, pressure sensor, plant and/or surrounding detection sensor, plant and/or surrounding distance sensor, device or sensor detecting odors and/or flavors, or any combination thereof. The method of any one of claims 35 to 54, wherein said system further comprising a transducer wherein said transducer is configured to transmit the data collected by the data acquisition module to the computing device. The method of any one of claims 35 to 55, wherein the computing device comprises a memory and a processor and is configured to process the data
49
collected by the data acquisition module and to transmit instructions to the controller based on the processed data. The method of any one of claims 1 to 56, wherein the controller is configured to fulfill at least one of the functions selected from: receiving instructions from the computer, providing instructions to the operator, providing instructions to the movement-controlling element of the autonomous airborne vehicle, turning on the fruit or vegetable harvesting element of the autonomous airborne vehicle, turning off the fruit or vegetable harvesting element of the autonomous airborne vehicle, communicate with other one or more autonomous airborne vehicles. The method of any one of claims 35 to 57 comprising more than one autonomous airborne vehicle. The method of claim 58, comprising a fleet of autonomous airborne vehicles. The method of claim 59, wherein the fleet of autonomous airborne vehicles comprises one or more scouting autonomous airborne vehicle and/or one or more harvesting autonomous airborne vehicle. The method of any one of claims 35 to 60, wherein the enclosed space is selected from a greenhouse, a plastic or glass house, and a net or mesh house. The method of any one of claims 36 to 61, wherein the fruit or vegetable -bearing crop is selected from the group consisting of tomato, pepper, cucumber, strawberry, raspberry, blueberry and any snack fruit or vegetable. The method of any one of claims 35 to 62, wherein the operator is a human operator. The method of any one of claims 35 to 63, wherein the operator is an automated operator. The method of any one of claims 35 to 64, wherein said system further comprising at least one container operably engaged with the one or more autonomous airborne vehicles. The method of any one of claims 35 to 65, wherein said autonomous airborne vehicle further comprises a wireless communication module or device, such as a Bluetooth device. The method of any one of claims 35 to 66, wherein said autonomous airborne vehicle further comprises a communication module to wirelessly communicate with other one or more autonomous airborne vehicles within said enclosed space.
50
The method of any one of claims 35 to 67, wherein said system further comprises at least one On-premises managing software and/or at least one off-premises or remote or on-cloud managing software. The method of any one of claims 35 to 68, wherein said system further comprises a user interface such as a graphical user interface (GUI). An autonomous airborne vehicle comprising: a) a power source; b) a data acquisition module; c) a computing system comprising a memory and a processor; d) a transducer; and, optionally, e) a controller; wherein said autonomous airborne vehicle is designed to commute inside an enclosed space characterized by having outer walls and a ceiling; and, wherein said enclosed space comprises fruit or vegetable -bearing growing agricultural produce. An autonomous airborne vehicle comprising: a) a power source; b) a data acquisition module; c) a transducer; d) a fruit or vegetable harvesting element; and, optionally, e) a controller; and f) a computing system comprising a memory and a processor; wherein said autonomous airborne vehicle is designed to commute inside an enclosed space characterized by having outer walls and a ceiling; and, wherein said enclosed space comprises fruit or vegetable -bearing growing agricultural produce. The autonomous airborne vehicle of claim 71, wherein said data acquisition module comprises at least one image acquisition element. The autonomous airborne vehicle of claim 72, wherein said at least one image acquisition element comprises a multi-function camera such as Hyperspectral, IR and/or RGB camera. The autonomous airborne vehicle of claim 71, further comprising a navigation unit such as a Global Positioning System (GPS).
51
The autonomous airborne vehicle of claim 71, further comprising a communication device with one or more other autonomous airborne vehicles. The autonomous airborne vehicle of claim 71, further comprising a fruit and/or vegetable collection container. The autonomous airborne vehicle of claim 71, further comprising at least one sensor selected from the group consisting of optical sensor, an acoustic sensor, a chemical sensor, a geo-location sensor, an environmental sensor, a weather sensor; a refractometer, a pH meter, acidometer, penetrometer, durometer, temperature meter, weight sensor, quantity sensor, pressure sensor, plant and/or surrounding detection sensor, plant and/or surrounding distance sensor, device or sensor detecting odors and/or flavors, or any combination thereof. A computer implemented method of indoor management of a growing fruit or vegetable-bearing crops comprising: a) providing the system of any one of claims 1 to 34; b) collecting image data by the data acquisition module; c) processing the collected data and assessing the state of the fruit or vegetable-bearing crops based on a set of parameters indicative of the state of the fruit or vegetable-bearing crops extracted from the collected image data; d) providing instructions to the controller based on the state of the growing fruit or vegetable-bearing crops; and, e) managing the growing fruit-bearing crops. The method of claim 78, wherein said assessing the state of the growing agricultural produce comprises using a computerized system including at least one of a machine learning system, a deep learning system, an optical flow technique, a computer vision technique, a convolutional neural network (CNN), a recurrent neutral network (RNN), or a machine learning dataflow library, and wherein said system analyzing the acquired image data comprising autonomously predicting one or more phenotypic traits of said fruit or vegetable bearing agricultural produce. The method of claim 78, wherein said assessing the state of the growing agricultural produce comprises using a trained neural network.
52
The method of claim 78, wherein said step of processing comprises steps of computing said image data using computer implemented algorithm trained to generate output based on the image data. The method of claim 81 , wherein said computer implemented algorithm is trained to generate output based on predetermined feature vectors or attributes extracted from the image data. The method of claim 78, wherein said method comprises steps of implementing with said algorithm a training process according to a training dataset comprising a plurality of training images of a plurality of fruit or vegetable-bearing crops captured by the at least one image acquisition element, wherein each respective training image of the plurality of training images is associated with the state of said fruit or vegetable-bearing crop depicted in the respective training image. The method of claim 83, wherein said training process comprises steps of a) capturing images of the fruit or vegetable -bearing crop using an imaging sensor; b) classifying images into desired categories by applying a tag associated with parameters or attributes indicative of the state of the fruit or vegetablebearing crop extracted from the image data; and c) applying a computer vision algorithm to determine a set of feature vectors associated with each desired category. The method of claim 84, comprising steps of applying a machine learning process with the computer implemented trained algorithm to determine the state of the fruit or vegetable-bearing crop. The method of claim 85, wherein said machine learning process comprises computing by the at least one neural network, a tag of at least one desired category for the at least one fruit-bearing crop, wherein the tag of at least one classification category is computed at least according to weights of the at least one neural network, wherein the at least one neural network is trained according to a training dataset comprising a plurality of training images of a plurality of fruit or vegetable-bearing crops captured by the at least one imaging sensor, wherein each respective training image of the plurality of training images is associated with said tag of at least one desired category of at least one fruit or vegetable -bearing crop depicted in the respective training image; and generating according to the
53
tag of at least one classification category, instructions for execution by the controller. The method of claim 78, wherein the state of the fruit or vegetable comprises parameters including fruit or vegetable size, fruit or vegetable shape, fruit or vegetable length, fruit or vegetable color, fruit or vegetable location, BRIX content, fruit or vegetable ripeness, BRIX (%), Firmness (sh), Dry Weight (%), Quality Defect, Condition Defect, Color range, Acidity, Hue, Color Variance, Appearance, Color Coverage (%), Starch (%), Weight, Juice (%), Seeds (%), Temperature, TSS / Acidity, number of fruits or vegetable per crop, yield, health, state of the plant, performance, or any combination thereof. A computer implemented algorithm comprising code to perform the steps of the method of any one of claims 78 to 87.
54
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202063107118P | 2020-10-29 | 2020-10-29 | |
| US63/107,118 | 2020-10-29 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2022091092A1 true WO2022091092A1 (en) | 2022-05-05 |
Family
ID=81382064
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IL2021/051273 Ceased WO2022091092A1 (en) | 2020-10-29 | 2021-10-27 | System and method for indoor crop management |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2022091092A1 (en) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115454164A (en) * | 2022-09-15 | 2022-12-09 | 马鞍山粤美智造电子科技有限公司 | Automatic dish warming and temperature controlling system based on intelligent identification |
| CN116784103A (en) * | 2023-07-26 | 2023-09-22 | 广东海洋大学 | Fruit is picked and detection device based on control of intelligence wifi |
| WO2024142219A1 (en) * | 2022-12-27 | 2024-07-04 | 株式会社クボタ | Work system and flying body |
| CN118533786A (en) * | 2024-07-26 | 2024-08-23 | 洛阳奥帆农业科技有限公司 | Vegetable greenhouse planting environment detection device |
| CN118568325A (en) * | 2024-08-05 | 2024-08-30 | 威海中玻镀膜玻璃股份有限公司 | Glass production data storage method based on information security |
| CN120315499A (en) * | 2025-06-17 | 2025-07-15 | 山东泽林农业科技有限公司 | A greenhouse all-weather environmental monitoring and control system |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| AU2017311695A1 (en) * | 2016-08-18 | 2019-04-11 | Tevel Aerobotics Technologies Ltd | Device, system and method for harvesting and diluting using aerial drones, for orchards, plantations and green houses |
| US20190303668A1 (en) * | 2018-03-30 | 2019-10-03 | Iunu, Inc. | Visual observer of unmanned aerial vehicle for monitoring horticultural grow operations |
-
2021
- 2021-10-27 WO PCT/IL2021/051273 patent/WO2022091092A1/en not_active Ceased
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| AU2017311695A1 (en) * | 2016-08-18 | 2019-04-11 | Tevel Aerobotics Technologies Ltd | Device, system and method for harvesting and diluting using aerial drones, for orchards, plantations and green houses |
| US20190166765A1 (en) * | 2016-08-18 | 2019-06-06 | Tevel Advanced Technologies Ltd. | System and method for mapping and building database for harvesting-dilution tasks using aerial drones |
| US20190303668A1 (en) * | 2018-03-30 | 2019-10-03 | Iunu, Inc. | Visual observer of unmanned aerial vehicle for monitoring horticultural grow operations |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115454164A (en) * | 2022-09-15 | 2022-12-09 | 马鞍山粤美智造电子科技有限公司 | Automatic dish warming and temperature controlling system based on intelligent identification |
| WO2024142219A1 (en) * | 2022-12-27 | 2024-07-04 | 株式会社クボタ | Work system and flying body |
| CN116784103A (en) * | 2023-07-26 | 2023-09-22 | 广东海洋大学 | Fruit is picked and detection device based on control of intelligence wifi |
| CN116784103B (en) * | 2023-07-26 | 2024-03-01 | 广东海洋大学 | Fruit is picked and detection device based on control of intelligence wifi |
| CN118533786A (en) * | 2024-07-26 | 2024-08-23 | 洛阳奥帆农业科技有限公司 | Vegetable greenhouse planting environment detection device |
| CN118568325A (en) * | 2024-08-05 | 2024-08-30 | 威海中玻镀膜玻璃股份有限公司 | Glass production data storage method based on information security |
| CN120315499A (en) * | 2025-06-17 | 2025-07-15 | 山东泽林农业科技有限公司 | A greenhouse all-weather environmental monitoring and control system |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2022091092A1 (en) | System and method for indoor crop management | |
| US12437392B2 (en) | Mobile sensing system for crop monitoring | |
| US20220107298A1 (en) | Systems and methods for crop health monitoring, assessment and prediction | |
| Defterli | Review of robotic technology for strawberry production | |
| Bergerman et al. | Robotics in agriculture and forestry | |
| Van Henten et al. | Robotics in protected cultivation | |
| CA3190046A1 (en) | Stereo-spatio-temporal crop condition measurements for plant growth and health optimization | |
| WO2021198731A1 (en) | An artificial-intelligence-based method of agricultural and horticultural plants' physical characteristics and health diagnosing and development assessment. | |
| Ashwini et al. | Transforming agriculture with smart farming: a comprehensive review of agriculture robots for research applications | |
| Kootstra et al. | Robotics in agriculture | |
| Mhamed et al. | Developments of the automated equipment of apple in the orchard: A comprehensive review | |
| Hughes et al. | Field robotics for harvesting: A review of field robotics approaches for harvesting | |
| US20240397880A1 (en) | Aerial mobile sensing system for crop monitoring | |
| Mathushika et al. | Smart farming using artificial intelligence, the internet of things, and robotics: a comprehensive review | |
| Hemming et al. | Advances in the use of robotics in greenhouse cultivation | |
| Negrete | Artificial vision in Mexican agriculture, a new techlogy for increase food security | |
| KUMAR et al. | Robots for harvesting of horticultural crop: A review: Harvesting fruits by robots | |
| Hemming | Automation and robotics in the protected environment, current developments and challenges for the future | |
| Eminoğlu et al. | Smart farming application in fruit harvesting | |
| Kosmopoulos et al. | The SOUP project: current state and future activities | |
| IL301099A (en) | Methods and equipment for artificial pollination | |
| Harjeet et al. | Machine vision technology, deep learning, and IoT in agricultural industry | |
| Kalbande et al. | Smart systems as futuristic approach towards agriculture development: a review | |
| Burks et al. | Orchard and vineyard production automation | |
| Burks et al. | Opportunity of robotics in precision horticulture. |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21885525 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 21885525 Country of ref document: EP Kind code of ref document: A1 |