WO2023230730A1 - System and method for precision application of residual herbicide through inference - Google Patents

System and method for precision application of residual herbicide through inference Download PDF

Info

Publication number
WO2023230730A1
WO2023230730A1 PCT/CA2023/050761 CA2023050761W WO2023230730A1 WO 2023230730 A1 WO2023230730 A1 WO 2023230730A1 CA 2023050761 W CA2023050761 W CA 2023050761W WO 2023230730 A1 WO2023230730 A1 WO 2023230730A1
Authority
WO
WIPO (PCT)
Prior art keywords
field
weed
weeds
drone
processor
Prior art date
Application number
PCT/CA2023/050761
Other languages
French (fr)
Inventor
Daniel Mccann
Terry ABERHART
Original Assignee
Daniel Mccann
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Daniel Mccann filed Critical Daniel Mccann
Publication of WO2023230730A1 publication Critical patent/WO2023230730A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01MCATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
    • A01M7/00Special adaptations or arrangements of liquid-spraying apparatus for purposes covered by this subclass
    • A01M7/0089Regulating or controlling systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation

Definitions

  • This invention relates to field treatment methods and systems, and more specifically to systems and methods for applying residual herbicides.
  • the term 'residual' applies to a number of herbicides that have a long-lasting activity in the soil. These herbicides are often applied directly to the soil prior to planting crops, pre-emergent. The residual (or pre-emergence) herbicides mitigate yield loss due to weed competition, provide a longer time for the crop to establish, and reduce the selection pressure for resistance to post-emergence herbicides. In addition, including pre-emergence herbicides can minimize the post-emergence herbicide applications and protect against early-season weed competition when weather or busy schedules prohibit a timely postemergence application.
  • spraying systems such as Green-on-Green technology are good at eliminating weeds that grow during the current season efficiently, but cannot see what is happening at the seed level or under the soil, rendering them ineffective at preventing breakouts in subsequent years.
  • Figure l is a block diagram of the current farm management process.
  • the present disclosure provides a system, process and method to provide precision application of residual herbicides. This results in higher efficiency of such herbicides while reduces the amount of herbicide to be used.
  • the precision application of residual herbicides provides limited application of herbicide to the patches of weed based on the previous data collected from the field.
  • the precision application of residual herbicides may include application of such herbicide based on the type of weed patches present in the field using the images previously taken from the field.
  • the precision treatment as disclosed herein has even more important as residual herbicides are generally more expensive than other herbicides, they must be applied based on the category of weeds present, and the patches of weed can work like a weed seed bomb and expand throughout the field.
  • the present disclosure provides a method for precision application of residual herbicide.
  • the method comprising processing images of a field to create a weedmap having location of weed clusters in the field; applying geofence regions encompassing said weed clusters; merging shapefile of said weed clusters to create amalgamated shapes meeting minimum requirements of a sprayer to perform praying; adding a buffer area to said amalgamated shapes; creating a map for residual herbicide application including said amalgamated shapes and said buffer areas ; spraying said field with residual herbicide based on said map using said sprayer.
  • the adding a buffer area to the amalgamated shapes comprises performing statistical analysis on said images or previously stored data.
  • creating the weedmap using location of the weed clusters in the collected images of the field comprises using an artificial intelligence framework to identify weeds in said field images and clustering said weeds.
  • creating said weedmap using location of the weed clusters in the collected images of the field comprises using said artificial intelligence framework to identify a shape of the weed and clustering said weeds based on said shape of the weed.
  • the creating said weedmap using location of the weed clusters in the collected images of the field may comprise using said artificial intelligence framework to identify species of the weeds and clustering them based on said species of the weeds.
  • the method may further comprise removing at least one low probability cluster from said weedmap based on a condition of the field at said at least one low probability cluster.
  • the condition of the field at said at least one low probability cluster comprises a negative condition reducing probability of weeds growing at a location of said at least one low probability cluster.
  • the condition of the field at said at least one low probability cluster may comprise a level of salinity of soil at said at least one low probability cluster.
  • the condition of the field at the at least one low probability cluster may comprise a terrain property reducing efficiency of said residual herbicide.
  • condition of the field at said at least one low probability cluster may comprise a topography of said field preventing proper spraying of a location of said at least one low probability cluster.
  • condition of the field at the at least one low probability cluster comprises a topography of said field reducing possibility of weed growing.
  • the method may further comprise collecting the images and said location data with a data collection system.
  • the collecting said images and said location data with said data collection system comprises collecting said images and said location data using an aerial vehicle.
  • the spraying said field with said residual herbicide based on said map using said sprayer comprises spraying said field with said residual herbicide based on said map using said aerial vehicle.
  • the aerial vehicle may be autonomous aerial vehicles.
  • the present disclosure provides a field treatment system for precision application of residual herbicide.
  • the system comprises a sprayer unit receiving at least one residual herbicide; a control unit comprising; a processor; and a non- transitory computer-readable medium containing instruction that, when executed by the processor, causes the processor to perform processing images of a field to create a weedmap having location of weed clusters in the field; applying geofence regions encompassing said weed clusters; merging shapefile of said weed clusters to create amalgamated shapes meeting minimum requirements of said sprayer to perform praying; adding a buffer area to the amalgamated shapes; creating a map for residual herbicide application including said amalgamated shapes and said buffer areas; and spraying said field with residual herbicide based on said map using said sprayer.
  • the one or more sprayer unit may include a data collection system, a navigation system, a propulsion system, a targeting system, a treatment system, and a power source.
  • the data collection system may provide data and may also include one or more of a positioning sensor, and a camera.
  • the positioning sensor may be selected from one or more of an altimeter, an ultrasonic sensor, a radar, a lidar, an accelerometer, a global positioning sensor, and the at least one camera.
  • the non-transitory computer-readable medium may contain further instruction that, when executed by the processor, causes the processor to perform use an artificial intelligence framework to identify weeds in said field images and clustering said weeds.
  • the non-transitory computer-readable medium may contain further instruction that, when executed by the processor, causes the processor to perform using said artificial intelligence framework to identify type of the weed and clustering said weeds based on their type.
  • the non-transitory computer-readable medium may contain further instruction that, when executed by the processor, causes the processor to perform using said artificial intelligence framework to identify species of the weeds and clustering them based on said species of the weeds.
  • one or more of the data collection system, the navigation system, and the targeting system are stored within a tangible computer- readable medium and are executed by a processor within the at least one autonomous drone.
  • the autonomous drone may be an aerial drone, a rolling drone, or a combination of the aerial drone and the rolling drone.
  • the field treatment system may also include an agricultural sensor to measure soil acidity, a soil moisture, a soil temperature, a conductivity, a wind direction, a wind speed, and/or radiation.
  • an agricultural sensor to measure soil acidity, a soil moisture, a soil temperature, a conductivity, a wind direction, a wind speed, and/or radiation.
  • the system may use, in addition to images collected, any element present including the agricultural sensors or other sensors to evaluate the possibility of weeds growing to provide more efficient spraying. For example, if after measuring the salinity, acidity of the soil the system concludes that the probability of weeds growing in certain areas of field is lower than certain threshold, it would remove that area from a spraying zone by modifying the weedmap or by removing that area from a buffer zone.
  • Figure l is a block diagram of the current farm management process
  • Figure 2 is a physical component architecture diagram of a treatment system having a drone, a base station, and a rolling drone;
  • FIG. 3 is a block diagram of various electronic components of the drone
  • Figures 4 and 5 are images of the field of view of the drone demonstrating target detection
  • Figure 6 is an image of the field of view of the drone demonstrating target identification with confidence intervals
  • Figures 7 to 9 are example images demonstrating a combination of a plant type identification map and a vegetation map;
  • Figures 10 to 14 are example images demonstrating steps of a herbicide treatment application map;
  • Figure 15 illustrates a flowchart of a method for precision application of residual herbicide in accordance with one embodiment of the present disclosure
  • Figure 16 shows a schematic view of where the weed clusters occur in a field which can infer the probability that weed pressure existing there in the following season.
  • Figure 17 illustrates the field in Figure 16 wherein residual herbicide prescriptions can be applied only to the areas that require it;
  • Figure 18 shows concentric circles C, B and A wherein combine harvesting may result in spread of weed seeds.
  • An example treatment system 250 disclosed herein may comprise any number and combination of the technologies, systems, subsystems, components, processes, computations, and other items discussed or referred to herein and may also be modified or augmented with existing technologies known in the art upon review of the content herein and still be within the scope and intent of the content disclosed herein.
  • the description herein may be specific to a treatment system 250 comprising one or more aerial drones and rolling drones 600 merely for convenience.
  • the field treatment identification and prioritization techniques described herein may equally apply to a conventional field treatment system such as a conventional sprayer and the like.
  • the treatment system 250 may comprise one or more aerial drones 202, one or more base stations 300, and/or one or more rolling drones 600.
  • the drone 202 may be an aerial drone 202 capable of autonomous flying over a field.
  • the aerial drone 202 may land on or near the base station 300 in order to receive electrical power and/or herbicide from the base station 300.
  • the rolling drone 600 may likewise be capable of autonomous movement around the field and may dock with the base station 300 in order to receive electrical power and/or pesticide from the base station 300.
  • the base station 300 may retrieve data from the aerial drone 202 and/or the rolling drone 600.
  • the rolling drone 600 may act as a mobile base station 300 for the one or more aerial drones 202. It will be appreciated by those skilled in the art that in some embodiments one or more of the drones 202 and 600 may work independently to recognize and deliver any type of herbicide including residual herbicide accordingly.
  • the treatment system 250 may have the base station 300 separated into one or more discrete stations 270, 280, 290.
  • the base station 300 may be separated into a battery/fuel management base station 270, a drone pesticide management system base station 280, and an on-site ground station management processing computer 290. It may be appreciated that these three base stations 270, 280, 290 may be combined into a single base station 300.
  • the field scanning drones 202 may be aerial drones, as illustrated with reference to Figures 2 and 3, instrumented with one or more flight cameras 256, a compass 258, and a GPS 260.
  • the field scanning drone 202 may comprise one or more plant scanning cameras 830 separate from the flight cameras 256. In other aspects, the plant scanning cameras 830 and the flight cameras 256 may be the same camera.
  • the field scanning drone 202 may traverse the field gathering field data in order to wirelessly relay the data to an on-site ground station management processing computer 290.
  • the field scanning drone 202 may dock with a battery/fuel management base station 270 in order to receive one or more new batteries and/or fuel.
  • the data collection system may comprise any one of or any combination of one or more cameras 254, 256, 830, one or more sensors 806, 812, and/or other data gathering devices. It is to be understood that the data collection system may include an array of various different sensors 806, 812 configured to collect data within a predefined proximal distance from the drone 202, 600, and transmit the sensor/image data back to the internal software systems of the drone 202, 600 (e.g., the targeting system 292, the spraying control, the spray vectors engine) and/or to the base station 300 and/or a display device of mission command center for outputting to an operator.
  • the internal software systems of the drone 202, 600 e.g., the targeting system 292, the spraying control, the spray vectors engine
  • the camera(s) 830 may be affixed or integrally formed with a body of the drone 202, 600. In another aspect, the camera(s) 830 may be located on a gyroscope or other stabilizing apparatus to minimize jitter and/or shaking of the camera(s) 830.
  • the camera(s) 830 may comprise a lens, a filter, and an imaging device, such as a CCD or CMOS imager.
  • the filter may only permit certain wavelengths of light to pass through and be captured by the imaging device.
  • the filter may only permit infrared light to pass through.
  • the filter may only permit ultraviolet light to pass through.
  • the filter may only permit visible light to pass through.
  • the visible light filter may be a filter mosaic in order to permit the image sensor to capture red-green-blue (RGB) colored light.
  • the filter mosaic may also include infrared, ultraviolet light filters, and/or any number of filters, such as 10 bands) that divide light into specific frequency bands.
  • the frame rate of the imaging device may be selected based on the number of filters, such as 30 frames- per-second (fps) per filter.
  • the imaging device may have five filters and therefore the imaging device may have a frame rate of at least 150-fps.
  • the frame rate may be higher or lower for a particular filter.
  • the camera(s) 830 may capture image data at 30 frames-per-second at a 4k resolution or greater.
  • the processor 802 may be configured to perform image processing on the captured image data as described in further detail below.
  • drone 202, 600 may comprise one or more light-emitting diodes (LEDs) for projecting light from the drone 202, 600 into the field of view of at least one of the cameras 830.
  • the LEDs may project infrared light, ultraviolet light, red light, blue light, green light, white light, and/or any combination thereof.
  • the processor 802 may modulate the LEDs and/or control an on/off state.
  • the LEDs may start with wavelengths not visible to most pests, such as insects, in order to more accurately determine their position without disturbing the pests.
  • the processor 802 may read position data from one or more positioning sensor(s) 806, such as an altimeter, ultrasonic sensors, radar, lidar, accelerometers, etc.
  • the positioning sensor(s) 806 may be a pair of cameras 830 capturing binocular vision from the drone 202, 600.
  • the processor 802 may triangulate a position of one or more features external to the aerial drone 202 in order to assist with navigation by a navigation system 808.
  • the navigation system 808 may provide instructions to the one or more motors 810. In this aspect, the navigation system 808 may be performed using the processor 802. In another aspect, the navigation system 808 may be independent of the processor 802.
  • the navigation system 808 may comprise one or more navigation and/or positioning sensors 806, such as a GPS system, an altimeter, ultrasonic sensors, radar, lidar, etc.
  • the positioning sensor 806 may be a pair of cameras 830 capturing binocular vision from a separate drone 202, 600 or a remotely located and fixed- position binocular camera system 830, such as a pole-mounted camera system.
  • the processor 802 may triangulate one or more locations of one more feature external to the drone 202, 600 and triangulate a drone position using the one or more features external to the drone 202, 600 in order to assist with navigation by the navigation system 808.
  • the navigation system 808 may receive input from the data collection system to assist with navigation.
  • the navigation system 808 may track a specific location of the drone 202, 600 relative to a previous location and may do so continuously in order to command the drone motors 810 to propel the drone 202, 600 to follow a desired path from the base station 300 to a treatment area and then within the treatment area.
  • the navigation system 808 may provide instructions to control the movement of the drone 202, 600.
  • the navigation system 808 may determine a first drone location and/or orientation, then be provided a desired second drone location and/or orientation, calculate a propulsion to move the drone from the first location to the second location and issue commands to move the drone 202, 600 in any number of desired directions, orientations, velocities and/or accelerations.
  • the navigation system 808 may comprise internal processors (not shown) to calculate the propulsion and/or may rely on processing resources 802 external to the navigation system 808 to calculate the propulsion with the navigation system 808.
  • the navigation system 808 may issue commands to the drone mechanical system 850, such as motors 810 and gears 822, to control the propulsion system 850.
  • the control and movement may include commands directed to pitch, elevation, yaw, azimuth, forward, backward, left, right, etc.
  • the accelerometers may be used to detect and respond to drone 202, 600 accelerations and vibrations. Such accelerations and vibrations may be caused by weather, terrain, other external influences, and/or mechanical vibration and movement of the drone 202, 600.
  • the drone 202, 600 may include rate gyros to stabilize the drone 202, 600 and magnetometers and accelerometers used for canceling gyro drift.
  • the global positioning system components or other positioning devices 806 may be included to determine the drone location, heading, and velocity to compute spraying solutions, and to target known treatment target coordinates.
  • the drone 202, 600 may comprise the drone mechanical system 850 and the drone mechanical system 850 may comprise a propulsion system 850.
  • the mechanical system 850 may comprise motors 810 driving a transmission system 822, including gears 822.
  • the drone 202, 600 may have one or more agricultural sensors 812 located on a sensor probe (not shown).
  • the processor 802 may periodically instruct the navigation system 808 to land the drone 202 or instruct the probe to move into the soil for the rolling drone 600 at positions in a field.
  • the processor 802 may read agricultural data from one or more agricultural sensors 812, such as soil acidity, soil moisture, temperature, conductivity, wind, gamma radiation sensor, and/or other radiation sensors, etc. used to construct a soil profile and/or a plant profile.
  • the sensors 812 may be able to remotely sense without requiring physical contact with the soil. For example, one or more sensor readings may be performed by measuring radiation, magnetic fields, and/or spectral analysis.
  • a liquid application system (not shown) may apply a liquid, such as water, to the soil to facilitate softening the soil for collection.
  • the processor 802 may perform image processing on the captured image data at a location in order to determine one or more of these characteristics as described in further detail herein.
  • the processor 802 may communicate via a wireless transceiver 814.
  • the wireless transceiver 814 may communicate using Wi-Fi, Bluetooth, 3G, LTE, 5G and/or a proprietary radio protocol and system, etc.
  • the processor 802 may communicate with the base station 300 in order to relay status data, such as fuel, battery life, herbicide amount, position, etc. and/or agricultural data.
  • the status data and/or agricultural data may be stored in internal memory, such as an SD card and/or a hard drive) until the processor 802 is within communication range (e.g. the wireless transceiver 814 has a stable connection with the base station 300 or when the drone 202, 600 docks with the base station 300).
  • the processor 802 may record the GPS/RTK coordinate data and/or other spatial sensing data (e.g. accelerometers, etc.) to determine the spray location without the use of cameras.
  • the GPS/RTK coordinate data may then subsequently be used by a spray drone 202, 600 that performs treatment of the one or more identified weeds.
  • An Al framework 292 may modify the priorities within one or more mission rules. For example, targets may have different characteristics such as type or size or proximity to the drone or proximity to a non-targeted plant or object. Any one or all of these may generate different spraying priorities. Thus, the Al framework 292 may be required to prioritize the targets as the targets are identified.
  • the prioritization process may be included in the identification or verification steps or may be a separate step. The prioritization may result in targets being tagged for later treatment or ignored. The prioritization may affect the order in which the targets are sprayed, or which spray nozzle is used. In some aspects, the prioritization may determine a type of treatment.
  • the drone 202, 600 may detect objects and identify and verify one or more targets, using the camera 830 and/or the sensors 806, 812 and may use additional data sources.
  • the image data from cameras 830 and the sensor data from the sensors 806, 812 may be used to detect one or more objects.
  • the same data or additional data may be used to identify the object as a target or potential target.
  • the object may be tagged for further analysis prior to being added to the target list, being tagged or being ignored.
  • the further analysis may be performed using the same or additional data such that the drone is made to collect additional data for analysis.
  • a predictive first analysis may be performed that requires fewer analysis resources and reduced analysis time.
  • the predictive first analysis can be used to optimize the drone resources 800 and only commit drone system resources 800 to objects that are predicted to be targets.
  • the predictive first analysis may be followed by a second analysis or a series of analysis prior to being added, or not, to the target list.
  • An object may be added to the target list based on one, two, or any number of analysis cycles consistent with mission rules.
  • the target list may be verified prior to or after a spray vector has been calculated.
  • the base station 300 may detect objects and identify and verify one or more targets, receiving data from the cameras 830 and/or the sensor units 806 of the drones 202, 600 and may use additional data sources.
  • the image data and the sensor data may be used to detect one or more objects.
  • the same data or additional data may be used to identify the object as the target or potential target.
  • the object may be tagged for further analysis prior to being added to the target list, or be tagged a non-target or be tagged to be ignored. Further analysis may be performed using the same or additional data such that the drone 202, 600 is made to collect additional data for analysis. In this way a predictive first analysis may be performed that requires fewer analysis resources and reduced analysis time.
  • the predictive first analysis can be used to optimize one or more resources of the drone 202, 600 and only commit the resources to objects that are predicted to be targets.
  • the predictive first analysis may be followed by a second analysis or a series of analysis prior to being added, or not, to the target list.
  • An object may be added to the target list based on one, two, or any number of analysis cycles consistent with mission rules.
  • the target list may be verified prior to or after a spray vector has been calculated.
  • the targeting system 292 may receive input data from various data sources, and analyze the data to identify, select, and prioritize targets, track real time or near-real-time relative target location, calculate and converge on spraying solutions, and control drone spraying.
  • the targeting system 292 may receive data from the cameras 830 and/or the sensor units 806, 816.
  • the data may include drone location data, drone movement vectors, drone vibration data, weather data, target images, distance/range data, infrared data, and any other sensor data described herein.
  • the drone 202, 600 may include a rules data store which may include identification rules for plants, pests or other target types.
  • the rules data store may include target selection and target priority rules.
  • the rules data store may include spraying rules, and other chemical application rules specific to the mission, the chemical(s) being applied, the target, and any other cam era/ sensor data input.
  • drone 202, 600 may identify a desired contact area for the treatment to be applied to the target.
  • the desired contact area may be a portion of the target based on target-specific characteristics such as those used for verification or may be a result of the verification step.
  • the desired contact area may be determined at any point in the process.
  • the contact area may be any particular shape or size relative to the target.
  • the target area may be determined based on the mission objectives and parameters. For example, if the mission is to spray weeds with a herbicide, a contact area for a targeted weed may include a portion of a leaf, an entire leaf, a group of leaves, stem, root(s), or the entire plant.
  • the base station 300 may identify the desired contact area for the drone 202, 600 to treat the target.
  • An object detection may involve an analysis of the image data, sensor data, etc., to detect one or more objects that may be targets within a proximity of the drone 202, 600 based on the mission rules.
  • the target identification may involve comparing object data and characteristics to a target database or target identification rules to recognize desired targets and distinguish targets from non-targets.
  • the target identification rules may be based on one or more GPS/RTK coordinates, relative locations to other objects, and/or visual characteristics.
  • the object may be detected and compared to the onboard plant database to identify the object as a weed or pest and distinguish the object from a non-target desirable plant and/or a weed or pest that has already been treated.
  • the identified weed may be added to the target list for verification or tagged for future treatment depending on the mission rules. If the object detected is not matched to the onboard plant database, the data may be relayed to the base station 300 or the mission command center 292 for further analysis with a more extensive plant database. The onboard plant database of each drone 202, 600 may be subsequently updated with the newly identified plant in order to facilitate more efficient determination of the plant by other drones 202, 600.
  • the processor 802 may be processing image data from the cameras 806 using an artificial intelligence (Al) framework 292 such as described herein in order to detect pests and/or areas of undesirable growth and flag a pest area as a treatment area.
  • Al artificial intelligence
  • the navigation system 808 may be instructed to land or lower or hover the aerial drone 202 within spraying (or treatment distance) once the aerial drone 202 reaches that point on the planned path.
  • the navigation system 808 may be instructed to deviate from the planned path by a certain threshold, which may be based on a proportion to row spacing and/or crop canopy size. In another aspect, the navigation system 808 may plan to land the aerial drone 202 at pests not on the planned path during a return path to the base station 300. If most of the field is in a specific color space (e.g. “green” for plants and “black” for dirt), the Al framework 292 may determine a geometrically significant feature in another color space (e.g. “gray” for gravel road, or “blue” for pond, or “red” for tractor).
  • a specific color space e.g. “green” for plants and “black” for dirt
  • the Al framework 292 may determine a geometrically significant feature in another color space (e.g. “gray” for gravel road, or “blue” for pond, or “red” for tractor).
  • an initial image 1700 may be captured by the data collection system using one or more of the cameras 256, 830 and processed by the object detection of the Al framework 292.
  • Figure 5 shows the image 1702 following the object detection processing.
  • the object detection has identified crop plants 1704 (e.g. surrounded by white boxes) and identified weeds 1706 (e.g. surrounded by black boxes) and surrounded those identified plants 1704 and weeds 1706 with one or more bounding boxes that have been calculated. Based on the bounding box, a center point may be determined using the bounding box and may correspond to the spray target area.
  • a probability score may be calculated or determined in a primary and/or a secondary processing engine of the Al framework 292.
  • the algorithms may involve semantic segmentation, instance segmentation, and/or object detection as previously described.
  • the output of the secondary processing engine may comprise a confidence interval, and/or pixel mask, and/or bounding box for each of the identified crop plants 1704 and/or each of the identified weeds 1706. GPS or other geolocation coordinates may also be appended to each crop plant 1704 and/or each identified weed 1706 in order to be located in the future.
  • an image containing one or more plants 1704 may be passed through the Al framework 292 that has been previously trained to identify canola plants.
  • the output from the Al framework 292 may correspond to one or more probability or confidence that each respective crop plant 1704 is a canola plant.
  • the confidence may range from 0 to 100%.
  • the identified weed 1706 may be passed through the Al framework 292 in order to determine a probability or confidence that the identified weed 1706 is indeed a weed. Therefore, each of the identified crop plants 1704 and/or each of the identified weeds 1706 may have an associated confidence.
  • the targets have been identified prior to determining a confidence, other aspects may identify the targets and the associated confidence simultaneously by the Al framework 292.
  • each of the identified crop plants 1704 and/or each of the identified weeds 1706 may have an associated confidence, problems may occur during spraying. Even when the Al framework 292 may be trained to provide very high accuracies, such as 99%, this accuracy results in roughly one out of every 100 weeds not being treated in the field. Over a large field, this accuracy may leave hundreds or thousands of untreated weeds. A risk exists that these untreated weeds may proliferate and/or damage the crop. Moreover, overtraining the Al framework 292 may result in an inflexible framework that may be unable to adapt to different plants and/or weeds at different stages.
  • the artificial intelligence system 292 may detect all or substantially all of the plants in the field.
  • the Al framework 292 is unable to reliably detect plants correctly as plants, an inefficient or undesirable outcome may occur converse to the priorities specified by the farmer using the graphical user interface.
  • one optimization to the accuracy of the Al framework 292 may be to combine Al framework 292 with secondary vegetation detection methods and/or image processing techniques to ensure all plants are adequately detected in the field prior to application of the Al framework 292.
  • an example image of a section of field is presented showing weeds 1706 and crop plants 1704.
  • the Al framework 292 may perform a plant detection on the image as shown in Figure 8; however, the Al framework 292 has failed to identify two weeds 1710 that are below a threshold size. When the secondary vegetation detection method is applied, as shown in Figure 9, the missing weeds 1710 may be more clearly visible. By combining the map in Figure 8 with the vegetation map shown in Figure 9, refined accuracy of the treatment system may be achieved.
  • a green detection process, chlorophyll/vegetation detection process, or other heuristic may locate one or more positions of all pixels in the image data representing possible vegetable/plant life. The green detection process may create a vegetation map (AA) shown in Figure 10.
  • classification system 2010 of the Al framework 292 may detect the plants in the image data and/or identify the plant type and stage in a plant type and stage map (BB) shown in Figure 11.
  • the vegetation map (AA) may then be combined with the plant type and stage map (BB) to produce a joint map (JM) shown in Figure 12.
  • the target verification may comprise image registration and/or geocoordinate registration whereby the previously captured sensor data of the target and/or geocoordinates may be saved and compared to the newly captured sensor data. For example, a matching two photos of the same target plant at different times, which might be slightly different (e.g. different angle, different position in photo, moved by wind, etc.).
  • the image registration and/or geocoordinate registration may ensure that multiple passes do not spray the same target plant more than once or may be used to determine a health of the target plant in order to determine if a more effective treatment may be necessary.
  • the present disclosure provides a method to leverage plant distribution combined with plant species detection to optimize the multi-year application of residual herbicide based on historical data trends.
  • weed seeds can remain in the ground for up to thirty years, creating challenges for application of annual herbicide. As herbicide is not completely effective at eradicating these seeds. The same regions of soil will need to be treated multiple times over multiple growing seasons, leading to chemical saturation in the soil, increased herbicide resistance, and risk of herbicide carryover. Also, different weeds (i.e. grassy weeds like Wild Oats and broadleaf weeds like Kochia) require different residual chemistries to apply. Without knowing the species or type of plant, farmers are forced to apply expensive chemicals to areas that may not need it, creating waste.
  • Residual herbicides are typically some of the most expensive, and there is no known prior art that can determine, at the species level, how to optimally apply residual herbicide to the regions most likely to have the risk of emergence in the following year.
  • Prescriptions from satellite imagery can provide a broad-brush risk-based system, but is not precise enough to differentiate between weeds and crop, or different weed species, leading to incorrect decisions or wasted chemical by over-applying the wrong chemical to the wrong weeds.
  • Spraying systems such as Green-on-Green technology are good at eliminating weeds that grow during the current season efficiently, but cannot see what is happening at the seed level or under the soil, rendering them ineffective at preventing breakouts in subsequent years and provides no data for subsequent year residual herbicides or resistance, requiring multiple passes and wasted time.
  • the present method and system benefits from the fact that weeds typically grow in clusters, which provide clues to subsequent year likelihood of emergence.
  • the method uses tracking of the species of the weeds inside a cluster and geofencing them, a geometric region can be inferred where the probability of subsequent emergence in following years will occur. This can be used to limit the application of herbicide to the relevant regions only, saving a considerable amount of chemical. This can be further improved by applying only the herbicide required for the plant species to that area. The precision and efficiency can be further improved by measuring the salinity and topography of the soil as an input to the probability. For example, in high salinity areas, it may not be important, as the weeds may not grow. Also, species and the GPS location may be required to apply the appropriate type of herbicide. Different levels of precision may be used in the presented method including but not limited to weeds location, broad leaf vs grass leaf, and species level.
  • the method includes using drone, plane, or satellite survey to captures images of a field (prescription maps - variable rate); identify the weed using method such as green sensors or an Al framework; determining the species of plants in a field or cluster (e.g.
  • the residual weed treatment method may benefit from prioritize treatment method disclosed herein.
  • the user may provide thresholds or priorities for use of residual herbicide based on the probability of weed presence as disclosed herein.
  • a method for precision application of residual herbicide includes processing images of a field to create a weedmap having location of weed clusters in the field; followed by applying geofence regions encompassing said weed clusters at S314. Then at S316 the shapefile of the weed clusters are merged to create amalgamated shapes meeting minimum requirements of a sprayer to perform spraying and a buffer area is added to the amalgamated shapes at 318. In the next step, S320 a map is created for residual herbicide application including said amalgamated shapes and the buffer areas. Lastly, the map may be used to spray the field with residual herbicide based on said map using said sprayer at step S322.
  • the concentric circles C, B and A represent a progressively decreasing probability of where the weed seeds may result after combine harvesting as you move away from the GPS location where the weed was located.
  • This figure is for illustrative purposes only but represents how the probability is influenced by wind direction and the path the combine harvester is taking.
  • ASICs application-specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors controllers, microcontrollers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.
  • One skilled in the art may choose implementations including hardware, software, scripting languages, firmware, middleware, microcode, hardware description languages, and/or any combination thereof.
  • Software, firmware, middleware, scripting language, and/or microcode implementations may have the program code or code segments to perform the necessary tasks stored in a machine-readable medium such as a storage medium.
  • a code segment or machine-executable instruction may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a script, a class, or any combination of instructions, data structures, and/or program statements.
  • a code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, and/or memory contents.
  • Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
  • modules e.g., procedures, functions, algorithms, etc.
  • Any machine-readable medium tangibly embodying instructions may be used in implementing the processes and methodologies and techniques described herein.
  • software codes may be stored in a memory.
  • Memory may be implemented within a processor or external to a processor.
  • “Memory” as used herein refers to any type of long-term, short term, volatile, nonvolatile, or other storage medium and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
  • the term “storage medium” may represent one or more memories for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices and/or other machine readable mediums for storing information.
  • ROM read only memory
  • RAM random access memory
  • magnetic RAM magnetic RAM
  • core memory magnetic disk storage mediums
  • optical storage mediums flash memory devices and/or other machine readable mediums for storing information.
  • machine-readable medium includes, but is not limited to portable or fixed storage devices, optical storage devices, and/or various other storage mediums capable of storing that contain or carry instruction(s) and/or data.
  • the computer systems described herein may use without limitation one or more general-purpose processors and/or one or more special-purpose processors (such as digital signal processing chips, graphics acceleration processors, video decoders, and/or the like).
  • the computer systems described herein may use and or configure storage devices to implement any appropriate data stores, including without limitation, various file systems, database structures, database control or manipulation or optimization methodologies.
  • the methods, systems, and devices discussed above are examples. Various configurations may omit, substitute, or add various procedures or components as appropriate. For instance, in alternative configurations, the methods may be performed in an order different from that described, and/or various stages may be added, omitted, and/or combined. Also, features described with respect to certain configurations may be combined in various other configurations.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Zoology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Environmental Sciences (AREA)
  • Wood Science & Technology (AREA)
  • Pest Control & Pesticides (AREA)
  • Insects & Arthropods (AREA)
  • Artificial Intelligence (AREA)
  • Remote Sensing (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Catching Or Destruction (AREA)

Abstract

The present disclosure provides a system and method for precision application of residual herbicide which comprise processing images of a field to create a weedmap of the location of weed clusters in the field; applying geofence regions encompassing the weed clusters; merging shapefile of the weed clusters to create amalgamated shapes meeting minimum requirements of a sprayer to perform praying; adding a buffer area to the amalgamated shapes; creating a map for residual herbicide application including the amalgamated shapes and the buffer areas; and praying the field with residual herbicide based on the map using the sprayer.

Description

SYSTEM AND METHOD FOR PRECISION APPLICATION OF RESIDUAL
HERBICIDE THROUGH INFERENCE
PRIORITY
[0001] The present application claims priority to US provisional patent application No. 63/348,660, filed on June 3, 2022, the contents of which are explicitly incorporated by reference in its entirety.
FIELD
[0002] This invention relates to field treatment methods and systems, and more specifically to systems and methods for applying residual herbicides.
BACKGROUND
[0003] The term 'residual' applies to a number of herbicides that have a long-lasting activity in the soil. These herbicides are often applied directly to the soil prior to planting crops, pre-emergent. The residual (or pre-emergence) herbicides mitigate yield loss due to weed competition, provide a longer time for the crop to establish, and reduce the selection pressure for resistance to post-emergence herbicides. In addition, including pre-emergence herbicides can minimize the post-emergence herbicide applications and protect against early-season weed competition when weather or busy schedules prohibit a timely postemergence application.
[0004] Application of a pre-emergence herbicide is particularly important in fields where herbicide-resistant weeds are present or suspected. An effective pre-emergence application can go a long way to controlling a problem weed population, especially when postemergence herbicide options are limited or limited in effectiveness.
[0005] Research has shown that adding a residual herbicide helps keep fields free of yield-robbing weeds longer and improves yields and resistance management practices.
[0006] However, residual herbicides are generally more expensive than other herbicides. Furthermore, they must be applied based on the category of weeds present.
[0007] Also, spraying systems such as Green-on-Green technology are good at eliminating weeds that grow during the current season efficiently, but cannot see what is happening at the seed level or under the soil, rendering them ineffective at preventing breakouts in subsequent years.
[0008] Therefore, exists a need for a method and system to reduce the amount of residual herbicide used while increasing the efficiency of the residual herbicides through advanced methods of treatment.
[0009] Benefits of the aspects described herein may address disadvantages of the current farm management with the crop process. Other advantages may be apparent to a person of skill in the art upon understanding the aspects as described herein.
SUMMARY
[0010] The aspects as described herein in any and/or all combinations consistent with the understanding of one skilled in the art on review of the present application are disclosed. Figure l is a block diagram of the current farm management process.
[0011] The present disclosure provides a system, process and method to provide precision application of residual herbicides. This results in higher efficiency of such herbicides while reduces the amount of herbicide to be used. In some embodiments, the precision application of residual herbicides provides limited application of herbicide to the patches of weed based on the previous data collected from the field.
[0012] In some other embodiments of the present disclosure, the precision application of residual herbicides may include application of such herbicide based on the type of weed patches present in the field using the images previously taken from the field.
[0013] The precision treatment as disclosed herein has even more important as residual herbicides are generally more expensive than other herbicides, they must be applied based on the category of weeds present, and the patches of weed can work like a weed seed bomb and expand throughout the field.
[0014] In one broad aspect, the present disclosure provides a method for precision application of residual herbicide. The method comprising processing images of a field to create a weedmap having location of weed clusters in the field; applying geofence regions encompassing said weed clusters; merging shapefile of said weed clusters to create amalgamated shapes meeting minimum requirements of a sprayer to perform praying; adding a buffer area to said amalgamated shapes; creating a map for residual herbicide application including said amalgamated shapes and said buffer areas ; spraying said field with residual herbicide based on said map using said sprayer. In some examples of the present method, the adding a buffer area to the amalgamated shapes comprises performing statistical analysis on said images or previously stored data.
[0015] In some examples, creating the weedmap using location of the weed clusters in the collected images of the field comprises using an artificial intelligence framework to identify weeds in said field images and clustering said weeds. [0016] In some examples, the creating said weedmap using location of the weed clusters in the collected images of the field comprises using said artificial intelligence framework to identify a shape of the weed and clustering said weeds based on said shape of the weed.
[0017] In some examples, the creating said weedmap using location of the weed clusters in the collected images of the field may comprise using said artificial intelligence framework to identify species of the weeds and clustering them based on said species of the weeds.
[0018] In some examples, the method may further comprise removing at least one low probability cluster from said weedmap based on a condition of the field at said at least one low probability cluster. In some examples, the condition of the field at said at least one low probability cluster comprises a negative condition reducing probability of weeds growing at a location of said at least one low probability cluster. In some examples, the condition of the field at said at least one low probability cluster may comprise a level of salinity of soil at said at least one low probability cluster. In one example, the condition of the field at the at least one low probability cluster may comprise a terrain property reducing efficiency of said residual herbicide.
[0019] In some embodiments, the condition of the field at said at least one low probability cluster may comprise a topography of said field preventing proper spraying of a location of said at least one low probability cluster.
[0020] In one example, the condition of the field at the at least one low probability cluster comprises a topography of said field reducing possibility of weed growing. In some examples, the method may further comprise collecting the images and said location data with a data collection system. In one example, the collecting said images and said location data with said data collection system comprises collecting said images and said location data using an aerial vehicle.
[0021] In one other example, the spraying said field with said residual herbicide based on said map using said sprayer comprises spraying said field with said residual herbicide based on said map using said aerial vehicle. In some examples of present method, the aerial vehicle may be autonomous aerial vehicles.
[0022] In one other broad aspect, the present disclosure provides a field treatment system for precision application of residual herbicide. The system comprises a sprayer unit receiving at least one residual herbicide; a control unit comprising; a processor; and a non- transitory computer-readable medium containing instruction that, when executed by the processor, causes the processor to perform processing images of a field to create a weedmap having location of weed clusters in the field; applying geofence regions encompassing said weed clusters; merging shapefile of said weed clusters to create amalgamated shapes meeting minimum requirements of said sprayer to perform praying; adding a buffer area to the amalgamated shapes; creating a map for residual herbicide application including said amalgamated shapes and said buffer areas; and spraying said field with residual herbicide based on said map using said sprayer.
[0023] In some examples of the treatment system, the one or more sprayer unit may include a data collection system, a navigation system, a propulsion system, a targeting system, a treatment system, and a power source. In some embodiments of the field treatment system, the data collection system may provide data and may also include one or more of a positioning sensor, and a camera.
[0024] In some examples of the treatment system, the positioning sensor may be selected from one or more of an altimeter, an ultrasonic sensor, a radar, a lidar, an accelerometer, a global positioning sensor, and the at least one camera.
[0025] In some examples of the treatment system, the non-transitory computer-readable medium may contain further instruction that, when executed by the processor, causes the processor to perform use an artificial intelligence framework to identify weeds in said field images and clustering said weeds.
[0026] In some examples of the treatment system, the non-transitory computer-readable medium may contain further instruction that, when executed by the processor, causes the processor to perform using said artificial intelligence framework to identify type of the weed and clustering said weeds based on their type.
[0027] In some examples of the treatment system, the non-transitory computer-readable medium may contain further instruction that, when executed by the processor, causes the processor to perform using said artificial intelligence framework to identify species of the weeds and clustering them based on said species of the weeds.
[0028] In some embodiments of the present disclosure, one or more of the data collection system, the navigation system, and the targeting system are stored within a tangible computer- readable medium and are executed by a processor within the at least one autonomous drone.
[0029] In some embodiments of the present disclosure, the autonomous drone may be an aerial drone, a rolling drone, or a combination of the aerial drone and the rolling drone.
[0030] In some example of the present disclosure, the field treatment system may also include an agricultural sensor to measure soil acidity, a soil moisture, a soil temperature, a conductivity, a wind direction, a wind speed, and/or radiation. [0031] It would be appreciated by those skilled in the art that the system may use, in addition to images collected, any element present including the agricultural sensors or other sensors to evaluate the possibility of weeds growing to provide more efficient spraying. For example, if after measuring the salinity, acidity of the soil the system concludes that the probability of weeds growing in certain areas of field is lower than certain threshold, it would remove that area from a spraying zone by modifying the weedmap or by removing that area from a buffer zone.
DESCRIPTION OF THE DRAWINGS
[0032] While the invention is claimed in the concluding portions hereof, example embodiments are provided in the accompanying detailed description which may be best understood in conjunction with the accompanying diagrams where like parts in each of the several diagrams are labeled with like numbers, and where:
[0033] Figure l is a block diagram of the current farm management process;
[0034] Figure 2 is a physical component architecture diagram of a treatment system having a drone, a base station, and a rolling drone;
[0035] Figure 3 is a block diagram of various electronic components of the drone;
[0036] Figures 4 and 5 are images of the field of view of the drone demonstrating target detection;
[0037] Figure 6 is an image of the field of view of the drone demonstrating target identification with confidence intervals;
[0038] Figures 7 to 9 are example images demonstrating a combination of a plant type identification map and a vegetation map; [0039] Figures 10 to 14 are example images demonstrating steps of a herbicide treatment application map;
[0040] Figure 15 illustrates a flowchart of a method for precision application of residual herbicide in accordance with one embodiment of the present disclosure;
[0041] Figure 16 shows a schematic view of where the weed clusters occur in a field which can infer the probability that weed pressure existing there in the following season. [0042] Figure 17 illustrates the field in Figure 16 wherein residual herbicide prescriptions can be applied only to the areas that require it;
[0043] Figure 18 shows concentric circles C, B and A wherein combine harvesting may result in spread of weed seeds.
DETAILED DESCRIPTION
[0044] An example treatment system 250 disclosed herein may comprise any number and combination of the technologies, systems, subsystems, components, processes, computations, and other items discussed or referred to herein and may also be modified or augmented with existing technologies known in the art upon review of the content herein and still be within the scope and intent of the content disclosed herein. The description herein may be specific to a treatment system 250 comprising one or more aerial drones and rolling drones 600 merely for convenience. In some aspects, the field treatment identification and prioritization techniques described herein may equally apply to a conventional field treatment system such as a conventional sprayer and the like.
[0045] With reference to Figure 2, the treatment system 250 may comprise one or more aerial drones 202, one or more base stations 300, and/or one or more rolling drones 600. In this aspect, the drone 202 may be an aerial drone 202 capable of autonomous flying over a field. The aerial drone 202 may land on or near the base station 300 in order to receive electrical power and/or herbicide from the base station 300. Similarly, the rolling drone 600 may likewise be capable of autonomous movement around the field and may dock with the base station 300 in order to receive electrical power and/or pesticide from the base station 300. In some aspects, the base station 300 may retrieve data from the aerial drone 202 and/or the rolling drone 600. In some aspects, the rolling drone 600 may act as a mobile base station 300 for the one or more aerial drones 202. It will be appreciated by those skilled in the art that in some embodiments one or more of the drones 202 and 600 may work independently to recognize and deliver any type of herbicide including residual herbicide accordingly.
[0046] The treatment system 250 may have the base station 300 separated into one or more discrete stations 270, 280, 290. The base station 300 may be separated into a battery/fuel management base station 270, a drone pesticide management system base station 280, and an on-site ground station management processing computer 290. It may be appreciated that these three base stations 270, 280, 290 may be combined into a single base station 300. In this aspect, there may be one or more field scanning drones 202 and one or more field treatment drones 600. The field scanning drones 202 may be aerial drones, as illustrated with reference to Figures 2 and 3, instrumented with one or more flight cameras 256, a compass 258, and a GPS 260. In some aspects, the field scanning drone 202 may comprise one or more plant scanning cameras 830 separate from the flight cameras 256. In other aspects, the plant scanning cameras 830 and the flight cameras 256 may be the same camera. The field scanning drone 202 may traverse the field gathering field data in order to wirelessly relay the data to an on-site ground station management processing computer 290. The field scanning drone 202 may dock with a battery/fuel management base station 270 in order to receive one or more new batteries and/or fuel.
[0047] The data collection system may comprise any one of or any combination of one or more cameras 254, 256, 830, one or more sensors 806, 812, and/or other data gathering devices. It is to be understood that the data collection system may include an array of various different sensors 806, 812 configured to collect data within a predefined proximal distance from the drone 202, 600, and transmit the sensor/image data back to the internal software systems of the drone 202, 600 (e.g., the targeting system 292, the spraying control, the spray vectors engine) and/or to the base station 300 and/or a display device of mission command center for outputting to an operator.
[0048] In some aspects, the camera(s) 830 may be affixed or integrally formed with a body of the drone 202, 600. In another aspect, the camera(s) 830 may be located on a gyroscope or other stabilizing apparatus to minimize jitter and/or shaking of the camera(s) 830.
[0049] The camera(s) 830 may comprise a lens, a filter, and an imaging device, such as a CCD or CMOS imager. In some aspects, the filter may only permit certain wavelengths of light to pass through and be captured by the imaging device. For example, the filter may only permit infrared light to pass through. In another example, the filter may only permit ultraviolet light to pass through. In yet another example, the filter may only permit visible light to pass through. The visible light filter may be a filter mosaic in order to permit the image sensor to capture red-green-blue (RGB) colored light. In another aspect, the filter mosaic may also include infrared, ultraviolet light filters, and/or any number of filters, such as 10 bands) that divide light into specific frequency bands. The frame rate of the imaging device may be selected based on the number of filters, such as 30 frames- per-second (fps) per filter. In this aspect, the imaging device may have five filters and therefore the imaging device may have a frame rate of at least 150-fps. In other aspects, the frame rate may be higher or lower for a particular filter. According to some aspects, the camera(s) 830 may capture image data at 30 frames-per-second at a 4k resolution or greater. The processor 802 may be configured to perform image processing on the captured image data as described in further detail below.
[0050] In some aspects, drone 202, 600 may comprise one or more light-emitting diodes (LEDs) for projecting light from the drone 202, 600 into the field of view of at least one of the cameras 830. The LEDs may project infrared light, ultraviolet light, red light, blue light, green light, white light, and/or any combination thereof. In some aspects, the processor 802 may modulate the LEDs and/or control an on/off state. In some aspects, the LEDs may start with wavelengths not visible to most pests, such as insects, in order to more accurately determine their position without disturbing the pests.
[0051] The processor 802 may read position data from one or more positioning sensor(s) 806, such as an altimeter, ultrasonic sensors, radar, lidar, accelerometers, etc. In some aspects, the positioning sensor(s) 806 may be a pair of cameras 830 capturing binocular vision from the drone 202, 600. In some aspects, the processor 802 may triangulate a position of one or more features external to the aerial drone 202 in order to assist with navigation by a navigation system 808. The navigation system 808 may provide instructions to the one or more motors 810. In this aspect, the navigation system 808 may be performed using the processor 802. In another aspect, the navigation system 808 may be independent of the processor 802.
[0052] In another aspect, the navigation system 808 may comprise one or more navigation and/or positioning sensors 806, such as a GPS system, an altimeter, ultrasonic sensors, radar, lidar, etc. In some aspects, the positioning sensor 806 may be a pair of cameras 830 capturing binocular vision from a separate drone 202, 600 or a remotely located and fixed- position binocular camera system 830, such as a pole-mounted camera system.
[0053] In some aspects, the processor 802 may triangulate one or more locations of one more feature external to the drone 202, 600 and triangulate a drone position using the one or more features external to the drone 202, 600 in order to assist with navigation by the navigation system 808. The navigation system 808 may receive input from the data collection system to assist with navigation. The navigation system 808 may track a specific location of the drone 202, 600 relative to a previous location and may do so continuously in order to command the drone motors 810 to propel the drone 202, 600 to follow a desired path from the base station 300 to a treatment area and then within the treatment area.
[0054] The navigation system 808 may provide instructions to control the movement of the drone 202, 600. The navigation system 808 may determine a first drone location and/or orientation, then be provided a desired second drone location and/or orientation, calculate a propulsion to move the drone from the first location to the second location and issue commands to move the drone 202, 600 in any number of desired directions, orientations, velocities and/or accelerations. The navigation system 808 may comprise internal processors (not shown) to calculate the propulsion and/or may rely on processing resources 802 external to the navigation system 808 to calculate the propulsion with the navigation system 808. The navigation system 808 may issue commands to the drone mechanical system 850, such as motors 810 and gears 822, to control the propulsion system 850. The control and movement may include commands directed to pitch, elevation, yaw, azimuth, forward, backward, left, right, etc.
[0055] The accelerometers may be used to detect and respond to drone 202, 600 accelerations and vibrations. Such accelerations and vibrations may be caused by weather, terrain, other external influences, and/or mechanical vibration and movement of the drone 202, 600. The drone 202, 600 may include rate gyros to stabilize the drone 202, 600 and magnetometers and accelerometers used for canceling gyro drift. The global positioning system components or other positioning devices 806 may be included to determine the drone location, heading, and velocity to compute spraying solutions, and to target known treatment target coordinates.
[0056] The drone 202, 600 may comprise the drone mechanical system 850 and the drone mechanical system 850 may comprise a propulsion system 850. The mechanical system 850 may comprise motors 810 driving a transmission system 822, including gears 822.
[0057] The drone 202, 600 may have one or more agricultural sensors 812 located on a sensor probe (not shown). The processor 802 may periodically instruct the navigation system 808 to land the drone 202 or instruct the probe to move into the soil for the rolling drone 600 at positions in a field. When the drone 202, 600 has landed or reached a sufficient distance depending on whether or not the sensor 812 requires contact with the field, the processor 802 may read agricultural data from one or more agricultural sensors 812, such as soil acidity, soil moisture, temperature, conductivity, wind, gamma radiation sensor, and/or other radiation sensors, etc. used to construct a soil profile and/or a plant profile.
[0058] In other aspects, the sensors 812 may be able to remotely sense without requiring physical contact with the soil. For example, one or more sensor readings may be performed by measuring radiation, magnetic fields, and/or spectral analysis. In some aspects, a liquid application system (not shown) may apply a liquid, such as water, to the soil to facilitate softening the soil for collection.
[0059] According to some aspects, the processor 802 may perform image processing on the captured image data at a location in order to determine one or more of these characteristics as described in further detail herein.
[0060] The processor 802 may communicate via a wireless transceiver 814. The wireless transceiver 814 may communicate using Wi-Fi, Bluetooth, 3G, LTE, 5G and/or a proprietary radio protocol and system, etc. The processor 802 may communicate with the base station 300 in order to relay status data, such as fuel, battery life, herbicide amount, position, etc. and/or agricultural data. In another aspect, the status data and/or agricultural data may be stored in internal memory, such as an SD card and/or a hard drive) until the processor 802 is within communication range (e.g. the wireless transceiver 814 has a stable connection with the base station 300 or when the drone 202, 600 docks with the base station 300).
[0061] In one aspect, on detection of the weed by the processor 802, the processor 802 may record the GPS/RTK coordinate data and/or other spatial sensing data (e.g. accelerometers, etc.) to determine the spray location without the use of cameras. The GPS/RTK coordinate data may then subsequently be used by a spray drone 202, 600 that performs treatment of the one or more identified weeds.
[0062] An Al framework 292 may modify the priorities within one or more mission rules. For example, targets may have different characteristics such as type or size or proximity to the drone or proximity to a non-targeted plant or object. Any one or all of these may generate different spraying priorities. Thus, the Al framework 292 may be required to prioritize the targets as the targets are identified. The prioritization process may be included in the identification or verification steps or may be a separate step. The prioritization may result in targets being tagged for later treatment or ignored. The prioritization may affect the order in which the targets are sprayed, or which spray nozzle is used. In some aspects, the prioritization may determine a type of treatment.
[0063] In one aspect, the drone 202, 600 may detect objects and identify and verify one or more targets, using the camera 830 and/or the sensors 806, 812 and may use additional data sources. For example, the image data from cameras 830 and the sensor data from the sensors 806, 812 may be used to detect one or more objects. The same data or additional data may be used to identify the object as a target or potential target. The object may be tagged for further analysis prior to being added to the target list, being tagged or being ignored. The further analysis may be performed using the same or additional data such that the drone is made to collect additional data for analysis.
[0064] In this way, a predictive first analysis may be performed that requires fewer analysis resources and reduced analysis time. The predictive first analysis can be used to optimize the drone resources 800 and only commit drone system resources 800 to objects that are predicted to be targets. The predictive first analysis may be followed by a second analysis or a series of analysis prior to being added, or not, to the target list. An object may be added to the target list based on one, two, or any number of analysis cycles consistent with mission rules. The target list may be verified prior to or after a spray vector has been calculated.
[0065] In another aspect, the base station 300 may detect objects and identify and verify one or more targets, receiving data from the cameras 830 and/or the sensor units 806 of the drones 202, 600 and may use additional data sources. For example, the image data and the sensor data may be used to detect one or more objects. The same data or additional data may be used to identify the object as the target or potential target. The object may be tagged for further analysis prior to being added to the target list, or be tagged a non-target or be tagged to be ignored. Further analysis may be performed using the same or additional data such that the drone 202, 600 is made to collect additional data for analysis. In this way a predictive first analysis may be performed that requires fewer analysis resources and reduced analysis time. The predictive first analysis can be used to optimize one or more resources of the drone 202, 600 and only commit the resources to objects that are predicted to be targets. The predictive first analysis may be followed by a second analysis or a series of analysis prior to being added, or not, to the target list. An object may be added to the target list based on one, two, or any number of analysis cycles consistent with mission rules. The target list may be verified prior to or after a spray vector has been calculated.
[0066] The targeting system 292 may receive input data from various data sources, and analyze the data to identify, select, and prioritize targets, track real time or near-real-time relative target location, calculate and converge on spraying solutions, and control drone spraying. The targeting system 292 may receive data from the cameras 830 and/or the sensor units 806, 816. The data may include drone location data, drone movement vectors, drone vibration data, weather data, target images, distance/range data, infrared data, and any other sensor data described herein. The drone 202, 600 may include a rules data store which may include identification rules for plants, pests or other target types. The rules data store may include target selection and target priority rules. The rules data store may include spraying rules, and other chemical application rules specific to the mission, the chemical(s) being applied, the target, and any other cam era/ sensor data input.
[0067] In one aspect, drone 202, 600 may identify a desired contact area for the treatment to be applied to the target. The desired contact area may be a portion of the target based on target-specific characteristics such as those used for verification or may be a result of the verification step. The desired contact area may be determined at any point in the process. The contact area may be any particular shape or size relative to the target. The target area may be determined based on the mission objectives and parameters. For example, if the mission is to spray weeds with a herbicide, a contact area for a targeted weed may include a portion of a leaf, an entire leaf, a group of leaves, stem, root(s), or the entire plant. In another aspect, the base station 300 may identify the desired contact area for the drone 202, 600 to treat the target.
[0068] An object detection may involve an analysis of the image data, sensor data, etc., to detect one or more objects that may be targets within a proximity of the drone 202, 600 based on the mission rules. The target identification may involve comparing object data and characteristics to a target database or target identification rules to recognize desired targets and distinguish targets from non-targets. The target identification rules may be based on one or more GPS/RTK coordinates, relative locations to other objects, and/or visual characteristics. For example, the object may be detected and compared to the onboard plant database to identify the object as a weed or pest and distinguish the object from a non-target desirable plant and/or a weed or pest that has already been treated. Further, the identified weed may be added to the target list for verification or tagged for future treatment depending on the mission rules. If the object detected is not matched to the onboard plant database, the data may be relayed to the base station 300 or the mission command center 292 for further analysis with a more extensive plant database. The onboard plant database of each drone 202, 600 may be subsequently updated with the newly identified plant in order to facilitate more efficient determination of the plant by other drones 202, 600.
[0069] While the aerial drone 202 is passing over the field, the processor 802 may be processing image data from the cameras 806 using an artificial intelligence (Al) framework 292 such as described herein in order to detect pests and/or areas of undesirable growth and flag a pest area as a treatment area. When the processor 802 determines a pest or weed is located on a planned path, the navigation system 808 may be instructed to land or lower or hover the aerial drone 202 within spraying (or treatment distance) once the aerial drone 202 reaches that point on the planned path. In another example, when the processor 802 determines a pest or weed is not located on the planned path, the navigation system 808 may be instructed to deviate from the planned path by a certain threshold, which may be based on a proportion to row spacing and/or crop canopy size. In another aspect, the navigation system 808 may plan to land the aerial drone 202 at pests not on the planned path during a return path to the base station 300. If most of the field is in a specific color space (e.g. “green” for plants and “black” for dirt), the Al framework 292 may determine a geometrically significant feature in another color space (e.g. “gray” for gravel road, or “blue” for pond, or “red” for tractor).
[0070] In the example presented in Figures 4 and 5, one or more weeds may be determined. These techniques described may equally apply to other types of pests, such as insects, disease, and/or damaged plants. As shown in Figure 4, an initial image 1700 may be captured by the data collection system using one or more of the cameras 256, 830 and processed by the object detection of the Al framework 292. Figure 5 shows the image 1702 following the object detection processing. The object detection has identified crop plants 1704 (e.g. surrounded by white boxes) and identified weeds 1706 (e.g. surrounded by black boxes) and surrounded those identified plants 1704 and weeds 1706 with one or more bounding boxes that have been calculated. Based on the bounding box, a center point may be determined using the bounding box and may correspond to the spray target area.
[0071] Turning to Figure 6, for each of identified crop plants 1704 and/or each of the identified weeds 1706, a probability score may be calculated or determined in a primary and/or a secondary processing engine of the Al framework 292. The algorithms may involve semantic segmentation, instance segmentation, and/or object detection as previously described. The output of the secondary processing engine may comprise a confidence interval, and/or pixel mask, and/or bounding box for each of the identified crop plants 1704 and/or each of the identified weeds 1706. GPS or other geolocation coordinates may also be appended to each crop plant 1704 and/or each identified weed 1706 in order to be located in the future. For example, an image containing one or more plants 1704 may be passed through the Al framework 292 that has been previously trained to identify canola plants. The output from the Al framework 292 may correspond to one or more probability or confidence that each respective crop plant 1704 is a canola plant. In practice, the confidence may range from 0 to 100%. In another example, the identified weed 1706 may be passed through the Al framework 292 in order to determine a probability or confidence that the identified weed 1706 is indeed a weed. Therefore, each of the identified crop plants 1704 and/or each of the identified weeds 1706 may have an associated confidence. Although described herein, the targets have been identified prior to determining a confidence, other aspects may identify the targets and the associated confidence simultaneously by the Al framework 292.
[0072] Although each of the identified crop plants 1704 and/or each of the identified weeds 1706 may have an associated confidence, problems may occur during spraying. Even when the Al framework 292 may be trained to provide very high accuracies, such as 99%, this accuracy results in roughly one out of every 100 weeds not being treated in the field. Over a large field, this accuracy may leave hundreds or thousands of untreated weeds. A risk exists that these untreated weeds may proliferate and/or damage the crop. Moreover, overtraining the Al framework 292 may result in an inflexible framework that may be unable to adapt to different plants and/or weeds at different stages.
[0073] As described herein, the artificial intelligence system 292 may detect all or substantially all of the plants in the field. When the Al framework 292 is unable to reliably detect plants correctly as plants, an inefficient or undesirable outcome may occur converse to the priorities specified by the farmer using the graphical user interface. As presented in Figures 4 to 6, one optimization to the accuracy of the Al framework 292 may be to combine Al framework 292 with secondary vegetation detection methods and/or image processing techniques to ensure all plants are adequately detected in the field prior to application of the Al framework 292. In Figure 7, an example image of a section of field is presented showing weeds 1706 and crop plants 1704. The Al framework 292 may perform a plant detection on the image as shown in Figure 8; however, the Al framework 292 has failed to identify two weeds 1710 that are below a threshold size. When the secondary vegetation detection method is applied, as shown in Figure 9, the missing weeds 1710 may be more clearly visible. By combining the map in Figure 8 with the vegetation map shown in Figure 9, refined accuracy of the treatment system may be achieved. In one aspect, a green detection process, chlorophyll/vegetation detection process, or other heuristic may locate one or more positions of all pixels in the image data representing possible vegetable/plant life. The green detection process may create a vegetation map (AA) shown in Figure 10. Sequentially, or in parallel, classification system 2010 of the Al framework 292 may detect the plants in the image data and/or identify the plant type and stage in a plant type and stage map (BB) shown in Figure 11. The vegetation map (AA) may then be combined with the plant type and stage map (BB) to produce a joint map (JM) shown in Figure 12.
[0074] In some aspects, the target verification may comprise image registration and/or geocoordinate registration whereby the previously captured sensor data of the target and/or geocoordinates may be saved and compared to the newly captured sensor data. For example, a matching two photos of the same target plant at different times, which might be slightly different (e.g. different angle, different position in photo, moved by wind, etc.). The image registration and/or geocoordinate registration may ensure that multiple passes do not spray the same target plant more than once or may be used to determine a health of the target plant in order to determine if a more effective treatment may be necessary.
[0075] In one other improvement, the present disclosure provides a method to leverage plant distribution combined with plant species detection to optimize the multi-year application of residual herbicide based on historical data trends.
[0076] In many cases, weed seeds can remain in the ground for up to thirty years, creating challenges for application of annual herbicide. As herbicide is not completely effective at eradicating these seeds. The same regions of soil will need to be treated multiple times over multiple growing seasons, leading to chemical saturation in the soil, increased herbicide resistance, and risk of herbicide carryover. Also, different weeds (i.e. grassy weeds like Wild Oats and broadleaf weeds like Kochia) require different residual chemistries to apply. Without knowing the species or type of plant, farmers are forced to apply expensive chemicals to areas that may not need it, creating waste.
[0077] Residual herbicides are typically some of the most expensive, and there is no known prior art that can determine, at the species level, how to optimally apply residual herbicide to the regions most likely to have the risk of emergence in the following year.
[0078] Prescriptions from satellite imagery can provide a broad-brush risk-based system, but is not precise enough to differentiate between weeds and crop, or different weed species, leading to incorrect decisions or wasted chemical by over-applying the wrong chemical to the wrong weeds.
[0079] Spraying systems such as Green-on-Green technology are good at eliminating weeds that grow during the current season efficiently, but cannot see what is happening at the seed level or under the soil, rendering them ineffective at preventing breakouts in subsequent years and provides no data for subsequent year residual herbicides or resistance, requiring multiple passes and wasted time.
[0080] The present method and system benefits from the fact that weeds typically grow in clusters, which provide clues to subsequent year likelihood of emergence.
[0081] In some examples of present disclosure, the method uses tracking of the species of the weeds inside a cluster and geofencing them, a geometric region can be inferred where the probability of subsequent emergence in following years will occur. This can be used to limit the application of herbicide to the relevant regions only, saving a considerable amount of chemical. This can be further improved by applying only the herbicide required for the plant species to that area. The precision and efficiency can be further improved by measuring the salinity and topography of the soil as an input to the probability. For example, in high salinity areas, it may not be important, as the weeds may not grow. Also, species and the GPS location may be required to apply the appropriate type of herbicide. Different levels of precision may be used in the presented method including but not limited to weeds location, broad leaf vs grass leaf, and species level.
[0082] In one exemplary embodiment the method includes using drone, plane, or satellite survey to captures images of a field (prescription maps - variable rate); identify the weed using method such as green sensors or an Al framework; determining the species of plants in a field or cluster (e.g. all plants or by sampling); geolocate the weeds and create a weedmap with precise GPS positions of the plants; using a mathematical technique to create a geofenced region encompassing the cluster; invoking shapefile amalgamation algorithm; adding a buffer area to the geofenced region; taking into account terrain conditions affecting weed growth including salinity (no weed would grow in high salinity), topography and any other reason that herbicide would not grow including modifiers soil moisture and condition; factoring in multi-year data collected from the field; and creating a map for residual application of herbicide which can be loaded into sprayers.
[0083] It will be appreciated by those skilled in the art that the residual weed treatment method may benefit from prioritize treatment method disclosed herein. In some examples, the user may provide thresholds or priorities for use of residual herbicide based on the probability of weed presence as disclosed herein.
[0084] The details of field treatment methods and system have been also disclosed by the applicant in the international PCT patent application having serial number PCT/CA2020/050276 with the publication number WO/2020/172756, incorporated herein by reference.
[0085] In some examples of the present disclosure, a method for precision application of residual herbicide is provided. As illustrated in Figure 15, at S312 the method includes processing images of a field to create a weedmap having location of weed clusters in the field; followed by applying geofence regions encompassing said weed clusters at S314. Then at S316 the shapefile of the weed clusters are merged to create amalgamated shapes meeting minimum requirements of a sprayer to perform spraying and a buffer area is added to the amalgamated shapes at 318. In the next step, S320 a map is created for residual herbicide application including said amalgamated shapes and the buffer areas. Lastly, the map may be used to spray the field with residual herbicide based on said map using said sprayer at step S322.
[0086] As illustrated in Figure 16, by understanding where the weed clusters occur, we can infer the probability that weed pressure will exist there in the following season. As illustrated in Figure 17, using the method disclosed here, a residual herbicide prescription can be applied only to the areas that require it. This shows a saving that may be up to 77% versus spraying the whole field. The different colors represent the targeted herbicides.
[0087] The presence of weeds in a field is related to the distribution of weed seeds, arriving in the field from germinated plants either through drift, wind, or contamination from people, equipment, or animals. While this can create a seemingly random distribution of weeds in the field, in the vast majority of cases, there is a correlation between these carrier methods and the presence of weed seeds.
[0088] In some examples of the present method, statistical models can be created which can exploit this correlation and determine the probability of weeds emerging in certain areas. One example is the presence of germinated weeds prior to combine harvesting in a wheat field. During a combine harvesting operation, crop biomass is separated from the wheat kernels. However, since the process of combine harvesting indiscriminately harvests all plants, weeds will also be ingested into the harvester. Since a wheat kernel is much larger than thistle seeds, the thistle’s seedlings will not be captured, but rather pass through the system and be stochastically distributed in the nearby soil, in the commodity, or on the machine. These weed seeds may activate in the current year, or lie dormant until subsequent years, requiring residual herbicide treatment to control.
[0089] However, by knowing the GPS location of the original thistle, the path of the combine harvester, and the wind speed and direction, it is possible to develop a statistical probability distribution as to where the weed seeds might be located in the soil. Thus, instead of applying residual herbicide to the entire field, it becomes necessary only to apply it to the regions where the seed distribution is the most statistically probable. This approach can achieve the similar weed control outcomes to indiscriminate broadcast application, but use far less chemical, as no chemical will be applied to areas where weeds are statistically unlikely to occur.
[0090] In Figure 18, the concentric circles C, B and A represent a progressively decreasing probability of where the weed seeds may result after combine harvesting as you move away from the GPS location where the weed was located. This figure is for illustrative purposes only but represents how the probability is influenced by wind direction and the path the combine harvester is taking.
[0091] Although the steps of detecting multiple objects, identifying and verifying and prioritizing multiple targets, calculating multiple spray vectors, determining the success of multiple sprays and using the spray success determinations as an input to subsequent spray vector calculations may be shown as single systems or subsystems. One skilled in the art will understand that these systems, subsystems or portions thereof, can be combined, used in a different sequence than shown here, simplified and in some cases omitted to achieve a less complex and less resource-intensive design.
[0092] Various components, subcomponents and parts can be used to achieve, implement and practice the processes, computations, techniques, steps, means and purposes described herein. The embodiments and inventions contained herein may be practiced in various forms and approaches as selected by one skilled in the art. For example, these processes, computations, techniques, steps, means and purposes described herein may be achieved and implemented in hardware, software, firmware or a combination thereof. The computing components and processes described herein can be distributed across a fixed or mobile network or both at the same time or different times. For example, some processing may be performed in one location using a first processor while other processing may be performed by another processor remote from the first processor. Other components of computer system may be similarly distributed. As such, computer system may be interpreted as a distributed computing system that performs processing in multiple locations. In some instances, computer system may be interpreted as a single computing device.
[0093] One skilled in the art may choose hardware implementations for the processing units using one or more application-specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, microcontrollers, microprocessors, other electronic units designed to perform the functions described above, and/or a combination thereof.
[0094] One skilled in the art may choose implementations including hardware, software, scripting languages, firmware, middleware, microcode, hardware description languages, and/or any combination thereof. Software, firmware, middleware, scripting language, and/or microcode implementations may have the program code or code segments to perform the necessary tasks stored in a machine-readable medium such as a storage medium. A code segment or machine-executable instruction may represent a procedure, a function, a subprogram, a program, a routine, a subroutine, a module, a software package, a script, a class, or any combination of instructions, data structures, and/or program statements. A code segment may be coupled to another code segment or a hardware circuit by passing and/or receiving information, data, arguments, parameters, and/or memory contents. Information, arguments, parameters, data, etc. may be passed, forwarded, or transmitted via any suitable means including memory sharing, message passing, token passing, network transmission, etc.
[0095] One skilled in the art may choose implementations including firmware and/or software utilizing modules (e.g., procedures, functions, algorithms, etc.) that perform the processes described herein. Any machine-readable medium tangibly embodying instructions may be used in implementing the processes and methodologies and techniques described herein. For example, software codes may be stored in a memory. Memory may be implemented within a processor or external to a processor. “Memory” as used herein refers to any type of long-term, short term, volatile, nonvolatile, or other storage medium and is not to be limited to any particular type of memory or number of memories, or type of media upon which memory is stored.
[0096] Moreover, as disclosed herein, the term “storage medium” may represent one or more memories for storing data, including read only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices and/or other machine readable mediums for storing information. The term “machine-readable medium” includes, but is not limited to portable or fixed storage devices, optical storage devices, and/or various other storage mediums capable of storing that contain or carry instruction(s) and/or data.
[0097] The computer systems described herein may use without limitation one or more general-purpose processors and/or one or more special-purpose processors (such as digital signal processing chips, graphics acceleration processors, video decoders, and/or the like). The computer systems described herein may use and or configure storage devices to implement any appropriate data stores, including without limitation, various file systems, database structures, database control or manipulation or optimization methodologies. [0098] The methods, systems, and devices discussed above are examples. Various configurations may omit, substitute, or add various procedures or components as appropriate. For instance, in alternative configurations, the methods may be performed in an order different from that described, and/or various stages may be added, omitted, and/or combined. Also, features described with respect to certain configurations may be combined in various other configurations. Different aspects and elements of the configurations may be combined in a similar manner. Also, technology evolves and, thus, many of the elements are examples and do not limit the scope of the disclosure or claims. The foregoing is considered as illustrative only of the principles of the invention.
[0099] Further, since numerous changes and modifications will readily occur to those skilled in the art, it is not desired to limit the invention to the exact construction and operation shown and described, and accordingly, all such suitable changes or modifications in structure or operation which may be resorted to are intended to fall within the scope of the claimed invention.

Claims

We claim:
1. A method for precision application of residual herbicide, the method comprising:
- processing images of a field to create a weedmap having location of weed clusters in the field;
- applying geofence regions encompassing said weed clusters;
- merging shapefile of said weed clusters to create amalgamated shapes meeting minimum requirements of a sprayer to perform praying;
- adding a buffer area to said amalgamated shapes;
- creating a map for residual herbicide application including said amalgamated shapes and said buffer areas;
- spraying said field with residual herbicide based on said map using said sprayer.
2. The method in claim 1, wherein adding a buffer area to said amalgamated shapes comprises performing statistical analysis on said images or previously stored data.
3. The method in claim 1 or 2, wherein said creating said weedmap using location of the weed clusters in the collected images of the field comprises using an artificial intelligence framework to identify weeds in said field images and clustering said weeds.
4. The method in claim 1 or 2, wherein said creating said weedmap using location of the weed clusters in the collected images of the field comprises using said artificial intelligence framework to identify a shape of the weed and clustering said weeds based on said shape of the weed.
5. The method in claim 1 or 2, wherein said creating said weedmap using location of the weed clusters in the collected images of the field comprises using said artificial intelligence framework to identify species of the weeds and clustering them based on said species of the weeds.
6. The method in any one of claims 1 to 5, further comprising removing at least one low probability cluster from said weedmap based on a condition of the field at said at least one low probability cluster.
7. The method in claim 6, wherein said condition of the field at said at least one low probability cluster comprises a negative condition reducing probability of weeds growing at a location of said at least one low probability cluster.
8. The method in claim 7, wherein said condition of the field at said at least one low probability cluster comprises a level of salinity of soil at said at least one low probability cluster.
9. The method in claim 6, wherein said condition of the field at said at least one low probability cluster comprises a terrain property reducing efficiency of said residual herbicide.
10. The method in claim 9, wherein said condition of the field at said at least one low probability cluster comprises a topography of said field preventing proper spraying of a location of said at least one low probability cluster.
11. The method in claim 9, wherein said condition of the field at said at least one low probability cluster comprises a topography of said field reducing possibility of weed growing.
12. The method in any one of claims 1 to 11, further comprising collecting said images and said location data with a data collection system.
13. The method in claim 12, wherein collecting said images and said location data with said data collection system comprises collecting said images and said location data using an aerial vehicle.
14. The method in claim 13, where said spraying said field with said residual herbicide based on said map using said sprayer comprises spraying said field with said residual herbicide based on said map using said aerial vehicle.
15. The method in claims 13 or 14, said aerial vehicle is an autonomous aerial vehicle.
. A field treatment system for precision application of residual herbicide, the system comprising:
-a sprayer unit receiving at least one residual herbicide;
-a control unit comprising:
- a processor; and
- a non-transitory computer-readable medium containing instruction that, when executed by the processor, causes the processor to perform:
- processing images of afield to create a weedmap having location of weed clusters in the field;
- applying geofence regions encompassing said weed clusters;
- merging shapefile of said weed clusters to create amalgamated shapes meeting minimum requirements of said sprayer to perform praying;
- adding a buffer area to said amalgamated shapes;
- creating a map for residual herbicide application including said amalgamated shapes and said buffer areas;
- spraying said field with residual herbicide based on said map using said sprayer. . The field treatment system according to claim 16, wherein the at least one sprayer unit comprises a data collection system, a navigation system, a propulsion system, a targeting system, a treatment system, and a power source. . The field treatment system according to claim 17, wherein the data collection system providing data and comprises at least one of: at least one positioning sensor, at least one and at least one camera. . The field treatment system according to claim 18, wherein the at least one positioning sensor is selected from at least one of an altimeter, an ultrasonic sensor, a radar, a lidar, an accelerometer, a global positioning sensor, and the at least one camera. . The field treatment system in any one of claims 16 to 19, wherein said non-transitory computer- readable medium contains further instruction that, when executed by the processor, causes the processor to perform use an artificial intelligence framework to identify weeds in said field images and clustering said weeds. . The field treatment system in any one of claims 16 to 20, wherein said non-transitory computer- readable medium contains further instruction that, when executed by the processor, causes the processor to perform using said artificial intelligence framework to identify type of the weed and clustering said weeds based on their type. . The field treatment system in any one of claims 16 to 20, wherein said non-transitory computer- readable medium contains further instruction that, when executed by the processor, causes the processor to perform using said artificial intelligence framework to identify species of the weeds and clustering them based on said species of the weeds. . The field treatment system in any one of claims 17 to 22, wherein at least one of the data collection system, the navigation system, and the targeting system are stored within a tangible computer-readable medium and is executed by a processor within the at least one autonomous drone. . The field treatment system in claim 23, wherein the at least one autonomous drone is selected from at least one of: an aerial drone, a rolling drone, and a combination of the aerial drone and the rolling drone. . The field treatment system in any one of claims 17 to 24, further comprising at least one agricultural sensor to measure at least one of soil acidity, soil moisture, soil temperature, conductivity, wind direction, wind speed, and radiation.
PCT/CA2023/050761 2022-06-03 2023-06-02 System and method for precision application of residual herbicide through inference WO2023230730A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263348660P 2022-06-03 2022-06-03
US63/348,660 2022-06-03

Publications (1)

Publication Number Publication Date
WO2023230730A1 true WO2023230730A1 (en) 2023-12-07

Family

ID=89026418

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2023/050761 WO2023230730A1 (en) 2022-06-03 2023-06-02 System and method for precision application of residual herbicide through inference

Country Status (1)

Country Link
WO (1) WO2023230730A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090226036A1 (en) * 2005-04-29 2009-09-10 Jozsef Gaal Setup for constructing a weed map
WO2021009136A1 (en) * 2019-07-15 2021-01-21 Basf Agro Trademarks Gmbh Method for generating an application map for treating a field with an agricultural equipment
WO2021062459A1 (en) * 2019-10-04 2021-04-08 Single Agriculture Pty Ltd Weed mapping
US20210120731A1 (en) * 2019-10-29 2021-04-29 International Business Machines Corporation Multi-dimension artificial intelligence agriculture advisor
WO2021089825A1 (en) * 2019-11-08 2021-05-14 Basf Agro Trade Marks Method for automated buffer zone management

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090226036A1 (en) * 2005-04-29 2009-09-10 Jozsef Gaal Setup for constructing a weed map
WO2021009136A1 (en) * 2019-07-15 2021-01-21 Basf Agro Trademarks Gmbh Method for generating an application map for treating a field with an agricultural equipment
WO2021062459A1 (en) * 2019-10-04 2021-04-08 Single Agriculture Pty Ltd Weed mapping
US20210120731A1 (en) * 2019-10-29 2021-04-29 International Business Machines Corporation Multi-dimension artificial intelligence agriculture advisor
WO2021089825A1 (en) * 2019-11-08 2021-05-14 Basf Agro Trade Marks Method for automated buffer zone management

Similar Documents

Publication Publication Date Title
EP3741214B1 (en) Method for plantation treatment based on image recognition
US11937524B2 (en) Applying multiple processing schemes to target objects
US11751559B2 (en) Detecting and treating a target from a moving platform
US20220377970A1 (en) Payload selection to treat multiple plant objects having different attributes
US11526997B2 (en) Targeting agricultural objects to apply units of treatment autonomously
US10206324B2 (en) Autonomous agricultural robot (agbot) for decision making and courses of action considering real-time conditions
US11449976B2 (en) Pixel projectile delivery system to replicate an image on a surface using pixel projectiles
US20210186006A1 (en) Autonomous agricultural treatment delivery
US20210153500A1 (en) Plant treatment techniques
US20210185942A1 (en) Managing stages of growth of a crop with micro-precision via an agricultural treatment delivery system
US11653590B2 (en) Calibration of systems to deliver agricultural projectiles
Christensen et al. Sensing for weed detection
WO2023230730A1 (en) System and method for precision application of residual herbicide through inference
US10959058B1 (en) Object tracking systems and methods
WO2023069841A1 (en) Autonomous detection and control of vegetation
Mondal et al. Autonomous architecture for uav-based agricultural survey
US20240074428A1 (en) System and method for adjustable targeting in field treatment
Rovira-Más et al. Crop scouting and surrounding awareness for specialty crops
Niu et al. The Unmanned Ground Vehicles (UGVs) for Digital Agriculture
Li et al. Abnormal Crops Image Data Acquisition Strategy by Exploiting Edge Intelligence and Dynamic-Static Synergy in Smart Agriculture
Potena Perception and environment modeling in robotic agriculture contexts
Siavalas et al. Unmanned aerial vehicles for agricultural automation
Gan An Autonomous Immature Green Citrus Fruit Yield Mapping System
CN117876903A (en) Pesticide spraying method, system, electronic device, medium and program product
CN115500334A (en) Sprayer, plant disease and insect pest identification method and identification equipment

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23814609

Country of ref document: EP

Kind code of ref document: A1