WO2019078733A1 - Pasture satellite measurements - Google Patents

Pasture satellite measurements Download PDF

Info

Publication number
WO2019078733A1
WO2019078733A1 PCT/NZ2018/050128 NZ2018050128W WO2019078733A1 WO 2019078733 A1 WO2019078733 A1 WO 2019078733A1 NZ 2018050128 W NZ2018050128 W NZ 2018050128W WO 2019078733 A1 WO2019078733 A1 WO 2019078733A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
biomass
farm
paddocks
computing device
Prior art date
Application number
PCT/NZ2018/050128
Other languages
French (fr)
Inventor
Grant Peter Stewart ANDERSON
Joshua Michael MILLER
Robert Brooks Stephenson
Original Assignee
Farmshots Llc
Livestock Improvement Corporation Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Farmshots Llc, Livestock Improvement Corporation Limited filed Critical Farmshots Llc
Publication of WO2019078733A1 publication Critical patent/WO2019078733A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Mining
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01KANIMAL HUSBANDRY; CARE OF BIRDS, FISHES, INSECTS; FISHING; REARING OR BREEDING ANIMALS, NOT OTHERWISE PROVIDED FOR; NEW BREEDS OF ANIMALS
    • A01K29/00Other apparatus for animal husbandry
    • A01K29/005Monitoring or measuring activity, e.g. detecting heat or mating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10036Multispectral image; Hyperspectral image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/97Determining parameters from multiple pictures

Definitions

  • the processing includes identifying paddocks currently being grazed by livestock and wherein the generating includes providing a visual indication of the identified paddocks.
  • the processing includes processing the satellite data to correct a solar angle and reflectance influence.
  • the processing includes processing the satellite data to detect clouds covering portions of a satellite image corresponding to the farm.
  • the generating includes generating the biomass value representation using the calculated biomass value or the updated biomass value for each pixel of the satellite image.
  • the generating may further include generating the feed wedge and the table including biomass values for the one or more paddocks using the calculated biomass values for each of the one or more paddocks.
  • the invention consists in a system for generating a management zone map for a particular farm including one or more paddocks, the system including: a source of satellite data comprising one or more ground images or tiles and associated metadata; a source of ground data; and, a processing device comprising at least one memory and at least one processor that use the satellite data and ground data to generate a management zone map comprising at least a biomass value representation of the farm, a feed wedge, and a table including biomass values for the one or more paddocks.
  • the instructions stored in the memory further direct the at least one processor to receive ground data.
  • the processing device may further includes instructions that direct the at least one processor to process the satellite data using at least the georeferenced farm map and the ground data.
  • the instruction stored in the memory further direct one processor to identify paddocks currently being grazed by livestock and to generate a management zone map including visual indications for the identified paddocks.
  • the invention consists in a non-transitory machine readable medium accessible by one or more processor that store instructions that direct the one or more processors to perform the method including: receiving satellite data; receiving a georeferenced farm map; processing the satellite data using at least the georeferenced farm map; and, generating a management zone map comprising at least a biomass value representation of the farm, a feed wedge, and a table including biomass values for the one or more paddocks.
  • the receiving may include receiving ground data and the processing may include processing the satellite data using the georeferenced map and the ground data.
  • the processing may include identifying at least one of the one or more paddocks currently being grazed by livestock and the generating may include providing a visual identification of the identified paddock(s) on the graphical user interface.
  • the processing may include processing the satellite data to correct a solar angle and a reflectance influence.
  • the biomass value of a pixel may be defined by a ratio between a near infrared data spectral value and a green spectral data value.
  • the processing may include determining an updated biomass value for each pixel by applying a linear regression model.
  • the processing may include determining a biomass value for each or some of the one or more paddocks using the determined biomass value or the updated biomass value for each pixel.
  • the biomass value of a particular paddock may be determined by averaging the determined biomass value or the updated biomass value of each pixel belonging to the particular paddock.
  • the generating may include generating the feed wedge and/or the biomass values of the graphical user interface using the determined biomass values for each or some of the one or more paddocks.
  • the computing device may be operative to identify paddocks currently being grazed by livestock and to generate a graphical user interface including one or more visual indications for the identified paddocks.
  • the computing device may be further operative to identify sunlit and/or shadowed portions of the one or more paddocks, and to generate a graphical user interface comprising one or more visual indications indicating the shadowed portions.
  • the instructions stored in the memory may further direct the at least one processor to: identify at least one of the one or more paddocks currently being grazed by livestock; and generate a visual identification of the identified paddock(s) on the graphical user interface.
  • the generated graphical user interface may comprise one or more visual indications identifying shadowed portions of the one or more paddocks.
  • the satellite data may comprise one or more of: one or more ground images; one or more ground images tiles; a time at which an image or tile was captured; a date at which an image or tile was captured; geolocation information; a projection; an altitude for each pixel of an image or tile; RGB (Red, Green, and Blue) spectral data values for each pixel of an image or tile; NIR (Near Infra-Red) spectral data value for each pixel of an image or tile.
  • RGB Red, Green, and Blue
  • NIR Near Infra-Red
  • Figure 1 is a simplified, partly pictorial, partly block diagram illustration of a system constructed and operative in accordance with an embodiment of the invention
  • Figure 2 is a block diagram of a pasture management system, constructed and operative in accordance with an embodiment of the invention
  • Figures 3A to 3D are simplified flow chart diagrams illustrating a method for generating and displaying a management zone map for a particular farm, constructed and operative in accordance with embodiments of the invention
  • Figure 4B is a simplified illustration of a feed wedge of a plurality of paddocks, constructed and operative in accordance with an embodiment of the invention
  • Figure 4C is a simplified table illustrating calculated biomass values for a plurality of paddocks, constructed and operative in accordance with an embodiment of the invention
  • Figures 5A to 5C are pictorial illustrations of a biomass value representation of a plurality of paddocks at different times, constructed and operative in accordance with an embodiment of the invention
  • Figure 6 is a graph showing biomass values measured using satellite and ground data
  • Figure 7 A is a pictorial illustration of a cloud detection map of a plurality of paddocks, constructed and operative in accordance with an embodiment of the invention
  • Figure 7B is a pictorial illustration of a biomass value representation of a plurality of paddocks, constructed an operative in accordance with an embodiment of the invention
  • Figure 7C is a simplified illustration of a feed wedge of a plurality of paddocks, constructed and operative in accordance with an embodiment of the invention
  • Figures 8A to 8D are pictorial illustrations illustrating a method to process images to identify sunlit and/or shadowed portions of an image, constructed and operative in accordance with embodiments of the invention.
  • a Wi-Fi and/or mobile telephone network a satellite network
  • a wireless broadcast network e.g., a satellite media broadcasting network or terrestrial broadcasting network
  • a provider-specific network e.g., the Internet, local area network, any other suitable network, and/or any combination or subcombination of these networks.
  • the server 140 may be configured to store, analyze and/or process the ground images 111, the metadata, and the additional data so as to generate a graphical user interface (GUI), hereinafter referred as a management zone map of a particular farm.
  • GUI graphical user interface
  • the generated management zone map may be a user interface providing an end user (e.g., a farmer) with visual information relevant to the pasture biomass.
  • the visual information may comprise a biomass value representation for one or more paddocks, a feed wedge and a table including biomass (i.e., dry matter) values for the one or more paddocks.
  • the management zone map may also comprise visual indications corresponding to the current locations of the livestock.
  • the system of Fig. 1 comprises one or more client devices 150 which are typically configured to communicate in operation with the server 140.
  • the client devices 150 may be comprised in any device with computing power which operates appropriate software.
  • the client devices 150 may comprise a desktop computer, a tablet computer, a handheld device, or other appropriate system operable to communicate with the server 140 and to generate the management zone map.
  • the client devices 150 may be configured to retrieve/receive and store the management zone map generated by the server 140.
  • the client devices 150 may further comprise or be associated with a display device operable to render the generated management zone map.
  • the client devices 150 may be configured to perform some or all of the functionalities of the server 140.
  • the client devices 150 may be in direct communication with the satellite provider server 120 and/or external sources to retrieve/receive and store the ground images 111, metadata and additional data.
  • the management zone map may be generated by the client devices 150. Accordingly, it will be apparent that any suitable processing device of Fig. 1 (e.g. server 140 and/or client devices 150) may be utilized to generate the management zone map.
  • Pasture composition varies between farms and also changes over the course of a year.
  • Pasture vegetation types may include ryegrasses or other grass species such as fescues.
  • Non grass species may include clover, and crops such as chickory, plaintain, etc.
  • the processing device 200 is therefore configured to execute instructions to generate a management zone map 251 using the satellite data 212 and the ground data 213 based on any of several possible models and algorithms to overcome the aforementioned problems.
  • the processing device 200 is configured to calculate biomass values for the different paddocks taking into consideration the pasture types of the different paddocks (provided as part of the ground data 213), including non-grass and root crops.
  • georeferenced map boundaries between the different paddocks of the farm are added to the georeferenced map so as to produce a georeferenced farm map.
  • file types for the georeferenced farm map includes XML (Extensible Markup Language), KML (Keyhole Markup Language), Shapefile, GeoJSON (Geo JavaScript Object Notation), etc.
  • the satellite data are processed by the processing device 200.
  • Satellite data depend on many factors such as the sun to illuminate their subjects and the sun is rarely, if ever, directly overhead a field when a satellite captures a ground image. Satellite imagery is also affected by atmospheric phenomena such as clouds and haze. These effects lead to an unknown bias or offset in readings obtained by satellites. There is therefore a need to process the satellite data in order to eliminate or at least reduce these effects.
  • the processing device 200 may be configured to execute instructions that calculate the biomass value of each pixel using a vegetation index defined by the ratio Following this calculation, each pixel is associated with a calculated biomass value. These biomass values may then be used to generate the management zone map (steps 340-341) and calculate the biomass values for each paddock (step 335).
  • the processing device 200 is configured to generate and display the management zone map.
  • the processing device 200 is configured, at step 341 (Fig. 3C), to generate a biomass value representation for one or more paddocks, a feed wedge and a table including biomass (i.e. dry matter) values for the one or more paddocks using the satellite data processed at step 330.
  • the processing device 200 may be configured to generate the biomass value representation for the one or more paddocks using the biomass values of each pixel obtained from step 334.
  • FIGs. 5A to 5C are biomass value representations of a plurality of paddocks.
  • Figs. 5A to 5C illustrate a method of controlling the feed intake of livestock, involving non-permanent fencing, called break feeding (strip gazing).
  • break feeding strip gazing
  • a paddock is divided into several portions, and the portions are sequentially grazed by the livestock.
  • farmers generally use electric fences that are moved either manually or automatically (e.g. Lely Voyager).
  • Figs. 5A to 5C shows the evolution of the biomass values on the biomass value representation 500 for paddock 501 while Fig. 6 shows the decrease in biomass over time for paddock 501 measured using a plate meter (ground data) and satellite imagery (satellite data).
  • the processing device 200 is configured to generate a management zone map that identifies the paddocks that are showing break feeding.
  • the processing device 200 may be configured to process the data so as to detect paddocks having heterogeneous patterns and/or paddocks that have a distinct line defining two paddock areas having different biomass values thereby indicating that a particular paddock is being grazed at the time when the image was captured.
  • FIG. 7A depicting a cloud detection representation of a plurality of paddocks
  • Fig. 7B depicting a biomass value representation of a plurality of paddocks
  • Fig. 7C depicting a feed wedge of a plurality of paddocks
  • Fig. 7D depicting calculated biomass values for a plurality of paddocks sorted from highest to lowest.
  • Fig. 7A further includes a visual indication 705 illustrating that some portions or parts of the paddocks/farm are covered by clouds. As explained hereinabove in relation to step 333 of Fig.
  • Fig. 7B is a biomass value representation 700 for a plurality of paddocks of the farm generated and rendered as explained hereinabove in relation to Figs. 3A-3C while Fig. 7C shows a feed wedge 710 for the plurality of paddocks or the farm depicted on the biomass values representation of Fig. 7B.
  • the feed wedge 710 provides a visual representation of a current pasture biomass situation of the plurality of paddocks and/or the farm, by ranking paddocks based on the average pasture cover.
  • the processing device 200 may be configured to discard some of the metadata.
  • Fig. 8A illustrates a portion or part 800 of the captured and/or processed ground image 111 (steps 331, 332, or 333 of Fig. 3A) comprising: a first area 806 covered by e.g., trees; a second area 807 corresponding shadowed portions or parts of the paddocks/farm (i.e., shadows created by the trees covering the first area 806); and, third 808 and fourth 809 areas corresponding to sunlit portions or parts of the paddocks/farm.
  • the processing device 200 may be configured to process the portion or part 800 shown in Fig. 8A.
  • the processing device 200 may be configured to execute instructions that process the image 800 to create two sub-images or masks depicted on Figs. 8B and 8C.
  • the processing device 200 may be configured to calculate the biomass values of the different areas using any suitable method as described hereinabove in relation to steps 334 and/or 335 of Fig. 3B.
  • the calculated biomass value of each pixel is then compared to a first threshold to determine whether the pixel is to be categorized as a pixel of low or high biomass (i.e., dry matter) value.
  • the processing device 200 may be further configured to execute instructions that determine the luminance - intensity of light emitted by a surface - of the different areas of image 800 to create the second sub-image of Fig. 8C. To do so, the processing device 200 may use, for example but not limited to, the metadata (e.g., RGB or R spectral values) associated with the satellite data. The calculated luminance value of each pixel is then compared to a second threshold to determine whether the pixel is to be categorized as a pixel of low or high luminance value.
  • the metadata e.g., RGB or R spectral values
  • each pixel is assigned a binary value, e.g., a '0' or T bit or a 'True' or 'False' value depending on whether it corresponds to a high or low luminance value.
  • the processing device 200 is configured to identify whether a particular pixel belongs to a shadowed or a sunlit portion and/or whether a particular area is a sunlit or shadowed portion. Pixels belonging to a shadowed portion and/or areas identified as a shadowed portion may be discarded and therefore not used for subsequent calculations (e.g., calculating the biomass value of a particular paddock) and/or representation on the graphical user interface.
  • the configuration/operation of the processing device 200 described in relation to Figs. 8A-8D is an exemplary implementation and is not limiting.
  • the processing device 200 may be configured to operate with more than one thresholds for the biomass and luminance values.
  • the processing device 200 is configured to identify other portions of the paddocks/farm depending on the combination of biomass and luminance values.
  • Fig. 8D shows area 817 identified as a sunlit portion having low pasture and area 818 identified as a sunlit portion having high pasture.
  • Those skilled in the art will appreciate that other any suitable categories may be provided and programmed in a memory of the processing device 200 in order to identify different areas of the processed images.
  • Figs. 9A-9B illustrate two different visual representations that can be obtained based on the image processing detailed hereinabove in relation to Figs. 8A to 8D.
  • Fig. 9A is similar to Fig. 8D but at a farm level and shows portions of the paddocks identified as sunlit pasture portions 917, 918 in light grey as well as portions of the paddocks identified as shadowed portions 916 in white.
  • the processing device 200 may be configured to discard some or all the metadata associated with the shadowed portions 916.
  • the biomass and updated biomass values may then be calculated at steps 334 and 335 of Fig. 3B using the remaining data and metadata corresponding to the paddock portions 917, 918.
  • step 340 of Fig. 3A (and/or steps 341 and 342 of Fig. 3C) where a management zone map is generated and one or more element of the management zone map are displayed and/or transmitted to a client device 150 for display.
  • the processing device 200 may be able to generate a biomass value representation comprising the shadowed portions of the paddocks/farm. Such biomass value representation is depicted in Fig. 9B.
  • the management zone map may further comprise the visual representation of Fig. 9 A such that the end user can understand in a timely and efficient manner which parts or portions of the paddocks/farm have been included into the calculations.

Abstract

In one embodiment, a method implemented on a computing device for visualizing an amount of pasture available for grazing by livestock in a farm including one or more paddocks is provided. The method includes: receiving data relevant to the particular farm, the data comprising at least satellite data of the farm and a georeferenced map identifying the one or more paddocks of the farm; processing the satellite data using at least the georeferenced map; and generating a graphical user interface using the processed data, the graphical user interface comprising at least one biomass value representation of the farm, the biomass value representation being indicative of an amount of pasture available for grazing by livestock for each or some of the one or more paddocks of the farm.

Description

PASTURE SATELLITE MEASUREMENTS
INCOPORATION BY REFERENCE TO ANY PRIORITY APPLICATIONS
[0001] This application claims priority to U.S. Provisional Application No. 62/573,347, filed 17 October 2017, which is incorporated by herein by reference in its entirety. Any and all applications for which a foreign or domestic priority claim is identified in the Application Data Sheet as filed with the present application are hereby incorporated by reference under 37 C.F.R. § 1.57.
FIELD OF DISCLOSURE
[0002] The present invention generally relates to pasture management, more particularly to methods and systems for pasture management including hardware and software components.
BACKGROUND
[0003] Pasture management is a key factor in farm performance. The ability to accurately measure the pasture biomass is important for the farmer in order to calculate forage availability, stocking rates (i.e. number of animals on a given amount of land over a certain period of time), and feed requirements. This information may be used by the farmer in order to create pasture management tools such as feed wedges and rotation planners.
[0004] Pasture feed wedges provide a visual representation of a current pasture biomass situation, by ranking paddocks based on the average pasture cover. Typically, a pasture feed wedge is a graph of paddock covers (expressed in kg DM ha) and paddock descriptors for a dairy farm for a selected day, sorted by paddock from longest to shortest pasture cover. Target pre- and post-grazing may be added to the graph so that the farmer can identify where there is a shortfall or surplus of pasture feed. [0005] Rotation planners are typically used in conjunction with feed wedges to calculate the rotation of paddocks. With rotational grazing, the livestock daily intake is governed mainly by the amount and quality of pasture offered. Leaving high residuals results in poor pasture utilization and a loss of feed quality in subsequent grazing. In addition, feed requirements of animals can change throughout the season (e.g. feed requirements are higher when livestock produces milk) and need to be taken into consideration to calculate the rotation of paddocks.
[0006] Existing methods of pasture measurement include visual assessment, use of plate meters during farm walks, or use of sensors mounted on machinery towed by vehicles (e.g. C-Dax pasture meter). Another technology currently being developed in pasture measurement includes the use of unnamed aerial vehicles (UAVs - e.g., drones). These UAVs are operative to fly along programmed routes over a farm, and automatically capture images. The captured images can then be processed to provide estimates of the pasture biomass.
[0007] However, all of these traditional methods are labour intensive to varying degrees and are prone to inaccuracies. Further, the use of UAVs may be restricted by weather conditions and/or civilian aviation authorities. There is therefore a need for an improved method and means for pasture measurement to realise the significant benefits that precision pasture management offers.
SUMMARY
[0008] It is an object of the present invention to provide a method, a system and an apparatus which at least go some way towards overcoming the above disadvantages or which will at least provide the public with a useful choice.
[0009] In a first aspect, the invention consists in a method for processing satellite images to generate a management zone map for a particular farm including one or more paddocks, the method including: receiving satellite data; receiving a georeferenced farm map; processing the satellite data using at least the georeferenced farm map; and, generating a management zone map comprising at least a biomass value representation of the farm, a feed wedge, and a table including biomass values for the one or more paddocks.
[0010] In one embodiment, ground data are further received and the processing includes processing the satellite data using at least the georeferenced farm map and the ground data.
[0011] In another embodiment, the processing includes identifying paddocks currently being grazed by livestock and wherein the generating includes providing a visual indication of the identified paddocks.
[0012] In a further embodiment, the method further includes displaying the generated management zone map on a screen for use by an end user.
[0013] In an embodiment, the processing includes processing the satellite data to correct a solar angle and reflectance influence.
[0014] In another embodiment, the processing includes overlaying the georeferenced farm map onto a satellite image of the satellite data to identify a portion of the satellite image corresponding to the farm.
[0015] In a further embodiment, the processing includes processing the satellite data to detect clouds covering portions of a satellite image corresponding to the farm.
[0016] In an embodiment, the processing includes calculating a biomass value for each pixel of the satellite image. The biomass value of a pixel may be defined by a ratio between a near infrared data spectral value and a green spectral data value. The processing may include calculating an updated biomass value for each pixel by applying a linear regression model. The processing may further include calculating a biomass value for each of the one or more paddocks using the calculated biomass value or the updated biomass value for each pixel. The biomass value of a particular paddock may be calculated by averaging the calculated biomass value or the updated biomass value of each pixel belonging to the particular paddock.
[0017] In a further embodiment, the generating includes generating the biomass value representation using the calculated biomass value or the updated biomass value for each pixel of the satellite image. The generating may further include generating the feed wedge and the table including biomass values for the one or more paddocks using the calculated biomass values for each of the one or more paddocks. [0018] In an embodiment, the satellite data includes one or more of: one or more ground images; one or more ground images tiles; a time at which an image or tile was captured; a date at which an image or tile was captured; geolocation information; a projection; an altitude for each pixel of an image or tile; RGB (Red, Green, and Blue) spectral data values for each pixel of an image or tile; R (Near Infra-Red) spectral data value for each pixel of an image or tile.
[0019] In another embodiment, the ground data includes one or more of: soil data; crop data; and climate data.
[0020] In a second aspect, the invention consists in a system for generating a management zone map for a particular farm including one or more paddocks, the system including: a source of satellite data comprising one or more ground images or tiles and associated metadata; a source of ground data; and, a processing device comprising at least one memory and at least one processor that use the satellite data and ground data to generate a management zone map comprising at least a biomass value representation of the farm, a feed wedge, and a table including biomass values for the one or more paddocks.
[0021] In an embodiment, the processing device may be operative to identify paddocks currently being grazed by livestock and to generate a management zone map including visual indications for the identified paddocks.
[0022] In a third aspect, the invention consists in a processing device for generating a management zone map for a particular farm including one or more paddocks, the processing device including: at least one processor; memory accessible by each of the at least one processors; and instructions stored in the memory that direct the at least one processor to: receive satellite data; receive a georeferenced farm map; process the satellite data using at least the georeferenced farm map; and, generate a management zone map comprising at least a biomass value representation of the farm, a feed wedge, and a table including biomass values for the one or more paddocks.
[0023] In one embodiment, the instructions stored in the memory further direct the at least one processor to receive ground data. The processing device may further includes instructions that direct the at least one processor to process the satellite data using at least the georeferenced farm map and the ground data. [0024] In another embodiment, the instruction stored in the memory further direct one processor to identify paddocks currently being grazed by livestock and to generate a management zone map including visual indications for the identified paddocks.
[0025] In a fourth aspect, the invention consists in a non-transitory machine readable medium accessible by one or more processor that store instructions that direct the one or more processors to perform the method including: receiving satellite data; receiving a georeferenced farm map; processing the satellite data using at least the georeferenced farm map; and, generating a management zone map comprising at least a biomass value representation of the farm, a feed wedge, and a table including biomass values for the one or more paddocks.
[0026] In a fifth aspect, the invention consists in a method implemented on a computing device for visualizing an amount of pasture available for grazing by livestock in a farm including one or more paddocks, the method including: receiving data relevant to the particular farm, the data comprising at least satellite data of the farm and a georeferenced map identifying the one or more paddocks of the farm; processing the satellite data using at least the georeferenced map; and generating a graphical user interface using the processed data, the graphical user interface comprising at least one biomass value representation of the farm, the biomass value representation being indicative of an amount of pasture available for grazing by livestock for each or some of the one or more paddocks of the farm.
[0027] In one embodiment, the graphical user interface may further include a feed wedge and/or biomass values for each or some of the one or more paddocks of the farm.
[0028] In another embodiment, the method may further include rendering the generated graphical user interface on a display screen associated with the computing device for use by an end user.
[0029] In a further embodiment, the receiving may include receiving ground data and the processing may include processing the satellite data using the georeferenced map and the ground data.
[0030] In an embodiment, the processing may include identifying at least one of the one or more paddocks currently being grazed by livestock and the generating may include providing a visual identification of the identified paddock(s) on the graphical user interface. [0031] In another embodiment, the processing may include processing the satellite data to correct a solar angle and a reflectance influence.
[0032] In a further embodiment, the processing may include overlaying the georeferenced map onto the satellite data to identify a portion of the satellite data corresponding to the farm. The processing may include processing the identified portion of the satellite data corresponding to the farm to identify sunlit and/or shadowed portions. The sunlit and/or shadowed portions may be identified using biomass and luminance values. The shadowed portions may be identified as portions of the image corresponding to the farm having low biomass and luminance values. The processing may further include discarding data associated with the identified shadowed portions. The processing may include determining a biomass value for each pixel of the portion of the satellite data corresponding to the farm. The processing may include determining a biomass value for each pixel corresponding to the identified sunlit portions. The biomass value of a pixel may be defined by a ratio between a near infrared data spectral value and a green spectral data value. The processing may include determining an updated biomass value for each pixel by applying a linear regression model. The processing may include determining a biomass value for each or some of the one or more paddocks using the determined biomass value or the updated biomass value for each pixel. The biomass value of a particular paddock may be determined by averaging the determined biomass value or the updated biomass value of each pixel belonging to the particular paddock. The generating may include generating the feed wedge and/or the biomass values of the graphical user interface using the determined biomass values for each or some of the one or more paddocks.
[0033] In an embodiment, the generated graphical user interface may include one or more visual indications identifying shadowed portions of the one or more paddocks.
[0034] In another embodiment, the satellite data may include one or more of: one or more ground images; one or more ground images tiles; a time at which an image or tile was captured; a date at which an image or tile was captured; geolocation information; a projection; an altitude for each pixel of an image or tile; RGB (Red, Green, and Blue) spectral data values for each pixel of an image or tile; NIR (Near Infra-Red) spectral data value for each pixel of an image or tile. [0035] In a further embodiment, the ground data may include one or more of: soil data; crop data; and climate data.
[0036] In a sixth aspect, the invention consists in a system for visualizing an amount of pasture available for grazing by livestock in a farm including one or more paddocks, the system including: a source of satellite data comprising one or more ground images or tiles and associated metadata; a source of ground data; and, a computing device comprising at least one memory and at least one processor that use the satellite data and ground data to generate a graphical user interface, the graphical user interface comprising at least one biomass value representation of the farm, the biomass value representation being indicative of an amount of pasture available for grazing by livestock for each or some of the one or more paddocks of the farm.
[0037] In an embodiment, the computing device may be operative to identify paddocks currently being grazed by livestock and to generate a graphical user interface including one or more visual indications for the identified paddocks.
[0038] In another embodiment, the computing device may be further operative to identify sunlit and/or shadowed portions of the one or more paddocks, and to generate a graphical user interface comprising one or more visual indications indicating the shadowed portions.
[0039] In a seventh aspect, the invention consists of a computing device for visualizing an amount of pasture available for grazing by livestock in a farm including one or more paddocks, the computing device including: at least one processor; memory accessible by each of the at least one processor; and instructions stored in the memory that direct the at least one processor to: receive data relevant to the particular farm, the data comprising at least satellite data of the farm and a georeferenced map identifying the one or more paddocks of the farm; process the satellite data using at least the georeferenced map; and generate a graphical user interface using the processed data, the graphical user interface comprising at least one biomass value representation of the farm, the biomass value representation being indicative of an amount of pasture available for grazing by livestock for each or some of the one or more paddocks of the farm.
[0040] In one embodiment, the graphical user interface may further include a feed wedge and/or biomass values for each or some of the one or more paddocks of the farm. [0041] In another embodiment, the computing device may be further associated to a display and the instructions stored in the memory may further direct the at least one processor to render the generated graphical user interface on the display.
[0042] In a further embodiment, the instructions stored in the memory may further direct the at least one processor to: receive ground data; and process the satellite data using the georeferenced map and the ground data.
[0043] In an embodiment, the instructions stored in the memory may further direct the at least one processor to: identify at least one of the one or more paddocks currently being grazed by livestock; and generate a visual identification of the identified paddock(s) on the graphical user interface.
[0044] In another embodiment, the instructions stored in the memory may further direct the at least one processor to process the satellite data to correct a solar angle and a reflectance influence.
[0045] In a further embodiment, the instructions stored in the memory may further direct the at least one processor to overlay the georeferenced map onto the satellite data to identify a portion of the satellite data corresponding to the farm. The instructions stored in the memory may further direct the at least one processor to process the identified portion of the satellite data corresponding to the farm to identify sunlit and/or shadowed portions. The sunlit and/or shadowed portions may be identified using biomass and luminance values. The shadowed portions may be identified as portions of the image corresponding to the farm having low biomass and luminance values. The instructions stored in the memory may further direct the at least one processor to discard data associated with the identified shadowed portions. The instructions stored in the memory may further direct the at least one processor to determine a biomass value for each pixel of the portion of the satellite data corresponding to the farm. The instructions stored in the memory may further direct the at least one processor to determine a biomass value for each pixel corresponding to the identified sunlit portions. The biomass value of a pixel may be defined by a ratio between a near infrared data spectral value and a green spectral data value. The instructions stored in the memory may further direct the at least one processor to determine an updated biomass value for each pixel by applying a linear regression model. The instructions stored in the memory may further direct the at least one processor to determine a biomass value for each or some of the one or more paddocks using the determined biomass value or the updated biomass value for each pixel. The biomass value of a particular paddock may be determined by averaging the determined biomass value or the updated biomass value of each pixel belonging to the particular paddock. The instructions stored in the memory may further direct the at least one processor to generate the feed wedge and/or the biomass values of the graphical user interface using the determined biomass values for each or some of the one or more paddocks.
[0046] In one embodiment, the generated graphical user interface may comprise one or more visual indications identifying shadowed portions of the one or more paddocks.
[0047] In another embodiment, the generated graphical interface may comprise the feed wedge and/or the biomass values of each or some of the one or more paddocks that do not comprise shadowed portions.
[0048] In a further embodiment, the satellite data may comprise one or more of: one or more ground images; one or more ground images tiles; a time at which an image or tile was captured; a date at which an image or tile was captured; geolocation information; a projection; an altitude for each pixel of an image or tile; RGB (Red, Green, and Blue) spectral data values for each pixel of an image or tile; NIR (Near Infra-Red) spectral data value for each pixel of an image or tile.
[0049] In one embodiment, the ground data may comprise one or more of: soil data; crop data; and climate data.In an eighth aspect, the invention consists in one or more computer readable tangible storage media encoded with software comprising computer executable instructions and when the software is executed the software is operable to: receive data relevant to the particular farm, the data comprising at least satellite data of the farm and a georeferenced map identifying the one or more paddocks of the farm; process the satellite data using at least the georeferenced map; and, generate a graphical user interface using the processed data, the graphical user interface comprising at least one biomass value representation of the farm, the biomass value representation being indicative of an amount of pasture available for grazing by livestock for each or some of the one or more paddocks of the farm. [0050] To those skilled in the art to which the invention relates, many changes in construction and widely differing embodiments and applications of the invention will suggest themselves without departing from the scope of the invention as defined in the appended claims. The disclosures and the descriptions herein are purely illustrative and are not intended to be in any sense limiting.
BRIEF DESCRIPTION OF THE DRAWINGS
[0051] One preferred form of the present invention will now be described with reference to the accompanying drawings in which:
[0052] Figure 1 is a simplified, partly pictorial, partly block diagram illustration of a system constructed and operative in accordance with an embodiment of the invention;
[0053] Figure 2 is a block diagram of a pasture management system, constructed and operative in accordance with an embodiment of the invention;
[0054] Figures 3A to 3D are simplified flow chart diagrams illustrating a method for generating and displaying a management zone map for a particular farm, constructed and operative in accordance with embodiments of the invention;
[0055] Figure 4A is a pictorial illustration of a biomass value representation of a plurality of paddocks, constructed and operative in accordance with an embodiment of the invention;
[0056] Figure 4B is a simplified illustration of a feed wedge of a plurality of paddocks, constructed and operative in accordance with an embodiment of the invention;
[0057] Figure 4C is a simplified table illustrating calculated biomass values for a plurality of paddocks, constructed and operative in accordance with an embodiment of the invention;
[0058] Figures 5A to 5C are pictorial illustrations of a biomass value representation of a plurality of paddocks at different times, constructed and operative in accordance with an embodiment of the invention;
[0059] Figure 6 is a graph showing biomass values measured using satellite and ground data; [0060] Figure 7 A is a pictorial illustration of a cloud detection map of a plurality of paddocks, constructed and operative in accordance with an embodiment of the invention;
[0061] Figure 7B is a pictorial illustration of a biomass value representation of a plurality of paddocks, constructed an operative in accordance with an embodiment of the invention;
[0062] Figure 7C is a simplified illustration of a feed wedge of a plurality of paddocks, constructed and operative in accordance with an embodiment of the invention;
[0063] Figure 7D is a simplified table illustrating calculated biomass values for a plurality of paddocks, constructed and operative in accordance with an embodiment of the invention;
[0064] Figures 8A to 8D are pictorial illustrations illustrating a method to process images to identify sunlit and/or shadowed portions of an image, constructed and operative in accordance with embodiments of the invention; and
[0065] Figures 9A and 9B are pictorial illustrations showing sunlit and shadowed portions of a plurality of paddocks.
DETAILED DESCRIPTION
[0066] In the following description, numerous specific details are set forth in order to provide a thorough understanding of the various principles of the present invention. However, those skilled in the art will appreciate that not all these details are necessarily always required for practicing the present invention.
[0067] Reference is now made to Fig. 1 , which is a simplified, partly pictorial, partly block diagram illustration of a system constructed and operative in accordance with an embodiment of the present invention.
[0068] The system 100 of Fig. 1 comprises a plurality of client devices 150, a communications network 130, a satellite provider server 110 and a server 140. For the sake of depiction, only three client devices 150 are shown although it will be apparent to those skilled in the art that the system 100 may comprise less or more than three client devices 150. Similarly, only one satellite provider server 120 and one server 140 are shown. However, it will be apparent to those skilled in the art that the system 100 may comprise more than one satellite provider server 120 and/or more than one server 140.
[0069] The satellite 110 is typically configured to communicate with the satellite provider server 120 via any suitable communication network. The satellite 110 is depicted in Fig. 1 as being in operative communication with the satellite provider server 120. The satellite 110 is typically configured to capture ground images 111 and communicate the captured ground images 111 along with metadata to the satellite provider server 120. Non- limiting examples of metadata associated with the ground images 111 include a time and a date at which the image was captured, geolocation information, a projection, an altitude for each pixel, RGB (Red, Green, and Blue) spectral data values for each pixel, NIR (Near Infra-Red) spectral data value for each pixel, etc.
[0070] The satellite provider server 120 is configured to receive and store the ground images 111 captured by the satellite along with associated metadata. In addition, the satellite provider server 120 is further configured to transmit the ground images 111 and metadata on request to any local or remote devices such as, for example but not limited to, the server 140.
[0071] Communications network 130 may be any suitable communications network enabling the satellite provider device 110, the server 140 and/or the client devices 150 to communicate with each other. Communications network 130 may include one or more networks or types of networks capable of carrying communications and/or data signals between the components of client devices 150 and the server 140 and/or between the components of the server 140 and the satellite provider server 120. For example, communications network 130 may include, but is not limited to, a cable network, an optical fiber network, a hybrid fiber coax network, a wireless network (e.g. a Wi-Fi and/or mobile telephone network), a satellite network, a wireless broadcast network (e.g., a satellite media broadcasting network or terrestrial broadcasting network), a provider-specific network, the Internet, local area network, any other suitable network, and/or any combination or subcombination of these networks.
[0072] As explained hereinabove, the server 140 is configured to communicate with the satellite provider server 120 via the communications network 130. For instance, the server 140 may request and receive data, such as for example, one or more ground images 111 and associated metadata, from the satellite provider server 120. Additionally or alternatively, the server 140 may be further configured to communicate with other devices (not shown in Fig. 1) via the communications network 130 or any other suitable communications network. For example, the server 140 may be configured to retrieve and/or receive and store additional data from external sources. The additional data may include, for example but not limited to, a georeferenced farm map identifying different paddocks of a particular farm. It will be appreciated by those skilled in the art that the term paddock should not be interpreted in an exclusive sense, but rather in an inclusive sense to refer to any delimited areas or zones of a farm. These zones or areas may be physically or virtually defined. For example, some of the boundaries used to define to these areas or zones may correspond to natural or man-made delimitations of the farm. In addition, or alternatively, some of these boundaries may be defined by the end user (e.g., the farmer) at any suitable time. This also includes virtual fencing where no physical obstacles are provided but rather a GPS enabled collar and a mobile application are used to fence, move, or monitor livestock. Virtual fencing includes virtually defining boundaries on a computing device which in turn transmit them to each collar. The livestock can then be notified by the collar when it has reached a boundary. Further, the server 140 may be configured to store, analyze and/or process the ground images 111, the metadata, and the additional data so as to generate a graphical user interface (GUI), hereinafter referred as a management zone map of a particular farm. The generated management zone map may be a user interface providing an end user (e.g., a farmer) with visual information relevant to the pasture biomass. The visual information may comprise a biomass value representation for one or more paddocks, a feed wedge and a table including biomass (i.e., dry matter) values for the one or more paddocks. In addition, the management zone map may also comprise visual indications corresponding to the current locations of the livestock. For example, livestock are usually wearing identification and/or geolocation tags (e.g., Global Positioning System) enabling a farmer to know their current location. This geolocation data can be retrieved and provided to the server 140 and/or the client devices 150 to generate visual indications indicating a position of the livestock on the generated GUI.
[0073] Lastly, the system of Fig. 1 comprises one or more client devices 150 which are typically configured to communicate in operation with the server 140. The client devices 150 may be comprised in any device with computing power which operates appropriate software. For example, and without limiting the generality of the foregoing, the client devices 150 may comprise a desktop computer, a tablet computer, a handheld device, or other appropriate system operable to communicate with the server 140 and to generate the management zone map. For example, the client devices 150 may be configured to retrieve/receive and store the management zone map generated by the server 140. The client devices 150 may further comprise or be associated with a display device operable to render the generated management zone map. Those skilled in the art will appreciate that in other embodiments of the present invention, the client devices 150 may be configured to perform some or all of the functionalities of the server 140. For example, the client devices 150 may be in direct communication with the satellite provider server 120 and/or external sources to retrieve/receive and store the ground images 111, metadata and additional data. Additionally, provided that the client devices 150 have sufficient computing power (graphics processing unit and/or computing processing unit), the management zone map may be generated by the client devices 150. Accordingly, it will be apparent that any suitable processing device of Fig. 1 (e.g. server 140 and/or client devices 150) may be utilized to generate the management zone map.
[0074] Reference is now made to Fig. 2 which is a block diagram of a pasture management system, constructed and operative in accordance with an embodiment of the present invention. The pasture management system comprises a processing device 200 which is configured to execute instructions to perform processes that generate, and optionally display, the management zone map and interact with other devices connected to the communications network 130 as shown in Fig. 1. The processing device 200 may be for example, but not limited to, the server 140 or one of the client devices 150 depicted in Fig. 1. One skilled in the art will recognize that a particular processing device may include other components that are omitted for brevity without departing from this invention. The processing device 200 includes a processor 205, a non-volatile memory 210, and a volatile memory 215. The processor 205 is a processor, microprocessor, controller, or a combination of processors, microprocessor, and/or controllers that performs instructions stored in the volatile 215 or the non-volatile memory 210 to manipulate data stored in the memory. The non-volatile memory 210 can store the processor instructions utilized to configure the processing device 200 to perform processes including processes in accordance with embodiments of the invention and/or data for the processes being utilized. In other embodiments, the processing system software and/or firmware can be stored in any of a variety of non-transient computer readable media appropriate to a specific application. A network interface is a component that allows processing device 200 to transmit and receive data over a network based upon the instructions performed by processor 205. Although a processing device 200 is illustrated in Fig. 2, any of a variety of processing device may be configured to provide the methods and systems in accordance with embodiments of the present invention.
[0075] In Fig. 2, ground data 213 and satellite data 212 are inputs to the processing device 200. The output from the processing device is a management zone map 251 of a particular farm i.e. a graphical user interface providing an end user (e.g., a farmer) with visual information relevant to the pasture biomass. The visual information may comprise a biomass value representation for one or more paddocks of the farm, a feed wedge and a table including biomass (i.e., dry matter) values for the one or more paddocks.
[0076] Non limiting examples of satellite data 212 include ground images 111 or ground images tiles, a time and a date at which the images were captured, geolocation information, a projection, an altitude for each pixel, RGB (Red, Green, and Blue) spectral data values for each pixel, NIR (Near Infra-Red) spectral data value for each pixel, etc. Non limiting examples of ground data 213 include soil data, crop data and climate data may also be provided to the processing device 200 although not all of these data may be needed to generate a management zone map 251. All of the data sources (212, 213), and other data not shown, are georeferenced. For example, each data point (soil type, crop type, climate history, NDVI from various sources, etc.) may be associated with a location specified in latitude and longitude or any other convenient mapping coordinate system. Also, the various data may be supplied at different spatial resolution.
[0077] It is rarely possible to obtain ground data 213 and satellite data 212 measured at the same time. If only a few days separate the measurements, the resulting errors may be small enough to be ignored. However, better results may be obtained by using models and algorithms to propagate data forward or backward in time as needed to compare asynchronous sources. Another problem for pasture cover estimation is the variation in pasture types. Pasture composition varies between farms and also changes over the course of a year. Pasture vegetation types may include ryegrasses or other grass species such as fescues. Non grass species may include clover, and crops such as chickory, plaintain, etc. Other root crops such as turnips, fodder beet are also used by farmers as part of their feed rotation and the dry matter of these crops should be included in the calculations such as feed budgets. The processing device 200 is therefore configured to execute instructions to generate a management zone map 251 using the satellite data 212 and the ground data 213 based on any of several possible models and algorithms to overcome the aforementioned problems. In particular, the processing device 200 is configured to calculate biomass values for the different paddocks taking into consideration the pasture types of the different paddocks (provided as part of the ground data 213), including non-grass and root crops.
[0078] In use, an end user (e. g. , a farmer) may use the processing device 200 to request, download, and/or retrieve by any suitable means the management zone map 251 from any physical location (e.g. server, hardware device, cloud, etc.). As a result, the management zone map 251 is rendered on a display screen of the processing device 200 or a display associated with the processing device 200 for visualization and/or use by the end user. The management zone map 251 provides the end user with information on the biomass (dry matter) of his paddocks in an easy and time efficient manner. In addition, the processing device 200 may be configured to receive inputs from the end user to alter some or all elements of the management zone map 251. For example, but not limited to, the end user may want to add target pre- and/or post-grazing covers so as to identify where there is a shortfall or surplus of pasture feed. To do so, the processing device 200 may be configured to authorize the end user to use any appropriate pointing methods as are known in the art (e.g., but not limited to: mice, eye tracking methods, etc.) as well as direct manipulation by the user's fingers.
[0079] Reference is now made to Fig. 3A which is a simplified flow chart diagram illustrating a method for generating and displaying a management zone map for a particular farm, constructed and operative in accordance with embodiments of the present invention. [0080] The process starts at step 300 and, at step 310, a georeferenced map relevant to a particular farm is retrieved by and/or received at the processing device 200. The georeferenced map associates a physical map or raster image of a map with spatial locations. For example, an internal coordinate system of a physical map or an aerial photo image can be related to a ground system of geographic coordinates. The relevant coordinate transforms are typically stored within the image file such as for example but not limited to in a GeoPDF and/or GeoTIFF file. In addition, boundaries between the different paddocks of the farm are added to the georeferenced map so as to produce a georeferenced farm map. Non-limiting examples of file types for the georeferenced farm map includes XML (Extensible Markup Language), KML (Keyhole Markup Language), Shapefile, GeoJSON (Geo JavaScript Object Notation), etc.
[0081] At step 320, the processing device 200 may retrieve and/or receive satellite data from the satellite server provider 120. For example, the ground images 111 captured by the satellite 110 along with associated metadata may be provided to the processing device 200. As explained hereinabove in relation to Fig. 1, the metadata associated with the ground images 111 may include a time and a date at which the image was captured, geolocation information, a projection, RGB (Red, Green, and Blue) spectral data values for each pixel, NIR (Near Infra-Red) spectral data value for each pixel, etc. In some embodiments, the ground images 111 may be processed at the satellite server provider 120 so that ground images 111 are segmented into tiles and the server 140 may request the satellite server provider 120 to provide one or more of the ground images tiles and associated metadata. The one or more of the ground image tiles may correspond to a subset of the ground images 111 relevant to the farm. Although step 320 is depicted on Fig. 3 A as being performed after step 310, it will be apparent to those skilled in the art that steps 310 and 320 may be performed by the processing device 120 in any suitable order.
[0082] At step 330, the satellite data are processed by the processing device 200. Satellite data depend on many factors such as the sun to illuminate their subjects and the sun is rarely, if ever, directly overhead a field when a satellite captures a ground image. Satellite imagery is also affected by atmospheric phenomena such as clouds and haze. These effects lead to an unknown bias or offset in readings obtained by satellites. There is therefore a need to process the satellite data in order to eliminate or at least reduce these effects.
[0083] In a first step (step 331 of Fig. 3B), the ground images 1 11 or the ground image tiles are processed to correct the solar angle and reflectance influence. To do so, the time and date at which the ground images 11 1 were captured are used and the processing device 200 is configured to execute instructions that calculate corrected RGB spectral data values and R spectral data values for each pixel. At step 332 of Fig. 3B, the georeferenced farm map is overlaid onto the ground images 111 or the ground image tiles to identify which parts correspond to the farm/paddocks. The processing device 200 is configured to execute instructions that enable identification of portions of the ground images 1 11 or ground image tiles to select for further processing and portions of the ground images 1 11 or ground images to cut out/discard. As a result, the processing device 200 is operative at this stage to discard the non-relevant portions of the images 1 11, tiles and associated metadata and to only keep the satellite data relevant to the particular farm/paddocks. At step 333 of Fig. 3B, the selected portions of the images 1 11 or ground images tiles and associated metadata are processed to account for the cloud influence. The processing device 200 may be configured to execute instructions that detect/locate clouds and/or portions or parts of images that correspond to portions or parts of the farm/paddocks covered by clouds. As a result, non-usable metadata corresponding to these portions or parts may be discarded at this stage and therefore not used in the various calculations described in the following steps.
[0084] At step 334, biomass values are calculated for each pixel using the relevant RGB spectral data values and NIR spectral data values obtained from steps 331 to 333. The processing device 200 is configured to execute instructions that calculate, for each pixel, a biomass value indicating the greenness (i.e. the relative density and health of vegetation) of a particular pixel. Those skilled in the art will appreciate that a biomass value may be calculated using different vegetation indexes such as, for example, but not limited to:
• the simple Difference Vegetation Index (DVT):
Simple DVI = NIR - G
which does not take into consideration the difference between reflectance and radiance caused by atmosphere or shadows,
• the Normalized Difference Vegetation Index (NDVI): NIR
NDVI =
NIR
, and
the Ratio Vegetation Index:
NIR
RVI =
R
which reduces effects of atmosphere and topography.
In one embodiment, the processing device 200 may be configured to execute instructions that calculate the biomass value of each pixel using a vegetation index defined by the ratio
Figure imgf000021_0001
Following this calculation, each pixel is associated with a calculated biomass value. These biomass values may then be used to generate the management zone map (steps 340-341) and calculate the biomass values for each paddock (step 335).
[0085] At step 335, biomass values (in Kg DM / ha) for each paddock are calculated using the biomass values for each pixel obtained at step 334. Those skilled in the art will appreciate at any suitable method for calculating the biomass values for each paddock may be used. In one embodiment, the processing device 200 is configured to execute instructions that apply a linear regression model to the calculated pixel biomass values, thereby resulting in updated biomass values for each pixel. The processing device 200 is also configured to calculate, for each paddock, a biomass value by averaging the calculated biomass values (obtained from step 334) or the updated biomass values (step 335) of pixels belonging to a particular paddock. The calculated or updated pixel biomass values and the paddock biomass values will then be used to generate the management zone map at steps 340-341. The processing device 200 may further be configured to receive and/or retrieve ground data and use them, at steps 334 and/or 335, to calculate or adjust the biomass values or the updated biomass values.
[0086] At step 340 of Fig. 3 A, the processing device 200 is configured to generate and display the management zone map. For example, the processing device 200 is configured, at step 341 (Fig. 3C), to generate a biomass value representation for one or more paddocks, a feed wedge and a table including biomass (i.e. dry matter) values for the one or more paddocks using the satellite data processed at step 330. In one embodiment, the processing device 200 may be configured to generate the biomass value representation for the one or more paddocks using the biomass values of each pixel obtained from step 334. Each pixel may be mapped to a particular color depending on its updated biomass value: red for no or a very low value; yellow for a low value; light green for a medium value; dark green for a high value, etc. In another embodiment, the processing device 200 may be configured to generate the biomass value representation for the one or more paddocks using the updated biomass values of each pixel obtained from step 335.
[0087] The processing device 200 may further be configured to generate the feed wedge and the table including biomass (i. e. dry matter) values for the one or more paddocks using the calculated biomass values obtained at step 334 or the updated biomass values obtained at step 335. Those skilled in the art will appreciate that, although the generation step 340 is shown as a separate step, it is apparent that the different elements of the management zone map may be generated at any suitable time (e.g. at step 334 or 335 for the biomass value representation and, at step 335, for the feed wedge and/or the table including biomass (i.e. dry matter) values) and merely assembled together at step 340.
[0088] The processing device 200 is then either configured to render one or more elements of the management zone map on a display or transmit the one or more elements of the management zone map to one or more client devices 150 for display at step 342 of Fig. 3C. Examples of the different elements of the management zone map are illustrated on Figs. 4A to 4C and Figs. 7A to 7C, Figs. 4A and 7A depicting a biomass value representation of a plurality of paddocks, Figs. 4B and 7B depicting a feed wedge of a plurality of paddocks, and Figs. 4C and 7C depicting calculated biomass values for a plurality of paddocks sorted from highest to lowest. The process ends at step 350.
[0089] Reference is now made to Figs. 5A to 5C, which are biomass value representations of a plurality of paddocks. Figs. 5A to 5C illustrate a method of controlling the feed intake of livestock, involving non-permanent fencing, called break feeding (strip gazing). In this method, a paddock is divided into several portions, and the portions are sequentially grazed by the livestock. To create break feeding divisions, farmers generally use electric fences that are moved either manually or automatically (e.g. Lely Voyager).
[0090] Figs. 5A to 5C shows the evolution of the biomass values on the biomass value representation 500 for paddock 501 while Fig. 6 shows the decrease in biomass over time for paddock 501 measured using a plate meter (ground data) and satellite imagery (satellite data). In an embodiment of the present invention, the processing device 200 is configured to generate a management zone map that identifies the paddocks that are showing break feeding. The processing device 200 may be configured to process the data so as to detect paddocks having heterogeneous patterns and/or paddocks that have a distinct line defining two paddock areas having different biomass values thereby indicating that a particular paddock is being grazed at the time when the image was captured. For example, the processing device 200 may be configured to generate a biomass value representation comprising visual indications allowing the end user to identify the break feeding paddocks in a timely and efficient manner. Additionally and/or alternatively, the processing device 200 may be configured to identify the break feeding paddocks on the generated feed wedge and/or on the table including biomass (i.e. dry matter) values. In another embodiment, the processing device 200 may be configured to identify one or more paddocks currently being grazed and, following this identification: generate a feed wedge showing different portions (e.g., a first portion corresponding to the paddock area 502 having been grazed and a second portion corresponding to the paddock area not having been grazed yet 503) of the one or more paddocks; and/or generate a table including biomass (i.e. dry matter) values for each portion of the one or more paddocks currently being grazed.
[0091] Reference is now made to Figs. 7A to 7D, which illustrate different elements of a management zone map according to one embodiment of the invention, Fig. 7A depicting a cloud detection representation of a plurality of paddocks, Fig. 7B depicting a biomass value representation of a plurality of paddocks, Fig. 7C depicting a feed wedge of a plurality of paddocks, and Fig. 7D depicting calculated biomass values for a plurality of paddocks sorted from highest to lowest. Fig. 7A further includes a visual indication 705 illustrating that some portions or parts of the paddocks/farm are covered by clouds. As explained hereinabove in relation to step 333 of Fig. 3B, the processing device 200 may be configured to execute instructions that detect/locate cloud and/or portions or parts of images that correspond to portions or parts of the farm/paddocks covered by clouds. Once detected or identified, the processing device 200 may be configured to generate the cloud detection representation 730 with one or more visual indications 705 identifying portions or parts of the paddocks/farm that are covered by clouds. Although the visual indication 705 is depicted in Fig. 7A as a cloud, those skilled in the art will appreciate that any suitable visual indication (of any shape or any color) may be used to represent portions or parts of the paddocks/farm being covered by clouds.
[0092] Fig. 7B is a biomass value representation 700 for a plurality of paddocks of the farm generated and rendered as explained hereinabove in relation to Figs. 3A-3C while Fig. 7C shows a feed wedge 710 for the plurality of paddocks or the farm depicted on the biomass values representation of Fig. 7B. The feed wedge 710 provides a visual representation of a current pasture biomass situation of the plurality of paddocks and/or the farm, by ranking paddocks based on the average pasture cover. As explained hereinabove in relation to step 333 of Fig. 3B, the processing device 200 may be configured to discard some of the metadata. In one embodiment, the processing device 200 may be therefore configured to discard all the metadata associated with paddocks comprising portions or parts being covered by clouds. As a result, the data relevant to these paddocks are ignored for the calculations and the paddocks are removed (i.e., not shown) on the generated feed wedge 710. Fig. 7D shows biomass values for the one or more paddocks. Similarly, the paddocks identified as having portions or parts covered by clouds can be ignored and therefore not shown. Alternatively, the paddocks can be presented without any biomass values but instead with an indication (e.g., CLOUD / HAZE) informing the end user that these paddocks were identified as being covered by clouds at the time when the images were captured.
[0093] Reference is now made to Figs. 8A-8D which are pictorial illustrations showing how captured images may be processed according to an embodiment of the present invention. Images captured by the satellite may comprise sunlit and/or shadowed portions, where the shadows can be created by the local relief (e.g., hills), vegetation (e.g., trees or shelter belts), physical structures (e.g., buildings), or even clouds. In turn, shadows may have a negative impact on the pasture biomass calculations and there is therefore a need to process the satellite data in order to eliminate or at least reduce this impact. In a further embodiment of the present invention, the processing device 200 is therefore further configured to detect shadows in the captured images and process the images to account for the shadow influence. Shadow detection and image processing can be done, for example, at any suitable time such as at the same time as or subsequently to steps 333, 334, and 335 of Fig. 3B.
[0094] Fig. 8A illustrates a portion or part 800 of the captured and/or processed ground image 111 (steps 331, 332, or 333 of Fig. 3A) comprising: a first area 806 covered by e.g., trees; a second area 807 corresponding shadowed portions or parts of the paddocks/farm (i.e., shadows created by the trees covering the first area 806); and, third 808 and fourth 809 areas corresponding to sunlit portions or parts of the paddocks/farm. The processing device 200 may be configured to process the portion or part 800 shown in Fig. 8A.
[0095] The processing device 200 may be configured to execute instructions that process the image 800 to create two sub-images or masks depicted on Figs. 8B and 8C. To create the first sub-image of Fig. 8B, the processing device 200 may be configured to calculate the biomass values of the different areas using any suitable method as described hereinabove in relation to steps 334 and/or 335 of Fig. 3B. The calculated biomass value of each pixel is then compared to a first threshold to determine whether the pixel is to be categorized as a pixel of low or high biomass (i.e., dry matter) value. For example, a pixel having a calculated biomass value greater than the first threshold is identified as having a high biomass value whereas a pixel having a calculated biomass value less than the first threshold is identified as having a low biomass value. As a result, the processing device 200 may, for example, identify area 811 (corresponding to first area 806 of Fig. 8A) as comprising pixels of high biomass values, and areas 812 (corresponding to second 807 and fourth 809 areas of Fig. 8A) as comprising pixels of low biomass values. The first sub- image or mask of Fig. 8B can then be created as a bit mask where each pixel is assigned binary value, e.g., 'O'or T bit or 'True' or 'False' depending on whether it corresponds to a high or low biomass value.
[0096] In addition, the processing device 200 may be further configured to execute instructions that determine the luminance - intensity of light emitted by a surface - of the different areas of image 800 to create the second sub-image of Fig. 8C. To do so, the processing device 200 may use, for example but not limited to, the metadata (e.g., RGB or R spectral values) associated with the satellite data. The calculated luminance value of each pixel is then compared to a second threshold to determine whether the pixel is to be categorized as a pixel of low or high luminance value. For example, a pixel having a calculated luminance value greater than the second threshold is identified as having a high luminance value whereas a pixel having a calculated luminance value less than the second threshold is identified as having a low luminance value. As a result, the processing device 200 may, for example, identify areas 813 and 815 (first 806, third 808, and fourth 809 areas of Fig. 8 A) as comprising pixels of high luminance values, and area 814 (second area 807 of Fig. 8A) as comprising pixels of low luminance values. The second sub-image or mask of Fig. 8C can then be created as a bit mask where each pixel is assigned a binary value, e.g., a '0' or T bit or a 'True' or 'False' value depending on whether it corresponds to a high or low luminance value.
[0097] The processing device 200 is further configured to identify the shadowed portions of the paddocks/farm by combining the luminance and biomass values of each pixel. The processing device 200 may be programmed to identify the shadowed portions of the paddocks/farm as areas of an image comprising pixels of low biomass and luminance values. Fig. 8D illustrates a further step of the image processing that is obtained by combining and/or overlaying the processed images of Figs. 8B and 8C. Fig. 8D shows a first area 816 identified by the processing device 200 as a shadowed portion of the paddock/farm as it comprises pixels of low biomass and luminance values based on the binary values assigned when creating the sub-images or masks of Figs. 8B and 8C. In turn, the processing device 200 is configured to identify whether a particular pixel belongs to a shadowed or a sunlit portion and/or whether a particular area is a sunlit or shadowed portion. Pixels belonging to a shadowed portion and/or areas identified as a shadowed portion may be discarded and therefore not used for subsequent calculations (e.g., calculating the biomass value of a particular paddock) and/or representation on the graphical user interface. Those skilled in the art will appreciate that the configuration/operation of the processing device 200 described in relation to Figs. 8A-8D is an exemplary implementation and is not limiting. For example, the processing device 200 may be configured to operate with more than one thresholds for the biomass and luminance values. A plurality of thresholds, values, range of values, or categories may be defined and used for the calculations instead of using the binary classification for the pixels. In addition, or alternatively, these thresholds, ranges, or categories may be updated and/or adjusted automatically and/or manually by an operator (e.g., the end user) at any suitable time.
[0098] Further, the processing device 200 is configured to identify other portions of the paddocks/farm depending on the combination of biomass and luminance values. Fig. 8D shows area 817 identified as a sunlit portion having low pasture and area 818 identified as a sunlit portion having high pasture. Those skilled in the art will appreciate that other any suitable categories may be provided and programmed in a memory of the processing device 200 in order to identify different areas of the processed images.
[0099] Figs. 9A-9B illustrate two different visual representations that can be obtained based on the image processing detailed hereinabove in relation to Figs. 8A to 8D. Fig. 9A is similar to Fig. 8D but at a farm level and shows portions of the paddocks identified as sunlit pasture portions 917, 918 in light grey as well as portions of the paddocks identified as shadowed portions 916 in white. As explained hereinabove in relation to step 333 of Fig. 3B, the processing device 200 may be configured to discard some or all the metadata associated with the shadowed portions 916. The biomass and updated biomass values may then be calculated at steps 334 and 335 of Fig. 3B using the remaining data and metadata corresponding to the paddock portions 917, 918.
[00100] The process then proceeds to step 340 of Fig. 3A (and/or steps 341 and 342 of Fig. 3C) where a management zone map is generated and one or more element of the management zone map are displayed and/or transmitted to a client device 150 for display. In the embodiment illustrated in Figs. 8A-8D and 9A, the processing device 200 may be able to generate a biomass value representation comprising the shadowed portions of the paddocks/farm. Such biomass value representation is depicted in Fig. 9B. In addition, the management zone map may further comprise the visual representation of Fig. 9 A such that the end user can understand in a timely and efficient manner which parts or portions of the paddocks/farm have been included into the calculations.
[00101] It is appreciated that various features of the invention which are, for clarity, described in the contexts of separate embodiments may also be provided in combination in a single embodiment. Conversely, various features of the invention which are, for brevity, described in the context of a single embodiment may also be provided separately or in any suitable subcombination.
[00102] It will be appreciated by persons skilled in the art that the present invention is not limited by what has been particularly shown and described hereinabove. Rather the scope of the invention is defined by the appended claims and equivalents thereof:

Claims

WHAT IS CLAIMED IS:
1. A method implemented on a computing device for visualizing an amount of pasture available for grazing by livestock in a farm including one or more paddocks, the method comprising:
receiving data relevant to the particular farm, the data comprising at least satellite data of the farm and a georeferenced map identifying the one or more paddocks of the farm; processing the satellite data using at least the georeferenced map; and
generating a graphical user interface using the processed data, the graphical user interface comprising at least one biomass value representation of the farm, the biomass value representation being indicative of an amount of pasture available for grazing by livestock for each or some of the one or more paddocks of the farm.
2. The method of claim 1, wherein the graphical user interface further comprises a feed wedge and/or biomass values for each or some of the one or more paddocks of the farm.
3. The method of any one of claim 1 or 2, further comprising rendering the generated graphical user interface on a display screen associated with the computing device for use by an end user.
4. The method of any one of the preceding claims, wherein the receiving comprises receiving ground data and wherein the processing comprises processing the satellite data using the georeferenced map and the ground data.
5. The method of any one of the preceding claims, wherein the processing comprises identifying at least one of the one or more paddocks currently being grazed by livestock and wherein the generating comprises providing a visual identification of the identified paddock(s) on the graphical user interface.
6. The method of any one of the preceding claims, wherein the processing comprises processing the satellite data to correct a solar angle and a reflectance influence.
7. The method of any one of the preceding claims, wherein the processing comprises overlaying the georeferenced map onto the satellite data to identify a portion of the satellite data corresponding to the farm.
8. The method of claim 7, wherein the processing comprises processing the identified portion of the satellite data corresponding to the farm to identify sunlit and/or shadowed portions.
9. The method of claim 8, wherein the sunlit and/or shadowed portions are identified using biomass and luminance values.
10. The method of claim 9, wherein the shadowed portions are identified as portions of the image corresponding to the farm having low biomass and luminance values.
11. The method of any one of claims 8-10, wherein the processing further comprises discarding data associated with the identified shadowed portions.
12. The method of any one of claims 7-11, wherein the processing comprises determining a biomass value for each pixel of the portion of the satellite data corresponding to the farm.
13. The method of claim 12 when dependent on any one of claims 8-11, wherein the processing comprises determining a biomass value for each pixel corresponding to the identified sunlit portions.
14. The method of any one of claim 12 or 13, wherein the biomass value of a pixel is defined by a ratio between a near infrared data spectral value and a green spectral data value.
15. The method of any one of claims 12-14, wherein the processing comprises determining an updated biomass value for each pixel by applying a linear regression model.
16. The method of any one of claims 12-15, wherein the processing comprise determining a biomass value for each or some of the one or more paddocks using the determined biomass value or the updated biomass value for each pixel.
17. The method of claim 16, wherein the biomass value of a particular paddock is determined by averaging the determined biomass value or the updated biomass value of each pixel belonging to the particular paddock.
18. The method of any one of claims 16-17 when dependent on claim 2, wherein the generating comprises generating the feed wedge and/or the biomass values of the graphical user interface using the determined biomass values for each or some of the one or more paddocks.
19. The method of any one of the preceding claims, wherein the generated graphical user interface comprises one or more visual indications identifying shadowed portions of the one or more paddocks.
20. The method of any one of claims 2-18, wherein the generated graphical interface comprises the feed wedge and/or the biomass values of each or some of the one or more paddocks that do not comprise shadowed portions.
21. The method of any one of the preceding claims, wherein the satellite data comprises one or more of: one or more ground images; one or more ground images tiles; a time at which an image or tile was captured; a date at which an image or tile was captured; geolocation information; a projection; an altitude for each pixel of an image or tile; RGB (Red, Green, and Blue) spectral data values for each pixel of an image or tile; R (Near Infra-Red) spectral data value for each pixel of an image or tile.
22. The method of any one of claims 4-21, wherein the ground data comprises one or more of: soil data; crop data; and climate data.
23. A system for visualizing an amount of pasture available for grazing by livestock in a farm including one or more paddocks, the system comprising:
a source of satellite data comprising one or more ground images or tiles and associated metadata;
a source of ground data; and,
a computing device comprising at least one memory and at least one processor that use the satellite data and ground data to generate a graphical user interface, the graphical user interface comprising at least one biomass value representation of the farm, the biomass value representation being indicative of an amount of pasture available for grazing by livestock for each or some of the one or more paddocks of the farm.
24. The system of claim 23, wherein the computing device is operative to identify paddocks currently being grazed by livestock and to generate a graphical user interface comprising one or more visual indications for the identified paddocks.
25. The system of any one of claim 23 or 24, wherein the computing device is further operative to identify sunlit and/or shadowed portions of the one or more paddocks, and to generate a graphical user interface comprising one or more visual indications indicating the shadowed portions.
26. A computing device for visualizing an amount of pasture available for grazing by livestock in a farm including one or more paddocks, the computing device comprising: at least one processor;
memory accessible by each of the at least one processor; and
instructions stored in the memory that direct the at least one processor to: receive data relevant to the particular farm, the data comprising at least satellite data of the farm and a georeferenced map identifying the one or more paddocks of the farm;
process the satellite data using at least the georeferenced map; and generate a graphical user interface using the processed data, the graphical user interface comprising at least one biomass value representation of the farm, the biomass value representation being indicative of an amount of pasture available for grazing by livestock for each or some of the one or more paddocks of the farm.
27. The computing device of claim 26, wherein the graphical user interface further comprises a feed wedge and/or biomass values for each or some of the one or more paddocks of the farm.
28. The computing device of any one of claim 26 or 27, further being associated to a display and wherein the instructions stored in the memory further direct the at least one processor to render the generated graphical user interface on the display.
29. The computing device of any one of claims 26-28, wherein the instructions stored in the memory further direct the at least one processor to: receive ground data; and process the satellite data using the georeferenced map and the ground data.
30. The computing device of any one of the claims 26-29, wherein the instructions stored in the memory further direct the at least one processor to: identify at least one of the one or more paddocks currently being grazed by livestock; and generate a visual identification of the identified paddock(s) on the graphical user interface.
31. The computing device of any one of claims 26-30, wherein the instructions stored in the memory further direct the at least one processor to process the satellite data to correct a solar angle and a reflectance influence.
32. The computing device of any one of claims 26-31, wherein the instructions stored in the memory further direct the at least one processor to overlay the georeferenced map onto the satellite data to identify a portion of the satellite data corresponding to the farm.
33. The computing device of claim 32, wherein the instructions stored in the memory further direct the at least one processor to process the identified portion of the satellite data corresponding to the farm to identify sunlit and/or shadowed portions.
34. The computing device of claim 33, wherein the sunlit and/or shadowed portions are identified using biomass and luminance values.
35. The computing device of claim 34, wherein the shadowed portions are identified as portions of the image corresponding to the farm having low biomass and luminance values.
36. The computing device of any one of claims 33-35, wherein the instructions stored in the memory further direct the at least one processor to discard data associated with the identified shadowed portions.
37. The computing device of any one of claims 32-36, wherein the instructions stored in the memory further direct the at least one processor to determine a biomass value for each pixel of the portion of the satellite data corresponding to the farm.
38. The computing device of claim 37 when dependent on any one of claims 33-37, wherein the instructions stored in the memory further direct the at least one processor to determine a biomass value for each pixel corresponding to the identified sunlit portions.
39. The computing device of any one of claim 37 or 38, wherein the biomass value of a pixel is defined by a ratio between a near infrared data spectral value and a green spectral data value.
40. The computing device of any one of claims 37-39, wherein the instructions stored in the memory further direct the at least one processor to determine an updated biomass value for each pixel by applying a linear regression model.
41. The computing device of any one of claims 37-40, wherein the instructions stored in the memory further direct the at least one processor to determine a biomass value for each or some of the one or more paddocks using the determined biomass value or the updated biomass value for each pixel.
42. The computing device of claim 41, wherein the biomass value of a particular paddock is determined by averaging the determined biomass value or the updated biomass value of each pixel belonging to the particular paddock.
43. The computing device of any one of claim 41 or 42 when dependent on claim 27, wherein the instructions stored in the memory further direct the at least one processor to generate the feed wedge and/or the biomass values of the graphical user interface using the determined biomass values for each or some of the one or more paddocks.
44. The computing device of any one of the claims 26-43, wherein the generated graphical user interface comprises one or more visual indications identifying shadowed portions of the one or more paddocks.
45. The computing device of any one of claims 27-43, wherein the generated graphical interface comprises the feed wedge and/or the biomass values of each or some of the one or more paddocks that do not comprise shadowed portions.
46. The computing device of any one of the claims 26-45, wherein the satellite data comprises one or more of: one or more ground images; one or more ground images tiles; a time at which an image or tile was captured; a date at which an image or tile was captured; geolocation information; a projection; an altitude for each pixel of an image or tile; RGB (Red, Green, and Blue) spectral data values for each pixel of an image or tile; R (Near Infra-Red) spectral data value for each pixel of an image or tile.
47. The computing device of any one of claims 29-46, wherein the ground data comprises one or more of: soil data; crop data; and climate data.
48. One or more computer readable tangible storage media encoded with software comprising computer executable instructions and when the software is executed the software is operable to:
receive data relevant to the particular farm, the data comprising at least satellite data of the farm and a georeferenced map identifying the one or more paddocks of the farm; process the satellite data using at least the georeferenced map; and
generate a graphical user interface using the processed data, the graphical user interface comprising at least one biomass value representation of the farm, the biomass value representation being indicative of an amount of pasture available for grazing by livestock for each or some of the one or more paddocks of the farm.
PCT/NZ2018/050128 2017-10-17 2018-09-26 Pasture satellite measurements WO2019078733A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762573347P 2017-10-17 2017-10-17
US62/573,347 2017-10-17

Publications (1)

Publication Number Publication Date
WO2019078733A1 true WO2019078733A1 (en) 2019-04-25

Family

ID=66173418

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/NZ2018/050128 WO2019078733A1 (en) 2017-10-17 2018-09-26 Pasture satellite measurements

Country Status (1)

Country Link
WO (1) WO2019078733A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111209821A (en) * 2019-12-30 2020-05-29 首都师范大学 Public-oriented grassland grass growth multispectral imaging monitoring handheld system and method
WO2022007117A1 (en) * 2020-07-06 2022-01-13 周爱丽 Livestock distribution density real-time measurement platform and method
WO2022087702A1 (en) * 2020-10-27 2022-05-05 Cunha Rafael Carvalho Da Device for measuring pasture grass biomass quality and quantity using information from the field and information from satellite vegetation indices
WO2022115916A1 (en) * 2020-12-04 2022-06-09 Finchain.Ai Pty Ltd Livestock monitoring and management
EP4245133A1 (en) * 2022-03-16 2023-09-20 Digitanimal, S.L. Characterization of pasture for improved and sustainable grazing and feeding management of livestock

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120101861A1 (en) * 2010-10-25 2012-04-26 Lindores Robert J Wide-area agricultural monitoring and prediction
US20160309646A1 (en) * 2015-04-24 2016-10-27 360 Yield Center, Llc Agronomic systems, methods and apparatuses

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120101861A1 (en) * 2010-10-25 2012-04-26 Lindores Robert J Wide-area agricultural monitoring and prediction
US20160309646A1 (en) * 2015-04-24 2016-10-27 360 Yield Center, Llc Agronomic systems, methods and apparatuses

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
CLARK, D ET AL.: "PASTURE MONITORING FROM SPACE", PROCEEDINGS OF THE SOUTH ISLAND DAIRY EVENT, 2006, pages 108 - 123, XP055597077, Retrieved from the Internet <URL:https://side.org.nz/wp-content/uploads/2014/05/3.1-PASTURE-MONITORING-FROM-SPACE.doc> [retrieved on 20181219] *
PASTURES FROM SPACE - FEED-ON-OFFER AT THE PADDOCK LEVEL, 8 April 2016 (2016-04-08), XP055597083, Retrieved from the Internet <URL:https://web.archive.org/web/20160408103316/http://www.pasturesfromspace.csiro.au/Feed-On-Offer.htm> [retrieved on 20181219] *
PUNALEKAR, S M ET AL.: "Application of Sentinel-2A data for pasture biomass monitoring using a physically based radiative transfer model", REMOTE SENSING OF ENVIRONMENT, vol. 218, 2018, pages 207 - 220, XP055597098, Retrieved from the Internet <URL:https://www.sciencedirect.com/science/article/pii/S0034425718304486> [retrieved on 20181219] *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111209821A (en) * 2019-12-30 2020-05-29 首都师范大学 Public-oriented grassland grass growth multispectral imaging monitoring handheld system and method
CN111209821B (en) * 2019-12-30 2022-08-12 首都师范大学 Popular grassland grass growth multispectral imaging monitoring handheld system and method
WO2022007117A1 (en) * 2020-07-06 2022-01-13 周爱丽 Livestock distribution density real-time measurement platform and method
WO2022087702A1 (en) * 2020-10-27 2022-05-05 Cunha Rafael Carvalho Da Device for measuring pasture grass biomass quality and quantity using information from the field and information from satellite vegetation indices
WO2022115916A1 (en) * 2020-12-04 2022-06-09 Finchain.Ai Pty Ltd Livestock monitoring and management
EP4245133A1 (en) * 2022-03-16 2023-09-20 Digitanimal, S.L. Characterization of pasture for improved and sustainable grazing and feeding management of livestock
WO2023175095A1 (en) * 2022-03-16 2023-09-21 Digitanimal, S.L. Characterization of pasture for improved and sustainable grazing and feeding management of livestock

Similar Documents

Publication Publication Date Title
WO2019078733A1 (en) Pasture satellite measurements
US11580731B2 (en) Systems, devices, and methods for in-field diagnosis of growth stage and crop yield estimation in a plant area
US20210217148A1 (en) Methods for agronomic and agricultural monitoring using unmanned aerial systems
De Souza et al. Height estimation of sugarcane using an unmanned aerial system (UAS) based on structure from motion (SfM) point clouds
Matese et al. Assessment of a canopy height model (CHM) in a vineyard using UAV-based multispectral imaging
US10713777B2 (en) Organism growth prediction system using drone-captured images
de Souza et al. Mapping skips in sugarcane fields using object-based analysis of unmanned aerial vehicle (UAV) images
Torres-Sánchez et al. High-throughput 3-D monitoring of agricultural-tree plantations with unmanned aerial vehicle (UAV) technology
Kalisperakis et al. Leaf area index estimation in vineyards from UAV hyperspectral data, 2D image mosaics and 3D canopy surface models
Stanton et al. Unmanned aircraft system-derived crop height and normalized difference vegetation index metrics for sorghum yield and aphid stress assessment
Chen et al. Improving estimates of fractional vegetation cover based on UAV in alpine grassland on the Qinghai–Tibetan Plateau
Torres-Sánchez et al. Multi-temporal mapping of the vegetation fraction in early-season wheat fields using images from UAV
Gil-Docampo et al. Above-ground biomass estimation of arable crops using UAV-based SfM photogrammetry
Sharifi Estimation of biophysical parameters in wheat crops in Golestan province using ultra-high resolution images
Ørka et al. Simultaneously acquired airborne laser scanning and multispectral imagery for individual tree species identification
Duniway et al. Rangeland and pasture monitoring: an approach to interpretation of high-resolution imagery focused on observer calibration for repeatability
Bonnet et al. Comparison of UAS photogrammetric products for tree detection and characterization of coniferous stands
Gandharum et al. Remote sensing versus the area sampling frame method in paddy rice acreage estimation in Indramayu regency, West Java province, Indonesia
Sinde-González et al. Biomass estimation of pasture plots with multitemporal UAV-based photogrammetric surveys
Laliberte et al. Hierarchical object-based classification of ultra-high-resolution digital mapping camera (DMC) imagery for rangeland mapping and assessment
Brocks et al. Toward an automated low-cost three-dimensional crop surface monitoring system using oblique stereo imagery from consumer-grade smart cameras
WO2020000043A1 (en) Plant growth feature monitoring
Fraser et al. UAV and high resolution satellite mapping of Forage Lichen (Cladonia spp.) in a Rocky Canadian Shield Landscape
CN108776106A (en) A kind of crop condition monitoring method and system based on unmanned plane low-altitude remote sensing
Bayati et al. 3D reconstruction of uneven-aged forest in single tree scale using digital camera and SfM-MVS technique

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18868694

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 17/08/2020)

122 Ep: pct application non-entry in european phase

Ref document number: 18868694

Country of ref document: EP

Kind code of ref document: A1