US12165222B2 - Imagery-based boundary identification for agricultural fields - Google Patents

Imagery-based boundary identification for agricultural fields Download PDF

Info

Publication number
US12165222B2
US12165222B2 US17/681,126 US202217681126A US12165222B2 US 12165222 B2 US12165222 B2 US 12165222B2 US 202217681126 A US202217681126 A US 202217681126A US 12165222 B2 US12165222 B2 US 12165222B2
Authority
US
United States
Prior art keywords
rasters
index
time series
raster
geographic region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US17/681,126
Other versions
US20220180526A1 (en
Inventor
Bobby Harold Braswell
Tina A. Cormier
Damien Sulla-Menashe
Keith Frederick Ma
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Indigo Ag Inc
Original Assignee
Indigo Ag Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Indigo Ag Inc filed Critical Indigo Ag Inc
Priority to US17/681,126 priority Critical patent/US12165222B2/en
Publication of US20220180526A1 publication Critical patent/US20220180526A1/en
Assigned to INDIGO AG, INC. reassignment INDIGO AG, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Cormier, Tina A., Ma, Keith Frederick, SULLA-MENASHE, Damien, BRASWELL, Bobby Harold
Assigned to CORTLAND CAPITAL MARKET SERVICES LLC, AS AGENT reassignment CORTLAND CAPITAL MARKET SERVICES LLC, AS AGENT SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: INDIGO AG, INC., INDIGO AGRICULTURE, INC.
Assigned to INDIGO AG, INC., INDIGO AGRICULTURE, INC. reassignment INDIGO AG, INC. RELEASE BY SECURED PARTY (SEE DOCUMENT FOR DETAILS). Assignors: CORTLAND CAPITAL MARKET SERVICES LLC
Application granted granted Critical
Publication of US12165222B2 publication Critical patent/US12165222B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/02Agriculture; Fishing; Forestry; Mining
    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B79/00Methods for working soil
    • A01B79/005Precision agriculture
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/162Segmentation; Edge detection involving graph-based methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/174Segmentation; Edge detection involving the use of two or more images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/32Normalisation of the pattern dimensions
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/34Smoothing or thinning of the pattern; Morphological operations; Skeletonisation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/60Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10036Multispectral image; Hyperspectral image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Definitions

  • Embodiments of the present disclosure relate to remote sensing, and more specifically, to imagery-based boundary identification for agricultural fields.
  • a time series of surface reflectance rasters for a geographic region is received.
  • at least one index raster is determined, yielding at least one time series of index rasters.
  • the at least one time series of index rasters is divided into a plurality of consecutive time windows.
  • the at least one time series of index rasters is composited within each of the plurality of time windows, yielding a composite index raster for each of the at least one time series of index rasters in each of the plurality of time windows.
  • the composite index rasters are segmented into a plurality of spatially compact regions of the geographic region.
  • a plurality of polygons is generated from the plurality of spatially compact regions, each of the plurality of polygons corresponding to an agricultural field in the geographic region.
  • the time series of surface reflectance rasters comprises satellite data. In some embodiments, the time series of surface reflectance rasters spans a growing season in the geographic region. In some embodiments, receiving the time series of surface reflectance rasters comprises determining surface reflectance from uncorrected reflectance data.
  • the at least one index raster comprises a normalized difference vegetation index raster. In some embodiments, the at least one index raster comprises a land surface water index raster. In some embodiments, the at least one index raster comprises a mean brightness raster. In some embodiments, determining the at least one index raster comprises downsampling the surface reflectance rasters.
  • the plurality of consecutive time windows correspond to early, mid-, and late phases of a growing season in the geographic region.
  • compositing comprises averaging the at least one time series of index rasters within each of the plurality of time windows.
  • segmenting comprises filling missing pixels in the composite index rasters. In some embodiments, filling missing pixels comprises applying linear interpolation to the composite index rasters. In some embodiments, segmenting comprises normalizing the composite index rasters. In some embodiments, segmenting comprises denoising. In some embodiments, denoising comprises applying a spatial low pass filter. In some embodiments, segmenting comprises graph-based segmentation. In some embodiments, segmenting comprises Felzenszwalb segmentation.
  • generating the plurality of polygons comprises applying spatial smoothing to the plurality of spatially compact regions.
  • FIG. 1 illustrates a field delineation pipeline according to embodiments of the present disclosure.
  • FIG. 2 illustrates an image preprocessing method according to embodiments of the present disclosure.
  • FIG. 3 illustrates an image segmentation method according to embodiments of the present disclosure.
  • FIGS. 4 A-B illustrate exemplary image segmentations according to embodiments of the present disclosure.
  • FIG. 5 illustrates a segmentation postprocessing method according to embodiments of the present disclosure.
  • FIGS. 6 A-B illustrate exemplary image segmentations according to embodiments of the present disclosure.
  • FIGS. 7 A-B illustrate exemplary field boundaries according to embodiments of the present disclosure.
  • FIG. 8 illustrates a method for agricultural field boundary identification according to embodiments of the present disclosure.
  • FIG. 9 depicts a computing node according to an embodiment of the present disclosure.
  • Farm fields represent a fundamental spatial unit of agriculture. Observation, analysis, and modeling of agricultural characteristics at the farm field level requires specification of their geographic boundaries. However, there is no standard reference data set for farm field boundaries.
  • a field boundary refers to a spatially compact (that is, closed and bounded) unit of the landscape that exhibited an approximately uniform pattern temporally and spectrally for a given growing season.
  • field boundary delineates a bounded area with common crop type and management.
  • a field boundary may differ from what visual inspection of a single image might suggest. For example, visual inspection is likely to be strongly affected by the presence of roads, streams, paths, or other such boundaries, irrespective of whether a crop was actually grown on the land in question.
  • Various image processing algorithms may be used to partition images into coherent spatial units (segmentation) based on detection of boundaries in the image.
  • satellite data at sufficient spatial resolution may be used to create farm field boundaries.
  • a given static image does not contain sufficient information to produce an accurate boundary.
  • two adjacent fields might appear as one field in an image.
  • satellite data contain generally multiple spectral reflectance bands, which can be combined algebraically to produce indices, but no single index is guaranteed to provide distinguishing power to resolve between fields. Even if satellite information alone were sufficient to delineate fields, a general segmentation approach would not delineate fields alone, but all image content with a discernable boundary.
  • the present disclosure provides systems, methods, and computer program products for automated identification of field boundaries in remote sensing imagery that exploits spectral, temporal, and spatial patterns in the data to create a geospatial data set (e.g., polygonal features) indicative of field boundaries.
  • a geospatial data set e.g., polygonal features
  • the field delineation pipeline includes three sequential steps: preprocessing 101 , in which precursor images are created to enable an accurate characterization of boundaries; segmentation 102 , in which an appropriate combination of filters is applied to the imagery prior to segmentation; and post-processing 103 , in which contextual and geometric screening is performed to remove boundaries that are determined to not be fields.
  • FIG. 2 an exemplary preprocessing method is illustrated according to embodiments of the present disclosure. It will be appreciated that the quality and particular characteristics of the input imagery used to create boundaries is important to accurate results from the delineation algorithms provided herein.
  • Remote sensing data are retrieved from one or more datastore 201 .
  • remote sensing data comprise satellite data including surface reflectance at a plurality of resolutions, at a plurality of times.
  • the datastore is the NASA Harmonized Landsat-Sentinel2 (HLS) product archive. HLS takes advantage of the complementary overpass times of Landsat and Sentinel2, to provide denser coverage in time, but with uniform radiometric and geospatial characteristics.
  • datastore 201 contains uncorrected reflectance data, which is converted to surface reflectance prior to use (e.g., by cloud mapping and atmospheric correction). It will be appreciated, however, that a variety of alternative satellite systems are suitable for providing data as set out herein.
  • remote sensing data are fetched and stored in a local cache 202 for further processing. It will be appreciated, however, that in some embodiments, data may be read directly from a local datastore, or may be streamed directly from a remote data store without the need for local caching.
  • the Geospatial Intelligence Production Solution is used for data retrieval.
  • GIPS is an open source solution that provides a uniform interface to a wide range of satellite, weather, and other geospatial data sources.
  • APIs and platforms may be used to retrieve suitable satellite data.
  • the remote sensing data is processed to compute 203 one or more indices 204 for each point in time for which data is available at each pixel of the input images.
  • surface reflectance images e.g., from HLS
  • NDVI normalized difference vegetation index
  • LSWI land surface water index
  • BRGT mean brightness
  • These three indices represent the three principal axes of variability of optical data, and may be referred to as greenness, wetness, and brightness.
  • each of three indices contains a plurality of snapshots in time. Each snapshot is a raster, or image, whose pixel intensity indicates the index value.
  • different indices are selected, resulting in a different number of bands.
  • the brightness band described above is omitted.
  • Brightness, greenness, and wetness are generally the most dominant modes of variability for optical remote sensing bands.
  • Enhanced Vegetation Index (EVI) or EVI2 may be used in place of NDVI.
  • IOU Intersection Over Union
  • GDF1 is used herein to refer to the mean comparison of each manually delineated field to autodelineated fields.
  • GDF2 is used herein to refer to the mean comparison of each autodelineated field to manual fields. In the present example, GDF2 IOU increased by 13%-84%, yielding a weighted mean increase of 33%. GDF1 IOU was about the same.
  • Remote sensing data may be available on an irregular schedule, for example due to orbital periods of a given constellation.
  • the HLS source images are provided irregularly in time, and may contain gaps which propagate into the indices.
  • the index images are composited 205 within pre-specified time windows, enabling delivery of a small number of high-value variables for use in the downstream algorithms. It will be appreciated that various techniques may be used to composite the source images prior to index computation. However, compositing the index images is advantageous as it reduces noise and lowers the dimensionality of the problem, thereby enabling more efficient computation.
  • the predetermined time windows correspond to phases of the growing season. In some embodiments, the time windows correspond to the early growing season, the mid-season, and the late growing season for a given crop. In an exemplary embodiment, a first window spans April and May, a second window spans June and July, and a third window spans August and September. It will be appreciated that these exemplary windows are calibrated to the northern hemisphere, and would be transposed by six months for use in the southern hemisphere. It will also be appreciated that while these windows are suitable for the continental US, they may be shortened or lengthened for certain crops at certain higher or lower latitudes.
  • a user is able to define the number of time windows, and the start and end date of each window separately. This approach allows the delineation of field boundaries for each specific season, or at multiple times within a season, capturing potential changes in the use of the land. For example, a field could be farmed in its entirety for a cash crop, then part of the field subsequently could be used for a cover crop. Similarly, different indices may be used for different conditions or different geographies.
  • compositing 205 comprises performing a temporal linear interpolation to reduce potential bias from having the distribution of measurements in time significantly different for different places.
  • linear interpolation is performed between available observations, which due to clouds and overpass constraints, may not be evenly distributed in time. After interpolation, for each pixel, the average in time within a window is taken. In an exemplary embodiment in which three indices are assessed over three time windows, the result is a nine band (3 indices ⁇ 3 windows) image stack 206 .
  • the above process may be performed for a global data set, or only for certain areas of interest.
  • the resulting image stack is downsampled to a predetermined resolution in order to limit the overall storage size necessary to maintain the image stacks.
  • the target resolution is 0.15 degrees. This resolution allows for storage of a global dataset while providing sufficient resolution for further downstream processing.
  • an exemplary segmentation method is illustrated according to embodiments of the present disclosure.
  • a series of numerical image processing steps are performed.
  • the combination of pre-segmentation steps contribute to the reliability of the segmentation results.
  • gaps are filled in the available multi-temporal multi-index imagery. Even after compositing, some data sets contain residual missing pixels which must be addressed. In some embodiments, gap-filling comprises applying linear interpolation to gap fill these residual missing values. The post-compositing gaps are typically very small (1-10 pixels), making linear interpolation sufficient.
  • normalizing comprises rescaling. In some embodiments, normalizing comprises quantizing. In some embodiments, all bands are normalized by subtracting the mean and dividing the result by its standard deviation.
  • the images are filtered.
  • filtering comprises applying a denoising filter.
  • the denoising filter is the scikit-image restoration.denoise_bilateral filter for spatial and variable-wise smoothing. This filter removes noise by applying a spatial low pass filter that does not smooth over features that appear consistently in the nine bands.
  • an edge-preserving, denoising filter is used (such as those provided by scikit-image restoration). Such filters average pixels based on their spatial closeness and radiometric similarity.
  • spatial closeness is measured by the Gaussian function of the Euclidean distance between two pixels and a configurable standard deviation value (denoted sigma spatial in scikit-image restoration). A larger value of the standard deviation for range distance results in averaging of pixels with larger spatial differences.
  • the standard deviation value is 0.1, 0.5, 0.8, or 0.9. A value of 0.5 or lower results in situations in which the auto-delineated fields may have four times or more polygons than manually delineated fields, which is undesirable.
  • spatial closeness is measured by the Gaussian function of the Euclidean distance between two color values and a configurable standard deviation value (denoted sigma_color in scikit-image restoration).
  • a larger value of the standard deviation for range distance results in averaging of pixels with larger radiometric differences.
  • the image is converted using the img_as float function (of the scikit-image restoration library) and thus the standard deviation is in respect to the range [0, 1]. If the value is none, the standard deviation of the image is used. In various embodiments, the standard deviation value is none, 0.3, 0.5, 0.8. In testing, changing this parameter did not change GDF1 IOU, GDF2 IOU, or the resulting number of field polygons.
  • image segmentation is performed.
  • segmentation is performed by graph-based image segmentation.
  • the Felzenszwalb method for efficient graph-based image segmentation is used.
  • segmentation is implemented using the scikit-image segmentation.felzenszwalb algorithm. This exemplary algorithm creates a single layer representing raster classes with labels such that the labeled classes are spatially compact parcels of land. The resulting parcels correspond to distinct units of the landscape, ready for postprocessing to generate vector features of farm fields.
  • the observation level is configurable via a scale value.
  • Higher scale generally means fewer, larger segments. Segment size within an image can vary greatly depending on local contrast, so scale is scene dependent. However, in some embodiments a consistent scale value is applied for all scenes.
  • Exemplary scale values include scale: 550, 600, 650, 700, and 750. Across these ranges, the max difference in GDF1 IOU is only about 1-2%. There is not a clear dominant scale that produces significantly better results across all test tiles. GDF2 IOU is more strongly and inversely related to scale ( ⁇ 5-6% difference between 550 and 750).
  • the diameter (standard deviation) of a Gaussian kernel used for smoothing the image prior to segmentation is configurable. This value may be denoted as sigma. Exemplary values of sigma include 0.5, 0.8, and 0.9. Across this parameter range, the maximum difference in GDF1 IOU is about 1%. However, the number of polygons varies significantly, with higher sigma (smoothing) values correlated with fewer, larger segments. It is preferable to use higher values that still maintain segments that do not cross field boundaries, such as 0.9.
  • FIGS. 4 A-B exemplary segmentations with variable sigma values are illustrated.
  • a sigma value of 0.8 is used.
  • a sigma value of 0.9 is used. As shown, a high value results in larger contiguous segments.
  • FIG. 5 an exemplary postprocessing method is illustrated according to embodiments of the present disclosure.
  • the segmented images are polygonised to create candidate field polygons in vector format based on raster classes.
  • the segmentated images are polygonised using GDAL polygonise.
  • the polygons are spatially cleaned. Spatial cleaning includes checking topological validity, fixing broken geometries (such as non-closed polygons), and removing complex shapes that are unlikely to represent all or part of a farm field.
  • complex shapes are identified by computing the area of the convex hull surrounding the polygon, divided by the area of the polygon.
  • perimeter to area ratio is used. Additional suitable metrics include: eccentricity (the maximum of the set of shortest distances from each vertex, to all vertices in the polygon); equivalent diameter (the diameter of the smallest circle containing the polygon); perimeter to area ratio; and the ratio of minor axis to major axis of the smallest ellipse containing the polygon.
  • Heuristics may be applied to evaluate whether a polygon falls within the reasonable range of field geometries. For example, polygons outside of predetermined size thresholds may be discarded. Similarly, polygons with too high an aspect ratio may be discarded. Across all parameters tested, the max difference in GDF1 and GDF2 IOU is about 3-4%. Larger opening values result in fewer polygons, however, parts of legitimate fields may be missed. Larger values also result in rounded field edges
  • spatial cleaning comprises applying spatial smoothing such as a buffer and reverse buffer cycle.
  • the buffer size (or opening) is configurable.
  • the buffering removes morphologically inconsistent pieces of polygons, such as long, narrow strips between fields that erroneously connect two distinct fields. The larger the opening, the more artifacts are removed, but additional edges of the remaining field boundaries are rounded. Accordingly, there is a tradeoff between boundary accuracy/fidelity and problematic artifacts.
  • Exemplary buffer size values include 1, 5, 10, and 20 meters.
  • FIGS. 6 A-B exemplary segmentations using variable buffer size are shown.
  • a buffer size of 1 is used.
  • Region 601 is an example of geometry that should be removed during the spatial cleaning step.
  • FIG. 6 B shows the result after spatial cleaning with a buffer size of 20. Region 601 is removed, and the remaining fields appear with rounded corners.
  • the polygons are screened to remove non-crop records.
  • screening comprises comparing the polygons against a reference layer of crop data, such as the USGS Crop Data Layer (CDL).
  • CDL USGS Crop Data Layer
  • polygons that lie outside known croplands are discarded.
  • the resulting field polygons are stored for further use, such as visualization.
  • the field polygons are organized in tiles for efficient retrieval of relevant data for a given problem.
  • the field polygons are stored with additional metadata, such as a historical crop type or other attributes derived from remote sensing data or drawn from additional data layers.
  • FIGS. 7 A-B exemplary field boundaries are illustrated according to an embodiment of the present disclosure.
  • FIG. 7 A shows a multitemporal/multispectral image tile such as would result from the preprocessing stage described above.
  • FIG. 7 B shows the same image tile with automatically delineated fields superimposed.
  • time series of surface reflectance rasters for a geographic region is received.
  • at least one index raster is determined, yielding at least one time series of index rasters.
  • the at least one time series of index rasters is divided into a plurality of consecutive time windows.
  • the at least one time series of index rasters is composited within each of the plurality of time windows, yielding a composite index raster for each of the at least one time series of index rasters in each of the plurality of time windows.
  • the composite index rasters are segmented into a plurality of spatially compact regions of the geographic region.
  • a plurality of polygons is generated from the plurality of spatially compact regions, each of the plurality of polygons corresponding to an agricultural field in the geographic region.
  • computing node 10 is only one example of a suitable computing node and is not intended to suggest any limitation as to the scope of use or functionality of embodiments described herein. Regardless, computing node 10 is capable of being implemented and/or performing any of the functionality set forth hereinabove.
  • computing node 10 there is a computer system/server 12 , which is operational with numerous other general purpose or special purpose computing system environments or configurations.
  • Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with computer system/server 12 include, but are not limited to, personal computer systems, server computer systems, thin clients, thick clients, handheld or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputer systems, mainframe computer systems, and distributed cloud computing environments that include any of the above systems or devices, and the like.
  • Computer system/server 12 may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system.
  • program modules may include routines, programs, objects, components, logic, data structures, and so on that perform particular tasks or implement particular abstract data types.
  • Computer system/server 12 may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network.
  • program modules may be located in both local and remote computer system storage media including memory storage devices.
  • computer system/server 12 in computing node 10 is shown in the form of a general-purpose computing device.
  • the components of computer system/server 12 may include, but are not limited to, one or more processors or processing units 16 , a system memory 28 , and a bus 18 that couples various system components including system memory 28 to processor 16 .
  • Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures.
  • bus architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, Peripheral Component Interconnect (PCI) bus, Peripheral Component Interconnect Express (PCIe), and Advanced Microcontroller Bus Architecture (AMBA).
  • Computer system/server 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer system/server 12 , and it includes both volatile and non-volatile media, removable and non-removable media.
  • System memory 28 can include computer system readable media in the form of volatile memory, such as random access memory (RAM) 30 and/or cache memory 32 .
  • Computer system/server 12 may further include other removable/non-removable, volatile/non-volatile computer system storage media.
  • storage system 34 can be provided for reading from and writing to a non-removable, non-volatile magnetic media (not shown and typically called a “hard drive”).
  • a magnetic disk drive for reading from and writing to a removable, non-volatile magnetic disk (e.g., a “floppy disk”).
  • an optical disk drive for reading from or writing to a removable, non-volatile optical disk such as a CD-ROM, DVD-ROM or other optical media can be provided.
  • memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the disclosure.
  • Program/utility 40 having a set (at least one) of program modules 42 , may be stored in memory 28 by way of example, and not limitation, as well as an operating system, one or more application programs, other program modules, and program data. Each of the operating system, one or more application programs, other program modules, and program data or some combination thereof, may include an implementation of a networking environment.
  • Program modules 42 generally carry out the functions and/or methodologies of embodiments as described herein.
  • Computer system/server 12 may also communicate with one or more external devices 14 such as a keyboard, a pointing device, a display 24 , etc.; one or more devices that enable a user to interact with computer system/server 12 ; and/or any devices (e.g., network card, modem, etc.) that enable computer system/server 12 to communicate with one or more other computing devices. Such communication can occur via Input/Output (I/O) interfaces 22 . Still yet, computer system/server 12 can communicate with one or more networks such as a local area network (LAN), a general wide area network (WAN), and/or a public network (e.g., the Internet) via network adapter 20 .
  • LAN local area network
  • WAN wide area network
  • public network e.g., the Internet
  • network adapter 20 communicates with the other components of computer system/server 12 via bus 18 .
  • bus 18 It should be understood that although not shown, other hardware and/or software components could be used in conjunction with computer system/server 12 . Examples, include, but are not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data archival storage systems, etc.
  • the present disclosure may be embodied as a system, a method, and/or a computer program product.
  • the computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure.
  • the computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device.
  • the computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • a non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disk
  • memory stick a floppy disk
  • a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon
  • a computer readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
  • Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network.
  • the network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
  • Computer readable program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages.
  • the computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
  • These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
  • the computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the block may occur out of the order noted in the figures.
  • two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Business, Economics & Management (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Strategic Management (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Tourism & Hospitality (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Software Systems (AREA)
  • Primary Health Care (AREA)
  • Mining & Mineral Resources (AREA)
  • Marine Sciences & Fisheries (AREA)
  • Animal Husbandry (AREA)
  • Agronomy & Crop Science (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Game Theory and Decision Science (AREA)
  • Quality & Reliability (AREA)
  • Operations Research (AREA)
  • Mechanical Engineering (AREA)
  • Soil Sciences (AREA)
  • Development Economics (AREA)
  • Educational Administration (AREA)
  • Environmental Sciences (AREA)
  • Image Processing (AREA)

Abstract

Imagery-based boundary identification for agricultural fields is provided. In various embodiments, a time series of surface reflectance rasters for a geographic region is received. For each of the surface reflectance rasters, at least one index raster is determined, yielding at least one time series of index rasters. The at least one time series of index rasters is divided into a plurality of consecutive time windows. The at least one time series of index rasters is composited within each of the plurality of time windows, yielding a composite index raster for each of the at least one time series of index rasters in each of the plurality of time windows. The composite index rasters are segmented into a plurality of spatially compact regions of the geographic region. A plurality of polygons is generated from the plurality of spatially compact regions, each of the plurality of polygons corresponding to an agricultural field in the geographic region.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application is a continuation of International Application No. PCT/US2020/048188, filed Aug. 27, 2020, which claims the benefit of U.S. Provisional Application No. 62/892,110, filed Aug. 27, 2019, each of which is hereby incorporated by reference in its entirety.
BACKGROUND
Embodiments of the present disclosure relate to remote sensing, and more specifically, to imagery-based boundary identification for agricultural fields.
BRIEF SUMMARY
According to embodiments of the present disclosure, methods of and computer program products for agricultural field boundary identification are provided. A time series of surface reflectance rasters for a geographic region is received. For each of the surface reflectance rasters, at least one index raster is determined, yielding at least one time series of index rasters. The at least one time series of index rasters is divided into a plurality of consecutive time windows. The at least one time series of index rasters is composited within each of the plurality of time windows, yielding a composite index raster for each of the at least one time series of index rasters in each of the plurality of time windows. The composite index rasters are segmented into a plurality of spatially compact regions of the geographic region. A plurality of polygons is generated from the plurality of spatially compact regions, each of the plurality of polygons corresponding to an agricultural field in the geographic region.
In some embodiments, the time series of surface reflectance rasters comprises satellite data. In some embodiments, the time series of surface reflectance rasters spans a growing season in the geographic region. In some embodiments, receiving the time series of surface reflectance rasters comprises determining surface reflectance from uncorrected reflectance data.
In some embodiments, the at least one index raster comprises a normalized difference vegetation index raster. In some embodiments, the at least one index raster comprises a land surface water index raster. In some embodiments, the at least one index raster comprises a mean brightness raster. In some embodiments, determining the at least one index raster comprises downsampling the surface reflectance rasters.
In some embodiments, the plurality of consecutive time windows correspond to early, mid-, and late phases of a growing season in the geographic region. In some embodiments, compositing comprises averaging the at least one time series of index rasters within each of the plurality of time windows.
In some embodiments, segmenting comprises filling missing pixels in the composite index rasters. In some embodiments, filling missing pixels comprises applying linear interpolation to the composite index rasters. In some embodiments, segmenting comprises normalizing the composite index rasters. In some embodiments, segmenting comprises denoising. In some embodiments, denoising comprises applying a spatial low pass filter. In some embodiments, segmenting comprises graph-based segmentation. In some embodiments, segmenting comprises Felzenszwalb segmentation.
In some embodiments, generating the plurality of polygons comprises applying spatial smoothing to the plurality of spatially compact regions.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
FIG. 1 illustrates a field delineation pipeline according to embodiments of the present disclosure.
FIG. 2 illustrates an image preprocessing method according to embodiments of the present disclosure.
FIG. 3 illustrates an image segmentation method according to embodiments of the present disclosure.
FIGS. 4A-B illustrate exemplary image segmentations according to embodiments of the present disclosure.
FIG. 5 illustrates a segmentation postprocessing method according to embodiments of the present disclosure.
FIGS. 6A-B illustrate exemplary image segmentations according to embodiments of the present disclosure.
FIGS. 7A-B illustrate exemplary field boundaries according to embodiments of the present disclosure.
FIG. 8 illustrates a method for agricultural field boundary identification according to embodiments of the present disclosure.
FIG. 9 depicts a computing node according to an embodiment of the present disclosure.
DETAILED DESCRIPTION
Farm fields represent a fundamental spatial unit of agriculture. Observation, analysis, and modeling of agricultural characteristics at the farm field level requires specification of their geographic boundaries. However, there is no standard reference data set for farm field boundaries.
While field boundaries can be created by manual digitization, whether by on-site data collection or manual markup of aerial or satellite imagery coupled with on-site verification, this process is not scalable to large areas. Individual US counties can contain thousands of fields, and the US as a whole contains millions of fields. Manually digitizing hundreds or thousands of fields would be prohibitively time consuming. Moreover, agricultural activity often shifts from season to season, utilizing different parts of a given field. Accordingly, the geographic definition of the field must be allowed to change over time, putting a temporal constraint on the validity of a given field identification.
As used herein, a field boundary refers to a spatially compact (that is, closed and bounded) unit of the landscape that exhibited an approximately uniform pattern temporally and spectrally for a given growing season. In particular, field boundary delineates a bounded area with common crop type and management. A field boundary may differ from what visual inspection of a single image might suggest. For example, visual inspection is likely to be strongly affected by the presence of roads, streams, paths, or other such boundaries, irrespective of whether a crop was actually grown on the land in question.
Various image processing algorithms may be used to partition images into coherent spatial units (segmentation) based on detection of boundaries in the image. In this way, satellite data at sufficient spatial resolution may be used to create farm field boundaries. However, a given static image does not contain sufficient information to produce an accurate boundary. For example, at any given time, two adjacent fields might appear as one field in an image. Similarly, satellite data contain generally multiple spectral reflectance bands, which can be combined algebraically to produce indices, but no single index is guaranteed to provide distinguishing power to resolve between fields. Even if satellite information alone were sufficient to delineate fields, a general segmentation approach would not delineate fields alone, but all image content with a discernable boundary.
Accordingly, there is a need for field boundary detection approaches that leverage spectral, temporal, and spatial information in remote sensing imagery to create high confidence boundaries suitable for downstream processes, free of non-field boundaries.
The present disclosure provides systems, methods, and computer program products for automated identification of field boundaries in remote sensing imagery that exploits spectral, temporal, and spatial patterns in the data to create a geospatial data set (e.g., polygonal features) indicative of field boundaries.
With reference now to FIG. 1 , an exemplary field delineation pipeline is illustrated according to embodiments of the present disclosure. In various embodiments, the field delineation pipeline includes three sequential steps: preprocessing 101, in which precursor images are created to enable an accurate characterization of boundaries; segmentation 102, in which an appropriate combination of filters is applied to the imagery prior to segmentation; and post-processing 103, in which contextual and geometric screening is performed to remove boundaries that are determined to not be fields.
Referring to FIG. 2 , an exemplary preprocessing method is illustrated according to embodiments of the present disclosure. It will be appreciated that the quality and particular characteristics of the input imagery used to create boundaries is important to accurate results from the delineation algorithms provided herein.
Remote sensing data are retrieved from one or more datastore 201. In various embodiments, remote sensing data comprise satellite data including surface reflectance at a plurality of resolutions, at a plurality of times. In some embodiments, the datastore is the NASA Harmonized Landsat-Sentinel2 (HLS) product archive. HLS takes advantage of the complementary overpass times of Landsat and Sentinel2, to provide denser coverage in time, but with uniform radiometric and geospatial characteristics. In some embodiments datastore 201 contains uncorrected reflectance data, which is converted to surface reflectance prior to use (e.g., by cloud mapping and atmospheric correction). It will be appreciated, however, that a variety of alternative satellite systems are suitable for providing data as set out herein.
In various embodiments, remote sensing data are fetched and stored in a local cache 202 for further processing. It will be appreciated, however, that in some embodiments, data may be read directly from a local datastore, or may be streamed directly from a remote data store without the need for local caching.
In some embodiments, the Geospatial Intelligence Production Solution (GIPS) is used for data retrieval. GIPS is an open source solution that provides a uniform interface to a wide range of satellite, weather, and other geospatial data sources. However, it will be appreciated that a variety of alternative APIs and platforms may be used to retrieve suitable satellite data.
The remote sensing data is processed to compute 203 one or more indices 204 for each point in time for which data is available at each pixel of the input images. In some embodiments, surface reflectance images (e.g., from HLS) are processed to create a three-band product consisting of normalized difference vegetation index (NDVI), land surface water index (LSWI), and mean brightness (BRGT). These three indices represent the three principal axes of variability of optical data, and may be referred to as greenness, wetness, and brightness. In the example shown, each of three indices contains a plurality of snapshots in time. Each snapshot is a raster, or image, whose pixel intensity indicates the index value.
In alternative embodiments, different indices are selected, resulting in a different number of bands. For example, in some embodiments, the brightness band described above is omitted. Brightness, greenness, and wetness are generally the most dominant modes of variability for optical remote sensing bands. However, it will be appreciated that a variety of different combinations of bands and specific computation of bands may be used for field delineation according to the present disclosure. For example, Enhanced Vegetation Index (EVI) or EVI2 may be used in place of NDVI.
Omitting the brightness band reduces the number of subfield segments by 13%-43%, depending on terrain, resulting in a weighted mean decrease of 22% over an exemplary selection of tiles. Further comparison to manual field boundaries may be made using Intersection Over Union (IOU), which is a measure of agreement between two sets of polygons. It refers to the ratio of the intersected area of the polygons to the union. For the two groups of fields (hand drawn, and auto-delineated) this measure is assymetric with respect to each set. GDF1 is used herein to refer to the mean comparison of each manually delineated field to autodelineated fields. GDF2 is used herein to refer to the mean comparison of each autodelineated field to manual fields. In the present example, GDF2 IOU increased by 13%-84%, yielding a weighted mean increase of 33%. GDF1 IOU was about the same.
Remote sensing data may be available on an irregular schedule, for example due to orbital periods of a given constellation. The HLS source images are provided irregularly in time, and may contain gaps which propagate into the indices. To address this variability, in some embodiments, the index images are composited 205 within pre-specified time windows, enabling delivery of a small number of high-value variables for use in the downstream algorithms. It will be appreciated that various techniques may be used to composite the source images prior to index computation. However, compositing the index images is advantageous as it reduces noise and lowers the dimensionality of the problem, thereby enabling more efficient computation.
In some embodiments, the predetermined time windows correspond to phases of the growing season. In some embodiments, the time windows correspond to the early growing season, the mid-season, and the late growing season for a given crop. In an exemplary embodiment, a first window spans April and May, a second window spans June and July, and a third window spans August and September. It will be appreciated that these exemplary windows are calibrated to the northern hemisphere, and would be transposed by six months for use in the southern hemisphere. It will also be appreciated that while these windows are suitable for the continental US, they may be shortened or lengthened for certain crops at certain higher or lower latitudes.
In various embodiments, a user is able to define the number of time windows, and the start and end date of each window separately. This approach allows the delineation of field boundaries for each specific season, or at multiple times within a season, capturing potential changes in the use of the land. For example, a field could be farmed in its entirety for a cash crop, then part of the field subsequently could be used for a cover crop. Similarly, different indices may be used for different conditions or different geographies.
In various embodiments, compositing 205 comprises performing a temporal linear interpolation to reduce potential bias from having the distribution of measurements in time significantly different for different places. In some such embodiments, linear interpolation is performed between available observations, which due to clouds and overpass constraints, may not be evenly distributed in time. After interpolation, for each pixel, the average in time within a window is taken. In an exemplary embodiment in which three indices are assessed over three time windows, the result is a nine band (3 indices×3 windows) image stack 206.
It will be appreciated that the above process may be performed for a global data set, or only for certain areas of interest. In some embodiments, the resulting image stack is downsampled to a predetermined resolution in order to limit the overall storage size necessary to maintain the image stacks. In some embodiments, the target resolution is 0.15 degrees. This resolution allows for storage of a global dataset while providing sufficient resolution for further downstream processing.
Referring to FIG. 3 , an exemplary segmentation method is illustrated according to embodiments of the present disclosure. As set out below, once the image data are available from the preprocessing stage, a series of numerical image processing steps are performed. The combination of pre-segmentation steps (gap-filling, scaling, noise filtering) contribute to the reliability of the segmentation results.
At 301, gaps are filled in the available multi-temporal multi-index imagery. Even after compositing, some data sets contain residual missing pixels which must be addressed. In some embodiments, gap-filling comprises applying linear interpolation to gap fill these residual missing values. The post-compositing gaps are typically very small (1-10 pixels), making linear interpolation sufficient.
At 302, the images corresponding to each index are normalized. This addresses the potential for each variable to have a different dynamic range. In some embodiments, normalizing comprises rescaling. In some embodiments, normalizing comprises quantizing. In some embodiments, all bands are normalized by subtracting the mean and dividing the result by its standard deviation.
At 303, the images are filtered. In some embodiments, filtering comprises applying a denoising filter. In some embodiments, the denoising filter is the scikit-image restoration.denoise_bilateral filter for spatial and variable-wise smoothing. This filter removes noise by applying a spatial low pass filter that does not smooth over features that appear consistently in the nine bands.
In various embodiments, an edge-preserving, denoising filter is used (such as those provided by scikit-image restoration). Such filters average pixels based on their spatial closeness and radiometric similarity. In various embodiments, spatial closeness is measured by the Gaussian function of the Euclidean distance between two pixels and a configurable standard deviation value (denoted sigma spatial in scikit-image restoration). A larger value of the standard deviation for range distance results in averaging of pixels with larger spatial differences. In various embodiments, the standard deviation value is 0.1, 0.5, 0.8, or 0.9. A value of 0.5 or lower results in situations in which the auto-delineated fields may have four times or more polygons than manually delineated fields, which is undesirable.
In various embodiments, spatial closeness is measured by the Gaussian function of the Euclidean distance between two color values and a configurable standard deviation value (denoted sigma_color in scikit-image restoration). A larger value of the standard deviation for range distance results in averaging of pixels with larger radiometric differences. In various embodiments, the image is converted using the img_as float function (of the scikit-image restoration library) and thus the standard deviation is in respect to the range [0, 1]. If the value is none, the standard deviation of the image is used. In various embodiments, the standard deviation value is none, 0.3, 0.5, 0.8. In testing, changing this parameter did not change GDF1 IOU, GDF2 IOU, or the resulting number of field polygons.
At 304, image segmentation is performed. In some embodiments, segmentation is performed by graph-based image segmentation. In some embodiments, the Felzenszwalb method for efficient graph-based image segmentation is used. In some embodiments, segmentation is implemented using the scikit-image segmentation.felzenszwalb algorithm. This exemplary algorithm creates a single layer representing raster classes with labels such that the labeled classes are spatially compact parcels of land. The resulting parcels correspond to distinct units of the landscape, ready for postprocessing to generate vector features of farm fields.
In various embodiments, such as those implemented with the segmentation.felzenszwalb algorithm, the observation level is configurable via a scale value. Higher scale generally means fewer, larger segments. Segment size within an image can vary greatly depending on local contrast, so scale is scene dependent. However, in some embodiments a consistent scale value is applied for all scenes. Exemplary scale values include scale: 550, 600, 650, 700, and 750. Across these ranges, the max difference in GDF1 IOU is only about 1-2%. There is not a clear dominant scale that produces significantly better results across all test tiles. GDF2 IOU is more strongly and inversely related to scale (˜5-6% difference between 550 and 750).
In various embodiments, such as those implemented with the segmentation.felzenszwalb algorithm, the diameter (standard deviation) of a Gaussian kernel used for smoothing the image prior to segmentation is configurable. This value may be denoted as sigma. Exemplary values of sigma include 0.5, 0.8, and 0.9. Across this parameter range, the maximum difference in GDF1 IOU is about 1%. However, the number of polygons varies significantly, with higher sigma (smoothing) values correlated with fewer, larger segments. It is preferable to use higher values that still maintain segments that do not cross field boundaries, such as 0.9.
Referring to FIGS. 4A-B, exemplary segmentations with variable sigma values are illustrated. In FIG. 4A, a sigma value of 0.8 is used. In FIG. 4B, a sigma value of 0.9 is used. As shown, a high value results in larger contiguous segments.
In various embodiments, such as those implemented with the segmentation.felzenszwalb algorithm, the minimum component size is configurable, which is enforced using post processing. This value may be denoted as min_size. Exemplary values for this value include 500, 600, and 700. Across this parameter range, the maximum difference in both GDF1 IOU and GDF2 IOU is about 1-2%. There is also an insignificant change in the number of polygons. In all cases, min_size=700 has the best IOU, but not by a significant margin.
Referring to FIG. 5 , an exemplary postprocessing method is illustrated according to embodiments of the present disclosure.
At 501, the segmented images are polygonised to create candidate field polygons in vector format based on raster classes. In some embodiments, the segmentated images are polygonised using GDAL polygonise.
At 502, the polygons are spatially cleaned. Spatial cleaning includes checking topological validity, fixing broken geometries (such as non-closed polygons), and removing complex shapes that are unlikely to represent all or part of a farm field. In some embodiments, complex shapes are identified by computing the area of the convex hull surrounding the polygon, divided by the area of the polygon. In some embodiments, perimeter to area ratio is used. Additional suitable metrics include: eccentricity (the maximum of the set of shortest distances from each vertex, to all vertices in the polygon); equivalent diameter (the diameter of the smallest circle containing the polygon); perimeter to area ratio; and the ratio of minor axis to major axis of the smallest ellipse containing the polygon. Heuristics may be applied to evaluate whether a polygon falls within the reasonable range of field geometries. For example, polygons outside of predetermined size thresholds may be discarded. Similarly, polygons with too high an aspect ratio may be discarded. Across all parameters tested, the max difference in GDF1 and GDF2 IOU is about 3-4%. Larger opening values result in fewer polygons, however, parts of legitimate fields may be missed. Larger values also result in rounded field edges
In some embodiments, spatial cleaning comprises applying spatial smoothing such as a buffer and reverse buffer cycle. In various embodiments, the buffer size (or opening) is configurable. The buffering removes morphologically inconsistent pieces of polygons, such as long, narrow strips between fields that erroneously connect two distinct fields. The larger the opening, the more artifacts are removed, but additional edges of the remaining field boundaries are rounded. Accordingly, there is a tradeoff between boundary accuracy/fidelity and problematic artifacts. Exemplary buffer size values include 1, 5, 10, and 20 meters.
Referring to FIGS. 6A-B, exemplary segmentations using variable buffer size are shown. In FIG. 6A, a buffer size of 1 is used. Region 601 is an example of geometry that should be removed during the spatial cleaning step. FIG. 6B shows the result after spatial cleaning with a buffer size of 20. Region 601 is removed, and the remaining fields appear with rounded corners.
At 503, the polygons are screened to remove non-crop records. In some embodiments, screening comprises comparing the polygons against a reference layer of crop data, such as the USGS Crop Data Layer (CDL). In some embodiments, polygons that lie outside known croplands are discarded.
At 504, the resulting field polygons are stored for further use, such as visualization. In some embodiments, the field polygons are organized in tiles for efficient retrieval of relevant data for a given problem. In some embodiments, the field polygons are stored with additional metadata, such as a historical crop type or other attributes derived from remote sensing data or drawn from additional data layers.
Referring to FIGS. 7A-B, exemplary field boundaries are illustrated according to an embodiment of the present disclosure. FIG. 7A shows a multitemporal/multispectral image tile such as would result from the preprocessing stage described above. FIG. 7B shows the same image tile with automatically delineated fields superimposed.
Referring to FIG. 8 , a method for agricultural field boundary identification is illustrated according to embodiments of the present disclosure. At 801, time series of surface reflectance rasters for a geographic region is received. At 802, for each of the surface reflectance rasters, at least one index raster is determined, yielding at least one time series of index rasters. At 803, the at least one time series of index rasters is divided into a plurality of consecutive time windows. At 804, the at least one time series of index rasters is composited within each of the plurality of time windows, yielding a composite index raster for each of the at least one time series of index rasters in each of the plurality of time windows. At 805, the composite index rasters are segmented into a plurality of spatially compact regions of the geographic region. At 806, a plurality of polygons is generated from the plurality of spatially compact regions, each of the plurality of polygons corresponding to an agricultural field in the geographic region.
Referring now to FIG. 9 , a schematic of an example of a computing node is shown. Computing node 10 is only one example of a suitable computing node and is not intended to suggest any limitation as to the scope of use or functionality of embodiments described herein. Regardless, computing node 10 is capable of being implemented and/or performing any of the functionality set forth hereinabove.
In computing node 10 there is a computer system/server 12, which is operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with computer system/server 12 include, but are not limited to, personal computer systems, server computer systems, thin clients, thick clients, handheld or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputer systems, mainframe computer systems, and distributed cloud computing environments that include any of the above systems or devices, and the like.
Computer system/server 12 may be described in the general context of computer system-executable instructions, such as program modules, being executed by a computer system. Generally, program modules may include routines, programs, objects, components, logic, data structures, and so on that perform particular tasks or implement particular abstract data types. Computer system/server 12 may be practiced in distributed cloud computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed cloud computing environment, program modules may be located in both local and remote computer system storage media including memory storage devices.
As shown in FIG. 9 , computer system/server 12 in computing node 10 is shown in the form of a general-purpose computing device. The components of computer system/server 12 may include, but are not limited to, one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including system memory 28 to processor 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, Peripheral Component Interconnect (PCI) bus, Peripheral Component Interconnect Express (PCIe), and Advanced Microcontroller Bus Architecture (AMBA).
Computer system/server 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer system/server 12, and it includes both volatile and non-volatile media, removable and non-removable media.
System memory 28 can include computer system readable media in the form of volatile memory, such as random access memory (RAM) 30 and/or cache memory 32. Computer system/server 12 may further include other removable/non-removable, volatile/non-volatile computer system storage media. By way of example only, storage system 34 can be provided for reading from and writing to a non-removable, non-volatile magnetic media (not shown and typically called a “hard drive”). Although not shown, a magnetic disk drive for reading from and writing to a removable, non-volatile magnetic disk (e.g., a “floppy disk”), and an optical disk drive for reading from or writing to a removable, non-volatile optical disk such as a CD-ROM, DVD-ROM or other optical media can be provided. In such instances, each can be connected to bus 18 by one or more data media interfaces. As will be further depicted and described below, memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the disclosure.
Program/utility 40, having a set (at least one) of program modules 42, may be stored in memory 28 by way of example, and not limitation, as well as an operating system, one or more application programs, other program modules, and program data. Each of the operating system, one or more application programs, other program modules, and program data or some combination thereof, may include an implementation of a networking environment. Program modules 42 generally carry out the functions and/or methodologies of embodiments as described herein.
Computer system/server 12 may also communicate with one or more external devices 14 such as a keyboard, a pointing device, a display 24, etc.; one or more devices that enable a user to interact with computer system/server 12; and/or any devices (e.g., network card, modem, etc.) that enable computer system/server 12 to communicate with one or more other computing devices. Such communication can occur via Input/Output (I/O) interfaces 22. Still yet, computer system/server 12 can communicate with one or more networks such as a local area network (LAN), a general wide area network (WAN), and/or a public network (e.g., the Internet) via network adapter 20. As depicted, network adapter 20 communicates with the other components of computer system/server 12 via bus 18. It should be understood that although not shown, other hardware and/or software components could be used in conjunction with computer system/server 12. Examples, include, but are not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data archival storage systems, etc.
The present disclosure may be embodied as a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure.
The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
Computer readable program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
The descriptions of the various embodiments of the present disclosure have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (18)

What is claimed is:
1. A method comprising:
receiving a time series of surface reflectance rasters for a geographic region;
determining, for each of the surface reflectance rasters, at least one index raster to produce one or more time series of index rasters;
dividing one or more time series of index rasters into a plurality of consecutive time windows;
compositing one or more time series of index rasters within each of the plurality of time windows, to produce a composite index raster for each of one or more time series of index rasters in each of the plurality of time windows;
segmenting the composite index rasters into a plurality of spatially compact regions of the geographic region by denoising the composite index raster using a spatial low pass filter; and
generating a plurality of polygons from the plurality of spatially compact regions, each of the plurality of polygons corresponding to an agricultural field in the geographic region.
2. The method of claim 1, wherein the time series of surface reflectance rasters comprises satellite data.
3. The method of claim 1, wherein the time series of surface reflectance rasters spans a growing season in the geographic region.
4. The method of claim 1, wherein receiving the time series of surface reflectance rasters comprises determining surface reflectance from uncorrected reflectance data.
5. The method of claim 1, wherein the at least one index raster comprises a normalized difference vegetation index raster.
6. The method of claim 1, wherein the at least one index raster comprises a land surface water index raster.
7. The method of claim 1, wherein the at least one index raster comprises a mean brightness raster.
8. The method of claim 1, wherein determining the at least one index raster comprises downsampling the surface reflectance rasters.
9. The method of claim 1, wherein the plurality of consecutive time windows correspond to early, mid-, and late phases of a growing season in the geographic region.
10. The method of claim 1, wherein compositing comprises averaging the at least one time series of index rasters within each of the plurality of time windows.
11. The method of claim 1, wherein segmenting comprises filling missing pixels in the composite index rasters.
12. The method of claim 11, wherein filling missing pixels comprises applying linear interpolation to the composite index rasters.
13. The method of claim 1, wherein segmenting comprises normalizing the composite index rasters.
14. The method of claim 1, wherein segmenting comprises graph-based segmentation.
15. The method of claim 1, wherein segmenting comprises Felzenszwalb segmentation.
16. The method of claim 1, wherein generating the plurality of polygons comprises applying spatial smoothing to the plurality of spatially compact regions.
17. A system comprising:
a datastore comprising a time series of surface reflectance rasters for a geographic region; and
a computing node comprising a computer readable storage medium having program instructions embodied therewith, the program instructions executable by a processor of the computing node to cause the processor to perform a method comprising:
receiving the time series of surface reflectance rasters for a geographic region;
determining, for each of the surface reflectance rasters, at least one index raster to produce one or more time series of index rasters;
dividing one or more time series of index rasters into a plurality of consecutive time windows;
compositing one or more time series of index rasters within each of the plurality of time windows, to produce a composite index raster for each of one or more time series of index rasters in each of the plurality of time windows;
segmenting the composite index rasters into a plurality of spatially compact regions of the geographic region by denoising the composite index raster using a spatial low pass filter; and
generating a plurality of polygons from the plurality of spatially compact regions, each of the plurality of polygons corresponding to an agricultural field in the geographic region.
18. A computer program product for agricultural field boundary identification, the computer program product comprising a computer readable storage medium having program instructions embodied therewith, the program instructions executable by a processor to cause the processor to perform a method comprising:
receiving the time series of surface reflectance rasters for a geographic region;
determining, for each of the surface reflectance rasters, at least one index raster to produce one or more time series of index rasters;
dividing one or more time series of index rasters into a plurality of consecutive time windows;
compositing one or more time series of index rasters within each of the plurality of time windows, to produce a composite index raster for each of one or more time series of index rasters in each of the plurality of time windows;
segmenting the composite index rasters into a plurality of spatially compact regions of the geographic region by denoising the composite index raster using a spatial low pass filter; and
generating a plurality of polygons from the plurality of spatially compact regions, each of the plurality of polygons corresponding to an agricultural field in the geographic region.
US17/681,126 2019-08-27 2022-02-25 Imagery-based boundary identification for agricultural fields Active 2041-08-10 US12165222B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/681,126 US12165222B2 (en) 2019-08-27 2022-02-25 Imagery-based boundary identification for agricultural fields

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201962892110P 2019-08-27 2019-08-27
PCT/US2020/048188 WO2021041666A1 (en) 2019-08-27 2020-08-27 Imagery-based boundary identification for agricultural fields
US17/681,126 US12165222B2 (en) 2019-08-27 2022-02-25 Imagery-based boundary identification for agricultural fields

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/048188 Continuation WO2021041666A1 (en) 2019-08-27 2020-08-27 Imagery-based boundary identification for agricultural fields

Publications (2)

Publication Number Publication Date
US20220180526A1 US20220180526A1 (en) 2022-06-09
US12165222B2 true US12165222B2 (en) 2024-12-10

Family

ID=74685241

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/681,126 Active 2041-08-10 US12165222B2 (en) 2019-08-27 2022-02-25 Imagery-based boundary identification for agricultural fields

Country Status (2)

Country Link
US (1) US12165222B2 (en)
WO (1) WO2021041666A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11138677B2 (en) 2018-04-24 2021-10-05 Indigo Ag, Inc. Machine learning in an online agricultural system
WO2021041666A1 (en) 2019-08-27 2021-03-04 Indigo Ag, Inc. Imagery-based boundary identification for agricultural fields
WO2021222763A1 (en) 2020-05-01 2021-11-04 Indigo Ag, Inc. Dynamic data tiling
EP4185992A4 (en) 2020-07-21 2025-02-26 Indigo Ag, Inc. Remote sensing algorithms for mapping regenerative agriculture
US20250272842A1 (en) * 2024-02-27 2025-08-28 Climate Llc Systems And Methods For Processing Images Related To Boundaries
CN120673067B (en) * 2025-08-22 2025-11-25 山东大学 A method, system, medium, and equipment for zoning the boundary area between wild land and farmland.

Citations (98)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6422508B1 (en) 2000-04-05 2002-07-23 Galileo Group, Inc. System for robotic control of imaging data having a steerable gimbal mounted spectral sensor and methods
US20030019408A1 (en) 2001-02-28 2003-01-30 Clyde Fraisse Method for prescribing site-specific fertilizer application in agricultural fields
US20050234691A1 (en) 2004-04-20 2005-10-20 Singh Ramesh P Crop yield prediction
US20070014488A1 (en) 2004-07-09 2007-01-18 Ching-Chien Chen Automatically and accurately conflating road vector data, street maps, and orthoimagery
US7212670B1 (en) * 2002-05-03 2007-05-01 Imagetree Corp. Method of feature identification and analysis
US20070229524A1 (en) 2006-02-13 2007-10-04 Geoffrey Hendrey Draggable maps
US20080174593A1 (en) 2007-01-18 2008-07-24 Harris Corporation, Corporation Of The State Of Delaware. System and method for processing map images
US20090037441A1 (en) 2007-07-31 2009-02-05 Microsoft Corporation Tiled packaging of vector image data
US20090259709A1 (en) 2002-10-07 2009-10-15 Nikitin Alexei V Method and apparatus for adaptive real-time signal conditioning, processing, analysis, quantification, comparison, and control
US7675549B1 (en) 2006-12-08 2010-03-09 Itt Manufacturing Enterprises, Inc. Imaging architecture for region and time of interest collection and dissemination
US20100082564A1 (en) 2008-10-01 2010-04-01 Navteq North America, Llc Spatial Index for Locating Geographic Data Parcels Stored on Physical Storage Media
US20100321399A1 (en) 2009-06-18 2010-12-23 Patrik Ellren Maps from Sparse Geospatial Data Tiles
US20100332430A1 (en) 2009-06-30 2010-12-30 Dow Agrosciences Llc Application of machine learning methods for mining association rules in plant and animal data sets containing molecular genetic markers, followed by classification or prediction utilizing features created from these association rules
US20120143504A1 (en) 2010-12-07 2012-06-07 Google Inc. Method and apparatus of route guidance
US20120191773A1 (en) 2011-01-26 2012-07-26 Google Inc. Caching resources
AU2012101249A4 (en) 2012-08-17 2012-09-20 Beijing Normal University Method for Generating High Spatial Resolution NDVI Time Series Data
US20130332205A1 (en) 2012-06-06 2013-12-12 David Friedberg System and method for establishing an insurance policy based on various farming risks
US20140039967A1 (en) 2006-11-07 2014-02-06 The Curators Of The University Of Missouri Method of predicting crop yield loss due to n-deficiency
US20140095261A1 (en) 2012-09-27 2014-04-03 FI2 Solutions, LLC Methods, apparatus and systems for determining stand population, stand consistency and stand quality in an agricultural crop and alerting users
US20140222374A1 (en) 2011-09-20 2014-08-07 Monford Ag Systems Limited System and Method for Measuring Parameters Relating to Agriculture
US20140263822A1 (en) 2013-03-18 2014-09-18 Chester Charles Malveaux Vertical take off and landing autonomous/semiautonomous/remote controlled aerial agricultural sensor platform
US8965812B2 (en) 2007-10-09 2015-02-24 Archer Daniels Midland Company Evaluating commodity conditions using aerial image data
US9113590B2 (en) 2012-08-06 2015-08-25 Superior Edge, Inc. Methods, apparatus, and systems for determining in-season crop status in an agricultural crop and alerting users
US20160050840A1 (en) 2014-08-22 2016-02-25 The Climate Corporation Methods for agronomic and agricultural monitoring using unmanned aerial systems
US20160071410A1 (en) 2014-09-05 2016-03-10 The Climate Corporation Updating execution of tasks of an agricultural prescription
US20160073573A1 (en) 2014-09-12 2016-03-17 The Climate Corporation Methods and systems for managing agricultural activities
US20160157414A1 (en) 2014-12-05 2016-06-09 Deere & Company Scouting systems
US20160171680A1 (en) 2014-12-16 2016-06-16 The Board of Trustees of the Land Stanford Junior University Systems and Methods for Satellite Image Processing to Estimate Crop Yield
US20160180473A1 (en) * 2011-05-13 2016-06-23 Hydrobio, Inc. Systems to prescribe and deliver fertilizer over agricultural fields and related methods
US9381646B1 (en) 2012-07-05 2016-07-05 Bernard Fryshman Insect and other small object image recognition and instant active response with enhanced application and utility
US20160216245A1 (en) 2012-11-07 2016-07-28 Brian Harold Sutton Infrared aerial thermography for use in monitoring plant health and growth
US20160232621A1 (en) 2015-02-06 2016-08-11 The Climate Corporation Methods and systems for recommending agricultural activities
US20160302351A1 (en) 2015-04-17 2016-10-20 360 Yield Center, Llc Agronomic systems, methods and apparatuses
US20160309646A1 (en) 2015-04-24 2016-10-27 360 Yield Center, Llc Agronomic systems, methods and apparatuses
US9489576B2 (en) 2014-03-26 2016-11-08 F12 Solutions, LLC. Crop stand analysis
US9519861B1 (en) 2014-09-12 2016-12-13 The Climate Corporation Generating digital models of nutrients available to a crop over the course of the crop's development based on weather and soil data
KR101703442B1 (en) * 2015-09-17 2017-02-06 한국항공우주연구원 Apparatus and method of extrcting a seamline for mosaicking of satellite images
US9563945B2 (en) 2012-07-05 2017-02-07 Bernard Fryshman Object image recognition and instant active response with enhanced application and utility
US20170041407A1 (en) 2015-04-20 2017-02-09 Agverdict, Inc. Systems and Methods for Efficiently Generating a Geospatial Data Map for Use in Agricultural Operations
US9582002B2 (en) 2013-11-20 2017-02-28 Rowbot Systems Llc Robotic platform and method for performing multiple functions in agricultural systems
US9582873B2 (en) 2014-06-30 2017-02-28 Trimble Inc. Active imaging systems for plant growth monitoring
US20170083747A1 (en) 2015-09-21 2017-03-23 The Climate Corporation Ponding water detection on satellite imagery
US20170089761A1 (en) 2014-06-18 2017-03-30 Gary L. McQuilkin Spectral imaging system for remote and noninvasive detection of target substances using spectral filter arrays and image capture arrays
US20170090068A1 (en) 2014-09-12 2017-03-30 The Climate Corporation Estimating soil properties within a field using hyperspectral remote sensing
US20170105335A1 (en) 2015-10-16 2017-04-20 The Climate Corporation Method for recommending seeding rate for corn seed using seed type and sowing row width
US20170109395A1 (en) 2015-10-14 2017-04-20 The Climate Corporation Computer-generated accurate yield map data using expert filters and spatial outlier detection
US9629306B2 (en) 2012-12-17 2017-04-25 The Climate Corporation Plot placement systems and methods
US20170112043A1 (en) 2015-10-23 2017-04-27 Deere & Company System and method for residue detection and implement control
US20170124463A1 (en) 2015-10-28 2017-05-04 The Climate Corporation Computer-implemented calculation of corn harvest recommendations
US9658201B2 (en) 2013-03-07 2017-05-23 Blue River Technology Inc. Method for automatic phenotype measurement and selection
US20170161627A1 (en) 2015-12-02 2017-06-08 The Climate Corporation Forecasting field level crop yield during a growing season
US20170168157A1 (en) 2015-12-10 2017-06-15 The Climate Corporation Generating estimates of uncertainty for radar based precipitation estimates
US20170169523A1 (en) 2015-12-14 2017-06-15 The Climate Corporation Generating digital models of relative yield of a crop based on nitrate values in the soil
US20170177938A1 (en) 2015-12-16 2017-06-22 Regents Of The University Of Minnesota Automated detection of nitrogen deficiency in crop
US20170199528A1 (en) 2015-09-04 2017-07-13 Nutech Ventures Crop height estimation with unmanned aerial vehicles
US20170196171A1 (en) 2016-01-07 2017-07-13 The Climate Corporation Generating digital models of crop yield based on crop planting dates and relative maturity values
US20170206415A1 (en) 2016-01-15 2017-07-20 Blue River Technology Inc. Plant feature detection using captured images
US20170213141A1 (en) 2016-01-22 2017-07-27 The Climate Corporation Forecasting national crop yield during the growing season using weather indices
US20170228475A1 (en) 2016-02-05 2017-08-10 The Climate Corporation Modeling trends in crop yields
US20170231213A1 (en) 2016-02-17 2017-08-17 International Business Machines Corporation Pest abatement utilizing an aerial drone
US9745060B2 (en) 2015-07-17 2017-08-29 Topcon Positioning Systems, Inc. Agricultural crop analysis drone
US20170257426A1 (en) 2015-04-20 2017-09-07 Agverdict, Inc. Systems and Methods for Cloud-Based Agricultural Data Processing and Management
US9756844B2 (en) 2011-05-13 2017-09-12 The Climate Corporation Method and system to map biological pests in agricultural fields using remotely-sensed data for field scouting and targeted chemical application
US20170258005A1 (en) 2016-03-14 2017-09-14 Cutter Technologies, LLC Field monitoring, analysis, and treatment system
US20170270446A1 (en) * 2015-05-01 2017-09-21 360 Yield Center, Llc Agronomic systems, methods and apparatuses for determining yield limits
US20170287437A1 (en) 2016-04-04 2017-10-05 Yandex Europe Ag Method and system of downloading image tiles onto a client device
US20170287436A1 (en) 2016-04-04 2017-10-05 Yandex Europe Ag Method and system of downloading image tiles onto a client device
US9880140B2 (en) 2014-08-19 2018-01-30 Clearag, Inc. Continual crop development profiling using dynamical extended range weather forecasting with routine remotely-sensed validation imagery
US20180049043A1 (en) 2005-10-04 2018-02-15 Steven M. Hoffberg Multifactorial optimization system and method
US20180070527A1 (en) * 2016-09-09 2018-03-15 Cibo Technologies, Inc. Systems for learning farmable zones, and related methods and apparatus
US20180075545A1 (en) 2016-09-09 2018-03-15 Cibo Technologies, Inc. Systems for adjusting agronomic inputs using remote sensing, and related apparatus and methods
US9928578B1 (en) * 2016-03-30 2018-03-27 Descartes Labs, Inc. Using boundary maps to refine imagery
US20180137675A1 (en) 2016-11-16 2018-05-17 Here Global B.V. Method for determining polygons that overlap with a candidate polygon or point
US20180211156A1 (en) 2017-01-26 2018-07-26 The Climate Corporation Crop yield estimation using agronomic neural network
US20190050948A1 (en) 2017-08-08 2019-02-14 Indigo Ag, Inc. Machine learning in agricultural planting, growing, and harvesting contexts
CN109360117A (en) 2018-10-08 2019-02-19 西充恒河农牧业开发有限公司 A kind of crop growing mode recognition methods
CN109685081A (en) * 2018-12-27 2019-04-26 中国土地勘测规划院 A kind of joint change detecting method of Remotely sensed acquisition black fallow
CN106529451B (en) 2016-10-28 2019-06-18 山东省农业可持续发展研究所 Winter wheat-summer corn planting mode remote sensing identification method
CN110287869A (en) * 2019-06-25 2019-09-27 吉林大学 Crop classification method for high-resolution remote sensing images based on deep learning
US10445877B2 (en) * 2016-12-30 2019-10-15 International Business Machines Corporation Method and system for crop recognition and boundary delineation
US20200008371A1 (en) 2016-11-16 2020-01-09 The Climate Corporation Identifying management zones in agricultural fields and generating planting plans for the zones
WO2020055950A1 (en) 2018-09-11 2020-03-19 The Climate Corporation Risk-adjusted hybrid seed selection and crop yield optimization by field
CN110909679A (en) 2019-11-22 2020-03-24 中国气象科学研究院 Remote sensing identification method and system for fallow crop rotation information of winter wheat historical planting area
WO2021007352A1 (en) 2019-07-08 2021-01-14 Indigo Ag, Inc. Crop yield forecasting models
US10902655B1 (en) 2016-02-01 2021-01-26 United Parcel Service Of America, Inc. Editing cached map titles
WO2021041666A1 (en) 2019-08-27 2021-03-04 Indigo Ag, Inc. Imagery-based boundary identification for agricultural fields
WO2021062177A1 (en) 2019-09-27 2021-04-01 Indigo Ag, Inc. Imputation of remote sensing time series for low-latency agricultural applications
WO2021062147A1 (en) 2019-09-27 2021-04-01 Indigo Ag, Inc. Modeling field irrigation with remote sensing imagery
US11100579B1 (en) 2017-08-09 2021-08-24 Bushel, Inc. Commodity tracking system and method
WO2021222763A1 (en) 2020-05-01 2021-11-04 Indigo Ag, Inc. Dynamic data tiling
CN108764688B (en) * 2018-05-21 2021-11-23 浙江大学 Winter wheat wet waterlogging remote sensing monitoring method based on satellite-ground multi-source rainfall data fusion
WO2022020448A1 (en) 2020-07-21 2022-01-27 Indigo Ag, Inc. Remote sensing algorithms for mapping regenerative agriculture
US20220067614A1 (en) 2018-12-19 2022-03-03 The Board Of Trustees Of The University Of Illinois Apparatus and method for crop yield prediction
US20220138767A1 (en) 2020-10-30 2022-05-05 Cibo Technologies, Inc. Method and system for carbon footprint monitoring based on regenerative practice implementation
US20220237888A1 (en) 2021-01-27 2022-07-28 Tata Consultancy Services Limited Method and system for providing generalized approach for crop mapping across regions with varying characteristics
US20220343229A1 (en) 2021-04-27 2022-10-27 Gevo, Inc. Systems and methods for automatic carbon intensity calculation and tracking
US20220342536A1 (en) 2021-04-26 2022-10-27 Bushel, Inc. User interface for adjusting component proportions
US11762125B2 (en) * 2014-09-12 2023-09-19 Climate Llc Forecasting national crop yield during the growing season

Patent Citations (108)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6422508B1 (en) 2000-04-05 2002-07-23 Galileo Group, Inc. System for robotic control of imaging data having a steerable gimbal mounted spectral sensor and methods
US20030019408A1 (en) 2001-02-28 2003-01-30 Clyde Fraisse Method for prescribing site-specific fertilizer application in agricultural fields
US7212670B1 (en) * 2002-05-03 2007-05-01 Imagetree Corp. Method of feature identification and analysis
US20090259709A1 (en) 2002-10-07 2009-10-15 Nikitin Alexei V Method and apparatus for adaptive real-time signal conditioning, processing, analysis, quantification, comparison, and control
US20050234691A1 (en) 2004-04-20 2005-10-20 Singh Ramesh P Crop yield prediction
US20070014488A1 (en) 2004-07-09 2007-01-18 Ching-Chien Chen Automatically and accurately conflating road vector data, street maps, and orthoimagery
US20180049043A1 (en) 2005-10-04 2018-02-15 Steven M. Hoffberg Multifactorial optimization system and method
US20070229524A1 (en) 2006-02-13 2007-10-04 Geoffrey Hendrey Draggable maps
US20140039967A1 (en) 2006-11-07 2014-02-06 The Curators Of The University Of Missouri Method of predicting crop yield loss due to n-deficiency
US7675549B1 (en) 2006-12-08 2010-03-09 Itt Manufacturing Enterprises, Inc. Imaging architecture for region and time of interest collection and dissemination
US20080174593A1 (en) 2007-01-18 2008-07-24 Harris Corporation, Corporation Of The State Of Delaware. System and method for processing map images
US20090037441A1 (en) 2007-07-31 2009-02-05 Microsoft Corporation Tiled packaging of vector image data
USRE47742E1 (en) 2007-10-09 2019-11-26 Archer Daniels Midland Company (ADM) Evaluating commodity conditions using aerial image data
USRE46968E1 (en) 2007-10-09 2018-07-24 Archer Daniels Midland Company Evaluating commodity conditions using aerial image data
US8965812B2 (en) 2007-10-09 2015-02-24 Archer Daniels Midland Company Evaluating commodity conditions using aerial image data
US20100082564A1 (en) 2008-10-01 2010-04-01 Navteq North America, Llc Spatial Index for Locating Geographic Data Parcels Stored on Physical Storage Media
US20100321399A1 (en) 2009-06-18 2010-12-23 Patrik Ellren Maps from Sparse Geospatial Data Tiles
US20100332430A1 (en) 2009-06-30 2010-12-30 Dow Agrosciences Llc Application of machine learning methods for mining association rules in plant and animal data sets containing molecular genetic markers, followed by classification or prediction utilizing features created from these association rules
US20120143504A1 (en) 2010-12-07 2012-06-07 Google Inc. Method and apparatus of route guidance
US20120191773A1 (en) 2011-01-26 2012-07-26 Google Inc. Caching resources
US9756844B2 (en) 2011-05-13 2017-09-12 The Climate Corporation Method and system to map biological pests in agricultural fields using remotely-sensed data for field scouting and targeted chemical application
US20160180473A1 (en) * 2011-05-13 2016-06-23 Hydrobio, Inc. Systems to prescribe and deliver fertilizer over agricultural fields and related methods
US20140222374A1 (en) 2011-09-20 2014-08-07 Monford Ag Systems Limited System and Method for Measuring Parameters Relating to Agriculture
US20130332205A1 (en) 2012-06-06 2013-12-12 David Friedberg System and method for establishing an insurance policy based on various farming risks
US9381646B1 (en) 2012-07-05 2016-07-05 Bernard Fryshman Insect and other small object image recognition and instant active response with enhanced application and utility
US9563945B2 (en) 2012-07-05 2017-02-07 Bernard Fryshman Object image recognition and instant active response with enhanced application and utility
US9113590B2 (en) 2012-08-06 2015-08-25 Superior Edge, Inc. Methods, apparatus, and systems for determining in-season crop status in an agricultural crop and alerting users
AU2012101249A4 (en) 2012-08-17 2012-09-20 Beijing Normal University Method for Generating High Spatial Resolution NDVI Time Series Data
US20140095261A1 (en) 2012-09-27 2014-04-03 FI2 Solutions, LLC Methods, apparatus and systems for determining stand population, stand consistency and stand quality in an agricultural crop and alerting users
US20160216245A1 (en) 2012-11-07 2016-07-28 Brian Harold Sutton Infrared aerial thermography for use in monitoring plant health and growth
US9629306B2 (en) 2012-12-17 2017-04-25 The Climate Corporation Plot placement systems and methods
US9658201B2 (en) 2013-03-07 2017-05-23 Blue River Technology Inc. Method for automatic phenotype measurement and selection
US20140263822A1 (en) 2013-03-18 2014-09-18 Chester Charles Malveaux Vertical take off and landing autonomous/semiautonomous/remote controlled aerial agricultural sensor platform
US9582002B2 (en) 2013-11-20 2017-02-28 Rowbot Systems Llc Robotic platform and method for performing multiple functions in agricultural systems
US9489576B2 (en) 2014-03-26 2016-11-08 F12 Solutions, LLC. Crop stand analysis
US20170089761A1 (en) 2014-06-18 2017-03-30 Gary L. McQuilkin Spectral imaging system for remote and noninvasive detection of target substances using spectral filter arrays and image capture arrays
US9582873B2 (en) 2014-06-30 2017-02-28 Trimble Inc. Active imaging systems for plant growth monitoring
US9880140B2 (en) 2014-08-19 2018-01-30 Clearag, Inc. Continual crop development profiling using dynamical extended range weather forecasting with routine remotely-sensed validation imagery
US20160050840A1 (en) 2014-08-22 2016-02-25 The Climate Corporation Methods for agronomic and agricultural monitoring using unmanned aerial systems
US20160071410A1 (en) 2014-09-05 2016-03-10 The Climate Corporation Updating execution of tasks of an agricultural prescription
US20170090068A1 (en) 2014-09-12 2017-03-30 The Climate Corporation Estimating soil properties within a field using hyperspectral remote sensing
US20160073573A1 (en) 2014-09-12 2016-03-17 The Climate Corporation Methods and systems for managing agricultural activities
US10564316B2 (en) 2014-09-12 2020-02-18 The Climate Corporation Forecasting national crop yield during the growing season
US11762125B2 (en) * 2014-09-12 2023-09-19 Climate Llc Forecasting national crop yield during the growing season
US9519861B1 (en) 2014-09-12 2016-12-13 The Climate Corporation Generating digital models of nutrients available to a crop over the course of the crop's development based on weather and soil data
US20160157414A1 (en) 2014-12-05 2016-06-09 Deere & Company Scouting systems
US20160171680A1 (en) 2014-12-16 2016-06-16 The Board of Trustees of the Land Stanford Junior University Systems and Methods for Satellite Image Processing to Estimate Crop Yield
US20160232621A1 (en) 2015-02-06 2016-08-11 The Climate Corporation Methods and systems for recommending agricultural activities
US20160302351A1 (en) 2015-04-17 2016-10-20 360 Yield Center, Llc Agronomic systems, methods and apparatuses
US20170041407A1 (en) 2015-04-20 2017-02-09 Agverdict, Inc. Systems and Methods for Efficiently Generating a Geospatial Data Map for Use in Agricultural Operations
US20170257426A1 (en) 2015-04-20 2017-09-07 Agverdict, Inc. Systems and Methods for Cloud-Based Agricultural Data Processing and Management
US20160309646A1 (en) 2015-04-24 2016-10-27 360 Yield Center, Llc Agronomic systems, methods and apparatuses
US20170270446A1 (en) * 2015-05-01 2017-09-21 360 Yield Center, Llc Agronomic systems, methods and apparatuses for determining yield limits
US9745060B2 (en) 2015-07-17 2017-08-29 Topcon Positioning Systems, Inc. Agricultural crop analysis drone
US20170199528A1 (en) 2015-09-04 2017-07-13 Nutech Ventures Crop height estimation with unmanned aerial vehicles
KR101703442B1 (en) * 2015-09-17 2017-02-06 한국항공우주연구원 Apparatus and method of extrcting a seamline for mosaicking of satellite images
US20170083747A1 (en) 2015-09-21 2017-03-23 The Climate Corporation Ponding water detection on satellite imagery
US20190019008A1 (en) 2015-09-21 2019-01-17 The Climate Corporation Ponding water detection on satellite imagery
US20170109395A1 (en) 2015-10-14 2017-04-20 The Climate Corporation Computer-generated accurate yield map data using expert filters and spatial outlier detection
US20170105335A1 (en) 2015-10-16 2017-04-20 The Climate Corporation Method for recommending seeding rate for corn seed using seed type and sowing row width
US20170112043A1 (en) 2015-10-23 2017-04-27 Deere & Company System and method for residue detection and implement control
US20170124463A1 (en) 2015-10-28 2017-05-04 The Climate Corporation Computer-implemented calculation of corn harvest recommendations
US20170161627A1 (en) 2015-12-02 2017-06-08 The Climate Corporation Forecasting field level crop yield during a growing season
US20170168157A1 (en) 2015-12-10 2017-06-15 The Climate Corporation Generating estimates of uncertainty for radar based precipitation estimates
US20170169523A1 (en) 2015-12-14 2017-06-15 The Climate Corporation Generating digital models of relative yield of a crop based on nitrate values in the soil
US20170177938A1 (en) 2015-12-16 2017-06-22 Regents Of The University Of Minnesota Automated detection of nitrogen deficiency in crop
US20170196171A1 (en) 2016-01-07 2017-07-13 The Climate Corporation Generating digital models of crop yield based on crop planting dates and relative maturity values
US20170206415A1 (en) 2016-01-15 2017-07-20 Blue River Technology Inc. Plant feature detection using captured images
US20170213141A1 (en) 2016-01-22 2017-07-27 The Climate Corporation Forecasting national crop yield during the growing season using weather indices
US10902655B1 (en) 2016-02-01 2021-01-26 United Parcel Service Of America, Inc. Editing cached map titles
US20170228475A1 (en) 2016-02-05 2017-08-10 The Climate Corporation Modeling trends in crop yields
US20170231213A1 (en) 2016-02-17 2017-08-17 International Business Machines Corporation Pest abatement utilizing an aerial drone
US20170258005A1 (en) 2016-03-14 2017-09-14 Cutter Technologies, LLC Field monitoring, analysis, and treatment system
US9928578B1 (en) * 2016-03-30 2018-03-27 Descartes Labs, Inc. Using boundary maps to refine imagery
US20170287436A1 (en) 2016-04-04 2017-10-05 Yandex Europe Ag Method and system of downloading image tiles onto a client device
US20170287437A1 (en) 2016-04-04 2017-10-05 Yandex Europe Ag Method and system of downloading image tiles onto a client device
US20180075545A1 (en) 2016-09-09 2018-03-15 Cibo Technologies, Inc. Systems for adjusting agronomic inputs using remote sensing, and related apparatus and methods
US20180070527A1 (en) * 2016-09-09 2018-03-15 Cibo Technologies, Inc. Systems for learning farmable zones, and related methods and apparatus
CN106529451B (en) 2016-10-28 2019-06-18 山东省农业可持续发展研究所 Winter wheat-summer corn planting mode remote sensing identification method
US20180137675A1 (en) 2016-11-16 2018-05-17 Here Global B.V. Method for determining polygons that overlap with a candidate polygon or point
US20200008371A1 (en) 2016-11-16 2020-01-09 The Climate Corporation Identifying management zones in agricultural fields and generating planting plans for the zones
US10445877B2 (en) * 2016-12-30 2019-10-15 International Business Machines Corporation Method and system for crop recognition and boundary delineation
US20180211156A1 (en) 2017-01-26 2018-07-26 The Climate Corporation Crop yield estimation using agronomic neural network
US20190050948A1 (en) 2017-08-08 2019-02-14 Indigo Ag, Inc. Machine learning in agricultural planting, growing, and harvesting contexts
US11100579B1 (en) 2017-08-09 2021-08-24 Bushel, Inc. Commodity tracking system and method
CN108764688B (en) * 2018-05-21 2021-11-23 浙江大学 Winter wheat wet waterlogging remote sensing monitoring method based on satellite-ground multi-source rainfall data fusion
WO2020055950A1 (en) 2018-09-11 2020-03-19 The Climate Corporation Risk-adjusted hybrid seed selection and crop yield optimization by field
CN109360117A (en) 2018-10-08 2019-02-19 西充恒河农牧业开发有限公司 A kind of crop growing mode recognition methods
US20220067614A1 (en) 2018-12-19 2022-03-03 The Board Of Trustees Of The University Of Illinois Apparatus and method for crop yield prediction
CN109685081A (en) * 2018-12-27 2019-04-26 中国土地勘测规划院 A kind of joint change detecting method of Remotely sensed acquisition black fallow
CN110287869A (en) * 2019-06-25 2019-09-27 吉林大学 Crop classification method for high-resolution remote sensing images based on deep learning
WO2021007352A1 (en) 2019-07-08 2021-01-14 Indigo Ag, Inc. Crop yield forecasting models
WO2021007352A8 (en) 2019-07-08 2021-08-12 Indigo Ag, Inc. Crop yield forecasting models
US20220261928A1 (en) 2019-07-08 2022-08-18 Indigo Ag, Inc. Crop yield forecasting models
WO2021041666A1 (en) 2019-08-27 2021-03-04 Indigo Ag, Inc. Imagery-based boundary identification for agricultural fields
US20220180526A1 (en) 2019-08-27 2022-06-09 Indigo Ag, Inc. Imagery-based boundary identification for agricultural fields
WO2021062147A1 (en) 2019-09-27 2021-04-01 Indigo Ag, Inc. Modeling field irrigation with remote sensing imagery
US20220215659A1 (en) 2019-09-27 2022-07-07 Indigo Ag, Inc. Imputation of remote sensing time series for low-latency agricultural applications
WO2021062177A1 (en) 2019-09-27 2021-04-01 Indigo Ag, Inc. Imputation of remote sensing time series for low-latency agricultural applications
US20220210987A1 (en) 2019-09-27 2022-07-07 Indigo Ag, Inc. Modeling field irrigation with remote sensing imagery
CN110909679A (en) 2019-11-22 2020-03-24 中国气象科学研究院 Remote sensing identification method and system for fallow crop rotation information of winter wheat historical planting area
US20230092057A1 (en) 2020-05-01 2023-03-23 Indigo Ag, Inc. Dynamic data tiling
WO2021222763A1 (en) 2020-05-01 2021-11-04 Indigo Ag, Inc. Dynamic data tiling
WO2022020448A1 (en) 2020-07-21 2022-01-27 Indigo Ag, Inc. Remote sensing algorithms for mapping regenerative agriculture
US20220138767A1 (en) 2020-10-30 2022-05-05 Cibo Technologies, Inc. Method and system for carbon footprint monitoring based on regenerative practice implementation
US20220237888A1 (en) 2021-01-27 2022-07-28 Tata Consultancy Services Limited Method and system for providing generalized approach for crop mapping across regions with varying characteristics
US20220342536A1 (en) 2021-04-26 2022-10-27 Bushel, Inc. User interface for adjusting component proportions
US20220343229A1 (en) 2021-04-27 2022-10-27 Gevo, Inc. Systems and methods for automatic carbon intensity calculation and tracking

Non-Patent Citations (29)

* Cited by examiner, † Cited by third party
Title
Aji, A. et al., "Hadoop-GIS: A High Performance Spatial Data Warehousing System over MapReduce," Proceedings VLDB Endowment 6(11), Aug. 2013, pp. 1009-1020.
Becker-Reshef, I. et al., "Prior Season Crop Type Masks for Winter Wheat Yield Forecasting: A US Case Study," Remote Sensing, 10: 1659, Oct. 19, 2018, pp. 1-20.
Bermudez, C., "Development of a remote sensing protocol for inventorying cover crop adoptions," Iowa State University Master of Science Thesis, 2016, pp. 1-80.
Bratten, "LandsatLinkr 0.1.4User Guide," Guide version 0.1.4a draft (2015).
European Patent Office, Extended European Search Report, EP Patent Application No. 20836088.3, Jun. 14, 2023, ten pages.
European Patent Office, Extended European Search Report, EP Patent Application No. 21796375.0, Dec. 19, 2023, nine pages.
Gao, F. et al., "A within-season approach for detecting early growth stages in corn and soybean using high temporal and spatial resolution imagery," Remote Sens Environ, vol. 242: 111752, Mar. 2020, pp. 1-19.
Github, "cogeo-mosaic," Jun. 13, 2019, pp. 1-11, [Online] Retrieved from the Internet <URL: https://github.com/developmentseed/mosaicjson-spec>.
Github, "Marblecutter," Jul. 31, 2017 pp. 1-4, [Online] Retrieved from the Internet <URL:https://github.com/mojodna/marblecutter>.
Github, "rio-tiler," Oct. 14, 2020, pp. 1-14, [Online] Retrieved from the Internet <URL:https://github.com/cogeotiff/rio-tiler>.
Github, "Terracotta," Mar. 6, 2018, pp. 1-5, [Online] Retrieved from the Internet <URL:https://github.com/DHI/terracotta>.
International Search Report and Written Opinion for International Application No. PCT/US2020/048188 mailed Nov. 13, 2020.
Kuzyakova, I.F. et al., "Time series analysis and mixed models for studying the dynamics of net N mineralization in a soil catena at Gondelsheim (S-W Germany)," Geoderma, vol. 136, Sep. 7, 2006, pp. 803-818.
Lin, X. et al., "Carbon Emissions Estimation and Spatiotemporal Analysis of China at City Level Based on Multi-Dimensional Data and Machine Learning," Remote Sensing 14(13), 3014, Jun. 23, 2022, pp. 1-18.
Luo, D. et al., "Integrated Carbon Footprint and Economic Performance of Five Types of Dominant Cropping Systems in China's Semiarid Zone," Sustainability 14(10), 5844, May 11, 2022, pp. 1-17.
PCT International Search Report and Written Opinion, PCT Application No. PCT/US2021/042542, Oct. 24, 2021, nine pages.
PCT International Search Report and Written Opinion, PCT Application No. PCT/US2023/078118, Feb. 22, 2024, 11 pages.
PCT International Search Report and Written Opinion, PCT Application No. PCT/US2023/085208, Apr. 29, 2024, eight pages.
PCT International Search Report and Written Opinion, PCT International Application No. PCT/US2020/041256, Oct. 5, 2020, 13 pages.
PCT International Search Report and Written Opinion, PCT International Application No. PCT/US2020/052706, Dec. 7, 2020, eight pages.
PCT International Search Report and Written Opinion, PCT International Application No. PCT/US2020/052755, Feb. 11, 2021, eight pages.
PCT International Search Report and Written Opinion, PCT International Application No. PCT/US2021/030196, Aug. 4, 2021, 12 pages.
Ritchie et al ("Sensitivities of Normalized Difference Vegetation Index and a Green/Red Ratio Index to Cotton Ground Cover Fraction" Crop science, vol. 50, May-Jun. 2010). (Year: 2010). *
United States Office Action, U.S. Appl. No. 18/051,789, Apr. 18, 2024, six pages.
Vorobiova, N.S. et al., "NOVI time series modeling in the problem of crop identification by satellite images," Information Technology and Nanotechnology, 2016, pp. 428-436.
Wang, H. "A Large-scale Dynamic Vector and Raster Data Visualization Geographic Information System Based on Parallel Map Tiling," FIU Electronic Theses and Dissertations, Nov. 8, 2011, pp. 1-77.
Wardlow et al., "Discriminating cropping patterns in the US Central Great Plains region using time-series MODIS 250-meter NOVI data-Preliminary Results," In Proceedings, Pecora 15 and Land Satellite Information IV Conference, 2002, pp. 1-12.
Xie, Y. et al., "Mapping irrigated cropland extent across the conterminous United States at 30 m resolution using a semi-automatic training approach on Google Earth Engine," ISPRS Journal of Photogrammetry and Remote Sensing, vol. 155, Sep. 2019, pp. 136-149.
Zhang et al., "Mapping paddy rice planting areas through time series analysis of MODIS land surface temperature and vegetation index data," ISPRS Journal of Photogrammetry and Remote Sensing, 106: 157-171 (2015).

Also Published As

Publication number Publication date
WO2021041666A1 (en) 2021-03-04
US20220180526A1 (en) 2022-06-09

Similar Documents

Publication Publication Date Title
US12165222B2 (en) Imagery-based boundary identification for agricultural fields
US11748861B2 (en) Enhancing resolution and correcting anomalies of remote sensed data
US10489689B1 (en) Iterative relabeling using spectral neighborhoods
US11527062B2 (en) Method and system for crop recognition and boundary delineation
US11164310B2 (en) Method and system for crop recognition and boundary delineation
US11635510B1 (en) Sparse phase unwrapping
EP3408828B1 (en) Systems and methods for detecting imaged clouds
Hormese et al. Automated road extraction from high resolution satellite images
US10572976B2 (en) Enhancing observation resolution using continuous learning
Gladkova et al. Quantitative restoration for MODIS band 6 on Aqua
Yusuf et al. Spectral information analysis of image fusion data for remote sensing applications
CN117808708A (en) Cloud and fog remote sensing image processing method, device, equipment and medium
US11922678B2 (en) Carbon estimation
Wang et al. Multiscale single image dehazing based on adaptive wavelet fusion
CN116958416A (en) Three-dimensional modeling method, device, system and storage medium
Gladkova et al. SST pattern test in ACSPO clear-sky mask for VIIRS
US12277193B1 (en) Synthesizing training data for training a change detection model
Prasad et al. An analysis of image quality enhancement techniques in an optical based satellite image
Politz et al. Exploring ALS and DIM data for semantic segmentation using CNNs
Salazar Colores et al. Statistical multidirectional line dark channel for single‐image dehazing
CN116523945A (en) Building height detection method and device, storage medium and electronic equipment
Kamod et al. Denoise auto-encoder based speckle reduction for risat-1 sar imagery
Borra et al. Satellite image enhancement and analysis
US12266167B1 (en) Radiometric terrain correction
Khoomboon et al. A land cover mapping algorithm for thin to medium cloud-covered remote sensing images using a level set method

Legal Events

Date Code Title Description
FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

AS Assignment

Owner name: INDIGO AG, INC., MASSACHUSETTS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BRASWELL, BOBBY HAROLD;CORMIER, TINA A.;SULLA-MENASHE, DAMIEN;AND OTHERS;SIGNING DATES FROM 20220628 TO 20230426;REEL/FRAME:063767/0755

AS Assignment

Owner name: CORTLAND CAPITAL MARKET SERVICES LLC, AS AGENT, ILLINOIS

Free format text: SECURITY INTEREST;ASSIGNORS:INDIGO AGRICULTURE, INC.;INDIGO AG, INC.;REEL/FRAME:064559/0438

Effective date: 20230809

AS Assignment

Owner name: INDIGO AGRICULTURE, INC., MASSACHUSETTS

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CORTLAND CAPITAL MARKET SERVICES LLC;REEL/FRAME:065344/0780

Effective date: 20231020

Owner name: INDIGO AG, INC., MASSACHUSETTS

Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:CORTLAND CAPITAL MARKET SERVICES LLC;REEL/FRAME:065344/0780

Effective date: 20231020

Owner name: INDIGO AG, INC., MASSACHUSETTS

Free format text: RELEASE OF SECURITY INTEREST;ASSIGNOR:CORTLAND CAPITAL MARKET SERVICES LLC;REEL/FRAME:065344/0780

Effective date: 20231020

Owner name: INDIGO AGRICULTURE, INC., MASSACHUSETTS

Free format text: RELEASE OF SECURITY INTEREST;ASSIGNOR:CORTLAND CAPITAL MARKET SERVICES LLC;REEL/FRAME:065344/0780

Effective date: 20231020

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY