US20190286905A1 - Systems and methods determining plant population and weed growth statistics from airborne measurements in row crops - Google Patents

Systems and methods determining plant population and weed growth statistics from airborne measurements in row crops Download PDF

Info

Publication number
US20190286905A1
US20190286905A1 US16/434,935 US201916434935A US2019286905A1 US 20190286905 A1 US20190286905 A1 US 20190286905A1 US 201916434935 A US201916434935 A US 201916434935A US 2019286905 A1 US2019286905 A1 US 2019286905A1
Authority
US
United States
Prior art keywords
implementations
crop
vegetation
machine
component
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US16/434,935
Other versions
US10803313B2 (en
Inventor
Michael Ritter
Michael Milton
Peter Matusov
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hiphen Corp
Original Assignee
Slantrange Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Slantrange Inc filed Critical Slantrange Inc
Priority to US16/434,935 priority Critical patent/US10803313B2/en
Assigned to SlantRange, Inc. reassignment SlantRange, Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MATUSOV, Peter, MILTON, MICHAEL, RITTER, MICHAEL
Publication of US20190286905A1 publication Critical patent/US20190286905A1/en
Application granted granted Critical
Publication of US10803313B2 publication Critical patent/US10803313B2/en
Assigned to HIPHEN CORP reassignment HIPHEN CORP CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: SLANT RANGE, INC.
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06K9/00657
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • G06K9/6232
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/174Segmentation; Edge detection involving the use of two or more images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/7715Feature extraction, e.g. by transforming the feature space, e.g. multi-dimensional scaling [MDS]; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10036Multispectral image; Hyperspectral image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10041Panchromatic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30242Counting objects in image

Definitions

  • This disclosure relates to systems and methods for determining statistics of plant populations based on overhead optical measurements.
  • Farming practices may become more efficient by informing growers with more accurate and thorough information on the status of their crops. For example, timely and accurate knowledge of the emergent plant density and size distribution and their spatial variances across the field may enable growers and agronomists to a) determine more accurately how the emergent crop population differs from the planned population and where replanting may be necessary; b) detect poor germination areas which can then be investigated for bad seed, bad soil, or other disadvantageous conditions; c) detect malfunctioning planting implements for corrective action; d) more accurately and selectively apply inputs such as fertilizers, fungicides, herbicides, pesticides, and other inputs; e) more thoroughly understand how combinations of seed types, planting densities, soil chemistries, irrigation, fertilizers, chemicals, etc. contribute to crop populations which optimize production yields.
  • Plant population statistics may include humans manually counting individual plants at multiple, yet sparse, locations across a field.
  • the area surveyed by this method may be less than 1% of the total field area.
  • An estimate for the entire field may be determined through interpolation which consequently may lead to large errors as entire portions of the field may not be surveyed and may include unplanted, misplanted, or damaged areas.
  • airborne observations have been employed. While airborne methods may benefit by replacing the sparse sampling and interpolation limitations of manual counting with 100% coverage, they have been greatly limited by their ability to a) resolve individual plants; b) discriminate individual plants of the crop species from other plants (weeds) or detritus in the field for automated counting; and c) accurately determine the area of the measured region due to inaccuracies in aircraft altitude and ground elevation measurements. These limitations have led to very large errors in population statistics.
  • Exemplary implementations of the present disclosure may employ a remote sensing system (e.g. multispectral, hyperspectral, panchromatic, and/or other sensors) mounted to an airborne or other overhead platform and automated computer vision techniques to detect, resolve, and discriminate crop plants for counting and sizing over large areas of agricultural fields.
  • a remote sensing system e.g. multispectral, hyperspectral, panchromatic, and/or other sensors mounted to an airborne or other overhead platform and automated computer vision techniques to detect, resolve, and discriminate crop plants for counting and sizing over large areas of agricultural fields.
  • the system may comprise one or more hardware processors configured by machine-readable instructions to receive output signals provided by one or more remote sensing devices mounted to an overhead platform.
  • the output signals may convey information related to one or more images of a land area where crops are grown.
  • the one or more images may be spatially resolved.
  • the output signals may include one or more channels corresponding to one or more spectral ranges.
  • the one or more hardware processors may be configured by machine-readable instructions to distinguish vegetation from background based on the one or more channels.
  • the vegetation may include one or both of a crop population and a non-crop population.
  • the background clutter may include one or more of soil, standing water, man-made materials, dead vegetation, or other detritus.
  • the one or more hardware processors may be configured by machine-readable instructions to segregate image regions corresponding to the vegetation from image regions corresponding to the background clutter.
  • the one or more hardware processors may be configured by machine-readable instructions to determine a plant count per unit area.
  • the method may be performed by one or more hardware processors configured by machine-readable instructions.
  • the method may include receiving output signals provided by one or more remote sensing devices mounted to an overhead platform.
  • the output signals may convey information related to one or more images of a land area where crops are grown.
  • the one or more images may be spatially resolved.
  • the output signals may include one or more channels corresponding to one or more spectral ranges.
  • the method may include distinguishing vegetation from background clutter based on the one or more channels.
  • the vegetation may include one or both of a crop population and a non-crop population.
  • the background clutter may include one or more of soil, standing water, man-made materials, dead vegetation, or other detritus.
  • the method may include segregating image regions corresponding to the vegetation from image regions corresponding to the background clutter.
  • the method may include determining a plant count per unit area.
  • FIG. 1 illustrates a system configured for determining statistics of plant populations based on overhead optical measurements, in accordance with one or more implementations.
  • FIG. 2 illustrates spectral images obtained from an airborne platform, in accordance with one or more implementations.
  • FIG. 3 illustrates segregation of vegetation from background clutter, in accordance with one or more implementations.
  • FIG. 4 illustrates segregation of large rafts of non-crop population from the crop population, in accordance with one or more implementations.
  • FIG. 5 illustrates detection and characterization of crop rows, in accordance with one or more implementations.
  • FIG. 6 illustrates segregation of vegetation growing within rows from vegetation growing outside of rows, in accordance with one or more implementations.
  • FIG. 7 illustrates complete segregation of crop population from non-crop population and background clutter, in accordance with one or more implementations.
  • FIG. 8 illustrates a map of plant center positions, in accordance with one or more implementations.
  • FIG. 9 illustrates a method for determining statistics of plant populations based on overhead optical measurements, in accordance with one or more implementations.
  • FIG. 10 illustrates process steps performed by the system of FIG. 1 , in accordance with one or more implementations.
  • FIG. 1 illustrates a system 10 configured for determining statistics of plant populations based on overhead optical measurements, in accordance with one or more implementations.
  • system 10 may include one or more remote sensing devices 24 .
  • system 10 may include one or more server 12 .
  • Server(s) 12 may be configured to communicate with one or more client computing platforms 18 and/or one or more remote sensing devices 24 according to a client/server architecture. The users may access system 10 via a user interface 20 of client computing platform(s) 18 .
  • the one or more remote sensing devices 24 may be mounted to an overhead platform.
  • the overhead platform may include one or more of an aircraft, a spacecraft, an unmanned aerial vehicle, a drone, a tower, a vehicle, a tethered balloon, farming infrastructure such as center pivot irrigation systems or other infrastructure, and/or other overhead platforms.
  • the one or more remote sensing devices 24 may be configured to provide output signals.
  • the output signals may convey information related to one or more images of a land area where crops are grown.
  • the one or more images may include one or more spectral measurements.
  • the one or more images may include one or more of a color measurement, a multi-spectral measurement, a hyperspectral measurement, and/or other spectral measurements of a land area where crops are grown.
  • the one or more remote sensing devices 24 may record two-dimensional images of the land area where crops are grown formed on a single or multiple focal plane arrays.
  • a color or multispectral measurement may be formed through multiple spectral filters applied to individual pixels in a single focal plane array, or through spectral filters applied to entire focal plane arrays in a multiple focal plane array configuration.
  • the one or more images may be of sufficient spatial resolution to detect individual plants within the crop population. In some implementations, the one or more images may be of sufficient spectral resolution to resolve spectral differences between growing vegetation and background clutter. In some implementations, the measurements may be of sufficient resolution such that the ground resolved distance (GRD) is smaller than a characteristic dimension of one or more target plants in the land area.
  • GMD ground resolved distance
  • the one or more remote sensing devices 24 may provide output signals conveying information related to one or more of a time stamp, a position (e.g., latitude, longitude, and/or altitude), an attitude (e.g., roll, pitch, and/or yaw/heading), a spectral measurement of solar irradiance, calibration information specific to the device, and/or other information corresponding to individual ones of the one or more images.
  • calibration may include adjusting the one or more images for sunlight conditions, systemic errors, or positioning the image onto the earth's surface for output mapping.
  • the one or more remote sensing devices 24 may provide output signals conveying information related to one or more environmental parameters, time stamp, the position, the attitude, and/or other information corresponding to individual ones of the one or more images.
  • the one or more environmental parameters may include spectral measurements of downwelling solar illuminance, temperature, relative humidity, and/or other weather or environmental conditions.
  • output signals conveying information related to one or more environmental parameters, time stamp, the position, the attitude, and/or other information may be utilized to calibrate the one or more spectral images.
  • the output signals may be synchronous to the one or more images.
  • each image may include the output signals as metadata whose time of validity corresponds to the image.
  • FIG. 2 illustrates spectral images obtained from an airborne platform, in accordance with one or more implementations.
  • an airborne platform 210 having one or more remote sensing devices may provide one or more images 220 of a land area 230 where crops are grown.
  • the server(s) 12 and/or client computing platform(s) 18 may be configured to execute machine-readable instructions 26 .
  • the machine-readable instructions 26 may include one or more of a communications component 28 , an image revision component 30 , a contrast adjustment component 32 , a background clutter segregation component 34 , a crop segregation component 36 , a crop row attributes 38 , a crop density determination component 40 , a presentation component 42 , and/or other components.
  • Machine-readable instructions 26 may facilitate determining statistics of plant populations based on overhead optical measurements.
  • communications component 28 may receive output signals provided by one or more remote sensing devices mounted to an overhead platform.
  • the output signals may include one or more spectral images, metadata related to the one or more spectral images, and/or other information.
  • the output signals may convey information related to one or more images of a land area where crops are grown.
  • the one or more images may be spatially resolved and spectrally resolved.
  • spatially resolved images may include one or more images corresponding to crop plants, non-crop plants, a land area, and/or other locations.
  • the one or more images may include individual pixels corresponding to a spectral range.
  • the individual pixels may include intensity values corresponding to the spectral range.
  • the one or more remote sensing devices may include a first camera having a red filter thereon and a second camera having a near infrared filter thereon.
  • An Image captured by the first camera may include pixel values indicating intensity in the red spectral range and an image captured by the second camera may include pixel values indicating intensity in the near infrared spectral range.
  • the output signals may include one or more channels.
  • multiple channels may be part of a single remote sensing device.
  • multiple channels may be part of multiple remote sensing devices.
  • each image may be created by a channel.
  • each image created by a channel may be both spatially and spectrally resolved.
  • individual channels may have a similar spatial resolution.
  • different spectral ranges may be resolved in each channel.
  • a stack of images may be based on the one or more channels.
  • image revisions component 30 may be configured to correct and/or revise systematic and environmental errors common to spectral imaging systems as described, for example in U.S. patent application Ser. No. 14/480,565 [Attorney Docket 023840-0431523], filed Sep. 8, 2014, and entitled “SYSTEM AND METHOD FOR CALIBRATING IMAGING MEASUREMENTS TAKEN FROM AERIAL VEHICLES” which is hereby incorporated into this disclosure by reference in its entirety.
  • image revisions component 30 may revise one or more intensity non-uniformities of the one or more images. The one or more intensity non-uniformities may be results from characteristics of one or more collection optics.
  • image revisions component 30 may revise one or more spatial distortions of the one or more images.
  • the one or more spatial distortions may be due to one or more characteristics of the collection optics.
  • image revisions component 30 may revise one or more variations in intensity that result from changes in solar irradiance of the one or more images.
  • image revisions component 30 may utilize one or more of a collocated solar spectrometer, a solar intensity measurement, a reflectance standard, and/or other calibration device or technique to revise the one or more images for variations in solar irradiance.
  • image revisions component 30 may be configured to register one or more pixels from the one or more channels to a common pixel space.
  • the first channel may correspond to a first spectral range and the second channel may correspond to a second spectral range.
  • one or more pixels of the first channel and the second channel may be registered to a common pixel space such that the corresponding pixels of each channel provide measurements of a common area of the target scene.
  • cross-channel registration may include two-dimensional cross-correlation and/or other techniques to determine the translation, rotation, scaling, and/or warping to be applied to each channel such that one or more pixels from the one or more channels are registered to a common pixel space.
  • contrast adjustment component 32 may be configured to distinguish vegetation from background clutter based on the one or more channels.
  • the vegetation may include one or both of a crop population and a non-crop population.
  • the background clutter may include one or more of soil, standing water, pavement, man-made materials, dead vegetation, other detritus, and/or other background clutter.
  • contrast adjustment component 32 may numerically combine the one or more channels such that a contrast between the vegetation and the background clutter is increased.
  • contrast adjustment component 32 may combine the one or more channels in a ratio or other index such that a contrast between the vegetation and the background clutter is increased.
  • the combination may include a Difference Vegetation Index (Difference VI), a Ratio Vegetation Index (Ratio VI), a Chlorophyll Index, a Normalized Difference Vegetation Index (NDVI), a Photochemical Reflectance Index (PRI), and/or other combinations of channels.
  • contrast adjustment component 32 may amplify the contrast of one or more high spatial frequency components corresponding to the combination.
  • a two-dimensional bandpass filter may be used to suppress signals of spatial frequencies lower than the crop plants or an edge sharpening filter may be used to increase the contrast of plant and non-plant boundaries in the images.
  • FIG. 3 illustrates segregation of vegetation from background clutter, in accordance with one or more implementations.
  • a false color image 310 may be converted into a high contrast image 320 which segregates growing vegetation 330 from background clutter 340 .
  • background clutter segregation component 34 may be configured to segregate image regions corresponding to the vegetation from image regions corresponding to the background clutter. In some implementations, background clutter segregation component 34 may be configured to utilize differing spectral reflectance combinations across multiple wavelength bands to segregate target types. In some implementations, background clutter segregation component 34 may be configured to determine an initial threshold value for the combination. The initial threshold value may be selected to segregate pixels containing vegetation signals from pixels containing background clutter. In some implementations, background clutter segregation component 34 may compare each pixel value in the combination to the threshold value.
  • background clutter segregation component 34 may group adjacent pixels that compare to the threshold value corresponding to the vegetation into “blobs.” In some implementations, background clutter segregation component 34 may count a total number of independent blobs with a “blob counting” algorithm and store the count with the value of the threshold.
  • background clutter segregation component 34 may be configured to adjust the value of the combination threshold to a new value. In some implementations, the combination threshold value adjustment may be repeated for a range of values such that a relationship may be established between the threshold and the number of blobs detected. In some implementations, background clutter segregation component 34 may establish a relationship between the ratio threshold and the number of vegetation “blobs” in the ratio image. In some implementations, background clutter segregation component 34 may be configured to determine a threshold value where detection count plateaus such that the blob count is most stable to changes in threshold. In some implementations, background clutter segregation component 34 may be configured to provide a two-dimensional matrix where each entry is a binary value indicating the presence (or absence) of vegetation within the corresponding pixel.
  • crop segregation component 36 may be configured to segregate image regions corresponding to the crop population from image regions corresponding to the non-crop population in the image regions corresponding to the vegetation.
  • crop segregation component 36 may perform an erosion operation on the binary matrix to segregate individual plants which may be grouped together into single blobs.
  • crop segregation component 36 may determine a characteristic size of the crop population based on a statistical distribution of the vegetation size.
  • crop segregation component 36 may segregate one or more contiguous groups of vegetation pixels having a size substantially greater than the characteristic size of the crop population.
  • crop segregation component 36 may be configured to classify and segregate large rafts of weeds from the crop population by identifying blob sizes that are larger and statistically separable from the main population of crop population.
  • crop segregation component 36 may be configured to remove the large rafts of weeds (non-crop population) from the binary matrix of vegetation detections.
  • FIG. 4 illustrates segregation of large rafts of non-crop population from the crop population, in accordance with one or more implementations. As depicted in FIG. 4 , successive erosion operations 1 - 4 are performed on the one or more images such that only the large non-crop population areas 410 remain.
  • crop row attributes component 38 may be configured to perform a two-dimensional Fast Fourier Transform (FFT) on the one or more images or on the numerical combination of images from the one or more channels as determined previously to determine the spatial frequencies, orientation, and curvature of peak energy with respect to the one or more images.
  • FFT Fast Fourier Transform
  • crop row attributes component 38 may identify two local maxima of peak energy corresponding to crop row spacing (the lowest frequency local maxima) and individual plant spacing along rows (the highest frequency local maxima).
  • An Inverse Fast Fourier Transform (IFFT) of the low frequency local maxima may provide the spatial separation of crop rows and their orientation relative to the one or more images.
  • IFFT Inverse Fast Fourier Transform
  • crop row attributes component 38 may perform a Hough transform to provide the location of each row in the one or more images along with individual row orientation, spacing, and curvature.
  • FIG. 5 illustrates detection and characterization of crop rows, in accordance with one or more implementations.
  • crop rows 510 , crop row spacing 520 in pixel coordinates, and crop row orientation 530 relative to the one or more remote sensing devices have been determined.
  • crop row attributes component 38 may determine a spacing of one or more crop rows in pixels.
  • crop row attributes component 38 may determine the pixel's Ground Sample Dimension using externally provided (e.g., by external resources 16 ) row spacing and the row spacing in pixels.
  • crop row attributes component 38 may be configured to provide a mask to segregate vegetation belonging to the crop population from vegetation belonging to the non-crop population using the previously determined crop row information.
  • crop row attributes component 38 may be configured to characterize a reference spectral signature of vegetation within the one or more crop rows.
  • crop row attributes component 38 may be configured to accept as input a prescribed reference spectral signature from an external resource 16 or may calculate a reference spectral signature by user selection of a region of interest.
  • crop row attributes component 38 may be configured to statistically compare the spectral signature of each pixel to the reference spectral signature.
  • the statistical proximity of each pixel's spectral signature to the reference spectral signature may be used to classify the pixel as belonging to the crop population class or another class. For example, individual plant detections that were classified in the crop class but have statistically different spectral signatures from the reference spectral signature may be reclassified as non-crop plants. Similarly, the reference spectral signature may be used to classify other plant or non-plant pixels.
  • a user may make a selection of a region of interest in the one or more images.
  • crop row attributes component 38 may determine a spectral signature corresponding to the region of interest.
  • crop row attributes component 38 may determine one or more additional regions and/or pixels in the one or more images having a statistically similar spectral signature.
  • crop row attributes component 38 may classify the one or more additional regions and/or pixels as belonging to the crop population class, the non-crop population class, or another class.
  • FIG. 6 illustrates segregation of vegetation growing within rows from vegetation growing outside of rows, in accordance with one or more implementations.
  • a crop row mask 610 is represented as a series of thick lines or curved lines, each fully encompassing one crop row.
  • mask 610 is applied to segregate vegetation growing within the rows (e.g., the crop population) from vegetation 620 growing outside of the rows (e.g., non-crop population).
  • the width of the crop row mask 610 may be determined by using a priori information about the crop and/or by dynamically determining the crop width from the image based on the statistical crop size.
  • crop row attributes component 38 may be configured to classify groups of vegetation pixels as belonging to the crop population if they are positioned statistically within the crop rows. In some implementations, crop row attributes component 38 may be configured to classify groups of vegetation pixels as belonging to the non-crop population if they are positioned statistically outside of the crop rows.
  • crop row attributes component 38 may be configured to determine a new and dynamic threshold level to improve the segregation of the crop population from the background noise by creating a histogram of pixel values only within the masked crop rows. The histogram may be utilized to determine the correct threshold to separate the plants from the background clutter.
  • background clutter may include one or more of soil, shadows, dead vegetation, weeds, standing water, farming equipment, and/or other background clutter.
  • the newly determined threshold may be applied to the whole image.
  • FIG. 7 illustrates complete segregation of crop population 720 from non-crop population 730 and background clutter 710 , in accordance with one or more implementations.
  • FIG. 7 depicts the segregation of crop population 720 from non-crop population 730 resulting from the utilization of a histogram and determination of a new threshold value.
  • crop density determination component 40 may determine a crop density corresponding to the crop population and a non-crop density corresponding to the non-crop population. In order to accurately determine vegetation density per unit area, the area over which the vegetation count was conducted may need to be accurately determined. While the optical characteristics (i.e. field of view) of the one or more remote sensing devices may be accurately known, the altitude of the one or more remote sensing devices above the ground level may be more difficult to determine. Accordingly, crop density determination component 40 may convert the determined row spacing from pixels to a linear spatial dimension (e.g., centimeters). In some implementations, crop density determination component 40 may determine an area of land portrayed by the one or more images using the converted row spacing.
  • a linear spatial dimension e.g., centimeters
  • crop density determination component 40 may determine a first count corresponding to the crop population and a second count corresponding to the non-crop population per unit area for one or more of the images. In some implementations, crop density determination component 40 may determine a crop count and/or non-crop count per unit area for one or more sub-regions of the one or more images. In some implementations, crop density determination component 40 may determine an area of land portrayed by the one or more images using the number of pixels in the image and the pixel's Ground Sample Dimension.
  • crop density determination component 40 may utilize blob detection techniques and/or other algorithms to identify and count each of the crop plants within the crop row mask. In some implementations, crop density determination component 40 may determine a centroid position of each blob. In some implementations, crop density determination component 40 may provide a list of plant center position coordinates.
  • FIG. 8 illustrates a map of plant center positions, in accordance with one or more implementations. FIG. 8 depicts plant centers 810 that are located within each image to determine spacing and count per area.
  • crop density determination component 40 may determine a pixel distance between each center position using the center position coordinates. In some implementations, crop density determination component 40 may provide a histogram of center to center spacing that may yield a strong peak at the nominal plant spacing. In some implementations, crop density determination component 40 may combine the nominal in-row plant spacing with row-to-row spacing to generate nominal planting density (e.g., plants per acre). In some implementations, crop density determination component 40 may receive user inputs regarding plant spacing and row spacing. In some implementations, crop density determination component 40 may utilize the received user inputs to refine the results of the planting statistics.
  • crop density determination component 40 may determine a refined plant count through analysis of the length of each blob along the plant row, and/or the spacing between plant centers. In some implementations, crop density determination component 40 may utilize statistics determined through the analysis to account for two plants that have grown together and appear as a single plant or single plants whose leaf structure causes them to appear as two or more plants.
  • crop density determination component 40 may determine statistics of the crop population including one or more of plant count per unit area, plant size, plant health, and/or other statistics. In some implementations, crop density determination component 40 may determine the crop density by dividing the first count by the determined area of the one or more images. In some implementations, crop density determination component 40 may determine the non-crop density by dividing the second count by the determined area of the one or more images.
  • crop density determination component 40 may determine plant size statistics by determining a number of contiguous pixels which constitute individual plants.
  • crop density determination component 40 may determine plant health characteristics using one or more of spectral combination methods. For example, combinations of spectral reflectance values may be used to infer conditions of plant health. Such combinations may include Difference Vegetation Index (Difference VI), a Ratio Vegetation Index (Ratio VI), a Chlorophyll Index, a Normalized Difference Vegetation Index (NDVI), a Photochemical Reflectance Index (PRI), and/or other combinations of channels.
  • Difference VI Difference Vegetation Index
  • Ratio VI Ratio Vegetation Index
  • NDVI Normalized Difference Vegetation Index
  • PRI Photochemical Reflectance Index
  • operations corresponding to one or more of communications component 28 , image revision component 30 , contrast adjustment component 32 , background clutter segregation component 34 , crop segregation component 36 , crop row attributes 38 , crop density determination component 40 , and/or other components may be repeated for multiple overlapping spectral images that cover large farming areas.
  • presentation component 42 may be configured to effectuate presentation of one or both of a map corresponding to the crop density or a map corresponding to the non-crop density. In some implementations, presentation component 42 may be configured to interpolate and/or resample results for the multiple spectral images onto a common grid spacing for the entire survey area. In some implementations, presentation component 42 may be configured to format the map corresponding to the crop density and/or the map corresponding to the non-crop density into multiple file formats for ease of dissemination, review, and further analysis in other downstream data products.
  • server(s) 12 , client computing platform(s) 18 , and/or external resources 16 may be operatively linked via one or more electronic communication links.
  • electronic communication links may be established, at least in part, via a network such as the Internet and/or other networks. It will be appreciated that this is not intended to be limiting, and that the scope of this disclosure includes implementations in which server(s) 12 , client computing platform(s) 18 , and/or external resources 16 may be operatively linked via some other communication media.
  • a given client computing platform 18 may include one or more processors configured to execute machine-readable instructions.
  • the machine-readable instructions may be configured to automatically, or through an expert or user associated with the given client computing platform 18 to interface with system 10 and/or external resources 16 , and/or provide other functionality attributed herein to client computing platform(s) 18 .
  • the one or more processors may be configured to execute machine-readable instruction components 28 , 30 , 32 , 34 , 36 , 38 , 40 , 42 , and/or other machine-readable instruction components.
  • the given client computing platform 18 may include one or more of a desktop computer, a laptop computer, a handheld computer, a tablet computing platform, a NetBook, a Smartphone, a gaming console, and/or other computing platforms.
  • the one or more remote sensing devices 24 may include one or more processors configured to execute machine-readable instructions.
  • the machine-readable instructions may be configured to automatically, or through an expert or user associated with the one or more remote sensing devices 24 to interface with system 10 and/or external resources 16 , and/or provide other functionality attributed herein to the one or more remote sensing devices 24 .
  • the one or more processors may be configured to execute machine-readable instruction components 28 , 30 , 32 , 34 , 36 , 38 , 40 , 42 , and/or other machine-readable instruction components.
  • the one or more remote sensing devices 24 may include processors 22 and electronic storage 14 .
  • External resources 16 may include sources of information, hosts and/or providers of digital media items outside of system 10 , external entities participating with system 10 , and/or other resources. In some implementations, some or all of the functionality attributed herein to external resources 16 may be provided by resources included in system 10 .
  • Server(s) 12 may include electronic storage 14 , one or more processors 22 , and/or other components. Server(s) 12 may include communication lines, or ports to enable the exchange of information with a network and/or other computing platforms. Illustration of server(s) 12 in FIG. 1 is not intended to be limiting. Server(s) 12 may include a plurality of hardware, software, and/or firmware components operating together to provide the functionality attributed herein to server(s) 12 . For example, server(s) 12 may be implemented by a cloud of computing platforms operating together as server(s) 12 .
  • Electronic storage 14 may comprise non-transitory storage media that electronically stores information.
  • the electronic storage media of electronic storage 14 may include one or both of system storage that is provided integrally (i.e., substantially non-removable) with server(s) 12 and/or removable storage that is removably connectable to server(s) 12 via, for example, a port (e.g., a USB port, a firewire port, etc.) or a drive (e.g., a disk drive, etc.).
  • a port e.g., a USB port, a firewire port, etc.
  • a drive e.g., a disk drive, etc.
  • Electronic storage 14 may include one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EEPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media.
  • Electronic storage 14 may include one or more virtual storage resources (e.g., cloud storage, a virtual private network, and/or other virtual storage resources).
  • Electronic storage 14 may store software algorithms, information determined by processor(s) 22 , information received from server(s) 12 , information received from client computing platform(s) 18 , and/or other information that enables server(s) 12 to function as described herein.
  • Processor(s) 22 is configured to provide information processing capabilities in server(s) 12 .
  • processor(s) 22 may include one or more of a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information.
  • processor(s) 22 is shown in FIG. 1 as a single entity, this is for illustrative purposes only.
  • processor(s) 22 may include a plurality of processing units. These processing units may be physically located within the same device, or processor(s) 22 may represent processing functionality of a plurality of devices operating in coordination.
  • the processor(s) 22 may be configured to execute machine-readable instruction components 28 , 30 , 32 , 34 , 36 , 38 , 40 , 42 , and/or other machine-readable instruction components.
  • the processor(s) 22 may be configured to execute machine-readable instruction components 28 , 30 , 32 , 34 , 36 , 38 , 40 , 42 , and/or other machine-readable instruction components by software; hardware; firmware; some combination of software, hardware, and/or firmware; and/or other mechanisms for configuring processing capabilities on processor(s) 22 .
  • machine-readable instruction components 28 , 30 , 32 , 34 , 36 , 38 , 40 , and 42 are illustrated in FIG. 1 as being implemented within a single processing unit, in implementations in which processor(s) 22 includes multiple processing units, one or more of machine-readable instruction components 28 , 30 , 32 , 34 , 36 , 38 , 40 , and/or 42 may be implemented remotely from the other components and/or subcomponents.
  • machine-readable instruction components 28 , 30 , 32 , 34 , 36 , 38 , 40 , and/or 42 described herein is for illustrative purposes, and is not intended to be limiting, as any of machine-readable instruction components 28 , 30 , 32 , 34 , 36 , 38 , 40 , and/or 42 may provide more or less functionality than is described.
  • one or more of machine-readable instruction components 28 , 30 , 32 , 34 , 36 , 38 , 40 , and/or 42 may be eliminated, and some or all of its functionality may be provided by other ones of machine-readable instruction components 28 , 30 , 32 , 34 , 36 , 38 , 40 , and/or 42 .
  • processor(s) 22 may be configured to execute one or more additional machine-readable instruction components that may perform some or all of the functionality attributed below to one of machine-readable instruction components 28 , 30 , 32 , 34 , 36 , 38 , 40 , and/or 42 .
  • FIG. 9 illustrates a method 900 for determining statistics of plant populations based on overhead optical measurements, in accordance with one or more implementations.
  • the operations of method 900 presented below are intended to be illustrative. In some implementations, method 900 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of method 900 are illustrated in FIG. 9 and described below is not intended to be limiting.
  • method 900 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information).
  • the one or more processing devices may include one or more devices executing some or all of the operations of method 900 in response to instructions stored electronically on an electronic storage medium.
  • the one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 900 .
  • output signals provided by one or more remote sensing devices mounted to an overhead platform may be received.
  • the output signals may convey information related to one or more images of a land area where crops are grown.
  • the one or more images may be spatially resolved and spectrally resolved.
  • the output signals may include one or more channels.
  • the first channel may correspond to a first spectral range and the second channel may correspond to a second spectral range.
  • Operation 905 may be performed by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to communications component 28 and image revisions component 30 (as described in connection with FIG. 1 ), in accordance with one or more implementations.
  • vegetation may be distinguished from background clutter based on the one or more channels.
  • the vegetation may include one or both of a crop population and a non-crop population.
  • the background clutter may include one or more of soil, standing water, man-made materials, dead vegetation, and/or other background clutter.
  • Operation 910 may be performed by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to contrast adjustment component 32 (as described in connection with FIG. 1 ), in accordance with one or more implementations.
  • image regions corresponding to the vegetation may be segregated from image regions corresponding to the background clutter.
  • Operation 915 may be performed by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to background clutter segregation component 34 (as described in connection with FIG. 1 ), in accordance with one or more implementations.
  • image regions corresponding to the crop population may be segregated from image regions corresponding to the non-crop population in the image regions corresponding to the vegetation.
  • Operation 920 may be performed by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to crop segregation component 36 (as described in connection with FIG. 1 ), in accordance with one or more implementations.
  • a crop density corresponding to the crop population and a non-crop density corresponding to the non-crop population may be determined. Operation 925 may be performed by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to crop row attributes component 38 and crop density determination component 40 (as described in connection with FIG. 1 ), in accordance with one or more implementations.
  • FIG. 10 illustrates process steps 1000 performed by the system of FIG. 1 , in accordance with one or more implementations.
  • system 10 may be configured to perform process steps 1002 - 1006 with the one or more remote sensing devices.
  • the one or more remote sensing devices may record one or more spectral images, record one or more environmental parameters, and record imager position, attitude, and time corresponding to the one or more spectral images.
  • system 10 may be configured to preprocess and calibrate the one or more spectral images (e.g., process step 1008 ) by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to image revision component 30 .
  • system 10 may calculate a numerical combination and apply image sharpening (e.g., process steps 1010 and 1012 ) by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to contrast adjustment component 32 .
  • system 10 may set an initial numerical combination threshold, calculate a number of vegetation detections, adjust the numerical combination threshold, and determine a threshold value where a blob count plateaus (e.g., process steps 1014 , 1016 , 1018 and 1020 ) by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to background clutter segregation component 34 .
  • system 10 may apply erosion to the one or more images, segregate crops from non-crops based on size statistics, determine a spatial frequency and an orientation of peak energy (e.g., process steps 1022 , 1024 , and 1026 ) by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to crop segregation component 36 .
  • system 10 may classify crops and non-crops by crop row masking, classify crops and non-crops by spectral signature, and calculate a ground area of the one or more spectral images (e.g., process steps 1028 , 1030 , and 1032 ) by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to crop row attributes component 38 .
  • system 10 may determine crop and non-crop densities (e.g., process step 1034 ) by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to crop density determination component 40 .
  • system 10 may spatially interpolate the crop and non-crop densities onto a geo-grid (e.g., process step 1036 ) by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to presentation component 42 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Geometry (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Investigating Or Analysing Materials By Optical Means (AREA)

Abstract

This disclosure describes a system and a method for determining statistics of plant populations based on overhead optical measurements. The system may include one or more hardware processors configured by machine-readable instructions to receive output signals provided by one or more remote sensing devices mounted to an overhead platform. The output signals may convey information related to one or more images of a land area where crops are grown. The one or more hardware processors may be configured by machine-readable instructions to distinguish vegetation from background clutter; segregate image regions corresponding to the vegetation from image regions corresponding to the background clutter; and determine a plant count per unit area.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 15/268,370 entitled SYSTEMS AND METHODS FOR DETERMINING STATISTICS OF PLANT POPULATIONS BASED ON OVERHEAD OPTICAL MEASUREMENTS, filed Sep. 16, 2016, which claims the benefit of U.S. Provisional Patent Application No. 62/220,596 entitled SYSTEMS AND METHODS DETERMINING PLANT POPULATION AND WEED GROWTH STATISTICS FROM AIRBORNE MEASUREMENTS IN ROW CROPS, filed Sep. 18, 2015, the contents of which are each herein incorporated by reference in their entirety.
  • FIELD OF THE DISCLOSURE
  • This disclosure relates to systems and methods for determining statistics of plant populations based on overhead optical measurements.
  • BACKGROUND
  • Farming practices may become more efficient by informing growers with more accurate and thorough information on the status of their crops. For example, timely and accurate knowledge of the emergent plant density and size distribution and their spatial variances across the field may enable growers and agronomists to a) determine more accurately how the emergent crop population differs from the planned population and where replanting may be necessary; b) detect poor germination areas which can then be investigated for bad seed, bad soil, or other disadvantageous conditions; c) detect malfunctioning planting implements for corrective action; d) more accurately and selectively apply inputs such as fertilizers, fungicides, herbicides, pesticides, and other inputs; e) more thoroughly understand how combinations of seed types, planting densities, soil chemistries, irrigation, fertilizers, chemicals, etc. contribute to crop populations which optimize production yields.
  • SUMMARY
  • Current solutions for estimating plant population statistics (or “stand count”) may include humans manually counting individual plants at multiple, yet sparse, locations across a field. The area surveyed by this method may be less than 1% of the total field area. An estimate for the entire field may be determined through interpolation which consequently may lead to large errors as entire portions of the field may not be surveyed and may include unplanted, misplanted, or damaged areas.
  • More recently, airborne observations have been employed. While airborne methods may benefit by replacing the sparse sampling and interpolation limitations of manual counting with 100% coverage, they have been greatly limited by their ability to a) resolve individual plants; b) discriminate individual plants of the crop species from other plants (weeds) or detritus in the field for automated counting; and c) accurately determine the area of the measured region due to inaccuracies in aircraft altitude and ground elevation measurements. These limitations have led to very large errors in population statistics.
  • Exemplary implementations of the present disclosure may employ a remote sensing system (e.g. multispectral, hyperspectral, panchromatic, and/or other sensors) mounted to an airborne or other overhead platform and automated computer vision techniques to detect, resolve, and discriminate crop plants for counting and sizing over large areas of agricultural fields.
  • Accordingly, one aspect of the disclosure relates to a system configured for determining statistics of plant populations based on overhead optical measurements. The system may comprise one or more hardware processors configured by machine-readable instructions to receive output signals provided by one or more remote sensing devices mounted to an overhead platform. The output signals may convey information related to one or more images of a land area where crops are grown. The one or more images may be spatially resolved. The output signals may include one or more channels corresponding to one or more spectral ranges. The one or more hardware processors may be configured by machine-readable instructions to distinguish vegetation from background based on the one or more channels. The vegetation may include one or both of a crop population and a non-crop population. The background clutter may include one or more of soil, standing water, man-made materials, dead vegetation, or other detritus. The one or more hardware processors may be configured by machine-readable instructions to segregate image regions corresponding to the vegetation from image regions corresponding to the background clutter. The one or more hardware processors may be configured by machine-readable instructions to determine a plant count per unit area.
  • Another aspect of the disclosure relates to a method for determining statistics of plant populations based on overhead optical measurements. The method may be performed by one or more hardware processors configured by machine-readable instructions. The method may include receiving output signals provided by one or more remote sensing devices mounted to an overhead platform. The output signals may convey information related to one or more images of a land area where crops are grown. The one or more images may be spatially resolved. The output signals may include one or more channels corresponding to one or more spectral ranges. The method may include distinguishing vegetation from background clutter based on the one or more channels. The vegetation may include one or both of a crop population and a non-crop population. The background clutter may include one or more of soil, standing water, man-made materials, dead vegetation, or other detritus. The method may include segregating image regions corresponding to the vegetation from image regions corresponding to the background clutter. The method may include determining a plant count per unit area.
  • These and other features, and characteristics of the present technology, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description and the appended claims with reference to the accompanying drawings, all of which form a part of this specification, wherein like reference numerals designate corresponding parts in the various figures. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the invention. As used in the specification and in the claims, the singular form of “a”, “an”, and “the” include plural referents unless the context clearly dictates otherwise.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates a system configured for determining statistics of plant populations based on overhead optical measurements, in accordance with one or more implementations.
  • FIG. 2 illustrates spectral images obtained from an airborne platform, in accordance with one or more implementations.
  • FIG. 3 illustrates segregation of vegetation from background clutter, in accordance with one or more implementations.
  • FIG. 4 illustrates segregation of large rafts of non-crop population from the crop population, in accordance with one or more implementations.
  • FIG. 5 illustrates detection and characterization of crop rows, in accordance with one or more implementations.
  • FIG. 6 illustrates segregation of vegetation growing within rows from vegetation growing outside of rows, in accordance with one or more implementations.
  • FIG. 7 illustrates complete segregation of crop population from non-crop population and background clutter, in accordance with one or more implementations.
  • FIG. 8 illustrates a map of plant center positions, in accordance with one or more implementations.
  • FIG. 9 illustrates a method for determining statistics of plant populations based on overhead optical measurements, in accordance with one or more implementations.
  • FIG. 10 illustrates process steps performed by the system of FIG. 1, in accordance with one or more implementations.
  • DETAILED DESCRIPTION
  • FIG. 1 illustrates a system 10 configured for determining statistics of plant populations based on overhead optical measurements, in accordance with one or more implementations. In some implementations, system 10 may include one or more remote sensing devices 24. In some implementations, system 10 may include one or more server 12. Server(s) 12 may be configured to communicate with one or more client computing platforms 18 and/or one or more remote sensing devices 24 according to a client/server architecture. The users may access system 10 via a user interface 20 of client computing platform(s) 18.
  • The one or more remote sensing devices 24 may be mounted to an overhead platform. In some implementations, the overhead platform may include one or more of an aircraft, a spacecraft, an unmanned aerial vehicle, a drone, a tower, a vehicle, a tethered balloon, farming infrastructure such as center pivot irrigation systems or other infrastructure, and/or other overhead platforms. In some implementations, the one or more remote sensing devices 24 may be configured to provide output signals. The output signals may convey information related to one or more images of a land area where crops are grown. In some implementations, the one or more images may include one or more spectral measurements. For example, the one or more images may include one or more of a color measurement, a multi-spectral measurement, a hyperspectral measurement, and/or other spectral measurements of a land area where crops are grown. In some implementations, the one or more remote sensing devices 24 may record two-dimensional images of the land area where crops are grown formed on a single or multiple focal plane arrays. For example, a color or multispectral measurement may be formed through multiple spectral filters applied to individual pixels in a single focal plane array, or through spectral filters applied to entire focal plane arrays in a multiple focal plane array configuration.
  • In some implementations, the one or more images may be of sufficient spatial resolution to detect individual plants within the crop population. In some implementations, the one or more images may be of sufficient spectral resolution to resolve spectral differences between growing vegetation and background clutter. In some implementations, the measurements may be of sufficient resolution such that the ground resolved distance (GRD) is smaller than a characteristic dimension of one or more target plants in the land area.
  • In some implementations, the one or more remote sensing devices 24 may provide output signals conveying information related to one or more of a time stamp, a position (e.g., latitude, longitude, and/or altitude), an attitude (e.g., roll, pitch, and/or yaw/heading), a spectral measurement of solar irradiance, calibration information specific to the device, and/or other information corresponding to individual ones of the one or more images. In some implementations, calibration may include adjusting the one or more images for sunlight conditions, systemic errors, or positioning the image onto the earth's surface for output mapping. In some implementations, the one or more remote sensing devices 24 may provide output signals conveying information related to one or more environmental parameters, time stamp, the position, the attitude, and/or other information corresponding to individual ones of the one or more images. For example, the one or more environmental parameters may include spectral measurements of downwelling solar illuminance, temperature, relative humidity, and/or other weather or environmental conditions. In some implementations, output signals conveying information related to one or more environmental parameters, time stamp, the position, the attitude, and/or other information may be utilized to calibrate the one or more spectral images. In some implementations, the output signals may be synchronous to the one or more images. For example, each image may include the output signals as metadata whose time of validity corresponds to the image.
  • By way of a non-limiting example, FIG. 2 illustrates spectral images obtained from an airborne platform, in accordance with one or more implementations. As shown on FIG. 2, an airborne platform 210 having one or more remote sensing devices may provide one or more images 220 of a land area 230 where crops are grown.
  • Returning to FIG. 1, the server(s) 12 and/or client computing platform(s) 18 may be configured to execute machine-readable instructions 26. The machine-readable instructions 26 may include one or more of a communications component 28, an image revision component 30, a contrast adjustment component 32, a background clutter segregation component 34, a crop segregation component 36, a crop row attributes 38, a crop density determination component 40, a presentation component 42, and/or other components.
  • Machine-readable instructions 26 may facilitate determining statistics of plant populations based on overhead optical measurements. In some implementations, communications component 28 may receive output signals provided by one or more remote sensing devices mounted to an overhead platform. In some implementations, the output signals may include one or more spectral images, metadata related to the one or more spectral images, and/or other information. The output signals may convey information related to one or more images of a land area where crops are grown. In some implementations, the one or more images may be spatially resolved and spectrally resolved. In some implementations, spatially resolved images may include one or more images corresponding to crop plants, non-crop plants, a land area, and/or other locations. In some implementations, the one or more images may include individual pixels corresponding to a spectral range. In some implementations, the individual pixels may include intensity values corresponding to the spectral range. For example, the one or more remote sensing devices may include a first camera having a red filter thereon and a second camera having a near infrared filter thereon. An Image captured by the first camera may include pixel values indicating intensity in the red spectral range and an image captured by the second camera may include pixel values indicating intensity in the near infrared spectral range. In some implementations, the output signals may include one or more channels. In some implementations, multiple channels may be part of a single remote sensing device. In some implementations, multiple channels may be part of multiple remote sensing devices. In some implementations, each image may be created by a channel. In some implementations, each image created by a channel may be both spatially and spectrally resolved. In some implementations, individual channels may have a similar spatial resolution. In some implementations, different spectral ranges may be resolved in each channel. In some implementations, a stack of images may be based on the one or more channels.
  • In some implementations, image revisions component 30 may be configured to correct and/or revise systematic and environmental errors common to spectral imaging systems as described, for example in U.S. patent application Ser. No. 14/480,565 [Attorney Docket 023840-0431523], filed Sep. 8, 2014, and entitled “SYSTEM AND METHOD FOR CALIBRATING IMAGING MEASUREMENTS TAKEN FROM AERIAL VEHICLES” which is hereby incorporated into this disclosure by reference in its entirety. In some implementations, image revisions component 30 may revise one or more intensity non-uniformities of the one or more images. The one or more intensity non-uniformities may be results from characteristics of one or more collection optics. In some implementations, image revisions component 30 may revise one or more spatial distortions of the one or more images. The one or more spatial distortions may be due to one or more characteristics of the collection optics. In some implementations, image revisions component 30 may revise one or more variations in intensity that result from changes in solar irradiance of the one or more images. For example, image revisions component 30 may utilize one or more of a collocated solar spectrometer, a solar intensity measurement, a reflectance standard, and/or other calibration device or technique to revise the one or more images for variations in solar irradiance.
  • In some implementations, image revisions component 30 may be configured to register one or more pixels from the one or more channels to a common pixel space. The first channel may correspond to a first spectral range and the second channel may correspond to a second spectral range. For example, one or more pixels of the first channel and the second channel may be registered to a common pixel space such that the corresponding pixels of each channel provide measurements of a common area of the target scene. In some implementations, cross-channel registration may include two-dimensional cross-correlation and/or other techniques to determine the translation, rotation, scaling, and/or warping to be applied to each channel such that one or more pixels from the one or more channels are registered to a common pixel space.
  • In some implementations, contrast adjustment component 32 may be configured to distinguish vegetation from background clutter based on the one or more channels. In some implementations, the vegetation may include one or both of a crop population and a non-crop population. In some implementations, the background clutter may include one or more of soil, standing water, pavement, man-made materials, dead vegetation, other detritus, and/or other background clutter. In some implementations, contrast adjustment component 32 may numerically combine the one or more channels such that a contrast between the vegetation and the background clutter is increased. In some implementations, contrast adjustment component 32 may combine the one or more channels in a ratio or other index such that a contrast between the vegetation and the background clutter is increased. In some implementations, the combination may include a Difference Vegetation Index (Difference VI), a Ratio Vegetation Index (Ratio VI), a Chlorophyll Index, a Normalized Difference Vegetation Index (NDVI), a Photochemical Reflectance Index (PRI), and/or other combinations of channels. In some implementations, contrast adjustment component 32 may amplify the contrast of one or more high spatial frequency components corresponding to the combination. For example, a two-dimensional bandpass filter may be used to suppress signals of spatial frequencies lower than the crop plants or an edge sharpening filter may be used to increase the contrast of plant and non-plant boundaries in the images. By way of a non-limiting example, FIG. 3 illustrates segregation of vegetation from background clutter, in accordance with one or more implementations. In FIG. 3, a false color image 310 may be converted into a high contrast image 320 which segregates growing vegetation 330 from background clutter 340.
  • Returning to FIG. 1, background clutter segregation component 34 may be configured to segregate image regions corresponding to the vegetation from image regions corresponding to the background clutter. In some implementations, background clutter segregation component 34 may be configured to utilize differing spectral reflectance combinations across multiple wavelength bands to segregate target types. In some implementations, background clutter segregation component 34 may be configured to determine an initial threshold value for the combination. The initial threshold value may be selected to segregate pixels containing vegetation signals from pixels containing background clutter. In some implementations, background clutter segregation component 34 may compare each pixel value in the combination to the threshold value. In some implementations, background clutter segregation component 34 may group adjacent pixels that compare to the threshold value corresponding to the vegetation into “blobs.” In some implementations, background clutter segregation component 34 may count a total number of independent blobs with a “blob counting” algorithm and store the count with the value of the threshold.
  • In some implementations, background clutter segregation component 34 may be configured to adjust the value of the combination threshold to a new value. In some implementations, the combination threshold value adjustment may be repeated for a range of values such that a relationship may be established between the threshold and the number of blobs detected. In some implementations, background clutter segregation component 34 may establish a relationship between the ratio threshold and the number of vegetation “blobs” in the ratio image. In some implementations, background clutter segregation component 34 may be configured to determine a threshold value where detection count plateaus such that the blob count is most stable to changes in threshold. In some implementations, background clutter segregation component 34 may be configured to provide a two-dimensional matrix where each entry is a binary value indicating the presence (or absence) of vegetation within the corresponding pixel.
  • In some implementations, crop segregation component 36 may be configured to segregate image regions corresponding to the crop population from image regions corresponding to the non-crop population in the image regions corresponding to the vegetation. In some implementations, crop segregation component 36 may perform an erosion operation on the binary matrix to segregate individual plants which may be grouped together into single blobs. In some implementations, crop segregation component 36 may determine a characteristic size of the crop population based on a statistical distribution of the vegetation size. In some implementations, crop segregation component 36 may segregate one or more contiguous groups of vegetation pixels having a size substantially greater than the characteristic size of the crop population. For example, crop segregation component 36 may be configured to classify and segregate large rafts of weeds from the crop population by identifying blob sizes that are larger and statistically separable from the main population of crop population. In some implementations, crop segregation component 36 may be configured to remove the large rafts of weeds (non-crop population) from the binary matrix of vegetation detections. By way of a non-limiting example, FIG. 4 illustrates segregation of large rafts of non-crop population from the crop population, in accordance with one or more implementations. As depicted in FIG. 4, successive erosion operations 1-4 are performed on the one or more images such that only the large non-crop population areas 410 remain.
  • Returning to FIG. 1, crop row attributes component 38 may be configured to perform a two-dimensional Fast Fourier Transform (FFT) on the one or more images or on the numerical combination of images from the one or more channels as determined previously to determine the spatial frequencies, orientation, and curvature of peak energy with respect to the one or more images. In some implementations, crop row attributes component 38 may identify two local maxima of peak energy corresponding to crop row spacing (the lowest frequency local maxima) and individual plant spacing along rows (the highest frequency local maxima). An Inverse Fast Fourier Transform (IFFT) of the low frequency local maxima may provide the spatial separation of crop rows and their orientation relative to the one or more images.
  • In some implementations, crop row attributes component 38 may perform a Hough transform to provide the location of each row in the one or more images along with individual row orientation, spacing, and curvature. By way of a non-limiting example, FIG. 5 illustrates detection and characterization of crop rows, in accordance with one or more implementations. In FIG. 5, crop rows 510, crop row spacing 520 in pixel coordinates, and crop row orientation 530 relative to the one or more remote sensing devices have been determined. In some implementations, crop row attributes component 38 may determine a spacing of one or more crop rows in pixels. In some implementations, crop row attributes component 38 may determine the pixel's Ground Sample Dimension using externally provided (e.g., by external resources 16) row spacing and the row spacing in pixels.
  • Returning to FIG. 1, crop row attributes component 38 may be configured to provide a mask to segregate vegetation belonging to the crop population from vegetation belonging to the non-crop population using the previously determined crop row information. In some implementations, crop row attributes component 38 may be configured to characterize a reference spectral signature of vegetation within the one or more crop rows. In some implementations, crop row attributes component 38 may be configured to accept as input a prescribed reference spectral signature from an external resource 16 or may calculate a reference spectral signature by user selection of a region of interest. In some implementations, crop row attributes component 38 may be configured to statistically compare the spectral signature of each pixel to the reference spectral signature. In some implementations, the statistical proximity of each pixel's spectral signature to the reference spectral signature may be used to classify the pixel as belonging to the crop population class or another class. For example, individual plant detections that were classified in the crop class but have statistically different spectral signatures from the reference spectral signature may be reclassified as non-crop plants. Similarly, the reference spectral signature may be used to classify other plant or non-plant pixels.
  • In some implementations, a user may make a selection of a region of interest in the one or more images. In some implementations, crop row attributes component 38 may determine a spectral signature corresponding to the region of interest. In some implementations, crop row attributes component 38 may determine one or more additional regions and/or pixels in the one or more images having a statistically similar spectral signature. In some implementations, crop row attributes component 38 may classify the one or more additional regions and/or pixels as belonging to the crop population class, the non-crop population class, or another class.
  • By way of a non-limiting example, FIG. 6 illustrates segregation of vegetation growing within rows from vegetation growing outside of rows, in accordance with one or more implementations. As depicted in FIG. 6, a crop row mask 610 is represented as a series of thick lines or curved lines, each fully encompassing one crop row. In FIG. 6, once the location and orientation of the crop rows have been determined, mask 610 is applied to segregate vegetation growing within the rows (e.g., the crop population) from vegetation 620 growing outside of the rows (e.g., non-crop population). In some implementations, the width of the crop row mask 610 may be determined by using a priori information about the crop and/or by dynamically determining the crop width from the image based on the statistical crop size.
  • In some implementations, crop row attributes component 38 may be configured to classify groups of vegetation pixels as belonging to the crop population if they are positioned statistically within the crop rows. In some implementations, crop row attributes component 38 may be configured to classify groups of vegetation pixels as belonging to the non-crop population if they are positioned statistically outside of the crop rows.
  • In some implementations, crop row attributes component 38 may be configured to determine a new and dynamic threshold level to improve the segregation of the crop population from the background noise by creating a histogram of pixel values only within the masked crop rows. The histogram may be utilized to determine the correct threshold to separate the plants from the background clutter. In some implementations, background clutter may include one or more of soil, shadows, dead vegetation, weeds, standing water, farming equipment, and/or other background clutter. In some implementations, the newly determined threshold may be applied to the whole image. By way of a non-limiting example, FIG. 7 illustrates complete segregation of crop population 720 from non-crop population 730 and background clutter 710, in accordance with one or more implementations. FIG. 7 depicts the segregation of crop population 720 from non-crop population 730 resulting from the utilization of a histogram and determination of a new threshold value.
  • Returning to FIG. 1, crop density determination component 40 may determine a crop density corresponding to the crop population and a non-crop density corresponding to the non-crop population. In order to accurately determine vegetation density per unit area, the area over which the vegetation count was conducted may need to be accurately determined. While the optical characteristics (i.e. field of view) of the one or more remote sensing devices may be accurately known, the altitude of the one or more remote sensing devices above the ground level may be more difficult to determine. Accordingly, crop density determination component 40 may convert the determined row spacing from pixels to a linear spatial dimension (e.g., centimeters). In some implementations, crop density determination component 40 may determine an area of land portrayed by the one or more images using the converted row spacing. In some implementations, crop density determination component 40 may determine a first count corresponding to the crop population and a second count corresponding to the non-crop population per unit area for one or more of the images. In some implementations, crop density determination component 40 may determine a crop count and/or non-crop count per unit area for one or more sub-regions of the one or more images. In some implementations, crop density determination component 40 may determine an area of land portrayed by the one or more images using the number of pixels in the image and the pixel's Ground Sample Dimension.
  • In some implementations, crop density determination component 40 may utilize blob detection techniques and/or other algorithms to identify and count each of the crop plants within the crop row mask. In some implementations, crop density determination component 40 may determine a centroid position of each blob. In some implementations, crop density determination component 40 may provide a list of plant center position coordinates. By way of a non-limiting example, FIG. 8 illustrates a map of plant center positions, in accordance with one or more implementations. FIG. 8 depicts plant centers 810 that are located within each image to determine spacing and count per area.
  • In some implementations, crop density determination component 40 may determine a pixel distance between each center position using the center position coordinates. In some implementations, crop density determination component 40 may provide a histogram of center to center spacing that may yield a strong peak at the nominal plant spacing. In some implementations, crop density determination component 40 may combine the nominal in-row plant spacing with row-to-row spacing to generate nominal planting density (e.g., plants per acre). In some implementations, crop density determination component 40 may receive user inputs regarding plant spacing and row spacing. In some implementations, crop density determination component 40 may utilize the received user inputs to refine the results of the planting statistics.
  • In some implementations, crop density determination component 40 may determine a refined plant count through analysis of the length of each blob along the plant row, and/or the spacing between plant centers. In some implementations, crop density determination component 40 may utilize statistics determined through the analysis to account for two plants that have grown together and appear as a single plant or single plants whose leaf structure causes them to appear as two or more plants.
  • In some implementations, crop density determination component 40 may determine statistics of the crop population including one or more of plant count per unit area, plant size, plant health, and/or other statistics. In some implementations, crop density determination component 40 may determine the crop density by dividing the first count by the determined area of the one or more images. In some implementations, crop density determination component 40 may determine the non-crop density by dividing the second count by the determined area of the one or more images.
  • In some implementations, crop density determination component 40 may determine plant size statistics by determining a number of contiguous pixels which constitute individual plants. In some implementations, crop density determination component 40 may determine plant health characteristics using one or more of spectral combination methods. For example, combinations of spectral reflectance values may be used to infer conditions of plant health. Such combinations may include Difference Vegetation Index (Difference VI), a Ratio Vegetation Index (Ratio VI), a Chlorophyll Index, a Normalized Difference Vegetation Index (NDVI), a Photochemical Reflectance Index (PRI), and/or other combinations of channels.
  • In some implementations, operations corresponding to one or more of communications component 28, image revision component 30, contrast adjustment component 32, background clutter segregation component 34, crop segregation component 36, crop row attributes 38, crop density determination component 40, and/or other components may be repeated for multiple overlapping spectral images that cover large farming areas.
  • In some implementations, presentation component 42 may be configured to effectuate presentation of one or both of a map corresponding to the crop density or a map corresponding to the non-crop density. In some implementations, presentation component 42 may be configured to interpolate and/or resample results for the multiple spectral images onto a common grid spacing for the entire survey area. In some implementations, presentation component 42 may be configured to format the map corresponding to the crop density and/or the map corresponding to the non-crop density into multiple file formats for ease of dissemination, review, and further analysis in other downstream data products.
  • In some implementations, server(s) 12, client computing platform(s) 18, and/or external resources 16 may be operatively linked via one or more electronic communication links. For example, such electronic communication links may be established, at least in part, via a network such as the Internet and/or other networks. It will be appreciated that this is not intended to be limiting, and that the scope of this disclosure includes implementations in which server(s) 12, client computing platform(s) 18, and/or external resources 16 may be operatively linked via some other communication media.
  • A given client computing platform 18 may include one or more processors configured to execute machine-readable instructions. The machine-readable instructions may be configured to automatically, or through an expert or user associated with the given client computing platform 18 to interface with system 10 and/or external resources 16, and/or provide other functionality attributed herein to client computing platform(s) 18. In some implementations, the one or more processors may be configured to execute machine- readable instruction components 28, 30, 32, 34, 36, 38, 40, 42, and/or other machine-readable instruction components. By way of non-limiting example, the given client computing platform 18 may include one or more of a desktop computer, a laptop computer, a handheld computer, a tablet computing platform, a NetBook, a Smartphone, a gaming console, and/or other computing platforms.
  • In some implementations, the one or more remote sensing devices 24 may include one or more processors configured to execute machine-readable instructions. The machine-readable instructions may be configured to automatically, or through an expert or user associated with the one or more remote sensing devices 24 to interface with system 10 and/or external resources 16, and/or provide other functionality attributed herein to the one or more remote sensing devices 24. In some implementations, the one or more processors may be configured to execute machine- readable instruction components 28, 30, 32, 34, 36, 38, 40, 42, and/or other machine-readable instruction components. In some implementations, the one or more remote sensing devices 24 may include processors 22 and electronic storage 14.
  • External resources 16 may include sources of information, hosts and/or providers of digital media items outside of system 10, external entities participating with system 10, and/or other resources. In some implementations, some or all of the functionality attributed herein to external resources 16 may be provided by resources included in system 10.
  • Server(s) 12 may include electronic storage 14, one or more processors 22, and/or other components. Server(s) 12 may include communication lines, or ports to enable the exchange of information with a network and/or other computing platforms. Illustration of server(s) 12 in FIG. 1 is not intended to be limiting. Server(s) 12 may include a plurality of hardware, software, and/or firmware components operating together to provide the functionality attributed herein to server(s) 12. For example, server(s) 12 may be implemented by a cloud of computing platforms operating together as server(s) 12.
  • Electronic storage 14 may comprise non-transitory storage media that electronically stores information. The electronic storage media of electronic storage 14 may include one or both of system storage that is provided integrally (i.e., substantially non-removable) with server(s) 12 and/or removable storage that is removably connectable to server(s) 12 via, for example, a port (e.g., a USB port, a firewire port, etc.) or a drive (e.g., a disk drive, etc.). Electronic storage 14 may include one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EEPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media. Electronic storage 14 may include one or more virtual storage resources (e.g., cloud storage, a virtual private network, and/or other virtual storage resources). Electronic storage 14 may store software algorithms, information determined by processor(s) 22, information received from server(s) 12, information received from client computing platform(s) 18, and/or other information that enables server(s) 12 to function as described herein.
  • Processor(s) 22 is configured to provide information processing capabilities in server(s) 12. As such, processor(s) 22 may include one or more of a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information. Although processor(s) 22 is shown in FIG. 1 as a single entity, this is for illustrative purposes only. In some implementations, processor(s) 22 may include a plurality of processing units. These processing units may be physically located within the same device, or processor(s) 22 may represent processing functionality of a plurality of devices operating in coordination. The processor(s) 22 may be configured to execute machine- readable instruction components 28, 30, 32, 34, 36, 38, 40, 42, and/or other machine-readable instruction components. The processor(s) 22 may be configured to execute machine- readable instruction components 28, 30, 32, 34, 36, 38, 40, 42, and/or other machine-readable instruction components by software; hardware; firmware; some combination of software, hardware, and/or firmware; and/or other mechanisms for configuring processing capabilities on processor(s) 22.
  • It should be appreciated that although machine- readable instruction components 28, 30, 32, 34, 36, 38, 40, and 42 are illustrated in FIG. 1 as being implemented within a single processing unit, in implementations in which processor(s) 22 includes multiple processing units, one or more of machine- readable instruction components 28, 30, 32, 34, 36, 38, 40, and/or 42 may be implemented remotely from the other components and/or subcomponents. The description of the functionality provided by the different machine- readable instruction components 28, 30, 32, 34, 36, 38, 40, and/or 42 described herein is for illustrative purposes, and is not intended to be limiting, as any of machine- readable instruction components 28, 30, 32, 34, 36, 38, 40, and/or 42 may provide more or less functionality than is described. For example, one or more of machine- readable instruction components 28, 30, 32, 34, 36, 38, 40, and/or 42 may be eliminated, and some or all of its functionality may be provided by other ones of machine- readable instruction components 28, 30, 32, 34, 36, 38, 40, and/or 42. As another example, processor(s) 22 may be configured to execute one or more additional machine-readable instruction components that may perform some or all of the functionality attributed below to one of machine- readable instruction components 28, 30, 32, 34, 36, 38, 40, and/or 42.
  • FIG. 9 illustrates a method 900 for determining statistics of plant populations based on overhead optical measurements, in accordance with one or more implementations. The operations of method 900 presented below are intended to be illustrative. In some implementations, method 900 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of method 900 are illustrated in FIG. 9 and described below is not intended to be limiting.
  • In some implementations, method 900 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information). The one or more processing devices may include one or more devices executing some or all of the operations of method 900 in response to instructions stored electronically on an electronic storage medium. The one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 900.
  • At an operation 905, output signals provided by one or more remote sensing devices mounted to an overhead platform may be received. In some implementations, the output signals may convey information related to one or more images of a land area where crops are grown. In some implementations, the one or more images may be spatially resolved and spectrally resolved. In some implementations, the output signals may include one or more channels. In some implementations, the first channel may correspond to a first spectral range and the second channel may correspond to a second spectral range. Operation 905 may be performed by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to communications component 28 and image revisions component 30 (as described in connection with FIG. 1), in accordance with one or more implementations.
  • At an operation 910, vegetation may be distinguished from background clutter based on the one or more channels. In some implementations, the vegetation may include one or both of a crop population and a non-crop population. In some implementations, the background clutter may include one or more of soil, standing water, man-made materials, dead vegetation, and/or other background clutter. Operation 910 may be performed by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to contrast adjustment component 32 (as described in connection with FIG. 1), in accordance with one or more implementations.
  • At an operation 915, image regions corresponding to the vegetation may be segregated from image regions corresponding to the background clutter. Operation 915 may be performed by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to background clutter segregation component 34 (as described in connection with FIG. 1), in accordance with one or more implementations.
  • At an operation 920, image regions corresponding to the crop population may be segregated from image regions corresponding to the non-crop population in the image regions corresponding to the vegetation. Operation 920 may be performed by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to crop segregation component 36 (as described in connection with FIG. 1), in accordance with one or more implementations.
  • At an operation 925, a crop density corresponding to the crop population and a non-crop density corresponding to the non-crop population may be determined. Operation 925 may be performed by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to crop row attributes component 38 and crop density determination component 40 (as described in connection with FIG. 1), in accordance with one or more implementations.
  • By way of a non-limiting example, FIG. 10 illustrates process steps 1000 performed by the system of FIG. 1, in accordance with one or more implementations. As depicted in FIG. 10, system 10 may be configured to perform process steps 1002-1006 with the one or more remote sensing devices. For example, the one or more remote sensing devices may record one or more spectral images, record one or more environmental parameters, and record imager position, attitude, and time corresponding to the one or more spectral images.
  • In some implementations, system 10 may be configured to preprocess and calibrate the one or more spectral images (e.g., process step 1008) by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to image revision component 30.
  • In some implementations, system 10 may calculate a numerical combination and apply image sharpening (e.g., process steps 1010 and 1012) by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to contrast adjustment component 32.
  • In some implementations, system 10 may set an initial numerical combination threshold, calculate a number of vegetation detections, adjust the numerical combination threshold, and determine a threshold value where a blob count plateaus (e.g., process steps 1014, 1016, 1018 and 1020) by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to background clutter segregation component 34.
  • In some implementations, system 10 may apply erosion to the one or more images, segregate crops from non-crops based on size statistics, determine a spatial frequency and an orientation of peak energy (e.g., process steps 1022, 1024, and 1026) by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to crop segregation component 36.
  • In some implementations, system 10 may classify crops and non-crops by crop row masking, classify crops and non-crops by spectral signature, and calculate a ground area of the one or more spectral images (e.g., process steps 1028, 1030, and 1032) by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to crop row attributes component 38.
  • In some implementations, system 10 may determine crop and non-crop densities (e.g., process step 1034) by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to crop density determination component 40.
  • In some implementations, system 10 may spatially interpolate the crop and non-crop densities onto a geo-grid (e.g., process step 1036) by one or more hardware processors configured to execute a machine-readable instruction component that is the same as or similar to presentation component 42.
  • Although the present technology has been described in detail for the purpose of illustration based on what is currently considered to be the most practical and preferred implementations, it is to be understood that such detail is solely for that purpose and that the technology is not limited to the disclosed implementations, but, on the contrary, is intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present technology contemplates that, to the extent possible, one or more features of any implementation can be combined with one or more features of any other implementation.

Claims (6)

What is claimed is:
1. A system configured for classifying plants within an area, the system comprising:
one or more hardware processors configured by machine-readable instructions to:
receive output signals provided by one or more remote sensing devices mounted to an unmanned aircraft system, the output signals conveying information related to one or more images of a land area, the one or more images being spatially resolved and including one or more spectral bands within a target wavelength range;
segregate image regions corresponding to growing vegetation from image regions corresponding to background clutter, the background clutter including one or more of soil, rock, liquid, man-made materials, or non-growing vegetation, wherein segregating image regions comprises:
classifying content of image pixels as belonging to a vegetation class or to background clutter based on a threshold of spectral reflectance combinations; and
classifying groups of contiguous vegetation pixels as plants belonging to one of one or more vegetation classes responsive to characteristics of a pixel group being proximate to a description of one of one or more of the vegetation classes.
2. The system of claim 1, wherein the one or more hardware processors are further configured by machine-readable instructions to count a number of plants within a vegetation class.
3. The system of claim 1, wherein the one or more hardware processors are further configured by machine-readable instructions to calculate a size of plants within a vegetation class.
4. The system of claim 1, wherein the one or more hardware processors are further configured by machine-readable instructions to calculate a geo-position of plant centroids of a vegetation class within the land area.
5. The system of claim 1, wherein the one or more hardware processors are further configured by machine-readable instructions to assign groups of contiguous pixels to a class responsive to characteristics of a group being proximate to characteristics of a pixel region of interest selected by a user through a graphical user interface.
6. The system of claim 1, wherein the one or more hardware processors are further configured by machine-readable instructions such that the target wavelength range is 400 nanometers to 3,000 nanometers.
US16/434,935 2015-09-18 2019-06-07 Systems and methods determining plant population and weed growth statistics from airborne measurements in row crops Active US10803313B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/434,935 US10803313B2 (en) 2015-09-18 2019-06-07 Systems and methods determining plant population and weed growth statistics from airborne measurements in row crops

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201562220596P 2015-09-18 2015-09-18
US15/268,370 US10318810B2 (en) 2015-09-18 2016-09-16 Systems and methods for determining statistics plant populations based on overhead optical measurements
US16/434,935 US10803313B2 (en) 2015-09-18 2019-06-07 Systems and methods determining plant population and weed growth statistics from airborne measurements in row crops

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/268,370 Continuation US10318810B2 (en) 2015-09-18 2016-09-16 Systems and methods for determining statistics plant populations based on overhead optical measurements

Publications (2)

Publication Number Publication Date
US20190286905A1 true US20190286905A1 (en) 2019-09-19
US10803313B2 US10803313B2 (en) 2020-10-13

Family

ID=58282771

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/268,370 Active US10318810B2 (en) 2015-09-18 2016-09-16 Systems and methods for determining statistics plant populations based on overhead optical measurements
US16/434,935 Active US10803313B2 (en) 2015-09-18 2019-06-07 Systems and methods determining plant population and weed growth statistics from airborne measurements in row crops

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/268,370 Active US10318810B2 (en) 2015-09-18 2016-09-16 Systems and methods for determining statistics plant populations based on overhead optical measurements

Country Status (5)

Country Link
US (2) US10318810B2 (en)
EP (1) EP3350554A4 (en)
AU (1) AU2016324156A1 (en)
CA (1) CA2998439A1 (en)
WO (1) WO2017049204A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190340789A1 (en) * 2017-02-28 2019-11-07 Optim Corporation Abnormality detection system, abnormality detection method, and program
US10685230B2 (en) * 2018-09-06 2020-06-16 National Central University Method of top-of-atmosphere reflectance-based spatiotemporal image fusion using aerosol optical depth

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CA2998439A1 (en) 2015-09-18 2017-03-23 SlantRange, Inc. Systems and methods for determining statistics of plant populations based on overhead optical measurements
US10120543B2 (en) * 2016-02-09 2018-11-06 Deere & Company Plant emergence system
US10445877B2 (en) 2016-12-30 2019-10-15 International Business Machines Corporation Method and system for crop recognition and boundary delineation
US10586105B2 (en) * 2016-12-30 2020-03-10 International Business Machines Corporation Method and system for crop type identification using satellite observation and weather data
US10664702B2 (en) 2016-12-30 2020-05-26 International Business Machines Corporation Method and system for crop recognition and boundary delineation
AU2017399173A1 (en) * 2017-02-15 2019-08-22 Sony Corporation Information generation method, information generation device, and program
US10402942B2 (en) * 2017-09-28 2019-09-03 Sentera, Inc. Multiple georeferenced aerial image crop analysis and synthesis
CN111226261A (en) * 2017-10-26 2020-06-02 索尼公司 Information processing device, information processing method, program, and information processing system
EP3503025B1 (en) 2017-12-19 2021-11-10 Accenture Global Solutions Limited Utilizing artificial intelligence with captured images to detect agricultural failure
CN109359533B (en) * 2018-09-12 2021-06-18 浙江海洋大学 Coastline extraction method based on multiband remote sensing image
CN110163879A (en) * 2019-05-28 2019-08-23 北京林业大学 A kind of method, apparatus, equipment and medium that plant counts
US11259515B2 (en) * 2019-10-31 2022-03-01 Deere & Company Agricultural plant detection and control system
CN110779876B (en) * 2019-11-07 2022-12-02 长光禹辰信息技术与装备(青岛)有限公司 Method, device and equipment for identifying epidemic trees and computer readable storage medium
US11393182B2 (en) * 2020-05-29 2022-07-19 X Development Llc Data band selection using machine learning
CN112818747A (en) * 2020-12-31 2021-05-18 上海应用技术大学 Urban characteristic neighborhood population density estimation method and system method based on spatial big data
EP4095643A1 (en) * 2021-05-28 2022-11-30 Agco Corporation Crop row guidance systems
US11941880B2 (en) * 2021-06-02 2024-03-26 Ping An Technology (Shenzhen) Co., Ltd. System and method for image-based crop identification
ES2930548A1 (en) * 2021-06-02 2022-12-15 Consejo Superior Investigacion MAPPING OF TILLING ROWS (Machine-translation by Google Translate, not legally binding)
US20230162496A1 (en) * 2021-11-24 2023-05-25 Satsure Analytics India Private Limited System and method for assessing pixels of satellite images of agriculture land parcel using ai
WO2023149963A1 (en) 2022-02-01 2023-08-10 Landscan Llc Systems and methods for multispectral landscape mapping

Citations (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5764819A (en) * 1991-10-18 1998-06-09 Dekalb Genetics Corporation Methods for classifying plants for evaluation and breeding programs by use of remote sensing and image analysis technology
US6778702B1 (en) * 2000-09-19 2004-08-17 Bae Systems Mission Solutions Inc. Method and apparatus for assessing the quality of spectral images
US20040264763A1 (en) * 2003-04-30 2004-12-30 Deere & Company System and method for detecting and analyzing features in an agricultural field for vehicle guidance
US20050111692A1 (en) * 2003-11-24 2005-05-26 Boright Arthur L. Cloud cover assessment: VNIR-SWIR
US7058197B1 (en) * 1999-11-04 2006-06-06 Board Of Trustees Of The University Of Illinois Multi-variable model for identifying crop response zones in a field
US20080059015A1 (en) * 2006-06-09 2008-03-06 Whittaker William L Software architecture for high-speed traversal of prescribed routes
US20080065287A1 (en) * 2005-01-04 2008-03-13 Shufeng Han Method and system for guiding a vehicle with vision-based adjustment
US20130235183A1 (en) * 2012-03-07 2013-09-12 Blue River Technology, Inc. Method and apparatus for automated plant necrosis
US20150015697A1 (en) * 2013-03-07 2015-01-15 Blue River Technology, Inc. Method for automatic phenotype measurement and selection
US20150254800A1 (en) * 2014-03-06 2015-09-10 F12 Solutions, Llc Nitrogen status determination in growing crops
US20150254738A1 (en) * 2014-03-05 2015-09-10 TerrAvion, LLC Systems and methods for aerial imaging and analysis
US20150278640A1 (en) * 2014-03-26 2015-10-01 FI2 Solutions, LLC Crop stand analysis
US20150324975A1 (en) * 2012-12-20 2015-11-12 Pioneer Hi-Bred International Inc. Non-destructive imaging of crop plants
US20160063639A1 (en) * 2014-08-26 2016-03-03 David P. Groeneveld System and Method to Assist Crop Loss Adjusting of Variable Impacts Across Agricultural Fields Using Remotely-Sensed Data
US20160133039A1 (en) * 2014-11-12 2016-05-12 SlantRange, Inc. Systems and methods for aggregating and facilitating the display of spatially variable geographic data acquired by airborne vehicles
US9390331B2 (en) * 2014-04-15 2016-07-12 Open Range Consulting System and method for assessing riparian habitats
US20170083747A1 (en) * 2015-09-21 2017-03-23 The Climate Corporation Ponding water detection on satellite imagery
US20170228595A1 (en) * 2014-05-09 2017-08-10 Raven Industries, Inc. Image filter based on row identification
US9741128B2 (en) * 2011-12-05 2017-08-22 Commonwealth Scientific And Industrial Research Organisation Method and system for characterizing plan phenotype
US9756844B2 (en) * 2011-05-13 2017-09-12 The Climate Corporation Method and system to map biological pests in agricultural fields using remotely-sensed data for field scouting and targeted chemical application
US20170316282A1 (en) * 2016-04-27 2017-11-02 SlantRange, Inc. Systems and methods for object classification and selective compression in aerial imagery
US20180189564A1 (en) * 2016-12-30 2018-07-05 International Business Machines Corporation Method and system for crop type identification using satellite observation and weather data

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010016053A1 (en) * 1997-10-10 2001-08-23 Monte A. Dickson Multi-spectral imaging sensor
US20020119187A1 (en) * 2000-09-29 2002-08-29 Cantor Adam S. Composition for the transdermal delivery of fentanyl
US7876927B2 (en) * 2005-01-04 2011-01-25 Deere & Company Method and system for guiding a vehicle with vision-based adjustment
US7684916B2 (en) * 2005-07-01 2010-03-23 Deere & Company Method and system for vehicular guidance using a crop image
ES2427817T3 (en) 2005-08-01 2013-11-04 Pioneer-Hi-Bred International, Inc. Sensor system, method and product of a computer program for the measurement of the phenotype of plants in agricultural environments
US8342383B2 (en) * 2006-07-06 2013-01-01 Praxair Technology, Inc. Method for forming sputter target assemblies having a controlled solder thickness
US20120155714A1 (en) * 2009-06-11 2012-06-21 Pa Llc Vegetation indices for measuring multilayer microcrop density and growth
US20140180549A1 (en) * 2011-01-07 2014-06-26 The Arizona Board Of Regents On Behalf Of The University Of Arizona Automated machine for selective in situ manipulation of plants
BR112014009255A8 (en) 2011-10-20 2017-06-20 Monsanto Technology Llc plant counter
JP2013159671A (en) * 2012-02-02 2013-08-19 Dow Corning Toray Co Ltd Curable silicone composition, cured product thereof, and optical semiconductor device
WO2014153057A1 (en) * 2013-03-14 2014-09-25 James Olson Improved multi-channel camera system
CA2998439A1 (en) 2015-09-18 2017-03-23 SlantRange, Inc. Systems and methods for determining statistics of plant populations based on overhead optical measurements

Patent Citations (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5764819A (en) * 1991-10-18 1998-06-09 Dekalb Genetics Corporation Methods for classifying plants for evaluation and breeding programs by use of remote sensing and image analysis technology
US7058197B1 (en) * 1999-11-04 2006-06-06 Board Of Trustees Of The University Of Illinois Multi-variable model for identifying crop response zones in a field
US6778702B1 (en) * 2000-09-19 2004-08-17 Bae Systems Mission Solutions Inc. Method and apparatus for assessing the quality of spectral images
US20040264763A1 (en) * 2003-04-30 2004-12-30 Deere & Company System and method for detecting and analyzing features in an agricultural field for vehicle guidance
US20050111692A1 (en) * 2003-11-24 2005-05-26 Boright Arthur L. Cloud cover assessment: VNIR-SWIR
US20080065287A1 (en) * 2005-01-04 2008-03-13 Shufeng Han Method and system for guiding a vehicle with vision-based adjustment
US20080059015A1 (en) * 2006-06-09 2008-03-06 Whittaker William L Software architecture for high-speed traversal of prescribed routes
US9756844B2 (en) * 2011-05-13 2017-09-12 The Climate Corporation Method and system to map biological pests in agricultural fields using remotely-sensed data for field scouting and targeted chemical application
US9741128B2 (en) * 2011-12-05 2017-08-22 Commonwealth Scientific And Industrial Research Organisation Method and system for characterizing plan phenotype
US20130235183A1 (en) * 2012-03-07 2013-09-12 Blue River Technology, Inc. Method and apparatus for automated plant necrosis
US20150324975A1 (en) * 2012-12-20 2015-11-12 Pioneer Hi-Bred International Inc. Non-destructive imaging of crop plants
US9658201B2 (en) * 2013-03-07 2017-05-23 Blue River Technology Inc. Method for automatic phenotype measurement and selection
US20150015697A1 (en) * 2013-03-07 2015-01-15 Blue River Technology, Inc. Method for automatic phenotype measurement and selection
US20150254738A1 (en) * 2014-03-05 2015-09-10 TerrAvion, LLC Systems and methods for aerial imaging and analysis
US20150254800A1 (en) * 2014-03-06 2015-09-10 F12 Solutions, Llc Nitrogen status determination in growing crops
US20150278640A1 (en) * 2014-03-26 2015-10-01 FI2 Solutions, LLC Crop stand analysis
US9390331B2 (en) * 2014-04-15 2016-07-12 Open Range Consulting System and method for assessing riparian habitats
US20170228595A1 (en) * 2014-05-09 2017-08-10 Raven Industries, Inc. Image filter based on row identification
US20160063639A1 (en) * 2014-08-26 2016-03-03 David P. Groeneveld System and Method to Assist Crop Loss Adjusting of Variable Impacts Across Agricultural Fields Using Remotely-Sensed Data
US20160133039A1 (en) * 2014-11-12 2016-05-12 SlantRange, Inc. Systems and methods for aggregating and facilitating the display of spatially variable geographic data acquired by airborne vehicles
US20170083747A1 (en) * 2015-09-21 2017-03-23 The Climate Corporation Ponding water detection on satellite imagery
US20170316282A1 (en) * 2016-04-27 2017-11-02 SlantRange, Inc. Systems and methods for object classification and selective compression in aerial imagery
US20180189564A1 (en) * 2016-12-30 2018-07-05 International Business Machines Corporation Method and system for crop type identification using satellite observation and weather data

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190340789A1 (en) * 2017-02-28 2019-11-07 Optim Corporation Abnormality detection system, abnormality detection method, and program
US10636174B2 (en) * 2017-02-28 2020-04-28 Optim Corporation Abnormality detection system, abnormality detection method, and program
US10685230B2 (en) * 2018-09-06 2020-06-16 National Central University Method of top-of-atmosphere reflectance-based spatiotemporal image fusion using aerosol optical depth

Also Published As

Publication number Publication date
CA2998439A1 (en) 2017-03-23
US20170084039A1 (en) 2017-03-23
WO2017049204A1 (en) 2017-03-23
US10803313B2 (en) 2020-10-13
EP3350554A1 (en) 2018-07-25
US10318810B2 (en) 2019-06-11
EP3350554A4 (en) 2019-06-12
AU2016324156A1 (en) 2018-04-05

Similar Documents

Publication Publication Date Title
US10803313B2 (en) Systems and methods determining plant population and weed growth statistics from airborne measurements in row crops
Gašparović et al. An automatic method for weed mapping in oat fields based on UAV imagery
Stanton et al. Unmanned aircraft system-derived crop height and normalized difference vegetation index metrics for sorghum yield and aphid stress assessment
Duan et al. Comparison of ground cover estimates from experiment plots in cotton, sorghum and sugarcane based on images and ortho-mosaics captured by UAV
Xu et al. Multispectral imaging and unmanned aerial systems for cotton plant phenotyping
Hunt Jr et al. What good are unmanned aircraft systems for agricultural remote sensing and precision agriculture?
US9734400B2 (en) System and method for field variance determination
US20230292647A1 (en) System and Method for Crop Monitoring
Chen et al. Monitoring cotton (Gossypium hirsutum L.) germination using ultrahigh-resolution UAS images
Marino et al. Detection of homogeneous wheat areas using multi-temporal UAS images and ground truth data analyzed by cluster analysis
Samseemoung et al. Application of low altitude remote sensing (LARS) platform for monitoring crop growth and weed infestation in a soybean plantation
US11087132B2 (en) Systems and methods for mapping emerged plants
US10217188B2 (en) Systems and methods for aggregating and facilitating the display of spatially variable geographic data acquired by airborne vehicles
US11631244B2 (en) Methods of yield assessment with crop photometry
Garcia-Ruiz et al. Sugar beet (Beta vulgaris L.) and thistle (Cirsium arvensis L.) discrimination based on field spectral data
Zhang et al. Estimating plant distance in maize using Unmanned Aerial Vehicle (UAV)
David et al. Vegetation indices and textures in object-based weed detection from UAV imagery
Shirzadifar et al. Mapping crop stand count and planting uniformity using high resolution imagery in a maize crop
US10817754B2 (en) Systems and methods for object classification and selective compression in aerial imagery
Borra-Serrano et al. Towards an objective evaluation of persistency of Lolium perenne swards using UAV imagery
Casuccio et al. Corn planting quality assessment in very high-resolution RGB UAV imagery using Yolov5 and Python
Yano et al. Weed identification in sugarcane plantation through images taken from remotely piloted aircraft (RPA) and KNN classifier
Raeva et al. UAV photogrammetry techniques for precision agriculture
Neeser et al. Performance and cost of field scouting for weeds and diseases using imagery obtained with an Unmanned Aerial Vehicle
Peña et al. The TOAS project: UAV technology for optimizing herbicide applications in weed-crop systems

Legal Events

Date Code Title Description
AS Assignment

Owner name: SLANTRANGE, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:RITTER, MICHAEL;MILTON, MICHAEL;MATUSOV, PETER;SIGNING DATES FROM 20160920 TO 20160921;REEL/FRAME:049407/0585

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STCF Information on status: patent grant

Free format text: PATENTED CASE

AS Assignment

Owner name: HIPHEN CORP, DELAWARE

Free format text: CHANGE OF NAME;ASSIGNOR:SLANT RANGE, INC.;REEL/FRAME:066522/0226

Effective date: 20230524