US20220138923A1 - Change Detection System - Google Patents

Change Detection System Download PDF

Info

Publication number
US20220138923A1
US20220138923A1 US17/576,363 US202217576363A US2022138923A1 US 20220138923 A1 US20220138923 A1 US 20220138923A1 US 202217576363 A US202217576363 A US 202217576363A US 2022138923 A1 US2022138923 A1 US 2022138923A1
Authority
US
United States
Prior art keywords
tile
image
pixel
change
previously analyzed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/576,363
Inventor
Ara Victor Nefian
Hrant Khachatryan
Hovnatan Karapetyan
Naira Hovakymian
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intelinair Inc
Original Assignee
Intelinair Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intelinair Inc filed Critical Intelinair Inc
Priority to US17/576,363 priority Critical patent/US20220138923A1/en
Assigned to MCKINSEY & COMPANY, INC. UNITED STATES reassignment MCKINSEY & COMPANY, INC. UNITED STATES SECURITY INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Intelinair, Inc.
Publication of US20220138923A1 publication Critical patent/US20220138923A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T5/002
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/758Involving statistics of pixels or of feature values, e.g. histogram matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/188Vegetation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Definitions

  • One embodiment of the present disclosure includes a change detection system including an image analysis unit that analyzes an image gathered from an image gathering unit, a change detection unit that detects changes in a plurality of images taken over a predetermined time, where the change detection unit modifies the image to indicate areas where a change has occurred.
  • the change detection unit calculates a normalized differential vegetation index for each image gathered.
  • the change detection unit calculates a soil adjusted vegetation index for each image gathered.
  • the change detection unit separates each image into a plurality of tiles of a predetermined size.
  • an Otsu binary thresholding is applied to each pixel in each tile and a mask is applied to each tile based on the Otsu binary thresholding.
  • the change detection unit performs a pixel by pixel comparison of each tile.
  • the change detection unit performs a Gaussian blur to each tile.
  • the change detection unit applies a local statistics method to each tile.
  • each tile is a square.
  • n each square is 25 pixels by 25 pixels.
  • Another embodiment of the present disclosure includes a change detection unit including a processor and a memory with a program being executed in the memory, the program performing the steps of analyzing an image gathered from an image gathering unit, detecting changes in a plurality of images taken over a predetermined time via a change detection unit, where the change detection unit modifies the image to indicate areas where a change has occurred.
  • Another embodiment includes the step of calculating a normalized differential vegetation index for each image gathered.
  • Another embodiment includes the step of calculating a soil adjusted vegetation index for each image gathered.
  • Another embodiment includes the step of separating each image into a plurality of tiles of a predetermined size.
  • Another embodiment includes the step of applying an Otsu binary thresholding to each pixel in each tile and applying a mask to each tile based on the Otsu binary thresholding.
  • Another embodiment includes the step of performing a pixel by pixel comparison of each tile.
  • Another embodiment includes the step of performing a Gaussian blur to each tile.
  • Another embodiment includes the step of applying a local statistics method to each tile.
  • each tile is a square.
  • each square is 25 pixels by 25 pixels.
  • FIG. 1 depicts one embodiment of a change identification system consistent with the present invention
  • FIG. 2 depicts one embodiment of a change detection unit
  • FIG. 3 depicts one embodiment of a communication device consistent with the present invention.
  • FIG. 4 depicts a schematic representation of a process used to identify changes in the conditions of agricultural assets.
  • the change identification system 100 gathers medium to low resolution images gathered from an aircraft flying above 1,500 feet. Each image is then partitioned into equally sized tiles. Each tile is analyzed to identify objects within the tile. Adjacent tiles are then compared to identify similar objects in adjacent tiles. The tiles are saved overtime and compared with earlier saved tiles to identify changes in objects identified in the tiles. By comparing tiles over a predetermined time, a history of a specific region can be analyzed. Further, when external information such as soil properties and seed information is incorporated into the analysis, a detailed view of the effectiveness of different agriculture methods can be rated and reviewed.
  • FIG. 1 depicts one embodiment of a change identification system 100 consistent with the present invention.
  • the change identification system 100 includes a change identification device 102 , a communication device 1 104 , a communication device 2 106 each communicatively connected via a network 108 .
  • the change identification system 100 further includes an image gathering unit 110 , an image analysis unit 112 , a change detection unit 114 and an image generation unit 116 .
  • the image gathering unit 110 and image analysis unit 112 may be embodied by one or more servers.
  • each of the change detection unit 114 and image generation unit 116 may be implemented using any combination of hardware and software, whether as incorporated in a single device or as a functionally distributed across multiple platforms and devices.
  • the network 108 is a cellular network, a TCP/IP network, or any other suitable network topology.
  • the change identification device 102 may be servers, workstations, network appliances or any other suitable data storage devices.
  • the communication devices 104 and 106 may be any combination of cellular phones, telephones, personal data assistants, or any other suitable communication devices.
  • the network 102 may be any private or public communication network known to one skilled in the art such as a local area network (“LAN”), wide area network (“WAN”), peer-to-peer network, cellular network or any suitable network, using standard communication protocols.
  • the network 108 may include hardwired as well as wireless branches.
  • the image gathering unit 112 may be a digital camera.
  • FIG. 2 depicts one embodiment of a change detection unit 102 .
  • the change identification device 102 includes a network I/O device 204 , a processor 202 , a display 206 and a secondary storage 208 running image storage unit 210 and a memory 212 running a graphical user interface 214 .
  • the image gathering unit 112 operating in memory 208 of the change detection unit 102 , is operatively configured to receive an image from the network I/O device 204 .
  • the processor 202 may be a central processing unit (“CPU”), an application specific integrated circuit (“ASIC”), a microprocessor or any other suitable processing device.
  • the memory 212 may include a hard disk, random access memory, cache, removable media drive, mass storage or configuration suitable as storage for data, instructions, and information.
  • the memory 208 and processor 202 may be integrated.
  • the memory may use any type of volatile or non-volatile storage techniques and mediums.
  • the network I/O line 204 device may be a network interface card, a cellular interface card, a plain old telephone service (“POTS”) interface card, an ASCII interface card, or any other suitable network interface device.
  • POTS plain old telephone service
  • the change detection unit 114 may be a compiled program running on a server, a process running on a microprocessor or any other suitable port control software.
  • FIG. 3 depicts one embodiment of a communication device 104 / 106 consistent with the present invention.
  • the communication device 104 / 1106 includes a processor 302 , a network I/O Unit 304 , an image capture unit 306 , a secondary storage unit 308 including an image storage device 310 , and memory 312 running a graphical user interface 314 .
  • the processor 302 may be a central processing unit (“CPU”), a application specific integrated circuit (“ASIC”), a microprocessor or any other suitable processing device.
  • the memory 312 may include a hard disk, random access memory, cache, removable media drive, mass storage or configuration suitable as storage for data, instructions, and information. In one embodiment, the memory 312 and processor 302 may be integrated.
  • the memory may use any type of volatile or non-volatile storage techniques and mediums.
  • the network I/O device 304 may be a network interface card, a plain old telephone service (“POTS”) interface card, an ASCII interface card, or any other suitable network interface device.
  • POTS plain old telephone service
  • ASCII ASCII interface card
  • the network 108 may be any private or public communication network known to one skilled in the art such as a Local Area Network (“LAN”), Wide Area Network (“WAN”), Peer-to-Peer Network, Cellular network or any suitable network, using standard communication protocols.
  • the network 108 may include hardwired as well as wireless branches.
  • FIG. 4 depicts a schematic representation of a process used to identify changes in the conditions of agricultural assets.
  • a first image is captured at a first time by the image gathering unit 112 .
  • the image may be captured using any conventional methods of capturing a digital image.
  • the image is a high resolution raw image.
  • a period of time is allowed to elapse and then a second image is gathered of the same size and content as the first image at a different time.
  • the image analysis unit 114 identifies common location markers in each image.
  • the common location markers maybe Geo-Location tags in the images.
  • the image analysis unit calculates the normalized differential vegetation (NDVI) and the soil adjusted vegetation (SAVI) for each image.
  • NDVI normalized differential vegetation
  • SAVI soil adjusted vegetation
  • NDVI NIR - RED NIR + RED
  • the SAVI is determined using the following equation:
  • each of the images is separated in to fixed size, non-overlapping tiles.
  • each tile is 25 pixels by 25 pixels.
  • an Otsu binary thresholding is performed on each tile in each image with the highest value of the Otsu output representing vegetation. Using the Otsu output, a vegetation mask is generated for each pixel with pixels not assigned to vegetation being assigned a value of 0, and pixels having vegetation begin assigned their corresponding value in the original image.
  • the vegetation segmented mask is used to determine the change in the image.
  • the change is determined by performing a pixel by pixel comparison incorporating the following equation:
  • the change is determined by performing a pixel by pixel analysis after a Gaussian blurring is applied to each image using the following equation:
  • GaussianBlur (image2 ⁇ ( x,y ) ⁇ image1( x,y ))
  • the mean ⁇ (x,y) and standard deviation ⁇ (x,y) are calculated in a box surrounding the pixel. Then the change image is calculated according the above equation. In one embodiment, the box size is given in the configuration file.
  • the changed regions are detected based on a comparison of the processed images. Once the areas where change has occurred are identified, the area of change can be marked on each image using any known method of marking a digital image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Multimedia (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Software Systems (AREA)

Abstract

A change detection system including an image analysis unit that analyzes an image gathered from an image gathering unit, a change detection unit that detects changes in a plurality of images taken over a predetermined time, where the change detection unit modifies the image to indicate areas where a change has occurred.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 16/245,743 that claims the benefit of and the priority from U.S. Provisional Patent Application No. 62/616,163, filed Jan. 11,2018, titled CHANGE DETECTION SYSTEM.
  • BACKGROUND OF THE INVENTION
  • The agriculture industry comprises a large portion of the world's economy. In addition, as the population of the world increases annually, more food must be produced by existing agricultural assets. In order to increase yields on existing plots of farm land, producers require a clear understanding of plant and soil conditions. However, as a single farm may encompass hundreds of acres, it is difficult to access the conditions of the farm land.
  • Currently, farmers rely on their observations of their land along with prior experience to determine the requirements to increase the yield of their farm land. These observations may include identifying locations of weeds, identifying plant illnesses and determining levels of crop damage. However, considering the large number of acres in the average farm, these observations are not a reliable method to increase yields. Therefore, a need exists for system that will allow a farmer to better understand the conditions of their farm land.
  • SUMMARY OF THE INVENTION
  • Systems, methods, features, and advantages of the present invention will be or will become apparent to one with skill in the art upon examination of the following figures and detailed description. It is intended that all such additional systems, methods, features, and advantages be included within this description, be within the scope of the invention, and be protected by the accompanying claims.
  • One embodiment of the present disclosure includes a change detection system including an image analysis unit that analyzes an image gathered from an image gathering unit, a change detection unit that detects changes in a plurality of images taken over a predetermined time, where the change detection unit modifies the image to indicate areas where a change has occurred.
  • In another embodiment, the change detection unit calculates a normalized differential vegetation index for each image gathered.
  • In another embodiment, the change detection unit calculates a soil adjusted vegetation index for each image gathered.
  • In another embodiment, the change detection unit separates each image into a plurality of tiles of a predetermined size.
  • In another embodiment, an Otsu binary thresholding is applied to each pixel in each tile and a mask is applied to each tile based on the Otsu binary thresholding.
  • In another embodiment, the change detection unit performs a pixel by pixel comparison of each tile.
  • In another embodiment, the change detection unit performs a Gaussian blur to each tile.
  • In another embodiment, the change detection unit applies a local statistics method to each tile.
  • In another embodiment, each tile is a square.
  • In another embodiment, n each square is 25 pixels by 25 pixels.
  • Another embodiment of the present disclosure includes a change detection unit including a processor and a memory with a program being executed in the memory, the program performing the steps of analyzing an image gathered from an image gathering unit, detecting changes in a plurality of images taken over a predetermined time via a change detection unit, where the change detection unit modifies the image to indicate areas where a change has occurred.
  • Another embodiment includes the step of calculating a normalized differential vegetation index for each image gathered.
  • Another embodiment includes the step of calculating a soil adjusted vegetation index for each image gathered.
  • Another embodiment includes the step of separating each image into a plurality of tiles of a predetermined size.
  • Another embodiment includes the step of applying an Otsu binary thresholding to each pixel in each tile and applying a mask to each tile based on the Otsu binary thresholding.
  • Another embodiment includes the step of performing a pixel by pixel comparison of each tile.
  • Another embodiment includes the step of performing a Gaussian blur to each tile.
  • Another embodiment includes the step of applying a local statistics method to each tile.
  • In another embodiment, each tile is a square.
  • In another embodiment, each square is 25 pixels by 25 pixels.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate an implementation of the present invention and, together with the description, serve to explain the advantages and principles of the invention. In the drawings:
  • FIG. 1 depicts one embodiment of a change identification system consistent with the present invention;
  • FIG. 2 depicts one embodiment of a change detection unit;
  • FIG. 3 depicts one embodiment of a communication device consistent with the present invention; and
  • FIG. 4 depicts a schematic representation of a process used to identify changes in the conditions of agricultural assets.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Referring now to the drawings which depict different embodiments consistent with the present invention, wherever possible, the same reference numbers will be used throughout the drawings and the following description to refer to the same or like parts.
  • The change identification system 100 gathers medium to low resolution images gathered from an aircraft flying above 1,500 feet. Each image is then partitioned into equally sized tiles. Each tile is analyzed to identify objects within the tile. Adjacent tiles are then compared to identify similar objects in adjacent tiles. The tiles are saved overtime and compared with earlier saved tiles to identify changes in objects identified in the tiles. By comparing tiles over a predetermined time, a history of a specific region can be analyzed. Further, when external information such as soil properties and seed information is incorporated into the analysis, a detailed view of the effectiveness of different agriculture methods can be rated and reviewed.
  • FIG. 1 depicts one embodiment of a change identification system 100 consistent with the present invention. The change identification system 100 includes a change identification device 102, a communication device 1 104, a communication device 2 106 each communicatively connected via a network 108. The change identification system 100 further includes an image gathering unit 110, an image analysis unit 112, a change detection unit 114 and an image generation unit 116.
  • The image gathering unit 110 and image analysis unit 112 may be embodied by one or more servers. Alternatively, each of the change detection unit 114 and image generation unit 116 may be implemented using any combination of hardware and software, whether as incorporated in a single device or as a functionally distributed across multiple platforms and devices.
  • In one embodiment, the network 108 is a cellular network, a TCP/IP network, or any other suitable network topology. In another embodiment, the change identification device 102 may be servers, workstations, network appliances or any other suitable data storage devices. In another embodiment, the communication devices 104 and 106 may be any combination of cellular phones, telephones, personal data assistants, or any other suitable communication devices. In one embodiment, the network 102 may be any private or public communication network known to one skilled in the art such as a local area network (“LAN”), wide area network (“WAN”), peer-to-peer network, cellular network or any suitable network, using standard communication protocols. The network 108 may include hardwired as well as wireless branches. The image gathering unit 112 may be a digital camera.
  • FIG. 2 depicts one embodiment of a change detection unit 102. The change identification device 102 includes a network I/O device 204, a processor 202, a display 206 and a secondary storage 208 running image storage unit 210 and a memory 212 running a graphical user interface 214. The image gathering unit 112, operating in memory 208 of the change detection unit 102, is operatively configured to receive an image from the network I/O device 204. In one embodiment, the processor 202 may be a central processing unit (“CPU”), an application specific integrated circuit (“ASIC”), a microprocessor or any other suitable processing device. The memory 212 may include a hard disk, random access memory, cache, removable media drive, mass storage or configuration suitable as storage for data, instructions, and information. In one embodiment, the memory 208 and processor 202 may be integrated. The memory may use any type of volatile or non-volatile storage techniques and mediums. The network I/O line 204 device may be a network interface card, a cellular interface card, a plain old telephone service (“POTS”) interface card, an ASCII interface card, or any other suitable network interface device. The change detection unit 114 may be a compiled program running on a server, a process running on a microprocessor or any other suitable port control software.
  • FIG. 3 depicts one embodiment of a communication device 104/106 consistent with the present invention. The communication device 104/1106 includes a processor 302, a network I/O Unit 304, an image capture unit 306, a secondary storage unit 308 including an image storage device 310, and memory 312 running a graphical user interface 314. In one embodiment, the processor 302 may be a central processing unit (“CPU”), a application specific integrated circuit (“ASIC”), a microprocessor or any other suitable processing device. The memory 312 may include a hard disk, random access memory, cache, removable media drive, mass storage or configuration suitable as storage for data, instructions, and information. In one embodiment, the memory 312 and processor 302 may be integrated. The memory may use any type of volatile or non-volatile storage techniques and mediums. The network I/O device 304 may be a network interface card, a plain old telephone service (“POTS”) interface card, an ASCII interface card, or any other suitable network interface device.
  • In one embodiment, the network 108 may be any private or public communication network known to one skilled in the art such as a Local Area Network (“LAN”), Wide Area Network (“WAN”), Peer-to-Peer Network, Cellular network or any suitable network, using standard communication protocols. The network 108 may include hardwired as well as wireless branches.
  • FIG. 4 depicts a schematic representation of a process used to identify changes in the conditions of agricultural assets. In step 402, a first image is captured at a first time by the image gathering unit 112. The image may be captured using any conventional methods of capturing a digital image. In one embodiment, the image is a high resolution raw image. A period of time is allowed to elapse and then a second image is gathered of the same size and content as the first image at a different time. In step 404, the image analysis unit 114 identifies common location markers in each image. The common location markers maybe Geo-Location tags in the images.
  • In step 406, the image analysis unit calculates the normalized differential vegetation (NDVI) and the soil adjusted vegetation (SAVI) for each image. The NDIV is calculated based on the near field channel (between 800 nm and 850 nm) and the red channel of each image (between 650 nm and 680 nm) using the following equation:
  • NDVI = NIR - RED NIR + RED
  • The SAVI is determined using the following equation:
  • SAVI = NIR - RED ( NIR + RED + L ) ( 1 + L )
      • Where L=0.5.
  • In step 408, each of the images is separated in to fixed size, non-overlapping tiles. In one embodiment, each tile is 25 pixels by 25 pixels. In step 410, an Otsu binary thresholding is performed on each tile in each image with the highest value of the Otsu output representing vegetation. Using the Otsu output, a vegetation mask is generated for each pixel with pixels not assigned to vegetation being assigned a value of 0, and pixels having vegetation begin assigned their corresponding value in the original image.
  • In step 412, the vegetation segmented mask is used to determine the change in the image. In one embodiment, the change is determined by performing a pixel by pixel comparison incorporating the following equation:

  • change(x,y)=image2(x,y)−image1(x,y)
  • In another embodiment, the change is determined by performing a pixel by pixel analysis after a Gaussian blurring is applied to each image using the following equation:

  • GaussianBlur=(image2−(x,y)−image1(x,y))
  • In another embodiment, a local statistics method is incorporated using the following equation:
  • change ( x , y ) = image 2 ( x , y ) - μ ( x , y ) σ ( x , y )
  • Based on this equation at first for every pixel (x,y) in the first image the mean μ(x,y) and standard deviation σ(x,y) are calculated in a box surrounding the pixel. Then the change image is calculated according the above equation. In one embodiment, the box size is given in the configuration file. In step 412, the changed regions are detected based on a comparison of the processed images. Once the areas where change has occurred are identified, the area of change can be marked on each image using any known method of marking a digital image.
  • While various embodiments of the present invention have been described, it will be apparent to those of skill in the art that many more embodiments and implementations are possible that are within the scope of this invention. Accordingly, the present invention is not to be restricted except in light of the attached claims and their equivalents.

Claims (20)

What is claimed:
1. A method of detecting a change in an agricultural field including the steps of:
gathering a first image from an aerial device;
separating the first image into a plurality of tiles of a predetermined size by an image analysis unit;
determining the geo location of each tile in the image;
performing an Otsu binary thresholding on a tile in the first image;
assigning a vegetation mask having an analog value greater than 0 to each pixel in the tile based on the Otsu binary thresholding with pixels not assigned to vegetation being assigned a value of 0,
performing, via an image analysis unit, a pixel by pixel analysis of each pixel in the tile with a vegetation mask greater then 0;
retrieving from memory at least one previously analyzed version of the tile;
comparing the analyzed tile to the at least one previously analyzed tile to determine any change in the tile; and
marking an area on the image to indicate change has been detected in the tile.
2. The method of claim 1, including the step of selecting an adjacent tile to the previous tile selected.
3. The method of claim 2, including the step of performing an Otsu binary thresholding on the adjacent tile first image.
4. The method of claim 3, including the step of assigning a vegetation mask having an analog value greater than 0 to each pixel in the adjacent tile based on the Otsu binary thresholding with pixels not assigned to vegetation being assigned a value of 0.
5. The method of claim 4, including the step of performing, via an image analysis unit, a pixel by pixel analysis of each pixel in the adjacent tile with a vegetation mask greater then 0.
6. The method of claim 5, including the step of combining the tile and the adjacent tile into a single tile.
7. The method of claim 6, including the step of retrieving a previously analyzed tile and previously analyzed adjacent tile.
8. The method of claim 7, including the step of combining the previously analyzed tile and previously analyzed adjacent tile into a single tile.
9. The method of claim 8, including the step of comparing the combined tile to the combined previously analyzed tile to determine a change in the combined tile.
10. The method of claim 9, including the step of using the geo location information of the combined tile to indicate the area of change on the first image.
11. A change detection unit including a processor executing a series of steps including:
gathering a first image from an aerial device;
separating the first image into a plurality of tiles of a predetermined size by an image analysis unit;
determining the geo location of each tile in the image;
performing an Otsu binary thresholding on a tile in the first image;
assigning a vegetation mask having an analog value greater than 0 to each pixel in the tile based on the Otsu binary thresholding with pixels not assigned to vegetation being assigned a value of 0,
performing, via an image analysis unit, a pixel by pixel analysis of each pixel in the tile with a vegetation mask greater then 0;
retrieving from memory at least one previously analyzed version of the tile;
comparing the analyzed tile to the at least one previously analyzed tile to determine any change in the tile; and
marking an area on the image to indicate change has been detected in the tile.
12. The method of claim 11, including the step of selecting an adjacent tile to the previous tile selected.
13. The method of claim 12, including the step of performing an Otsu binary thresholding on the adjacent tile first image.
14. The method of claim 13, including the step of assigning a vegetation mask having an analog value greater than 0 to each pixel in the adjacent tile based on the Otsu binary thresholding with pixels not assigned to vegetation being assigned a value of 0.
15. The method of claim 14, including the step of performing, via an image analysis unit, a pixel by pixel analysis of each pixel in the adjacent tile with a vegetation mask greater then 0.
16. The method of claim 15, including the step of combining the tile and the adjacent tile into a single tile.
17. The method of claim 16, including the step of retrieving a previously analyzed tile and previously analyzed adjacent tile.
18. The method of claim 17, including the step of combining the previously analyzed tile and previously analyzed adjacent tile into a single tile.
19. The method of claim 18, including the step of comparing the combined tile to the combined previously analyzed tile to determine a change in the combined tile.
20. The method of claim 19, including the step of using the geo location information of the combined tile to indicate the area of change on the first image.
US17/576,363 2018-01-11 2022-01-14 Change Detection System Abandoned US20220138923A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/576,363 US20220138923A1 (en) 2018-01-11 2022-01-14 Change Detection System

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201862616159P 2018-01-11 2018-01-11
US16/245,743 US11227382B2 (en) 2018-01-11 2019-01-11 Change detection system
US17/576,363 US20220138923A1 (en) 2018-01-11 2022-01-14 Change Detection System

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US16/245,743 Continuation US11227382B2 (en) 2018-01-11 2019-01-11 Change detection system

Publications (1)

Publication Number Publication Date
US20220138923A1 true US20220138923A1 (en) 2022-05-05

Family

ID=67139853

Family Applications (6)

Application Number Title Priority Date Filing Date
US16/245,758 Active US11094055B2 (en) 2018-01-11 2019-01-11 Anomaly detection system
US16/245,702 Abandoned US20190266401A1 (en) 2018-01-11 2019-01-11 Change Detection System
US16/245,743 Active 2039-02-09 US11227382B2 (en) 2018-01-11 2019-01-11 Change detection system
US17/401,457 Active 2039-06-13 US11721019B2 (en) 2018-01-11 2021-08-13 Anomaly detection system
US17/576,363 Abandoned US20220138923A1 (en) 2018-01-11 2022-01-14 Change Detection System
US18/230,834 Pending US20240005491A1 (en) 2018-01-11 2023-08-07 Anomaly Detection System

Family Applications Before (4)

Application Number Title Priority Date Filing Date
US16/245,758 Active US11094055B2 (en) 2018-01-11 2019-01-11 Anomaly detection system
US16/245,702 Abandoned US20190266401A1 (en) 2018-01-11 2019-01-11 Change Detection System
US16/245,743 Active 2039-02-09 US11227382B2 (en) 2018-01-11 2019-01-11 Change detection system
US17/401,457 Active 2039-06-13 US11721019B2 (en) 2018-01-11 2021-08-13 Anomaly detection system

Family Applications After (1)

Application Number Title Priority Date Filing Date
US18/230,834 Pending US20240005491A1 (en) 2018-01-11 2023-08-07 Anomaly Detection System

Country Status (1)

Country Link
US (6) US11094055B2 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11094055B2 (en) * 2018-01-11 2021-08-17 Intelinair, Inc. Anomaly detection system
CN110472525B (en) * 2019-07-26 2021-05-07 浙江工业大学 Noise detection method for time series remote sensing vegetation index
CN117935068B (en) * 2024-03-25 2024-05-24 中国平安财产保险股份有限公司四川分公司 Crop disease analysis method and analysis system

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120033862A1 (en) * 2010-08-06 2012-02-09 Sony Corporation Systems and methods for segmenting digital images
US20140049491A1 (en) * 2012-08-20 2014-02-20 Samsung Electronics Co., Ltd System and method for perceiving images with multimodal feedback
US20140176726A1 (en) * 2012-12-20 2014-06-26 Thales Holdings Uk Plc Image processor for processing images received from a plurality of image sensors
US20140270344A1 (en) * 2013-03-12 2014-09-18 Qualcomm Incorporated Reducing object detection time by utilizing space localization of features
US20160307599A1 (en) * 2013-06-05 2016-10-20 Snakt, Inc. Methods and Systems for Creating, Combining, and Sharing Time-Constrained Videos
US20170034986A1 (en) * 2014-04-14 2017-02-09 Precision Planting Llc Crop stand optimization systems, methods and apparatus
US20180070527A1 (en) * 2016-09-09 2018-03-15 Cibo Technologies, Inc. Systems for learning farmable zones, and related methods and apparatus
US20180357517A1 (en) * 2017-06-09 2018-12-13 Uptake Technologies, Inc. Computer System and Method for Classifying Temporal Patterns of Change in Images of an Area
US20190073534A1 (en) * 2015-11-08 2019-03-07 Agrowing Ltd. Method for aerial imagery acquisition and analysis
US20190188847A1 (en) * 2017-12-19 2019-06-20 Accenture Global Solutions Limited Utilizing artificial intelligence with captured images to detect agricultural failure
US20190213414A1 (en) * 2018-01-11 2019-07-11 Intelinair, Inc Row Detection System
US20190266401A1 (en) * 2018-01-11 2019-08-29 Intelinair, Inc Change Detection System
US20200143844A1 (en) * 2018-11-07 2020-05-07 Genetec Inc. Methods and systems for detection of anomalous motion in a video stream and for creating a video summary
US10728528B2 (en) * 2014-04-30 2020-07-28 Intel Corporation System for and method of social interaction using user-selectable novel views

Family Cites Families (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5764819A (en) * 1991-10-18 1998-06-09 Dekalb Genetics Corporation Methods for classifying plants for evaluation and breeding programs by use of remote sensing and image analysis technology
US7027072B1 (en) * 2000-10-13 2006-04-11 Silicon Graphics, Inc. Method and system for spatially compositing digital video images with a tile pattern library
US7337065B2 (en) * 2001-01-23 2008-02-26 Spectral Sciences, Inc. Methods for atmospheric correction of solar-wavelength hyperspectral imagery over land
US7184890B2 (en) * 2003-11-24 2007-02-27 The Boeing Company Cloud shadow detection: VNIR-SWIR
WO2006097911A1 (en) * 2005-03-17 2006-09-21 Algotec Systems Ltd. Bone segmentation
FR2942338B1 (en) * 2009-02-13 2011-08-26 Univ Paris Descartes ECHOGRAPHIC BROWSER
US8949913B1 (en) * 2010-09-16 2015-02-03 Pixia Corp. Method of making a video stream from a plurality of viewports within large format imagery
US10257728B2 (en) * 2013-03-15 2019-04-09 DGS Global Systems, Inc. Systems, methods, and devices for electronic spectrum management
US9042674B2 (en) * 2013-03-15 2015-05-26 Digitalglobe, Inc. Automated geospatial image mosaic generation
JP5950166B2 (en) * 2013-03-25 2016-07-13 ソニー株式会社 Information processing system, information processing method of image processing system, imaging apparatus, imaging method, and program
US10217188B2 (en) * 2014-11-12 2019-02-26 SlantRange, Inc. Systems and methods for aggregating and facilitating the display of spatially variable geographic data acquired by airborne vehicles
US9117118B1 (en) * 2015-01-26 2015-08-25 Fast Yeti, Inc. Systems and methods for capturing and processing documents
AU2016287397B2 (en) * 2015-06-30 2021-05-20 Climate Llc Systems and methods for image capture and analysis of agricultural fields
CN107735794B (en) * 2015-08-06 2021-06-04 埃森哲环球服务有限公司 Condition detection using image processing
WO2017181127A1 (en) * 2016-04-15 2017-10-19 The Regents Of The University Of California Robotic plant care systems and methods
EP3244343A1 (en) * 2016-05-12 2017-11-15 Bayer Cropscience AG Recognition of weed in a natural environment
WO2018049289A1 (en) * 2016-09-09 2018-03-15 Cibo Technologies, Inc. Systems for adjusting agronomic inputs using remote sensing, and related apparatus and methods
US9798954B1 (en) * 2016-12-15 2017-10-24 Federal Home Loan Mortgage Corporation System, device, and method for image anomaly detection
US10552692B2 (en) * 2017-09-19 2020-02-04 Ford Global Technologies, Llc Color learning
US10719744B2 (en) * 2017-12-28 2020-07-21 Intel Corporation Automated semantic inference of visual features and scenes

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120033862A1 (en) * 2010-08-06 2012-02-09 Sony Corporation Systems and methods for segmenting digital images
US20140049491A1 (en) * 2012-08-20 2014-02-20 Samsung Electronics Co., Ltd System and method for perceiving images with multimodal feedback
US20140176726A1 (en) * 2012-12-20 2014-06-26 Thales Holdings Uk Plc Image processor for processing images received from a plurality of image sensors
US20140270344A1 (en) * 2013-03-12 2014-09-18 Qualcomm Incorporated Reducing object detection time by utilizing space localization of features
US20160307599A1 (en) * 2013-06-05 2016-10-20 Snakt, Inc. Methods and Systems for Creating, Combining, and Sharing Time-Constrained Videos
US20170034986A1 (en) * 2014-04-14 2017-02-09 Precision Planting Llc Crop stand optimization systems, methods and apparatus
US10728528B2 (en) * 2014-04-30 2020-07-28 Intel Corporation System for and method of social interaction using user-selectable novel views
US20190073534A1 (en) * 2015-11-08 2019-03-07 Agrowing Ltd. Method for aerial imagery acquisition and analysis
US20180070527A1 (en) * 2016-09-09 2018-03-15 Cibo Technologies, Inc. Systems for learning farmable zones, and related methods and apparatus
US20180357517A1 (en) * 2017-06-09 2018-12-13 Uptake Technologies, Inc. Computer System and Method for Classifying Temporal Patterns of Change in Images of an Area
US20190188847A1 (en) * 2017-12-19 2019-06-20 Accenture Global Solutions Limited Utilizing artificial intelligence with captured images to detect agricultural failure
US10846843B2 (en) * 2017-12-19 2020-11-24 Accenture Global Solutions Limited Utilizing artificial intelligence with captured images to detect agricultural failure
US20190213414A1 (en) * 2018-01-11 2019-07-11 Intelinair, Inc Row Detection System
US20190266401A1 (en) * 2018-01-11 2019-08-29 Intelinair, Inc Change Detection System
US20200143844A1 (en) * 2018-11-07 2020-05-07 Genetec Inc. Methods and systems for detection of anomalous motion in a video stream and for creating a video summary

Also Published As

Publication number Publication date
US20190213727A1 (en) 2019-07-11
US20240005491A1 (en) 2024-01-04
US11721019B2 (en) 2023-08-08
US11227382B2 (en) 2022-01-18
US20190213731A1 (en) 2019-07-11
US20210374930A1 (en) 2021-12-02
US20190266401A1 (en) 2019-08-29
US11094055B2 (en) 2021-08-17

Similar Documents

Publication Publication Date Title
US20220138923A1 (en) Change Detection System
Dorj et al. An yield estimation in citrus orchards via fruit detection and counting using image processing
US10964009B2 (en) Method, medium, and system for detecting potato virus in a crop image
Zhang et al. Individual leaf identification from horticultural crop images based on the leaf skeleton
Van Evert et al. Real‐time vision‐based detection of Rumex obtusifolius in grassland
JP2014018140A (en) Method, device and program for identifying crop condition change date
CN108510490B (en) Method and device for analyzing insect pest trend and computer storage medium
Chen et al. Plant leaf segmentation for estimating phenotypic traits
CN114511820A (en) Goods shelf commodity detection method and device, computer equipment and storage medium
Wang et al. An automatic system for pest recognition and forecasting
US20190213414A1 (en) Row Detection System
CN112418050B (en) Remote sensing identification method and device for land withdrawal information
KR20210028966A (en) Method and apparatus for disease classification of plant leafs
US11580729B2 (en) Agricultural pattern analysis system
Dagobert et al. Visibility detection in time series of planetscope images
Singh et al. A novel algorithm for segmentation of diseased apple leaf images
Mahenthiran et al. Smart Pest Management: An Augmented Reality-Based Approach for an Organic Cultivation
US20240049618A1 (en) Crop yield prediction system
Sathiyamoorthi et al. An effective model for predicting agricultural crop yield on remote sensing hyper‐spectral images using adaptive logistic regression classifier
US20230146206A1 (en) Image processing device, image processing method, and program
US20220270249A1 (en) Nutrient deficiency detection and forecasting system
Tannouche et al. A fast and efficient approach for weeds identification using Haar-like features
JP6287176B2 (en) Image processing apparatus, image processing method, and image processing program
Reiner et al. An operational framework to track individual farmland trees over time at national scales using PlanetScope imagery
EP4327278A1 (en) Method of identifying a soil-borne pathogen on a target crop in an agricultural parcel

Legal Events

Date Code Title Description
AS Assignment

Owner name: MCKINSEY & COMPANY, INC. UNITED STATES, NEW YORK

Free format text: SECURITY INTEREST;ASSIGNOR:INTELINAIR, INC.;REEL/FRAME:059206/0843

Effective date: 20220302

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE