WO2023209413A1 - System and method for monitoring a right of way - Google Patents

System and method for monitoring a right of way Download PDF

Info

Publication number
WO2023209413A1
WO2023209413A1 PCT/IB2022/000457 IB2022000457W WO2023209413A1 WO 2023209413 A1 WO2023209413 A1 WO 2023209413A1 IB 2022000457 W IB2022000457 W IB 2022000457W WO 2023209413 A1 WO2023209413 A1 WO 2023209413A1
Authority
WO
WIPO (PCT)
Prior art keywords
location
time series
images
way
change
Prior art date
Application number
PCT/IB2022/000457
Other languages
French (fr)
Inventor
Lane Boyer
Sylvain CALISTI
Sébastien DROUYER
Alexandre POINSO
Eric Qian
Original Assignee
Kayrros
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kayrros filed Critical Kayrros
Priority to PCT/IB2022/000457 priority Critical patent/WO2023209413A1/en
Publication of WO2023209413A1 publication Critical patent/WO2023209413A1/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0108Measuring and analyzing of parameters relative to traffic conditions based on the source of data
    • G08G1/012Measuring and analyzing of parameters relative to traffic conditions based on the source of data from other sources than vehicle or roadside beacons, e.g. mobile networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

A method for monitoring of a right of way (ROW) location comprises defining a ROW location, obtaining a time series of mobile geolocation data representative of at least one mobile device located within the ROW location, obtaining a time series of images depicting the ROW location from an overhead image acquisition device, processing the time series of mobile geolocation data to detect at least one mobile device located within the ROW location, processing the time series of images to detect at least one ground cover change, and identifying at least one encroachment event based upon the detected mobile device and/or the at least one landscape change.

Description

SYSTEM AND METHOD FOR MONITORING A RIGHT OF WAY
TECHNICAL FIELD
[0001] The present disclosure, in various embodiments, relates generally to a computer- implemented method for monitoring a right of way using mobile geolocation data and/or images from an overhead image acquisition device and a system configured to perform the method.
BACKGROUND
[0002] A right of way establishes a reserved route over land for access to and placement of various assets. Such assets may be provided for purposes of transporting people, goods, and/or services. For example, right of ways may be established for roadways, footpaths, railway lines, oil and gas pipelines, and electrical power lines. To maintain the right of ways, recurring and planned surveillance is required. Such surveillance is intended to discover encroachments upon the right of way that affect the safe transportation of people, goods, and/or services and to protect land, people, and assets located near the right of way. Depending on the type of asset for which the right of way is established, such surveillance may be required by regulatory authorities and/or by law. This is the case for oil and gas right of ways.
[0003] Traditionally, surveillance is accomplished via one or more of manned or unmanned low-flying small aircraft, observation from vehicles passing over land on or adjacent to the right of way, and on foot. These traditional surveillance methods may be expensive due to the significant resources needed for surveillance, dangerous, inefficient, and sporadic or otherwise insufficiently frequent. Such surveillance methods also rely on human observations, which are subject to bias and/or error. As a result, the consistency and reliability of these traditional surveillance methods is lacking.
BRIEF SUMMARY
[0004] The present invention proposes a computer-implemented method and a system for monitoring a right of way location so as to detect activity within the right of way location that may disturb the safe operation and/or maintenance of an asset for which a right of way has been established.
[0005] A method for monitoring of a right of way location comprises defining a right of way location, obtaining a time series of mobile geolocation data representative of at least one mobile device located within the right of way location, obtaining a time series of images depicting the right of way location from an overhead image acquisition device, processing the time series of mobile geolocation data to detect at least one mobile device located within the right of way location, processing the time series of images to detect at least one ground cover change, and identifying at least one encroachment event based upon the detected mobile device and/or the at least one landscape change.
[0006] Certain preferred but non-limiting features of the method described above are the following, taken individually or in combination:
[0007] the time series of images comprises a time series of optical images;
[0008] processing the time series of images to detect at least one landscape change comprises detecting ground cover change in the right of way location;
[0009] detecting ground cover change comprises applying a digital image correlation algorithm to the time series of images;
[0010] processing the time series of images to detect at least one landscape change comprises detecting presence of construction equipmentin the right of way location;
[0011] detecting a piece of construction equipment comprises applying a machine learning algorithm to pixels indicative of the at least one landscape change.
[0012] processing the time series of mobile geolocation data comprises at least one of determining a number of mobile devices within the right of way location, a location of at least one mobile device, and a duration of time at least one mobile device was located in the right of way location;
[0013] communicating the at least one encroachment event to a user via a graphical user interface; and/or
[0014] the graphical user interface includes a plot indicating the location of the ground cover change, the location of construction equipment, and/or the location of the mobile device within the right of way location.
[0015] Another aspect of the present disclosure relates to a system configured to perform the method for monitoring a right of way location as described herein. [0016] The system for monitoring a right of way location comprises a processor communicatively coupled to at least one database from which a time series of mobile geolocation data representative of at least one mobile device located within the right of way location and a time series of images depicting the right of way location from an overhead image acquisition device are obtained, the processor configured to process the time series of mobile geolocation data to detect at least one mobile device located within the right of way location, process the time series of images to detect at least one ground cover change, and identify at least one encroachment event based upon the detected mobile device and/or the at least one landscape change.
[0017] A further aspect of the present disclosure relates to a computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method as described herein.
[0018] A yet further aspect of the present disclosure relates to a computer-readable storage medium comprising instructions which, when executed by a computer, cause the computer to carry out the method as described herein.
BRIEF DESCRIPTION OF THE DRAWINGS
[0019] FIG. 1 is a flow chart of a method for monitoring a right of way according to the present disclosure;
[0020] FIG. 2 is a block diagram of an exemplary computing system configured to perform the disclosed method;
[0021] FIG. 3 is an exemplary image obtained according to the disclosed method;
[0022] FIG. 4 is an exemplary mobile geolocation data map obtained according to the disclosed method;
[0023] FIG. 5 is a correlation image resulting from the disclosed method;
[0024] FIG. 6 is an exemplary optical image in which a ground cover change is detected according to the disclosed method;
[0025] FIG. 7 is an exemplary optical image in which a construction equipment change is detected according to the disclosed method;
[0026] FIG. 8 is an exemplary user interface; and
[0027] FIG. 9 is another exemplary user interface. DETAILED DESCRIPTION
[0028] The illustrations presented herein are not actual views of any particular component, device, or system, but are merely idealized representations employed to describe example embodiments of the present disclosure. The following description provides specific details of embodiments of the present disclosure in order to provide a thorough description thereof However, a person of ordinary skill in the art will understand that the embodiments of the disclosure may be practiced without employing many such specific details. Indeed, the embodiments of the disclosure may be practiced in conjunction with conventional techniques employed in the industry. In addition, only those process acts and systems necessary to understand the embodiments of the disclosure are described in detail below. Additional conventional acts and systems may be used. Also note, any drawings accompanying the application are for illustrative purposes only, and are thus not drawn to scale. Additionally, elements common between figures may have corresponding numerical designations.
[0029] As used herein, the terms “comprising,” “including,” “containing,” “characterized by,” and grammatical equivalents thereof are inclusive or open-ended terms that do not exclude additional, un-recited elements or method steps, but also include the more restrictive terms “consisting of,” “consisting essentially of,” and grammatical equivalents thereof
[0030] As used herein, the term “may” with respect to a system, structure, feature, or method act indicates that such is contemplated for use in implementation of an embodiment of the disclosure, and such term is used in preference to the more restrictive term “is” so as to avoid any implication that other compatible systems, structures, features, and methods usable in combination therewith should or must be excluded.
[0031] As used herein, the singular forms following “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.
[0032] As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
[0033] The present disclosure relates to a methodology to monitor (e.g., surveil) right of way locations on which assets may be located using a combination of remote-sensing data, allowing verification, scalability and globality.
[0034] FIG. 1 illustrates a flow chart 100 of a method for monitoring a right of way according to the present disclosure. The numbering of steps herein is not intended to infer a particular order unless explicitly stated otherwise. FIG. 2 is a schematic diagram of a general architecture of a system 200 configured to perform the method described herein.
[0035] The method comprises, at step 102, identifying an area of interest (AOI) to be monitored. The AOI to be monitored may be a geographic area in which at least one asset and at least one right of way associated with the asset are located. The AOI may also include some area proximate to the right of way, such as an area within a pre-determined distance adjacent to the right of way. The AOI may also be referred to herein as the “right of way location.”
[0036] The AOI may be selected from one or more database(s) 202 that provide(s) a geographic location of at least one right of way. The database 202 may also provide information related to the type of right of way granted, information related to a type of asset located on the right of way, and/or information related to a proprietor or other party responsible for the asset and/or the right of way including maintenance and/or operation thereof.
[0037] At step 102, the method comprises defining a boundary delimiting the AOI. The boundary and the area therein may be geo-referenced such that additional data as described herein may be obtained for the right of way location.
[0038] Information related to the right of way location including, but not limited to, geographic coordinates of the bounded AOI, geographic coordinates of the asset located within the AOI, a type of asset located within the right of way (e.g., transportation thoroughfare, a public utility line such as electrical power lines, oil and gas pipeline, etc.) and/or a proprietor of the asset located on the right of way may be stored in a database 204.
[0039] The method comprises, at step 104, obtaining at least one image from an overhead image acquisition device 206 depicting the right of way location. The image may be obtained from a database 208 of images from the overhead image acquisition device 206.
[0040] The at least one image from the overhead image acquisition device 206 may be an optical image.
[0041] The overhead image acquisition device 206 may be a satellite. The sensor of the overhead image acquisition device may be a three or four band frame imager. In some embodiments, the sensor may capture images across one or more of spectral bands such as blue (455-515 nm), green (500-590 nm), red (590-670 nm), and near infrared red (780-860 nm). In other embodiments, the sensor may capture a panchromatic image, which blends the visible blue, green, and red bands (450-900 nm). [0042] In some embodiments, the overhead image acquisition device may be a satellite operated by Planet, which provides optical images. The at least one image may be obtained from the Planetscope database. Other satellites from which the at least one image may be obtained include the AIRBUS Defence and Space Pleiades satellite constellation, the AIRBUS Defence and Space SPOT satellites, and/or the MAXAR Worldview constellation.
[0043] The pixels of the obtained optical image(s) may have a spatial resolution in a range from 0.5 m to 5 m, inclusive.
[0044] In addition to obtaining at least one optical image, the method may also comprise obtaining at least one synthetic aperture radar (SAR) image. In some embodiments, the overhead image acquisition device may be a Sentinel- 1 satellite or a satellite from the COSMO-SkyMed system.
[0045] Depending on the size of the right of way location and the spatial resolution of the overhead image acquisition device, more than one image may be obtained in order to capture the entire right of way location to be monitored.
[0046] At step 104, the method may comprise obtaining a time series of images from the overhead image acquisition device 206 depicting the right of way location from an initial time, To, to a given time, TN. For a given time with the time series, a plurality of images may be obtained as previously discussed herein. The frequency with which images may be obtained depends upon the revisit frequency of the overhead image acquisition device 206. In some embodiments, the frequency with which images may be obtained by the overhead image acquisition device 206 may be between 1 day and 30 days.
[0047] FIG. 3 illustrates an exemplary image 300 obtained according to the present method. The image 300 depicts a right of way 302 (represented by a dashed line) and a boundary 304 defining a right of way location.
[0048] At step 106, the method may comprise obtaining mobile geolocation data for the right of way location. From the mobile geolocation data, the presence and geographic coordinates of one or more mobile devices 210 within the right of way location may be determined.
[0049] As used herein, the term “mobile geolocation data” refers to the identification of at least one mobile device located within the right of way location, an estimated geographic location of at least one mobile device, and/or time and date information for the at least one mobile device. [0050] As used herein, the term “mobile device” means and includes to a device made for portability from which a geographic location may be derived.
[0051] The geographic location may be derived from global positioning systems, cellular networks, internet protocol addresses, or any other system usable to identify the longitudinal and latitudinal coordinates of the mobile device 210. By way of example and not limitation, mobile devices include cell phones, smartphones, PDAs, watches, wearable devices, tablets, laptops, personal computers, and fleet tracking devices.
[0052] The mobile geolocation data may be obtained from a database 212 of pseudonymised mobile device information. Pseudonymised mobile device information refers to information relating to a mobile device that cannot lead to the direct identification of a specific individual. Accordingly, the pseudonymized mobile device information obtained includes the longitudinal and latitudinal coordinates and accompanying timestamps of the mobile device(s) but does not include name, address, and/or phone number related to the mobile device(s).
[0053] At step 106, the method may comprise obtaining a time series of mobile geolocation data for the right of way location from an initial time, To, to a given time, TN. The time period over which the mobile geo location data is obtained may be the same as the time period over which the images depicting the right of way location is obtained. In some embodiments, a frequency at which mobile geolocation data is obtainable at step 106 may be greater than a frequency at which images from step 104 are obtainable.
[0054] FIG. 4 illustrates an exemplary mobile geolocation data map 400 depicting locations of one or more mobile devices 210 within the right of way location 402 for a given period of time. As shown in FIG. 4, areas of low activity (e.g., low number of mobile devices, short duration) are indicated by lighter colored pixels while areas of high activity (e.g., high number of mobile devices, long duration) are indicated by darker colored pixels.
[0055] According to the method of the present disclosure, the obtained images and/or the mobile geolocation data may be analyzed in order to identify (e.g., detect) one or more indicators of an actual or potential encroachment event within the right of way location. The obtained images and/or the mobile geolocation data may be analyzed separately and/or in combination. The obtained images and/or the mobile geolocation data may be analyzed consecutively or concurrently. From the one or more indicators of the actual or potential encroachment event, a risk level for encroachment upon the right of way may be assessed. As used herein, the term “encroachment event” will be used to refer to any activity or occurrence whether due to man-made or natural causes that may alter the state of the asset located in the right of way location and/or the right of way such that proper and/or safe operation of the asset is hindered or prevented. Such activity or occurrence(s) may vary depending on the type of right of way.
[0056] In order to process images to detect one or more indicators of an actual or potential encroachment events, the method may comprise, at step 108, pre-processing images obtained in step 104.
[0057] The method of pre-processing may comprise cropping the obtained images (step 108a). The images obtained in step 104 may depict a larger area than the right of way location 204. Accordingly, at step 108a, the images may be cropped to remove extraneous portions of the images and focus the images on the right of way location. At step 108a, the images may also be geo-referenced.
[0058] The pre-processing step 108 may further comprise a step 108b of registering two or more images in the time series obtained in step 104. The images are registered such that the images have a common frame of reference. The step 108b may comprise using a registration algorithm having sub-pixel precision resulting in images that are superimposable relative to each image in the time series of optical images obtained.
[0059] The pre-processing step 108 may optionally comprise applying a noise estimate algorithm to the images such that a noise level in the images may be reduced.
[0060] The pre-processing step 108 may optionally comprise applying one or more algorithms to identify useable and unusable pixels with the obtained images. The geolocation of those unusable pixels and usable pixels may be saved in database 204 for later use in the method.
[0061] The pre-processing step 108 may optionally comprise applying a cloud mask to the obtained images so as to detect cloudy pixels (e.g., pixels in which land is not visible) and cloud-free pixels (e.g., pixels in which land is visible). Cloudy pixels may be classified as unusable pixels, and cloud-free pixels may be classified as usable pixels. Cloudy and cloud-free pixels may be identified using the algorithm for detecting clouds in images made of three asynchronous spectral bands as described in the article “Cloud Detection by Luminance and Interband parallax analysis for Pushbroom Satellite Imagers,” by T. Dagobert, et al., Image Processing On Line, 10 (2020), pp. 167-190, which is incorporated herein by this reference.
[0062] The pre-processing step 108 may optionally include applying a vegetation mask to the obtained images so as to detect vegetation pixels (e.g., pixels in which land is covered by heavy vegetation such as farms, forests, etc.). Vegetation pixels may be classified as unusable pixels. A remainder of the obtained image may be classified as usable pixels.
[0063] The method further comprises processing the obtained images to identify a landscape change at a computing device 230 and, more particularly, at processor 234. As used herein, the term “landscape change” refers generally to a changed in a particular area of activity and is used to refer collectively to ground cover changes and construction equipment detections as discussed in further detail herein.
[0064] One indicator of an actual or potential encroachment event may comprise changes to landscape of the right of way. Landscape changes may be attributable to man-made or natural activities and may include, but are not limited to, natural vegetation growth, commercial activity, earthworks, construction, natural disasters, weather, and climate. The method may further comprise processing of the images obtained in step 104 to detect at least one landscape change.
[0065] The landscape change to be detected from the images may vary as a function of the type of right of way to be monitored. By way of non-limiting example, one asset that may be monitored according to the present method is a right of way on which electrical power transmission structures are located. Electrical power lines may be located over land that has been cleared of vegetation that poses a risk of damage or otherwise interfere with the safe operation of the power lines, such as entanglement of power lines with trees. Structures supporting the power lines, such as poles or towers, may be subject to deterioration or damage due to natural causes (e.g., weather, natural disasters, etc.), man-made activity (e.g., construction, vandalism, etc.), and/or ordinary use. By way of further non-limiting example, oil and gas pipelines may generally be buried beneath land. Such pipelines may be subject to deterioration or damage due to natural causes (e.g., natural disasters, etc.), man-made activity (e.g., construction, earthworks, development of additional pipelines, vandalism), and/or ordinary use.
[0066] At step 110, the method comprises applying a digital image correlation algorithm such that each pixel of images of the time series is assessed on a binary system as being either indicative of a change or indicative of no change.
[0067] In step 110, at least two images of the right of way location are compared and changes between the images are detected. The images compared are images obtained at different times and/or dates. [0068] In step 110, the digital image correlation algorithm outputs a correlation image 400 as illustrated in FIG. 5. In the image 500, the digital image correlation technique identifies a change for each pixel in the images on a binary scale: change (1) or no change (0). In FIG. 5, pixels in which a change is detected are indicated by dark spots 502 (e.g., black areas) and pixels in which no change is detected are indicated by light spots 504 (e.g., white areas).
[0069] At step 110, the method may further comprise filtering those pixels indicative of change so as to remove unusable pixels. Filtering may include removing pixels indicative of change that were identified in step 108 as unusable pixels as previously described herein. Such unusable pixels are expected to be false positive indicators of change. Pixels indicative of change that comprise usable pixels may be subject to further processing.
[0070] The method may further comprise identifying landscape changes indicative of at least one encroachment event within the right of way location from the pixels indicative of change identified in step 110, as previously described herein.
[0071] The method may comprise assessing whether pixels indicative of change identified are attributable to a ground cover change at step 112. Optionally, the method may comprise, at step 114, assessing whether pixels indicative of ground cover change at step 112 are pixels indicative of (e.g. attributable to) a construction equipment presence (step 114).
[0072] After such assessments, the method may comprise categorizing the detected change as a ground cover change (step 116) and, potentially, further as being indicative of the presence of construction equipment (step 118).
[0073] In step 112, ground cover change may include change resulting from natural causes (e.g., deforestation, fire, flood) and/or change resulting from human activity (e.g., construction). Ground cover change detection may comprise computing a change score for each pixel of the compared images of step 110. The change score is indicative of a degree of change between a pixel in an image at time Tn and a pixel in an image at an earlier time Tn-i.
[0074] The method further comprises comparing the computed change score for each pixel to a ground cover change score threshold. The ground cover change score threshold is a predetermined threshold established such that inconsequential changes, or changes not likely to be indicative of an encroachment event within the right of way location, are identified as being indicative of no change in the binary system. Where the computed change score is greater than the pre-determined threshold, the pixel is identified as indicative of a change (e.g., 1), and where the computed change score is less than or equal to the pre-determined threshold, the pixel is identified as indicative of no change (e.g., 0).
[0075] In step 112, an image segmentation technique may be utilized and is applied to the correlation image to draw at least one bounding box about pixels indicative of ground cover change. For each bounding box identified in step 112, the bounding box may be characterized by a geometry and geographic coordinates of a centroid of the geometry. The geometry and centroid of each bounding box is stored in a database 238 at step 116. FIG. 6 illustrates an exemplary optical image for which a ground cover change is detected including a bounding box indicated by dashed lines.
[0076] Optionally, the images obtained may be further processed so as to provide information about the ground cover changes. For instance, the optical and/or SAR images may be analyzed using a normalized difference vegetation index to assess changes in the presence of live green vegetation within the AOI. Alternatively or additionally, the optical and/or SAR images may be analyzed using a normalized difference water index to assess water content in water bodies within the AOI. Alternatively or additionally, the optical and/or SAR images may be analyzed to detect fire within the AOI.
[0077] At step 124, the ground changes identified may be output to a user interface. Information obtainable at step 116 and provided to the user interface may include, but is not limited to, a geographic location of the detected ground cover change and an optical image depicting the ground cover change.
[0078] In step 114, construction equipment presence is detected. As used herein, “construction equipment” means and includes vehicles or other structures specifically designed for executing construction tasks such as earthwork operations or other large construction tasks. Construction equipment includes, but is not limited to bulldozers, excavators, cranes, construction trucks and/or tractors.
[0079] In one embodiment, detecting the presence of construction equipment may include detecting changes in the brightness for each pixel of the compared images of step 110 and comparing the computed change score for each pixel to a construction equipment change score threshold. The construction equipment change score threshold is a pre-determined threshold established such that anomalously light pixels and anomalously dark pixels are identified. Such anomalously bright pixels are identified as being indicative of construction equipment change in the binary system. [0080] In step 114, detecting the presence of construction equipment may employ a vehicle detection method as described in the article entitled “Highway traffic monitoring on medium resolution satellite images,” by S. Drouyer and Carlo de Franchis, IGARSS 2019 - 2019 IEEE International Geoscience and Remote Sensing Symposium, 2019, pp. 1228-1231, which is incorporated by reference in its entirety herein. To detect construction equipment, the method includes comparing an image in which a ground cover change in the AOI was detected in step 112 and an image in which construction equipment is not present in the AOI. The images may be optical images in which the construction equipment is lighter than the ground within the AOI. Subsequently, a difference image is computed and white spots are detected. The white spots are detected by computing the minimum of all directional top-hat transform of the difference image and obtaining a top-hat transform image. A gray level version of the top-hat transform image is obtained where for each pixel the maximum of all RGB channels is retained. The maximum pixels are compared to a brightness threshold to remove those pixels not of sufficient brightness to be indicative of construction equipment. The remaining pixels are identified as being indicative of construction equipment presence.
[0081] In step 114, the method may comprise detecting a change in construction equipment presence including a presence of construction equipment where construction equipment previously did not exist and/or an absence of construction equipment where construction equipment previously existed.
[0082] In another embodiment, in step 114, a detection of construction equipment may be performed using image processing and/or a machine learning algorithm as applied to the correlation image and the pixels indicative of change identified in step 110. A machine learning algorithm may be used to learn to recognize particular types of construction equipment. In step 114, the method may comprise drawing at least one bounding box about pixels in which construction equipment is identified. For each bounding box identified in step 114, the bounding box may be characterized by a geometry and geographic coordinates of a centroid of the geometry. The geometry and centroid of each bounding box is stored in a database 238 at step 118.
[0083] In step 114, an image segmentation technique may be utilized and is applied to the image in which construction equipment is detected to draw at least one bounding box about pixels indicative of construction equipment using either methodology described above. For each bounding box identified in step 114, the bounding box may be characterized by a geometry and geographic coordinates of a centroid of the geometry. The geometry and centroid of each bounding box is stored in a database 238 at step 118.
[0084] FIG. 7 illustrates an exemplary optical image for which an area of construction equipment change is detected including a bounding box indicated by dashed lines and the location of construction equipment is identified by triangles.
[0085] At step 124, the construction equipment detections identified may be output to a user interface. Information obtainable at step 118 and provided to the user interface may include, but is not limited to, a geographic location of an area in which construction equipment and/or ground cover has changed, a geographic location of any particular piece of construction equipment detected, a geographic location of a centroid of the bounding box drawn for the construction equipment and/or ground cover change detected, a distance of a detected change for construction equipment and/or ground cover relative to the location of an asset in the AOI, an eccentricity of the area in which a change is detected and an optical image depicting the ground cover change.
[0086] In step 120, the method may comprise analyzing the time series of mobile geolocation data to determine whether the presence of any mobile device within the AOI is indicative of at least one encroachment event. Analysis of the mobile geolocation data may comprise: determining a number of mobile devices located within the right of way location at a given time TN and/or during a particular period of time within the time series based on a unique identified associated with each mobile device; determining a location or locations of any mobile devices within the right of way location; determining a distance of any mobile devices from the right of way and/or the asset associated with the right of way location by, for example, computing a haversine distance between the geographic coordinates of a given mobile device and the geographic coordinates of the asset within the right of way location; determining a time at which a given mobile device enters a right of way location and exits the right of way location so as to determine a duration in which the given mobile device remained within the right of way location and/or at a particular location within the right of way location; and/or tracking mobile devices over a period of time within the right of way location. [0087] In step 122, the method may comprise filtering of the time series of mobile geolocation data to detect anomalously present mobile devices within the AOI. Anomalously present mobile devices may be indicative of an encroachment event. Depending upon the location of the right of way, the presence of mobile devices within the AOI may be expected. For example, the right of way may be located in a residential area or a commercial area in which mobile devices are expected to be present over an extended period of time. To identify those mobile devices expected within the AOI, the method may comprise using a density-based clustering algorithm to identify mobile devices located within a given distance of other mobile devices during a given time. The mobile devices identified as present within the AOI in step 120 are compared in both time and location to previously identified expected clusters of mobile devices. Those mobile devices located beyond a predetermined distance threshold from expected mobile devices and/or located within the AOI for a duration outside of a predetermined duration threshold for expected mobile devices may be identified as anomalously present mobile devices.
[0088] In step 122, the method may comprise filtering of the times series of mobile geolocation data so as to remove transitory mobile devices within the AOI. Depending upon the location of the right of way, the presence of mobile devices temporarily within the AOI may be expected. In step 122, transitory mobile devices are identified based on the velocity of a given mobile device present at a given time in the AOI. In step 122, at least two consecutive geolocation points (e.g., longitude and latitude coordinates) are identified for a given mobile device located within the AOI. The distance between the consecutive points and the time associated with those consecutive points is used to calculate a velocity of the given mobile device. If the velocity is greater than a pre-determined threshold velocity, the given mobile device is considered as transitory and is filtered out in step 122.
[0089] At step 124, one or more mobile geolocation device detections may be directly output to a user interface as will be described in further detail below. The location and/or duration of the presence of the mobile geolocation device detections may also be output to a user interface.
[0090] In step 126, which may be in addition to or as an alternative to steps 112-122, the method may comprise identifying at least one encroachment event from two or more of a ground change detection, a construction equipment detection, and a mobile device detection within the right of way location.
[0091] In some embodiments, individual detections may associated together so as to identify a common encroachment event. [0092] In other embodiments, each of the individual detections may be separately reported to the user as a potential encroachment event.
[0093] In the method described herein, analysis of images and mobile geolocation data provides enhanced context to the various detections compared to using image analysis or mobile geolocation data analysis alone. By way of non-limiting example, a detection based on mobile geolocation data close in time and location to a detection based on image analysis is indicative of an encroachment event more severe (e.g., posing a high risk to the asset) than a detection based solely on image analysis. Further, mobile geolocation data is available at a higher frequency compared to image acquisition. Therefore, detections based on mobile geolocation data provide context related to a duration of a potential encroachment event in a manner that image analysis provides information for a particular moment in time.
[0094] By way of example, at step 126, the method may comprise determining whether ground cover change detections, construction equipment detections, and/or mobile device detections are related or isolated detections. For example, the detection of ground cover change in association with construction equipment detections and mobile device detections being located in a concentrated area within the right of way location are indicative of an active worksite, which may pose a high risk of encroachment to the asset located in the right of way location. By way of further non-limiting example, the detection of construction equipment detections without mobile device detections may be indicative of a future, but not active worksite, which poses a low risk of encroachment to the asset located in the right of way location. By way of yet further non-limiting example, the detection of one or more of construction equipment, ground change detection, and mobile devices within the right of way location may be sufficiently separated in time and/or geographic location such that each poses a distinct encroachment risk to the asset in the right of way location.
[0095] For example, a number and/or nature of the encroachment events detected in steps 116, 118, and 122 may be aggregated to determine whether the identified encroachment events pose a low, medium, or high risk to the asset and/or the right of way location associated therewith. For example, areas in which a high number of mobile devices are identified in combination with landscape changes in substantially the same location within the right of way location may pose a higher threat than areas in which a relatively low number of mobile devices are identified in isolation from any landscape changes. [0096] FIG. 8 illustrates an exemplary user interface according to the present disclosure. The user interface may depict a plot 800 having a horizontal axis with longitudinal coordinates and a vertical axis with latitudinal coordinates. A right of way location 802 as identified in step 102 may be provided on the plot 800. Within the right of way location 802, one or more of the ground change detections from step 116, construction equipment detections from step 118, and/or mobile device detections from step 122 may be displayed within the right of way location on the plot 800.
[0097] In FIG. 8, the detections are illustrated as circular icons. The detections may be color coded so as to indicate an encroachment risk level as determined in step 128. Alternatively or additionally, detections may be given some other visual indicator, such as a symbol, to categorize the detection as being either a ground change detection, construction equipment detection, or mobile device detections.
[0098] FIG. 9 illustrates another exemplary user interface according to the present disclosure. FIG. 9 depicts two-dimensional geographic model of the right of way location as identified in step 102. Within the right of way location, one or more of the ground change detections provided at step 116, construction equipment detections at step 118, and/or mobile device detections at step 122 may be displayed within the right of way location on the plot 800. The detections are illustrated as circular icons. The detections may be color coded or otherwise given a visual indicator that is indicative of a type of detection and/or an encroachment risk level as determined in step 128.
[0099] In either of the user interfaces described herein, the user interface may include a display of an image acquired at step 104 and having the ground cover change detection, construction equipment detections, and mobile device detections displayed thereon. Other information that may be displayed on either of the foregoing user interfaces including, but not limited to, a location of the asset, the right of way location, the optical images obtained in step 104, data and/or time information for each detection provided on the user interface, geographic coordinates for each detection, an indication of the distance from the ROW at which each detection is located.
[00100] By reporting ground cover change detection, construction equipment detections, and mobile device detections at a user interface, a user, who is a proprietor or other party responsible for the asset and/or the right of way including maintenance and/or operation thereof, may determine and/or take any necessary action to ensure safety with respect to the asset and/or right of way.
[00101] Embodiments of the present disclosure may comprise a computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method as described herein.
[00102] Embodiments of the present disclosure may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below. Embodiments within the scope of the present disclosure also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. In particular, one or more of the processes described herein may be implemented at least in part as instructions embodied in a non-transitory computer-readable medium and executable by one or more computing devices (e.g., any of the media content access devices described herein). In general, a processor (e.g., a microprocessor) receives instructions, from a non-transitory computer-readable medium, (e.g., a memory, etc.), and executes those instructions, thereby performing one or more processes, including one or more of the processes described herein.
[00103] Accordingly, an aspect of the present disclosure includes a computer-readable storage medium comprising instructions which, when executed by a computer, cause the computer to carry out the method as described herein
[00104] Computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are non-transitory computer-readable storage media (devices). Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the disclosure can comprise at least two distinctly different kinds of computer-readable media: non-transitory computer-readable storage media (devices) and transmission media.
[00105] Non-transitory computer-readable storage media (devices) includes RAM, ROM, EEPROM, CD-ROM, solid state drives (“SSDs”) (e.g., based on RAM), Flash memory, phase-change memory (“PCM”), other types of memory, other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. [00106] Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to non-transitory computer-readable storage media (devices) (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “MC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media (devices) at a computer system. Thus, it should be understood that non- transitory computer-readable storage media (devices) can be included in computer system components that also (or even primarily) utilize transmission media.
[00107] Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general-purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. In some embodiments, computer-executable instructions are executed on a general-purpose computer to turn the general-purpose computer into a special purpose computer implementing elements of the disclosure. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
[00108] Those skilled in the art will appreciate that the disclosure may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, tablets, pagers, routers, switches, and the like. The disclosure may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
[00109] Embodiments of the present disclosure can also be implemented in cloud computing environments. In this description, “cloud computing” is defined as a model for enabling on-demand network access to a shared pool of configurable computing resources. For example, cloud computing can be employed in the marketplace to offer ubiquitous and convenient on-demand access to the shared pool of configurable computing resources. The shared pool of configurable computing resources can be rapidly provisioned via virtualization and released with low management effort or service provider interaction, and then scaled accordingly.
[00110] A cloud-computing model can be composed of various characteristics such as, for example, on-demand self-service, broad network access, resource pooling, rapid elasticity, measured service, and so forth. A cloud-computing model can also expose various service models, such as, for example, Software as a Service (“SaaS”), Platform as a Service (“PaaS”), and Infrastructure as a Service (“laaS”). A cloud-computing model can also be deployed using different deployment models such as private cloud, community cloud, public cloud, hybrid cloud, and so forth. In this description and in the claims, a “cloud-computing environment” is an environment in which cloud computing is employed.
[00111] FIG. 2 illustrates a block diagram of an example computing device 230 that may be configured to perform one or more of the methods described above. The computing device 230 is communicatively coupled to the databases 204, 208, and 212. The databases 204, 208, and 212 may be stored on a non-transitory computer-readable storage media (device) as previously described herein. As shown by FIG. 2, the computing device 230 can comprise a processor 234, a memory 236, a storage device 238, an I/O interface 240, and a communication interface 242, which may be communicatively coupled by way of a communication infrastructure 244. While an example computing device 230 is shown in FIG. 2, the components illustrated in FIG. 2 are not intended to be limiting. Additional or alternative components may be used in other embodiments. Furthermore, in certain embodiments, the computing device 230 can include fewer components than those shown in FIG. 2. Components of the computing device 230 shown in FIG. 2 will now be described in additional detail.
[00112] In one or more embodiments, the processor 234 includes hardware for executing instructions for performing the method described herein, such as those making up a computer program. As an example, and not by way of limitation, to execute instructions, the processor 234 may retrieve (or fetch) the instructions from an internal register, an internal cache, the memory 236, or the storage device 238 and decode and execute them. In one or more embodiments, the computing device 234 may include one or more internal caches for data, instructions, or addresses. By way of non-limiting example, the computing device 234 may include one or more instruction caches, one or more data caches, and one or more translation look aside buffers (TLBs). Instructions in the instruction caches may be copies of instructions in the memory 236 or the storage 238.
[00113] The computing device 230 includes memory 236, which is coupled to the processor 234. The memory 236 may be used for storing data, metadata, and programs for execution by the processor(s). The memory 236 may include one or more of volatile and nonvolatile memories, such as Random Access Memory (“RAM”), Read Only Memory (“ROM”), a solid state disk (“SSD”), Flash, Phase Change Memory (“PCM”), or other types of data storage. The memory 236 may be internal or distributed memory.
[00114] The computing device 230 includes a storage device 238 that includes storage for storing data or instructions. As an example and not by way of limitation, storage device 238 can comprise a non-transitory storage medium described above. The storage device 238 may include a hard disk drive (HDD), a floppy disk drive, flash memory, an optical disc, a magnetooptical disc, magnetic tape, or a Universal Serial Bus (USB) drive or a combination of two or more of these. The storage device 238 may include removable or non-removable (or fixed) media, where appropriate. The storage device 238 may be internal or external to the computing device 230. In one or more embodiments, the storage device 238 is non-volatile, solid-state memory. In other embodiments, the storage device 238 includes read-only memory (ROM). Where appropriate, this ROM may be mask programmed ROM, programmable ROM (PROM), erasable PROM (EPROM), electrically erasable PROM (EEPROM), electrically alterable ROM (EAROM), or flash memory or a combination of two or more of these.
[00115] The computing device 230 also includes one or more input or output (“I/O”) devices/interfaces 240, which are provided to allow a user to provide input to, receive output from, and otherwise transfer data to and receive data from computing device 230. The I/O devices/interfaces 240 may include a mouse, a keypad or a keyboard, a touch screen, a camera, an optical scanner, network interface, modem, other known I/O devices or a combination of such I/O device/interfaces. The touch screen may be activated with a stylus or a finger.
[00116] The I/O devices/interfaces 240 may include one or more devices for presenting output to a user, including, but not limited to, a graphics engine, a display (e.g., a display screen), one or more output drivers (e.g., display drivers), one or more audio speakers, and one or more audio drivers. In certain embodiments, the I/O interface 240 is configured to provide graphical data to a display for presentation to a user. The graphical data may be representative of one or more graphical user interfaces and/or any other graphical content as may serve a particular implementation.
[00117] The computing device 230 can further include a communication interface 328. The communication interface 328 can include hardware, software, or both. The communication interface 328 can provide one or more interfaces for communication (such as, for example, packetbased communication) between the computing device 230 and one or more other computing devices or networks. As an example and not by way of limitation, the communication interface 328 may include a network interface controller (NIC) or network adapter for communicating with an Ethernet or other wire-based network or a wireless MC (WNIC) or wireless adapter for communicating with a wireless network, such as a WI-FI. The computing device 230 can further include a bus. The bus can comprise hardware, software, or both that couples components of computing device 230 to each other.
[00118] While the present disclosure has been described herein with respect to certain illustrated embodiments, those of ordinary skill in the art will recognize and appreciate that it is not so limited. Rather, many additions, deletions, and modifications to the illustrated embodiments may be made without departing from the scope of the invention as hereinafter claimed, including legal equivalents thereof. In addition, features from one embodiment may be combined with features of another embodiment while still being encompassed within the scope of the invention as contemplated by the inventors.

Claims

CLAIMS What is claimed is:
1. A method (100) for monitoring a right of way location, comprising: defining (102) a right of way location; obtaining (106) a time series of mobile geo location data representative of at least one mobile device located within the right of way location; obtaining (104) a time series of images depicting the right of way location from an overhead image acquisition device; processing (120) the time series of mobile geo location data to detect at least one mobile device located within the right of way location and processing (110) the time series of images to detect at least one landscape change; and identifying (126) at least one encroachment event from the at least one detected mobile device and the at least one detected landscape change.
2. The method of claim 1, wherein the time series of images comprises a time series of optical images.
3. The method of claim 1 or 2, wherein processing (110) the time series of images to detect at least one landscape change comprises detecting (112) a ground cover change in the right of way location and/or detecting (114) a presence of construction equipment in the right of way location.
4. The method of claim 3, wherein detecting (112) a ground cover change comprises applying an image segmentation technique to pixels indicative of the at least one landscape change.
5. The method of claims 1-4, wherein processing (110) the time series of images to detect at least one landscape change comprises applying a digital image correlation algorithm to the time series of images.
6. The method of claim 3, wherein detecting (114) a piece of construction equipment comprises applying a machine learning algorithm to pixels indicative of the at least one landscape change.
7. The method of any of claims 1-6, wherein processing (120) the time series of mobile geolocation data comprises at least one of determining a number of mobile devices within the right of way location, a location of at least one mobile device, and a duration of time at least one mobile device was located in the right of way location.
8. The method of any of claims 1-7, further comprising communicating (124) the encroachment events to a user via a graphical user interface.
9. The method of any of claims 1-8, wherein the graphical user interface includes a plot indicating the location of the ground cover change, the location of the construction equipment, and/or the location of the mobile device within the right of way location.
10. A system (200) for monitoring a right of way location, comprising: a processor (234) communicatively coupled to at least one database (208, 212) from which a time series of mobile geolocation data representative of at least one mobile device located within the right of way location and a time series of images depicting the right of way location from an overhead image acquisition device are obtained, the processor configured to: process the time series of mobile geolocation data to detect at least one mobile device located within the right of way location, process the time series of images to detect at least one landscape change, and identify at least one encroachment event based upon the detected mobile device and/or the at least one landscape change.
11. A computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the method of any of claims 1-9.
12. A computer-readable storage medium comprising instructions which, when executed by a computer, cause the computer to carry out the method of any of claims 1-9.
5
PCT/IB2022/000457 2022-04-28 2022-04-28 System and method for monitoring a right of way WO2023209413A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2022/000457 WO2023209413A1 (en) 2022-04-28 2022-04-28 System and method for monitoring a right of way

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2022/000457 WO2023209413A1 (en) 2022-04-28 2022-04-28 System and method for monitoring a right of way

Publications (1)

Publication Number Publication Date
WO2023209413A1 true WO2023209413A1 (en) 2023-11-02

Family

ID=83322564

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2022/000457 WO2023209413A1 (en) 2022-04-28 2022-04-28 System and method for monitoring a right of way

Country Status (1)

Country Link
WO (1) WO2023209413A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018038720A1 (en) * 2016-08-24 2018-03-01 Google Inc. Change detection based imagery acquisition tasking system
US20190295408A1 (en) * 2018-03-26 2019-09-26 International Business Machines Corporation Real-time service level monitor
US20210042499A1 (en) * 2019-08-06 2021-02-11 Saudi Arabian Oil Company Method and system for land encroachment detection and surveillance
US20210232818A1 (en) * 2020-01-27 2021-07-29 AIDASH Inc. System and method of intelligent vegetation management

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018038720A1 (en) * 2016-08-24 2018-03-01 Google Inc. Change detection based imagery acquisition tasking system
US20190295408A1 (en) * 2018-03-26 2019-09-26 International Business Machines Corporation Real-time service level monitor
US20210042499A1 (en) * 2019-08-06 2021-02-11 Saudi Arabian Oil Company Method and system for land encroachment detection and surveillance
US20210232818A1 (en) * 2020-01-27 2021-07-29 AIDASH Inc. System and method of intelligent vegetation management

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
S. DROUYERCARLO DE FRANCHIS: "Highway traffic monitoring on medium resolution satellite images", IGARSS 2019 - 2019 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2019, pages 1228 - 1231, XP033656745, DOI: 10.1109/IGARSS.2019.8899777
T. DAGOBERT ET AL.: "Cloud Detection by Luminance and Interband parallax analysis for Pushbroom Satellite Imagers", IMAGE PROCESSING ON LINE, vol. 10, 2020, pages 167 - 190
YU ZHENG ET AL: "Urban Computing", ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, ASSOCIATION FOR COMPUTING MACHINERY CORPORATION, 2 PENN PLAZA, SUITE 701 NEW YORK NY 10121-0701 USA, vol. 5, no. 3, 18 September 2014 (2014-09-18), pages 1 - 55, XP058054295, ISSN: 2157-6904, DOI: 10.1145/2629592 *

Similar Documents

Publication Publication Date Title
Tang et al. Near real-time monitoring of tropical forest disturbance: New algorithms and assessment framework
Chen et al. Detecting and monitoring long-term landslides in urbanized areas with nighttime light data and multi-seasonal Landsat imagery across Taiwan from 1998 to 2017
Jendryke et al. Putting people in the picture: Combining big location-based social media data and remote sensing imagery for enhanced contextual urban information in Shanghai
US20150130840A1 (en) System and method for reporting events
Ioannidis et al. Towards a strategy for control of suburban informal buildings through automatic change detection
Ajmar et al. Response to flood events: The role of satellite‐based emergency mapping and the experience of the Copernicus emergency management service
US11893538B1 (en) Intelligent system and method for assessing structural damage using aerial imagery
Handwerger et al. Generating landslide density heatmaps for rapid detection using open-access satellite radar data in Google Earth Engine
Kouchi et al. Characteristics of tsunami-affected areas in moderate-resolution satellite images
Cotrufo et al. Building damage assessment scale tailored to remote sensing vertical imagery
US20190286875A1 (en) Cloud detection in aerial imagery
US20230177816A1 (en) Hierarchical context in risk assessment using machine learning
Ehrlich et al. Quantifying the building stock from optical high-resolution satellite imagery for assessing disaster risk
KR101954899B1 (en) Method for automatic water level detection based on the intelligent CCTV
Nazmfar et al. Classification of satellite images in assessing urban land use change using scale optimization in object-oriented processes (a case study: Ardabil city, Iran)
Baba et al. Application of remote sensing and GIS techniques in urban planning, development and management.(A case study of allahabad district, India)
Novo et al. Canopy detection over roads using mobile lidar data
Freihardt et al. Assessing riverbank erosion in Bangladesh using time series of Sentinel-1 radar imagery in the Google Earth Engine
Handwerger et al. Rapid landslide identification using synthetic aperture radar amplitude change detection on the Google Earth Engine
Shinde et al. Flood impact and damage assessment based on the Sentitnel-1 SAR data using google earth engine
Cossu et al. SAR-based seismic damage assessment in urban areas: scaling down resolution, scaling up computational performance
US20180357720A1 (en) Detection of Real Estate Development Construction Activity
Cao et al. Posthurricane damage assessment using satellite imagery and geolocation features
WO2023209413A1 (en) System and method for monitoring a right of way
Handwerger et al. Strategies for landslide detection using open-access synthetic aperture radar backscatter change in Google Earth Engine

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22770033

Country of ref document: EP

Kind code of ref document: A1