US20150130840A1 - System and method for reporting events - Google Patents

System and method for reporting events Download PDF

Info

Publication number
US20150130840A1
US20150130840A1 US14/534,803 US201414534803A US2015130840A1 US 20150130840 A1 US20150130840 A1 US 20150130840A1 US 201414534803 A US201414534803 A US 201414534803A US 2015130840 A1 US2015130840 A1 US 2015130840A1
Authority
US
United States
Prior art keywords
data
augmented
network
camera
photograph
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/534,803
Inventor
Tero Heinonen
Juha Hyyppa
Anttoni Jaakkola
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sharper Shape Ltd
Original Assignee
Sharper Shape Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201361901489P priority Critical
Priority to US201361901490P priority
Priority to US201361901492P priority
Application filed by Sharper Shape Ltd filed Critical Sharper Shape Ltd
Priority to US14/534,803 priority patent/US20150130840A1/en
Assigned to SHARPER SHAPE OY reassignment SHARPER SHAPE OY ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HEINONEN, TERO, HYYPPA, JUHA, JAAKKOLA, ANTTONI
Publication of US20150130840A1 publication Critical patent/US20150130840A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management, e.g. organising, planning, scheduling or allocating time, human or machine resources; Enterprise planning; Organisational models
    • G06Q10/063Operations research or analysis
    • G06Q10/0631Resource planning, allocation or scheduling for a business operation
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/06Electricity, gas or water supply

Abstract

Disclosed is a system for reporting changes to a network in case of an event. The system includes a survey unit adapted to be located at a site of the network using data from a positioning sensor of the survey unit. The survey unit is configured to request from a control unit an augmented view related to the location of the site of the network and displays the augmented view in a display of the survey unit on top of a current view of the site. The survey unit is adapted to capture a photograph on the display and to communicate the photograph to the control unit. The control unit is configured to determine changes in the network by comparing the current view as shown in the photograph with the augmented view, and to create an event report including a catalog of the changes to the network.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims priority to, and the benefit of, U.S. provisional Patent Application No. 61/901,492, filed on 8 Nov. 2013; and is related to, and claims the benefit of, U.S. Patent Application Ser. No. 61/901,489 filed on 8 Nov. 2013 entitled System for Monitoring Power Lines (Docket SLSH.2649.USU2/Sharpershape001); and U.S. Patent Application Ser. No. 61/901,490 filed on 8 Nov. 2013 entitled System and Method for Allocating Resources (Docket SLSH.2650.USU2/Sharpershape002); the disclosures of which are incorporated herein by reference in their entirety.
  • TECHNICAL FIELD
  • The present disclosure generally relates to a system and a method for reporting changes to a network in case of an event, and more particularly related to reporting and documenting changes concerning damages to infrastructure networks due to the event.
  • BACKGROUND
  • Infrastructure networks (such as power lines, water pipes, oil and gas pipes, etc.) are prone to damages over a period of time. Considering, for example, power lines (PL) networks which are usually extensive and comprise of several components like conductors, insulators, pylons and other associated structures such as spacer, dead-lines, switch boxes, etc. Such PL networks are often exposed to potential threats, mainly caused by encroaching vegetation, for example, as the tree grows it will be eventually so tall that in case it falls down during a storm it would break the power line. Furthermore, in case of calamities like a storm, a flood, an earthquake, a hurricane, or the like, a substantial amount of damage may occur to the PL network causing massive disruption to the power distribution and to the whole society dependent on electricity.
  • In all these circumstances, a quick and accurate analysis of the damage is of utmost importance for the electricity transmission and distribution operators, for the accurate assessment of the situation and subsequently to manage the repair work efficiently. Lack of proper and timely reporting leads to a situation that it is difficult to allocate personnel for repair work to appropriate places.
  • Substantial costs are involved in monitoring, identifying, reporting, documenting and accessing of damages to such networks. Traditionally this has been achieved primarily by relying on on-site manual inspection, however sending official representatives for reporting of damages to these infrastructure networks usually take lots of time. Moreover, in case of severe events such as major thunderstorm, the same event often has caused damages to access roads, or trees to fall onto roads, preventing outside personnel to access the site of damage without first clearing the roads, which can take days or weeks in the worst case.
  • Furthermore, independent reporting of these extensive networks and keeping the information up to date in a database, whether by the staff of the company or its subcontractors, is a time and resource consuming task. At the same time the people residing or staying in or near the site of damage are not capable to assessing the situation (as they do now know what to look for) or communicating the findings in a useful and understandable way to the damage assessment firm.
  • Therefore, there exists a need to devise a system that aims to solve the problem associated with reporting of damages to the infrastructure networks, and that overcomes the above-described limitations of existing systems.
  • BRIEF SUMMARY
  • The present disclosure provides a system and a method for reporting of changes to a network in case of an event. More specifically, the present disclosure relates to a system and a method for identifying changes concerning damages to an infrastructure network, and reporting and documenting of these damages to the network for assigning actions related to repair activities for such networks.
  • In one aspect, embodiments of the present disclosure provide a method for reporting changes to a network in case of an event. The method comprises steps requesting from a control unit, after the occurrence of an event, an augmented view of a site as before the occurrence of the event; overlapping the augmented view with a current view, by using a first survey unit; capturing a photograph of the current view along with the overlapped augmented view; sending the photograph to the control unit; and determining changes to the network by comparing the current view as shown in the photograph with the augmented view.
  • According to an embodiment, the method further comprises updating the mission prior data based on the determined changes to the network.
  • In another aspect, embodiments of the present disclosure provide a system for reporting changes to a network in case of an event. The system comprises a first survey unit adapted to be located at a site of the network using data from a positioning sensor of the survey unit. The survey unit is further configured to request from a control unit an augmented view related to the location of the site of the network. The first survey unit is further adapted to display the augmented view in a display of the survey unit on top of a current view of the site. The first survey unit is also adapted to capture a photograph on the display and to communicate the photograph and the positioning sensor data to the control unit. The control unit is configured to receive and to store the photograph and the positioning sensor data from the first survey unit. The control unit is further configured to determine changes in the network by comparing the current view as shown in the photograph with the augmented view, and to create an event report including a catalog of the changes to the network to be accessible therefrom.
  • According to an embodiment, the control unit is further configured to update the mission prior data based on the determined changes to the network.
  • In an example, the augmented view is rendered on the display of the first survey unit by executing with a processor computing instructions stored in a memory of the first survey unit. The computing instructions being configured to use the data from the position sensor to determine a direction of the camera view in relation to the site, capture an image of the camera view, and augment a view on the display using the data from the position sensor.
  • In an embodiment, the augmented view is constructed using a mission prior data. The mission prior data is collected by using a second survey unit, satellite unit data, or Light Detection And Ranging (LiDAR) equipment data in drones or helicopters.
  • In an example, the current view is a camera view rendered on a display of the first survey unit.
  • Further, the photograph contains the camera view along with the overlapped augmented view.
  • In yet another aspect, embodiments of the present disclosure provide an apparatus for documenting changes to a network. The apparatus comprises a communication interface, a camera, at least one location sensor for determining a location of the apparatus and a rotation of the apparatus relative to a ground level and to a map coordinates, a memory for storing computing instructions, and a processor configured to execute the computing instructions. The computing instructions configured to request an augmented view of a site, based on the location of the apparatus, use the data from the at least one position sensor to determine a direction of a camera view in relation to the site, overlap the augmented view with the camera view using the data from the at least one position sensor, capture a photograph of the camera view along with the overlapped augmented view, and communicate the photograph to a device external to the apparatus.
  • Additional aspects, advantages, features and objects of the present disclosure would be made apparent from the drawings and the detailed description of the illustrative embodiments.
  • It will be appreciated that features of the disclosure are susceptible to being combined in various combinations or further improvements without departing from the scope of the disclosure and this provisional application.
  • DESCRIPTION OF THE DRAWINGS
  • The summary above, as well as the following detailed description of illustrative embodiments, is better understood when read in conjunction with the appended drawings. For the purpose of illustrating the present disclosure, exemplary constructions of the disclosure are shown in the drawings. However, the disclosure is not limited to specific methods and instrumentalities disclosed herein. Wherever possible, like elements have been indicated by identical numbers.
  • FIG. 1 illustrates a pictorial representation of a system for reporting changes to a network in case of an event associated with an exemplary infrastructure network, in accordance with an embodiment of the present disclosure;
  • FIG. 2 illustrates a schematic diagram of an apparatus for documenting changes to a network, in accordance with embodiments of the present disclosure;
  • FIG. 3 illustrates a flow diagram for the event reporting system, in accordance with embodiments of the present disclosure; and
  • FIG. 4 is an illustration of steps of a method for reporting changes to a network in case of an event, in accordance with an embodiment of the present disclosure.
  • DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • The present disclosure provides a system 100 for reporting changes to a network in case of an event, hereinafter simply referred to as system 100. The system 100 of the present disclosure is configured for reporting and documentation of changes to an infrastructure network. In particular, the system 100 is configured to collect information related to damage(s) to any component or object in a network, optionally provide assessment of the damages based on some past information about the same component, and generate and document an event report with details of the damages for further perusal, such as, repair activities.
  • More specifically, the system 100 of the present disclosure enables some local user/personnel already present at the site of damage to collect information. The system 100 may further provide means to enable the said personnel to assess the damage to any component in the network based on the available information on the past condition of the same component, either already known to the personnel or provided by some other means in the system 100. In an embodiment, the system 100 may be additionally integrated with other systems involved with execution of actions related to repair activities for such networks, based on the damage report generated by the present system 100. Personnel can refers to any user(s) or person(s) independently on their contractual or employment status.
  • Referring now to the drawings, particularly by their reference numbers, FIG. 1 illustrates an embodiment of the present system 100 associated with an exemplary infrastructure network 200. For the purpose of the present disclosure, the system 100, as shown in FIG. 1, has been depicted in view of a system for damage reporting for a power lines (PL) network 200. Hereinafter, the terms “PL network”, “infrastructure network” and “network” have been interchangeably used. Such a network 200 typically comprises several components 202 like, for example, poles 203, 204, 205; conductor wires 206, 207; insulators; pylons and other associated structures such as spacer, dead-lines, switch boxes, etc. These networks 200 are usually extensive and run through various territories, including urban areas, rural areas, country sides, forests, etc. As depicted in an exemplary embodiment of FIG. 1, the PL network 200 may be installed in a site 210, like a forest comprising of various objects 212 such as trees 214, 215, 216.
  • According to an embodiment of the present disclosure, the system 100 includes a survey unit, such as a first survey unit 110 (the term “first survey unit” interchangeably used as the survey unit herein later). The system also includes other survey units, such as second survey unit (not shown in the FIG. 1). The survey unit 110 is configured to collect information related to damage to the network 200. The survey unit 110 is configured for utility monitoring task, as in present case, monitoring of the components 202 and objects 212, and collecting remote sensing data 112 for the same. Example of remote sensing data 112 include; 3D point cloud (from Light Detection And Ranging, LiDAR), 3D point cloud (from Synthetic-Aperture radar, SAR radar), 2D image (from thermal, infrared, photographic camera, or SAR), or any other representation of the results of remote sensing represented in digital form of the components 202 and objects 212. Typically, LiDAR is used to denote a LiDAR system, although the word system is usually omitted. In present disclosure the term camera can refer, but is not limited, to: RGB (red green blue) camera, RGBN (RGB+near infrared) camera, infrared camera, near-infrared camera, thermal camera, video camera, high frequency video camera, multispectral camera, hyperspectral camera, multispectral video camera, hyperspectral video camera.
  • In an embodiment, the remote sensing data 112 includes mission prior data 114. The mission prior data 114 could be any available data related to the components 202 and the objects 212 before the occurrence of an event. Herein, the event could be natural such as, a flood, an earthquake, a storm, a hurricane, or the like; or man-made such as, construction activities, deforestation, etc. The mission prior data 114 is collected by using the second survey unit or other survey units present in the system 100 or it can be data which has been collected by for example from satellite data, using LiDAR in drones or helicopters, with mobile terminals etc. Alternatively, the mission prior data 114 may be collected by using the first survey unit 110.
  • In an embodiment, the mission prior data 114 is used to construct an augmented view related to a location of the site 210 of the network 200. The augmented view of the site 210 associated with the location of the site 210 as before the occurrence of the event.
  • For example, as shown, it may be contemplated that the mission prior data 114 may have positions of poles 203, 204, 205 and the conductor wires 206, 207 disposed between the poles 203, 204, 205. The view may also have information on trees 214, 215, 216 or other objects 212 present in the site 210. The augmented view accordingly includes the poles 203, 204, 205, the conductor wires 206, 207 disposed between the poles 203, 204, 205 and the trees 214, 215, 216 present in the site 210 before the event.
  • According to an embodiment, the remote sensing data 112 can also include mission current data 116, such as any available data related to the components 202 and the objects 212 after the occurrence of the event. In an embodiment, the mission current data 116 is used to construct a current view related to the location of the site 210 of the network 200 after the occurrence of the event, which in explained in greater detail herein later. The mission current data 116 may have information on the tree 215 x which is now fallen down, as shown in FIG. 1. The view may also have information on the pole 205 x which is now fallen down and/or on the conductor wire 207 x which is now cut and fallen down.
  • It may be contemplated by a person ordinarily skilled in the art that the remote sensing data 112 may be absolute (as in specific coordinates), relative (to corresponding mission data 114, 116), or structural (e.g. topology or proximity between the components 202 and the objects 212). Further, the remote sensing data 112 may be discrete, or probabilistic (in a sense of probability distribution of the components 202 and the objects 212). Further it may be understood that the mission current data 116 could be similar to or different from the mission prior data 114.
  • The collection of the remote sensing data 112 involves regular monitoring of the components 202 and the objects 212. According to one embodiment of the present disclosure, each of the survey unit 110 may include at least one remote sensing equipment 118. The remote sensing equipment 118 may include digital remote sensing equipment and instruments such as LiDAR, SAR radar, thermal camera, camera, or video camera, x-ray radar, etc. The remote sensing equipment 118 may be located near by the target site 210 or may be located remotely to the site 210 gathering information by remote communication means. The remote sensing equipment 118 may be installed and operated from a mobile platform, for example a copter, fixed wing plane, an Unmanned Aerial Vehicle (UAV), Unmanned Aerial System (UAS), satellite, wheel drive terrain vehicle such as a car, forest machine, etc.
  • In an embodiment, the remote sensing equipment 118 includes LiDAR systems as a primary information source. LiDAR (also written LIDAR) is a remote sensing technology that measures distance by illuminating a target with a laser and analyzing the reflected light. The term “LiDAR” comes from combining the words light and radar. This emerging data acquisition tool provides an opportunity to classify a utility corridor scene more reliably and thus generate accurate 3D models of infrastructure features due to LiDAR's ability of highly dense and accurate, and multiple-echo data acquisition, which can also provide information on the internal structure of vegetation.
  • LiDAR uses ultraviolet, visible, or near infrared light to image objects and can be used with a wide range of targets, including non-metallic objects, rocks, rain, chemical compounds, aerosols, clouds and even single molecules. LiDAR systems employ a narrow laser beam which can be used to map physical features with very high resolution. Wavelengths from about 10 micrometers to the UV (ca. 250 nm) are used to suit the target. Typically light is reflected via backscattering. Different types of scattering are used for different LiDAR applications; most common are Rayleigh scattering, Mie scattering, Raman scattering, and fluorescence. Based on different kinds of backscattering, the LiDAR can be accordingly called Rayleigh LiDAR, Mie LiDAR, Raman LiDAR, Na/Fe/K Fluorescence LiDAR, and so on. Suitable combinations of wavelengths can allow for remote mapping of atmospheric contents by looking for wavelength dependent changes in the intensity of the returned signal.
  • According to an alternative and a preferred embodiment of the present disclosure, the survey unit 110 may be constituted by personnel 120 already located at the site 210. The personnel 120, as a part of the survey unit 110, is equipped with a mobile terminal 122 such as, but not limited to, a smart phone, a laptop, a tablet, a smart camera, or some combination thereof.
  • Referring now to FIG. 2, illustrated is an exemplary embodiment of the terminal 122. The terminal 122 for all intents and purposes includes a display 122 a, a camera 122 b, a user interface 122 c, a communication interface 122 d, a central processing unit (CPU) 122 e, a sensor unit 122 f including compass, accelerometer, magnetometer, global navigation satellite system (GNSS) such as global positioning system (GPS) sensor, and other components such as memories, etc. In addition, the terminal 122 also includes a power source 122 g such as, batteries to provide electricity for the above listed parts in the mobile terminal 122.
  • It may be understood by a person skilled in the art that the various parts in the mobile terminal 122 function together to collect and analyze some remote sensing data 112 from the site 210. For example, the camera 122 b and the sensor unit 122 f are connected for making analysis of the view and surroundings, which explained in greater detail herein later. The sensor unit 122 f can include an accelerometer to determine tilting angle of the terminal 122, a magnetometer to determine direction of the terminal 122 in respect to the magnetic field of earth, a location sensor (GPS) to determine longitude and latitude of the terminal 122, etc. The sensor unit 122 f can be, for example, similar to a Kinect sensor of Microsoft® or a range camera, that is, the sensor unit 122 f may be a horizontal bar connected to a small base with an optionally motorized pivot and is designed to be positioned lengthwise above or below the terminal 122. The sensor unit 122 f may further features a “RGB” camera, a depth sensor and a multi-array microphone running proprietary software, which provides full-body 3D motion capture, facial recognition and voice recognition capabilities.
  • Further, the central processing unit 122 e, in the terminal 122, may include related memories (non-transitory, flash memory, memory cards) for running the software needed for operation. The communication interface 122 d may include one or combination of cellular interface [2G, 3G, 4G, 4G LTE (Long Term Evolution)], etc.), or Wireless Local Area Network (WLAN) interface, generally for accessing internet. The user interface 122 c may include display and touch screen/buttons for the personnel 120 to operate the terminal 122.
  • Essentially, the mobile terminal 122 could be any device or combination of devices capable of collecting the mission prior data 114 indicative of the components 202 and objects 212 before the occurrence of the event at the site 210. The mission prior data 114 is most commonly in the form of a picture or views of locations of the site 210, having the components 202 and objects 212, before the event.
  • In an example, the mission prior data 114 includes pictures captured using the camera 122 b, showing the state/position of various components 202 in the network 200, and possibly in some relation to the objects 212 in the site 210 before the event. Such pictures may be stored in an external device, and requested as augmented views from the external device, which is explained in greater detail herein later. In an example, creating an augmented view or augment a view using the terminal 122 involves using the data from the position sensor, such as the sensor 122 f, to determine a direction of the camera view in relation to the site 210, capture an image of the camera view, and augment a view on the display 122 a using the data from the position sensor, i.e. add tags (with positioning sensor data) for further storage and processing. Additionally, the augmented view can constructed using the mission prior data 114 collected by using a second survey unit, satellite unit data, or Light Detection And Ranging (LiDAR) equipment data in drones or helicopters.
  • Similarly, the mission current data 116 includes pictures (current views) captured by using the camera 122 b, showing the state/position of various components 202 in the network 200, and possibly in some relation to the objects 212 in the site 210 after the event. For example, in the process of generating such pictures the display 122 a of the terminal 122 is brought relative to at least one of the various components 202 or the objects 212 present in the network 200 based on the positioning sensor data.
  • Referring again to FIG. 1, the system 100 includes a control unit 130 having a server 132 and a database 134. The control unit 130 is configured to communicate, over the communication interface 122 c such as Internet, with the terminal 122. In an embodiment, the server 132 may be configured to receive the mission current data 116 (current view) related to the components 202 and the objects 212 in the network 200, from the survey unit 110. Specifically, the server 132 is adapted to receive the mission current data 116 from the mobile terminal 122. The server 132 may also be adapted to receive the mission prior data 114. The received data 114, 116 may be stored in the provided database 134, from where it can be accessed by other modules of the system 100.
  • In an embodiment, the server 132 may further be configured to send the mission prior data 114 to the survey unit 110, or specifically the terminal 122 for the perusal of the personnel 120. This mission prior data 114 may be sent in the form of the augmented view representative of the original state/position of the various objects 212 in the site 210, before any change/damage.
  • In an example, the survey unit 110, or specifically the terminal 122, is adapted to be located at the site 210 of the network 200, using data from a positioning sensor of the survey unit 110. The survey unit 110 is configured to request from the control unit 130 the augmented view related to the location of the site 210 of the network 200.
  • The augmented view is rendered on the display 122 a of the terminal 122 by executing with a processor (i.e. the central processing unit 122 e) computing instructions stored in the memory of the terminal 122. In an example, the computing instructions are configured to use the positioning sensor data for rendering the augmented view is rendered on the display 122 a. The survey unit 110, or specifically the terminal 122, is further adapted to display the augmented view in the display 122 a of the terminal 122 on top of a current view of the site 210. The current view includes information on the tree 215 x which is now fallen down, as shown in FIG. 1. The view may also have information on the pole 205 x which is now fallen down and/or on the conductor wire 207 x which is now cut and fallen down. The current view is a camera view rendered on the display 122 a of the terminal 122.
  • The survey unit 110 being also adapted to capture a photograph on the display 122 a and to communicate the photograph and the positioning sensor data to the control unit 130. The photograph contains the camera view along with the overlapped augmented view. The control unit 130 is configured to receive and to store the photograph and the positioning sensor data from the survey unit 110. The control unit 130 is further configured to determine changes in the network 200 by comparing the current view as shown in the photograph with the augmented view. The control unit 130 is further configured to create an event report including a catalog of the changes to the network 200 to be accessible therefrom. The control unit 130 is further configured to update the mission prior data based on the determined changes to the network.
  • In an example, the display 122 a may show, by the view finder of the terminal 122, roughly a same perspective view V1 (augmented view requested from the control unit 130) as would be a view V2 (current view) presently seen by the personnel 120 from his/her current position. The software in the terminal 122 achieves this by using the sensor information from the terminal 122 and communicating the same to the control unit 130, which generates the view V1 in consideration of this sensor information. In an embodiment, the view V1 may be shown as a dashed line figure or with some transparent means for the perusal of the personnel 120.
  • The personnel 120 then tries to direct the terminal's camera 122 b to overlap the dashed line figure of the view V1 with the current view V2, that is, tries to position the transparent recorded image with the reality. The terminal 122 may be configured to determine differences between the two views, indicative of changes/damages to the PL network 200. This could be done automatically by the CPU 122 e by performing comparative analysis of the views V1 and V2. Alternatively, the terminal 122 may include the option to allow the personnel 120 to indicate the changes by means of a provided user interface. The personnel 120 could, for example, use a touch screen of the terminal 122 to circle one or more objects which are changed from the image view V1 from the database 134. The said information is sent back to the database 134 as an event report indicative of the damages/changes in the power line network 200.
  • Still alternatively, the overlapped views V1 and V2 may be shared with the control unit 130. For example, the terminal 122 is adapted to capture a photograph (overlapped views V1 and V2) on the display 122 a and to communicate the photograph to the control unit 130. The control unit 130 may include the requisite software/service to perform the comparative analysis of the views (photograph) and identify the differences. Further, the event report is generated which could be accessible from the database 134 for the perusal of some operators or agencies responsible for repair activities to mitigate the damages to the network 200.
  • In an embodiment, the data transfer between the terminal 122 and the server 132 may be done by any of two scenarios. In a first scenario, the raw data is transferred from the terminal 122 including location, video stream, depth map, point clouds, figures, direction of making the visual data, and the analysis of the data is performed in the server 132. In another scenario, some of the analysis is done locally and the results are sent to the server 132. This way the amount of data to be transferred, for example, via a cellular network, can be made smaller as compared to the first scenario.
  • In accordance with an additional embodiment of the present disclosure, the control unit 130 may be configured to send the views to multiple users/personnel 120 equipped with the terminal 122. These users may be present in the same locality, such as, the locality at the site of damage, or spread over some geographical area. Each user may be enabled to analyze, comment, or rate the images (formed by the overlapping of the two views) in order to identify the differences. This crowdsourcing of the analysis provides a larger resource pool and therefore results in better damage assessment for the event report.
  • Moving on, FIG. 3 provides an architecture related to the system 100 of the present disclosure. In step S2.1, the personnel 120 sends a position of the terminal 122 and a direction of view finder i.e. the camera 122 b of the terminal 122. The direction may be deducted from the accompanying sensor information such as, compass direction (say, 56 degrees to North) and tilting angle of the camera with respect to ground (say, 10 degrees in relation to horizontal), latitude, longitude, etc. The terminal 122 can also send a current view, such as the view V2 (i.e. a camera view appearing on the display 122 a of the camera 122 b of the terminal 122) to the server 132.
  • In step S2.2, the server 132 scans the database 134 to generate/find/determine viewable reference data V1 (such as image, model, shape, drawing, or outline) related to the present view V2 in the viewfinder. The view V1 is the augmented view stored in the database 134 of the control unit. Bases on an embodiment, the view V1 is calculated from a set of 3d point data which data have been previously measured from the said position of the terminal 122.
  • According to an embodiment, the view V1 may be generated by the terminal 122 (for being stored in the database 134 of the control unit 130). Alternatively, the view V1 may be provided by a second survey unit, satellite unit data, or Light Detection And Ranging (LiDAR) equipment data in drones or helicopters.
  • In step S2.3, the view V1 is communicated back to the terminal 122. The view V1 can consists of one or more views. The view V1 can be of the type an image (digital photograph), a depth map (image showing distance to different parts of the view), a point cloud, a thermal image, or a generated image, illustration, model, shape, drawing, or outline depending on capabilities of the terminal 122.
  • In step S2.4, the personnel 120 aims to align the received view V1 with the view V2. The personnel 120 can indicate with a touch screen or other user interface means objects which have changed in comparison to received view. Specifically, the changes in the components 202 and objects 212 can be deducted with the comparison of the view V1 and the V2. The personnel 120 can, for example, indicate which of the power line poles are missing or have fallen down. For example, as shown in FIG. 1, the broken pole 205 x and the broken conductor wire 207 x can be indicated by the personal 120. In an embodiment, a menu is provided to the personnel 120 upon touching an object or a component appearing in the display 122 a of the terminal 122. The menu can have for example symbols such as fallen down, broken, disappeared, tilted, no changes, etc. or related texts. In an embodiment, the menu may also have free text input fields for the personnel 120 to provide notes related to damage. Further, the menu information may be used to create an event report including the catalog of the changes to the network.
  • In step S2.5, the generated information (related to the status of the components 202 and the objects 212) is sent to the service in the server 132. Further in step S2.6, this information is stored in the database 134 as updated object information. In an embodiment, the update can be accepted automatically or it can be subject to verification by a service provider. Next time the personnel 120 accesses the database 134, the updates/changes are reflected on the received view V1 at the terminal 122.
  • Based on embodiments in step S2.7, the server 132 analyses the differences between the views and creates an event report including a catalog of changes. More specifically, a photograph of the view V1 overlapped on the V2 is captured by the terminal 122 and thereafter the terminal 122 sends the photograph to the server 132. The server 132 analyses the photograph to create the event report including the catalog of the changes to the network, which can be determined from the comparison of the view V1 overlapped on the V2.
  • In step S2.8, in case the event report indicates damage which needs to be corrected, such as the broken pole 205 x and the broken conductor wire 207 x, the information is communicated to some third-party system such as repairing agencies. The third-party may receive information, either as a message, a push message or information accessible via Internet connection. The third-party may subsequently analyze the updated information and plan for possible corrective/repair actions at the site 210.
  • Referring now to FIG. 4, illustrated is a method 400 for reporting changes to a network in case of an event, in accordance with an embodiment of the present disclosure.
  • At step 402, an augmented view is requested from a control unit, after the occurrence of an event. The augmented is associated with a site as before the occurrence of the event. According to an embodiment, the augmented view is constructed using a mission prior data. The mission prior data is collected by using at one of a second survey unit, satellite unit data, or Light Detection And Ranging (LiDAR) equipment data in drones or helicopters.
  • At step 404, the augmented view is overlapped with a current view, by using a first survey unit. In an example, the current view is a camera view rendered on a display of the first survey unit. The first survey unit is adapted to be located at the site of the network, using data from a positioning sensor of the survey unit. The survey unit is configured to request from the control unit the augmented view related to the location of the site of the network.
  • At step 406, a photograph of the current view along with the overlapped augmented view is captured. In an example, the photograph contains the camera view along with the overlapped augmented view.
  • At step 408, the photograph is sent to the control unit.
  • At step 410, changes to the network are determined by comparing the current view as shown in the photograph with the augmented view.
  • The steps 402 to 410 are only illustrative and other alternatives can also be provided where one or more steps are added, one or more steps are removed, or one or more steps are provided in a different sequence without departing from the scope of the claims herein.
  • For example, the method 400 further includes updating the mission prior data based on the determined changes to the network. Further, the method includes creating an event report including a catalog of the changes to the network.
  • In yet another aspect, embodiments of the present disclosure provide an apparatus for documenting changes to a network. The apparatus includes a communication interface, a camera, at least one location sensor for determining a location of the apparatus and a rotation of the apparatus relative to a ground level and to a map coordinates, a memory for storing computing instructions, and a processor configured to execute the computing instructions. The computing instructions configured to request an augmented view of a site, based on the location of the apparatus, use the data from the at least one position sensor to determine a direction of a camera view in relation to the site, overlap the augmented view with the camera view using the data from the at least one position sensor, capture a photograph of the camera view along with the overlapped augmented view, and communicate the photograph to a device external to the apparatus. The external device can be a control unit as explained above.
  • In further embodiment, the present disclosure utilizes an apparatus, such as the terminal 122. In an example, the apparatus can be a smart phone with a camera and a QR (quick response) code reading application. Poles and other objects in power lines could have a QR code (or other identifier such as a Radio Frequency Identifier (RFID) which can be read with a smart phone). When a person (which can be in practice any person) sees a damaged object such as a fallen power line, the person scans the identifier with the phone. In case of QR code, the camera of the phone is pointed to the QR-code. The application in the phone connects to a service and forms an event report. The application can form data connection, send a short message service (SMS) message, email, multimedia service message (MMS) etc. The application can be a dedicated application for reporting or it can be for example browser in the phone. In the latter case the QR code would be used to connect to a reporting web site and to post at the same time a unique identification of the object to the system. The application in the phone can be further configured to provide or allow user to provide location co-ordinates where the reporting is made. In an example embodiment the application would send the identification read from the QR-code in the pole and the GPS location of the phone at the time of scanning the code. Additionally the application or the web site can include a form for the user to make an event report or use menus to select a type of incident and to add an image/photo taken from the place. The system could be configured to give reward such as money or other credits to users who report the damages. In addition to reporting damages the users could also report possible future problems. The power line companies could give incentive for event reports depending on possible cost savings on resulting on the event report.
  • According to another embodiment, the system 100 configured to form an overview of the situation of the site 210 of the network 200. For example, the control unit 130 might be configured to receive data from a plurality of sources such as a satellite, a drone, helicopters, a LiDAR system, multiple users with mobile terminals etc., to form an overview of the situation using multiple data sources. For example in some areas the data might include only one picture taken with a mobile phone and in some other areas there might be video coverage, satellite images and photos.
  • It is intended that all matter contained in the above description or shown in the accompanying drawings shall be interpreted as illustrative only and not limiting of the scope of the disclosure. Expressions such as “including”, “comprising”, “incorporating”, “consisting of”, “have”, “is” used to describe the present disclosure are intended to be construed in a non-exclusive manner, namely allowing for items, components or elements not explicitly described also to be present. Reference to the singular is also to be construed to relate to the plural.
  • While certain exemplary embodiments have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad present disclosure, and that this present disclosure is not limited to the specific constructions and arrangements shown and described, since various other modifications and/or adaptations may occur to those of ordinary skill in the art. It is to be understood that individual features shown or described for one embodiment may be combined with individual features shown or described for another embodiment.

Claims (11)

1. A method for reporting changes to a network in case of an event, the method comprising steps of:
requesting from a control unit, after the occurrence of an event, an augmented view of a site as before the occurrence of the event;
overlapping the augmented view with a current view, by using a first survey unit;
capturing a photograph of the current view along with the overlapped augmented view;
sending the photograph to the control unit; and
determining changes to the network by comparing the current view as shown in the photograph with the augmented view.
2. A method according to claim 1, wherein the augmented view is constructed using a mission prior data, the mission prior data being collected by using
a second survey unit,
satellite unit data, or
Light Detection And Ranging (LiDAR) equipment data in drones or helicopters.
3. A method according to claim 1, wherein the current view is a camera view rendered on a display of the first survey unit.
4. A method according to claim 3, wherein the photograph contains the camera view along with the overlapped augmented view.
5. A method according to claim 2, wherein the method further comprises updating the mission prior data based on the determined changes to the network.
6. A system for reporting changes to a network in case of an event, the system comprising
a first survey unit adapted to be located at a site of the network, using data from a positioning sensor of the survey unit, the survey unit being configured to request from a control unit an augmented view related to the location of the site of the network,
the first survey unit being further adapted to display the augmented view in a display of the survey unit on top of a current view of the site,
the first survey unit being also adapted to capture a photograph on the display and to communicate the photograph and the positioning sensor data to the control unit,
the control unit being configured to receive and to store the photograph and the positioning sensor data from the first survey unit, the control unit being further configured to determine changes in the network by comparing the current view as shown in the photograph with the augmented view, and to create an event report including a catalog of the changes to the network to be accessible therefrom.
7. A system according to claim 6, wherein the augmented view is constructed using a mission prior data, the mission prior data being collected by using
a second survey unit,
satellite unit data, or
Light Detection And Ranging (LiDAR) equipment data in drones or helicopters.
8. A system according to claim 6, wherein the current view is a camera view rendered on a display of the first survey unit.
9. A system according to claim 7, wherein the control unit is further configured to update the mission prior data based on the determined changes to the network.
10. A system according to claim 8, wherein the augmented view is rendered on the display of the first survey unit by executing with a processor computing instructions stored in a memory of the first survey unit, the computing instructions being configured to
use the data from the position sensor to determine a direction of the camera view in relation to the site,
capture an image of the camera view, and
augment a view on the display using the data from the position sensor.
11. An apparatus for documenting changes to a network, the apparatus comprising:
a communication interface;
a camera;
at least one location sensor for determining a location of the apparatus and a rotation of the apparatus relative to a ground level and to a map coordinates;
a memory for storing computing instructions; and
a processor configured to execute the computing instructions to
request an augmented view of a site, based on the location of the apparatus,
use the data from the at least one position sensor to determine a direction of a camera view in relation to the site,
overlap the augmented view with the camera view using the data from the at least one position sensor,
capture a photograph of the camera view along with the overlapped augmented view, and
communicate the photograph to a device external to the apparatus.
US14/534,803 2013-11-08 2014-11-06 System and method for reporting events Abandoned US20150130840A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US201361901489P true 2013-11-08 2013-11-08
US201361901490P true 2013-11-08 2013-11-08
US201361901492P true 2013-11-08 2013-11-08
US14/534,803 US20150130840A1 (en) 2013-11-08 2014-11-06 System and method for reporting events

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US14/534,803 US20150130840A1 (en) 2013-11-08 2014-11-06 System and method for reporting events

Publications (1)

Publication Number Publication Date
US20150130840A1 true US20150130840A1 (en) 2015-05-14

Family

ID=53043441

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/534,803 Abandoned US20150130840A1 (en) 2013-11-08 2014-11-06 System and method for reporting events
US14/534,778 Abandoned US20150134384A1 (en) 2013-11-08 2014-11-06 System and method for allocating resources
US14/534,728 Active 2036-01-09 US9784836B2 (en) 2013-11-08 2014-11-06 System for monitoring power lines

Family Applications After (2)

Application Number Title Priority Date Filing Date
US14/534,778 Abandoned US20150134384A1 (en) 2013-11-08 2014-11-06 System and method for allocating resources
US14/534,728 Active 2036-01-09 US9784836B2 (en) 2013-11-08 2014-11-06 System for monitoring power lines

Country Status (1)

Country Link
US (3) US20150130840A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160371865A1 (en) * 2015-06-18 2016-12-22 Eran JEDWAB System and method for deploying sensor based surveillance systems
US20170103558A1 (en) * 2015-10-13 2017-04-13 Wipro Limited Method and system for generating panoramic images with real-time annotations
US9672661B2 (en) * 2015-06-08 2017-06-06 Airbus Operations (S.A.S.) Damage detection and repair system and method using enhanced geolocation
US9942721B2 (en) 2016-07-11 2018-04-10 At&T Intellectual Property I, L.P. Aerial profiling of communication networks
CN110570081A (en) * 2019-07-29 2019-12-13 中国南方电网有限责任公司超高压输电公司昆明局 Earthquake response method and device for power transmission line

Families Citing this family (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2530104A (en) * 2014-09-15 2016-03-16 Point4Uk Ltd Vegetation catergorisation
CN104867180B (en) * 2015-05-28 2017-09-15 南京林业大学 A kind of forest stand characteristics inversion method of integrated UAV and LIDAR
WO2016192025A1 (en) * 2015-06-01 2016-12-08 SZ DJI Technology Co., Ltd. Systems and methods for memory architecture
US9953540B2 (en) * 2015-06-16 2018-04-24 Here Global B.V. Air space maps
CN106332032A (en) * 2015-06-26 2017-01-11 中兴通讯股份有限公司 Method and device for processing Internet of Vehicles service
RU2694016C1 (en) 2015-08-06 2019-07-08 Эксенчер Глобал Сервисез Лимитед Detecting the state of objects using an image processing system, a corresponding method and a persistent machine-readable medium
AU2015404215B2 (en) * 2015-08-06 2019-01-03 Accenture Global Services Limited Vegetation management for power line corridor monitoring using computer vision
US9969492B2 (en) * 2015-09-04 2018-05-15 Nutech Ventures Crop height estimation with unmanned aerial vehicles
CN105372671A (en) * 2015-12-11 2016-03-02 国网四川省电力公司电力应急中心 Unmanned aerial vehicle-based power line three-dimensional reconstruction system
CN105527969B (en) * 2015-12-17 2018-07-06 中国科学院测量与地球物理研究所 A kind of mountain garden belt investigation and monitoring method based on unmanned plane
US10880623B2 (en) * 2016-06-12 2020-12-29 Green Grid Inc. Method and system for utility power lines vegetation proximity monitoring and controlling
CN108074370A (en) * 2016-11-11 2018-05-25 国网湖北省电力公司咸宁供电公司 The early warning system and method that a kind of anti-external force of electric power transmission line based on machine vision is destroyed
RU173640U1 (en) * 2016-12-06 2017-09-04 Федеральное государственное бюджетное учреждение науки Государственный геологический музей им. В.И. Вернадского Российской академии наук Unmanned aeromagnetic complex of copper type
US10096251B2 (en) * 2016-12-16 2018-10-09 Insitu, Inc. Systems and methods for establishing a flight pattern adjacent to a target for a vehicle to follow
CN107255520A (en) * 2017-06-08 2017-10-17 广东容祺智能科技有限公司 One kind is based on the infrared forest community Regeneration pattern analysis system of taking photo by plane of unmanned plane
CN107800770B (en) * 2017-09-19 2020-05-29 山西省林业科学研究院 Forest monitoring system based on internet terminal
US10600037B2 (en) 2017-09-29 2020-03-24 International Business Machines Corporation Efficient scheduling of maintenance for power distribution systems
US20190114725A1 (en) * 2017-10-13 2019-04-18 Honeywell International Inc. Utility network monitoring with a device and an unmanned aircraft
WO2019075434A1 (en) 2017-10-13 2019-04-18 Honeywell International, Inc. Utility network monitoring device, system and method
US10805382B2 (en) * 2018-01-29 2020-10-13 International Business Machines Corporation Resource position planning for distributed demand satisfaction
CN108873932A (en) * 2018-06-13 2018-11-23 西安理工大学 Unmanned plane bee colony attack guidance system and bootstrap technique based on wireless ultraviolet light
CN108919838A (en) * 2018-08-27 2018-11-30 河海大学常州校区 A kind of unmanned plane transmission line of electricity automatic tracking method based on binocular vision
CN108919232B (en) * 2018-09-07 2021-01-15 北京数字绿土科技有限公司 Method and device for detecting dangerous points of power transmission line
CN109828488A (en) * 2018-12-27 2019-05-31 北京航天福道高技术股份有限公司 The double optical detection tracking systems of acquisition transmission integration
CN110060289A (en) * 2019-04-26 2019-07-26 深圳市镭神智能系统有限公司 Power line extraction method and laser radar system, storage medium
CN110430527B (en) * 2019-07-17 2020-09-25 大连理工大学 Unmanned aerial vehicle ground safety transmission power distribution method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010010546A1 (en) * 1997-09-26 2001-08-02 Shenchang Eric Chen Virtual reality camera
US20090110267A1 (en) * 2007-09-21 2009-04-30 The Regents Of The University Of California Automated texture mapping system for 3D models
US20100066751A1 (en) * 2008-09-12 2010-03-18 Lg Electronics Inc. Adjusting the display orientation of an image on a mobile terminal
US20100207936A1 (en) * 2009-02-13 2010-08-19 Harris Corporation Fusion of a 2d electro-optical image and 3d point cloud data for scene interpretation and registration performance assessment
US20110222781A1 (en) * 2010-03-15 2011-09-15 U.S. Government As Represented By The Secretary Of The Army Method and system for image registration and change detection
US20110252131A1 (en) * 2010-04-12 2011-10-13 Jeyhan Karaoguz System and method for automatically managing a network of user-selectable devices

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6853327B2 (en) * 1999-12-22 2005-02-08 Hot/Shot Radar Inspections, Llc Method and system for analyzing overhead line geometries
US20050279069A1 (en) * 2004-06-16 2005-12-22 Cn Utility Consulting, Llc Systems, device, and methods for efficient vegetation maintenance at multiple infrastructure sites
FR2920745B1 (en) * 2007-09-12 2010-04-23 Altair Modular drone with detachable subassemblies
US8244469B2 (en) * 2008-03-16 2012-08-14 Irobot Corporation Collaborative engagement for target identification and tracking
US8275547B2 (en) * 2009-09-30 2012-09-25 Utility Risk Management Corporation, Llc Method and system for locating a stem of a target tree
IL201682D0 (en) * 2009-10-22 2010-11-30 Bluebird Aero Systems Ltd Imaging system for uav
US8352410B2 (en) * 2009-12-17 2013-01-08 Utility Risk Management Corporation, Llc Method and system for estimating vegetation growth relative to an object of interest
US8680994B2 (en) * 2010-12-30 2014-03-25 Utility Risk Management Corporation, Llc Method for locating vegetation having a potential to impact a structure
US8275570B2 (en) * 2011-03-02 2012-09-25 Utility Risk Management Corporation, Llc Thermal powerline rating and clearance analysis using local thermal sensor
US8880241B2 (en) * 2013-02-20 2014-11-04 Farrokh Mohamadi Vertical takeoff and landing (VTOL) small unmanned aerial system for monitoring oil and gas pipelines

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010010546A1 (en) * 1997-09-26 2001-08-02 Shenchang Eric Chen Virtual reality camera
US20090110267A1 (en) * 2007-09-21 2009-04-30 The Regents Of The University Of California Automated texture mapping system for 3D models
US20100066751A1 (en) * 2008-09-12 2010-03-18 Lg Electronics Inc. Adjusting the display orientation of an image on a mobile terminal
US20100207936A1 (en) * 2009-02-13 2010-08-19 Harris Corporation Fusion of a 2d electro-optical image and 3d point cloud data for scene interpretation and registration performance assessment
US20110222781A1 (en) * 2010-03-15 2011-09-15 U.S. Government As Represented By The Secretary Of The Army Method and system for image registration and change detection
US20110252131A1 (en) * 2010-04-12 2011-10-13 Jeyhan Karaoguz System and method for automatically managing a network of user-selectable devices

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Barbara Zitova´, Jan Flusser, "Image registration methods: a survey", Image and Vision Computing 21 (2003) 977-1000. *
Horatiu Boeriu, Head-Up Display 2.0 – Augmented Reality, posted on Oct 7, 2011, retrieved from http://www.bmwblog.com/2011/10/07/head-up-display-2-0-augmented-reality/. *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9672661B2 (en) * 2015-06-08 2017-06-06 Airbus Operations (S.A.S.) Damage detection and repair system and method using enhanced geolocation
US20160371865A1 (en) * 2015-06-18 2016-12-22 Eran JEDWAB System and method for deploying sensor based surveillance systems
US20170103558A1 (en) * 2015-10-13 2017-04-13 Wipro Limited Method and system for generating panoramic images with real-time annotations
US9942721B2 (en) 2016-07-11 2018-04-10 At&T Intellectual Property I, L.P. Aerial profiling of communication networks
US10484849B2 (en) 2016-07-11 2019-11-19 At&T Intellectual Property I, L.P. Aerial profiling of communication networks
CN110570081A (en) * 2019-07-29 2019-12-13 中国南方电网有限责任公司超高压输电公司昆明局 Earthquake response method and device for power transmission line

Also Published As

Publication number Publication date
US20150131079A1 (en) 2015-05-14
US9784836B2 (en) 2017-10-10
US20150134384A1 (en) 2015-05-14

Similar Documents

Publication Publication Date Title
JP6587297B2 (en) Method and system for displaying short-term forecasts along a route on a map
US9709394B2 (en) Assisted 3D scene comparison
US10685404B1 (en) Loss mitigation implementing unmanned aerial vehicles (UAVs)
US9805456B1 (en) Method and system for assessing damage to infrastructure
Fernández‐Hernandez et al. Image‐based modelling from unmanned aerial vehicle (UAV) photogrammetry: an effective, low‐cost tool for archaeological applications
US9784836B2 (en) System for monitoring power lines
Singh et al. A meta-analysis and review of unmanned aircraft system (UAS) imagery for terrestrial applications
Boccardo et al. UAV deployment exercise for mapping purposes: Evaluation of emergency response applications
US8724900B2 (en) Method and apparatus for direct detection, location, analysis, identification, and reporting of vegetation clearance violations technical field
CN103942941B (en) Mobile monitoring convergence platform based on GIS
CN103703758B (en) mobile augmented reality system
US10584978B2 (en) Method and system for displaying nowcasts along a route on a map
Coveney et al. Lightweight UAV digital elevation models and orthoimagery for environmental applications: data accuracy evaluation and potential for river flood risk modelling
Khaloo et al. Unmanned aerial vehicle inspection of the Placer River Trail Bridge through image-based 3D modelling
CA2749917C (en) Methods and apparatus for indicating a planned excavation
Gini et al. Use of unmanned aerial systems for multispectral survey and tree classification: A test in a park area of northern Italy
US8310361B1 (en) Creating and monitoring alerts for a geographical area
US9476964B2 (en) Automatic image capture
US8788496B2 (en) Visual organization of information via associated geospatial data
US20160116280A1 (en) Integrated Aerial Photogrammetry Surveys
Hess et al. Geocoded digital videography for validation of land cover mapping in the Amazon basin
Wieland et al. Estimating building inventory for rapid seismic vulnerability assessment: Towards an integrated approach based on multi-source imaging
US10708548B2 (en) Systems and methods for video analysis rules based on map data
US5652717A (en) Apparatus and method for collecting, analyzing and presenting geographical information
Tomaštík et al. Accuracy of photogrammetric UAV-based point clouds under conditions of partially-open forest canopy

Legal Events

Date Code Title Description
AS Assignment

Owner name: SHARPER SHAPE OY, FINLAND

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HEINONEN, TERO;HYYPPA, JUHA;JAAKKOLA, ANTTONI;REEL/FRAME:034182/0180

Effective date: 20141103

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION