WO2022162433A1 - An augmented reality platform for a geographic information system - Google Patents

An augmented reality platform for a geographic information system Download PDF

Info

Publication number
WO2022162433A1
WO2022162433A1 PCT/IB2021/052187 IB2021052187W WO2022162433A1 WO 2022162433 A1 WO2022162433 A1 WO 2022162433A1 IB 2021052187 W IB2021052187 W IB 2021052187W WO 2022162433 A1 WO2022162433 A1 WO 2022162433A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
interface
location
information
real
Prior art date
Application number
PCT/IB2021/052187
Other languages
French (fr)
Inventor
Navid NIKNIA
Payam TOHIDI SHAKERSEFAT
Original Assignee
Niknia Navid
Tohidi Shakersefat Payam
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Niknia Navid, Tohidi Shakersefat Payam filed Critical Niknia Navid
Publication of WO2022162433A1 publication Critical patent/WO2022162433A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/907Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/909Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Definitions

  • the present subject matter is, in general, related to Geographic Information System (GIS) and more particularly, but not exclusively, to an Augmented Reality (AR) based platform for realizing the GIS and a method of creating the said platform.
  • GIS Geographic Information System
  • AR Augmented Reality
  • the descriptive and geospatial information were converted into digital maps, in the form of graphical pieces of software such as Auto Computer-Aided Design (AutoCAD) applications. Subsequently, these information were converted to geospatial information pieces of software such as ArcGIS, prominently in work offices, after they were collected by field forces. Also, the maps of a desired region utilized to be printed on papers and utilized while performing field operations in that location or in case of fortuitous events.
  • AutoCAD Auto Computer-Aided Design
  • AR Augmented Reality
  • Such a software/application is intended to replace the printed maps either at workplace or accident locations.
  • the AR based software/application is intended to enable users to look at all desired underground features of the location, accompanied by their descriptive information.
  • the AR based software/application is also intended for use in correcting an old, incorrect data and load a new and/or much accurate data in case of variances in the location.
  • Another existing method by Adel Fridhi et al discusses embedding all 3D data in GIS utilizing augmented reality.
  • the augmented reality in a 3D model embedded in GIS makes accessing hidden sites such as subsurface infrastructures feasible.
  • the document suggests carrying out subsurface features positioning utilizing the Azimuth augmented reality in a 3D model.
  • the disclosed software determines the precise of location of the feature through receiving the data from GIS database and phone sensors such as gyroscope and GPS, as well as analysing the information in a 3D environment by considering the depth and slope of feature and location plus Azimuth calculations.
  • CN104331423A suggests a method and device for positioning based on electronic maps.
  • the user's location and one or several other aid goals are considered as the reference, and the navigation is carried out based on Azimuth data calculation.
  • the sensors utilized in this idea include GPS, gyroscopic compass, and accelerator.
  • reference KR101450133B1 discloses a system and method for presenting the cross-sectional data of subsurface features utilizing augmented reality, so that the data related to the features recorded in the database and the precise location is accessible in the augmented reality.
  • the software disclosed in this reference document traces the precise location of the feature or records the location of new features in 2D and 3D environments utilizing data received from GPS, compass, and gyroscope sensors as well as Azimuth calculations.
  • the existing methods suggest combining the geospatial and descriptive data and augmented reality, there is still a tremendous scope for improving the geospatial precision to the range of 1-2 centimetres. Also, the existing methods lack an efficient mechanism that allows administrators and field operators to correct or insert the feature data utilizing a simple, easy-to-use administrative panel in real-time, while the administrator or the field operator is present in a desired location.
  • the application may comprise a first environment and a second environment.
  • the first environment and the second environment may comprise a two-dimensional (2D) environment and an AR based three-dimensional (3D) environment, respectively (these environments are alternatively referred to as 2D interface and 3D interface, respectively, throughout this document).
  • a user uses/launches the application installed on a computing device (also referred to as user device)
  • the application contacts a central server and requests for a different feature data from a Geographic Information System (GIS).
  • GIS Geographic Information System
  • the feature data may be related to the at least one subsurface feature present in the location.
  • the feature data received from the GIS may be shown in the first and the second environments of the application.
  • the precise location of the user may be identified utilizing a plurality of sensors that exist on the user device.
  • the plurality of sensors may include a Global Positioning System (GPS) sensor, a gyroscope and a compass sensor.
  • GPS Global Positioning System
  • the application may recognize a location of the user and may change or update it utilizing the data collected from the plurality of sensors.
  • the application may connect a created map, obtained from the 2D or the 3D model, with a current location of the user. Thereafter, the user can look at all the desired underground features in the location, along with their descriptive information.
  • the application enables the user to rectify the old feature data or upload the new data, in cases where the old data is faulty or of variance at the location.
  • the present disclosure relates to a method for providing information related to positioning of subsurface features in the location.
  • the method comprises creating, by a processor, feature data comprising the location information and corresponding descriptive data related to at least one subsurface feature in the location utilizing a predetermined mapping method.
  • the location information and corresponding descriptive data are stored in a server associated with the processor.
  • the method comprises collecting a user real-time location information, utilizing one or more sensors configured in the user device associated with the user.
  • the method comprises processing the user real-time location information and providing a two-dimensional (2D) interface and a three-dimensional (3D) interface to the user by retrieving the feature data corresponding to the real-time location of the user.
  • the method comprises providing the information related to the positioning of the at least one subsurface feature in the location to the user through at least one of the 2D interface and the 3D interface.
  • the user is allowed to switch between the 2D interface and the 3D interface during an inspection of the at least one subsurface feature in the location.
  • the present disclosure relates to an Augmented Reality (AR) generator for providing information related to positioning of subsurface features in the location.
  • the AR generator comprises the processor and a memory.
  • the memory is communicatively coupled to the processor and stores processor-executable instructions, which on execution, cause the processor to create feature data comprising the location information and corresponding descriptive data related to at least one subsurface feature in the location utilizing the predetermined mapping method.
  • the location information and corresponding descriptive data is stored in the server associated with the processor.
  • the instructions cause the processor to collect the user real-time location information, utilizing one or more sensors configured in the user device associated with the user.
  • the instructions cause the processor to process the user real-time location information and providing thetwo-dimensional (2D) interface and thethree-dimensional (3D) interface to the user by retrieving the feature data corresponding to the real-time location of the user.
  • the instructions cause the processor to provide the information related to the positioning of the at least one subsurface feature in the location to the user through at least one of the 2D interface and the 3D interface. The user is allowed to switch between the 2D interface and the 3D interface during an inspection of the at least one subsurface feature in the location.
  • the present disclosure relates to a method for detecting municipality features in the location.
  • the method comprises receiving municipality feature data and information related to the location of one or more pipes and fittings in a GIS data format. Further, the method comprises converting the received municipality feature data into respective 2D interface and 3D interface formats. Thereafter, the method comprises matching the converted data with the user real-time location information received from the location of the user through GPS sensor.
  • the present disclosure relates to a software application installed on the user device and configured to provide information related to positioning of subsurface features in the location utilizing some of or each of the limitations stated hereinabove.
  • the present disclosure relates to the user device or a computing device, which is configured for providing information related to positioning of subsurface features in the location utilizing the AR generator utilizing some of or each of the limitations stated hereinabove.
  • FIG. 1 illustrates an exemplary architecture illustrating a method of providing information related to positioning of subsurface features in a location in accordance with some embodiments of the present disclosure.
  • FIG. 1 shows an exemplary block diagram of an Augmented Reality (AR) generator configured for providing information related to positioning of subsurface features in a location, in accordance with some embodiments of the present disclosure.
  • AR Augmented Reality
  • FIG. 1 shows an exemplary flowchart illustrating a method of collecting and processing feature data for observing a desired subsurface feature in a location in accordance with some embodiments of the present disclosure.
  • FIG. 1 shows an exemplary flowchart illustrating a method of providing information related to positioning of subsurface features in a location in accordance with some embodiments of the present disclosure.
  • FIG. 1 shows an exemplary block diagram of an exemplary computer system in accordance with some embodiments of the present disclosure.
  • FIG. 1 illustrates an exemplary architecture illustrating a method of providing information related to positioning of subsurface features in a location in accordance with some embodiments of the present disclosure.
  • the architecture may include, without limiting to, a user 101, a user device 103 associated with the user 101, one or more subsurface features, subsurface feature 1 1051, ..., subsurface feature N 105N (collectively referred to as subsurface feature 105), a central server 109, a database 111 associated with the server 109 and a network 107 connecting the user device 103 with the server 109 and the database 111.
  • the user 101 may be, without limitation, a field operator, a surveying officer or an excavator operator, who needs to study, track and analyse various subsurface features 105 in a particular location.
  • one of the objectives of the proposed method is to provide an easy-to-use tool or application for the users, as state above, to help the users in easily analysing the at least one subsurface feature 105 in the location.
  • the at least one subsurface feature 105 may include, without limiting to, distribution pipelines, underground gas connections, connective fittings, cable networks, telephone lines, optic fibre lines, subsurface pumps, or any other underground objects.
  • the at least one subsurface feature 105 may not be visible to the user 101 from the surface of the ground.
  • the user device 103 associated with the user 101 may include, without limiting to, a Personal Digital Assistant (PDA), a smartphone, a tablet computer or even a laptop.
  • PDA Personal Digital Assistant
  • the user device 103 may be any computing device equipped with an image/video capturing module (i.e., camera), an Internet connectivity and the one or more sensors including, without any limitation, a Global Positioning System (GPS) sensor, a gyroscope, and a compass.
  • GPS Global Positioning System
  • the user device 103 and the Internet connectivity may be the minimal prerequisites for successful implementation of the proposed method.
  • the server 109 may be a remote server or a central server, which may be configured for receiving and processing the user real-time location information from the user device 103 and provide a desired feature to the user 101 through the user device 103.
  • the server 109 and the user device 103 may be communicatively interfaced utilizing a predetermined communication network (generally represented as network 107 in ).
  • the network may be a wired network such as, without limiting to, Local Area Network (LAN), Ethernet, fibre optics or a wireless network such as, without limiting to, Internet, Wi-Fi, and wireless LAN.
  • the database 111 associated with the server 109 may be utilized for storing the user real-time location information and the feature data corresponding to the at least one subsurface feature 105 in the location.
  • the database 111 may be updated at regular intervals for including any additional information or rectified information received from the user 101.
  • the feature data corresponding to that location may be retrieved from the database 111 and sent to the user device 103 for displaying on the application.
  • the user 101 may launch the AR software application installed on the user device 103 for seeing the position information of the at least one subsurface feature 105 in the location.
  • the user real-time location information of the user 101 may be collected and transmitted to the server 109 utilizing the one or more sensors in the user device 103.
  • the server 109 may process the user real-time location information provided by the user device 103 and query the database 111 for retrieving the feature data corresponding to the real-time location of the user 101.
  • the database 111 may share the requested feature data with the server 109.
  • the server 109 may combine the feature data with the current location of the user 101 and may send it back to the user device 103 through the network.
  • the AR application may process the feature data received from the server 109 and may generate a the 2D interface and the 3D interface.
  • the 2D interface may comprise a base map such as, Google ® maps or other open source maps, and indicate the current location of the user 101 along with certain descriptive features of the at least one subsurface feature 105.
  • the user 101 may switch to the 3D interface from the 2D interface by simply clicking on a designed UI button on the 2D interface.
  • the 3D interface may use launch the camera application in the user device 103 and show the location and position information of the at least one subsurface feature 105 hidden in the underground surface of the location.
  • the AR application allows the user 101 to view and study the position and configurations of the at least one subsurface feature 105 utilizing the user device 103, as if these subsurface features 105 are present on the ground/surface of the location.
  • the application may also provide several other tools to the user 101.
  • the application may provide a search tool, utilizing which the desired features can be searched and viewed. This search tool may be according to the type of the subsurface feature or any pre-requested descriptive data already stored on the database 111.
  • the application may provide a filter tool, which helps in applying limitations to showing the specific features, such that, only one or a required number of the desired features are retrieved and shown to the user 101.
  • the application may also provide a new feature recording tool.
  • This tool helps the user 101 in removing or correcting the new features or the features that are previously shown on the application.
  • the corrections suggested by the user 101 may be uploaded on the database 111 after approval by an office expert.
  • the user 101 may need to just be located on the desired feature and record the current location utilizing the GPS sensor, and then insert the descriptive data about the update. This is one of the prominent features of the proposed AR application and this considerably aids in correcting and updating the base maps of the subsurface features 105 in the location.
  • FIG. 1 shows an exemplary detailed block diagram of the Augmented Reality (AR) generator 201 configured for providing information related to positioning of subsurface features in the location, in accordance with some embodiments of the present disclosure.
  • AR Augmented Reality
  • the AR generator 201 comprises an I/O interface 202, the processor 203, the memory 205 and one or more modules 207.
  • the one or more modules 207 may comprise, without limiting to, a data creation module 209, an information collection module 211, a processing module 213 and a positioning module 215.
  • the AR generator 201 may be installed or configured on the user device. In an alternative implementation, the AR generator 201 may be same as the AR application running on the user device.
  • the I/O interface may be utilized for interfacing the user device with the server through the network. Further, the I/O interface may be responsible for interfacing the one or more external sensors with the processor of the AR generator 201. In an exemplary embodiment, the processor may be configured for performing each function of the AR generator 201. In an exemplary embodiment, the memory may be utilized for locally storing the user real-time location information and the feature data corresponding to the at least one subsurface feature in the current location of the user.
  • the data creation module may be configured for creating the feature data comprising the location information and the corresponding descriptive data related to at least one subsurface feature in the location.
  • the data creation module may use predetermined mapping methods such as GIS science to create the feature data.
  • the information collection module may be configured for collecting the user real-time location information, utilizing the one or more sensors configured in the user device associated with the user.
  • the processing module may be configured for processing the user real-time location information and providing the two-dimensional (2D) interface and the three-dimensional (3D) interface to the user by retrieving the feature data corresponding to the real-time location of the user.
  • the positioning module may be configured for providing the information related to the positioning of the at least one subsurface feature in the location to the user through at least one of the 2D interface and the 3D interface. Also, the positioning module may allow the user to switch between the 2D interface and the 3D interface during an inspection of the at least one subsurface feature in the location.
  • FIG. 1 shows a method of collecting and processing feature data for observing a desired subsurface feature in the location in accordance with some embodiments of the present disclosure.
  • the feature data is collected.
  • the feature data may include the location and descriptive data.
  • the location information may be prepared utilizing the field operation forces utilizing different mapping methods.
  • the descriptive data may be collected by performing several scans on the features. Combining these two groups of data may be carried out through GIS science, as shown in step 303.
  • the GIS may show the location and descriptive data of different features including, without limitation, a point, linear, or polygon in a coherent viewing environment and so that a capability of analysing the features may be possible.
  • the GIS experts may investigate and record the feature data utilizing different software applications available. The recording of data may be carried out as a predetermined database and may be uploaded on a GIS server, as shown in step 305.
  • the database may be created as a server of the software application.
  • the database of the software may be created on a Linux server utilizing MySQL language.
  • the location and descriptive data of the features may be called through connecting the server of the software to the GIS server.
  • the data may be added singularly or as a group utilizing the designed administrative panel of the software.
  • the administrative panel may be placed in a web environment, as shown in step 307.
  • a start and finish coordinates of the line features, coordinates of the point features, and all descriptive data may be included and/or represented in the designed administrative panel.
  • a drawn model is created from the features using the software.
  • the features may be drawn as a 2D model (step 311) or as a 3D model (step 313), that may be carried out according to a real-coordinate requested from the database.
  • the real coordinate may be shown in a real location along with an actual scale.
  • different coding languages such as, without limiting to, PHP, Java, C++, and C# may be utilized.
  • the features and the drawn models may be loaded on a program memory of the user device utilizing an Internet connection.
  • different sensors of the user device may detect and collect information related to the current location of the user.
  • the sensors of the user device may comprise a GPS sensor, a gyroscope sensor, and a compass sensor.
  • the GPS sensor may be responsible for a receiving the latest location of the user device by utilizing a satellite radio connection.
  • the gyroscope sensor may be utilized to detect an angular change of the user device in any of the three rotational directions.
  • the compass sensor may be utilized for reporting an orientation of the user device in accordance with the true north on the world coordinate system.
  • the software application may create a connection between the drawn maps resulted from a 2D or a 3D models and the current location of the user after receiving the above-mentioned feature data.
  • a desired underground feature may be observed utilizing the AR environment on the application, which is the prominent feature of the application.
  • the AR based 3D model may show the 3D features utilizing a plurality of elements or attributes such as, without limiting to, dimension, perception depth, perspective views and perception angle of the at least one subsurface feature. This may resemble as if the user is observing the features through his naked eyes. That is, in the AR based 3D environment, the camera of the user device may be utilized instead of the base map in the background. And the user may be allowed to observe a plurality images that are taken from the camera, like what the user can see with her/his eyes.
  • the images show the 3D model of the desired features, such that the 3D model provides the observation of the desired feature that are buried underground and cannot be observed in the normal circumstances.
  • the observation may comprise a real location of the feature with respect to a plurality of elements such as a perspective view and a perception angle of the desired feature.
  • the 3D model of the desired feature may be shown from different angles utilizing the movement of the user device in different angles. All the sensors of the user device may be engaged and utilized in the 3D model.
  • the developed application and/or the Azimuth software may show a plurality of subsurface feature information and facilities in an actual location of the feature to the user and may provide a better visual profile of the status of invisible subsurface features for the user at their precise location.
  • the proposed application is also very facile, and it is enough for the user to run the software and point the camera of the user device to observe the invisible subsurface features, as well as the descriptive data associated with the visible features for a better analysis of the geospatial status of the invisible subsurface features.
  • the present disclosure combines the geospatial and descriptive data (GIS) and AR technology and improves the geospatial precision to 1cm - 2 cm range.
  • the present disclosure may be also applicable in sloped regions.
  • the proposed disclosure may be utilized for analysing and exhibiting the water flow disconnection and reconnection pipe valve related to damaged water pipes.
  • the water flow should be shut off for performing the replacement and repair operations of pipelines in the desired pipe.
  • the first pipe valve related to that pipe may be closed so that water would not flow in that part of the network.
  • the preceding pipe valve related to the main valve of the network is turned off. Therefore, on top of shutting off the water flow in that damaged pipe, the flow in the rest of the network is also shut off.
  • This problem causes water outage for the subscribers out of the range of the damaged pipe as well.
  • the geospatial analysis according to the water flow direction in the network, and the correlation of the pipeline to the network may be considered.
  • This network analysis is carried out online and in-site in the software environment.
  • a list of pipe valves related to this pipe may be shown, which is in order of a sub-main to the main pipe.
  • the screen moves on that valve and shows the location of the valve to the user, thereby solving the problems.
  • FIG. 1 shows a flowchart illustrating a method of providing information related to positioning of subsurface features in the location in accordance with some embodiments of the present disclosure.
  • the method 400 may include one or more blocks illustrating a method of providing information related to positioning of at least one subsurface feature in the location utilizing the user device associated with the user, as illustrated in .
  • the method 400 may be described in the general context of computer executable instructions.
  • computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform specific functions or implement specific abstract data types.
  • the method 400 includes creating, by the processor of the user device, feature data comprising the location information and the corresponding descriptive data related to the at least one subsurface feature in the location utilizing predetermined mapping methods.
  • the location information and the corresponding descriptive data may be stored in the server associated with the processor.
  • the at least one subsurface feature may comprise, without limitation, at least one of distribution pipelines, connective fittings, cable networks, subsurface pumps, or other underground objects.
  • creating the feature data may further comprise collecting the location information associated with at least one subsurface feature by performing field operations on the location utilizing mapping methods. Once the location information is collected, the descriptive data for the at least one subsurface feature may be generated by repetitively scanning the at least one subsurface feature. Further, the location information and the descriptive data may be combined utilizing Geographic Information System (GIS) science for creating a coherent viewing environment. Finally, the combined data may be stored on the server as a predetermined database and tagged with a predetermined database with the user real-time location information for subsequent retrieval operations.
  • GIS Geographic Information System
  • the method 400 includes collecting, by the processor of the user device, the user real-time location information, utilizing one or more sensors configured in the user device associated with the user.
  • the one or more sensors utilized for collecting the user real-time location information comprise, without limiting to, at least one of the Global Positioning System (GPS) sensor, the gyroscope, and the compass.
  • collecting the user real-time location information may comprise receiving, utilizing the GPS sensor, the latest user real-time location information utilizing satellite network.
  • the method may use the gyroscope for detecting angular changes of the user device in at least one of the three rotational directions. Additionally, the method may use the compass for detecting an orientation of the user device with reference to the geographical north.
  • the user device may be the smartphone, the tablet computer or any other handheld computing device associated with the user.
  • the user device may be configurable for connecting to the server for transmitting the user real-time location information and for receiving the 2D interface and the 3D interface data from the server, through a wired or wireless communication channel including the Internet.
  • the method 400 includes processing, by the processor of the user device, the user real-time location information and then providing the two-dimensional (2D) interface and a three-dimensional (3D) interface to the user by retrieving the feature data corresponding to the real-time location of the user.
  • the 2D interface and the 3D interface may be created utilizing real-time coordinates of the location of the user comprised in the user real-time location information.
  • the 2D environment or the 2D interface may comprise a base map of the location that is executed as a background application on the user device. Further, the 2D interface may represent the real-time location of the user, change in the real-time location of the user and a User Interface (UI) symbol for activating or switching to the 3D interface. In addition, the 2D interface may also represent the descriptive information of the at least one subsurface feature on a distinct window over the 2D interface. In some implementations, information from only the GPS sensor and the compass sensor may be utilized for creating the 2D interface (i.e., information collected from the gyroscope may not be utilized for creating the 2D interface).
  • UI User Interface
  • the 3D interface may be created by processing the feature data utilizing a predetermined Augmented Reality (AR) technique.
  • AR Augmented Reality
  • the 3D interface may represent at least one of, without limitation, a dimension of the at least one subsurface feature, perception depth, perspective views and perception angle of the at least one subsurface feature.
  • an image/video capturing module i.e., camera
  • the method 400 includes providing, by the processor of the user device, the information related to the positioning of the at least one subsurface feature in the location to the user through at least one of the 2D interface and the 3D interface, wherein the user can switch between the 2D interface and the 3D interface during an inspection of the at least one subsurface feature in the location.
  • the method may provide the search tool for enabling the user to search for at least one of a required one of at least one subsurface feature or a pre-requested descriptive data from the database.
  • the filter tool may be provided for filtering a required one of the at least one subsurface feature or the pre-requested descriptive data from the database.
  • the method may comprise allowing the user to perform at least one of adding a missing information or an additional information to the data associated with the location. Additionally, the user may be also provided with options for rectifying an incorrect information present on at least one of the 2D interface or the 3D interface of the location. In an implementation, each of the one or more updates made by the user while updating a missing information, adding an additional information, or rectifying the existing information may be stored on the database associated with the central server.
  • the computer system 500 may be the Augmented Reality (AR) generator or the user device disclosed in the claimed invention.
  • one or more computer systems 500 perform one or more steps of one or more methods described or illustrated herein.
  • one or more computer systems 500 provide functionality described or illustrated herein.
  • software running on one or more computer systems 500 performs one or more steps of one or more methods described or illustrated herein or provides functionality described or illustrated herein.
  • Embodiments include one or more portions of one or more computer systems 500.
  • reference to the computer system may encompass a computing device, and vice versa, where appropriate.
  • reference to the computer system may encompass one or more computer systems, where appropriate.
  • computer system 500 may be an embedded computer system, a System-On-Chip (SOC), a Single-Board Computer System (SBC) (such as, for example, a Computer-On-Module (COM) or System-On-Module (SOM)), a desktop computer system, a laptop or notebook computer system, an interactive kiosk, a mainframe, a mesh of computer systems, a mobile telephone, a Personal Digital Assistant (PDA), a server, a tablet computer system, or a combination of two or more of these.
  • SOC System-On-Chip
  • SBC Single-Board Computer System
  • COM Computer-On-Module
  • SOM System-On-Module
  • computer system 500 may include one or more computer systems 500, be unitary or distributed, span multiple locations, span multiple machines, span multiple data centres, or reside in a cloud, which may include one or more cloud components in one or more networks.
  • one or more computer systems 500 may perform without substantial spatial or temporal limitation one or more steps of one or more methods described or illustrated herein.
  • one or more computer systems 500 may perform in real time or in batch mode one or more steps of one or more methods described or illustrated herein.
  • One or more computer systems 500 may perform at different times or at different locations one or more steps of one or more methods described or illustrated herein, where appropriate.
  • computer system 500 includes the processor 502, memory 504, storage 506, an Input / Output (I/O) interface 508, a communication interface 510, and a bus 512.
  • I/O Input / Output
  • this disclosure describes and illustrates a particular computer system having a particular number of components in a particular arrangement, this disclosure contemplates any suitable computer system having any suitable number of any suitable components in any suitable arrangement.
  • processor 502 includes hardware for executing instructions, such as those making up a computer program.
  • processor 502 may retrieve (or fetch) the instructions from an internal register, an internal cache, memory 504, or storage 506; decode and execute them; and then write one or more results to an internal register, an internal cache, memory 504, or storage 506.
  • processor 502 may include one or more internal caches for data, instructions, or addresses. This disclosure contemplates processor 502 including any suitable number of any suitable internal caches, where appropriate. As an example, and not by way of limitation, processor 502 may include one or more instruction caches, one or more data caches, and one or more Translation Lookaside Buffers (TLBs). Instructions in the instruction caches may be copies of instructions in memory 504 or storage 506, and the instruction caches may speed up retrieval of those instructions by processor 502.
  • TLBs Translation Lookaside Buffers
  • Data in the data caches may be copies of data in memory 504 or storage 506 for instructions executing at processor 502 to operate on; the results of previous instructions executed at processor 502 for access by subsequent instructions executing at processor 502 or for writing to memory 504 or storage 506; or other suitable data.
  • the data caches may speed up read or write operations by processor 502.
  • the TLBs may speed up virtual-address translation for processor 502.
  • the processor 502 may include one or more internal registers for data, instructions, or addresses. This disclosure contemplates processor 502 including any suitable number of any suitable internal registers, where appropriate. Where appropriate, processor 502 may include one or more Arithmetic Logic Units (ALUs), a multi-core processor or include one or more processors 502. Although this disclosure describes and illustrates a particular processor, this disclosure contemplates any suitable processor.
  • ALUs Arithmetic Logic Units
  • the memory 504 includes main memory for storing instructions for processor 502 to execute or data for processor 502 to operate on.
  • computer system 500 may load instructions from storage 506 or another source (such as, for example, another computer system 500) to memory 504.
  • Processor 502 may then load the instructions from memory 504 to an internal register or internal cache.
  • processor 502 may retrieve the instructions from the internal register or internal cache and decode them.
  • processor 502 may write one or more results (which may be intermediate or results) to the internal register or internal cache.
  • Processor 502 may then write one or more of those results to memory 504.
  • processor 502 executes only instructions in one or more internal registers or internal caches or in memory 504 (as opposed to storage 506 or elsewhere) and operates only on data in one or more internal registers or internal caches or in memory 504 (as opposed to storage 506 or elsewhere).
  • One or more memory buses (which may each include an address bus and a data bus) may couple processor 502 to memory 504.
  • Bus 512 may include one or more memory buses, as described below.
  • memory 504 includes Random Access Memory (RAM).
  • RAM Random Access Memory
  • This RAM may be volatile memory, where appropriate Where appropriate, this RAM may be Dynamic RAM (DRAM) or Static RAM (SRAM). Moreover, where appropriate, this RAM may be single-ported or multi-ported RAM.
  • DRAM Dynamic RAM
  • SRAM Static RAM
  • this RAM may be single-ported or multi-ported RAM.
  • Memory 504 may include one or more memories 504, where appropriate. Although this disclosure describes and illustrates particular memory, this disclosure contemplates any suitable memory.
  • storage 506 includes mass storage for data or instructions.
  • storage 506 may include a Hard Disk Drive (HDD), a floppy disk drive, flash memory, an optical disc, a magneto-optical disc, magnetic tape, or a Universal Serial Bus (USB) drive or a combination of two or more of these.
  • HDD Hard Disk Drive
  • floppy disk drive a floppy disk drive
  • flash memory an optical disc
  • magneto-optical disc magnetic tape
  • USB Universal Serial Bus
  • Storage 506 may include removable or non-removable (or fixed) media, where appropriate.
  • Storage 506 may be internal or external to computer system 500, where appropriate.
  • storage 506 is non-volatile, solid-state memory.
  • storage 506 includes read-only memory (ROM). Where appropriate, this ROM may be mask-programmed ROM, Programmable ROM (PROM), Erasable PROM (EPROM), electrically erasable PROM (EEPROM), Electrically Alterable ROM (EAROM), or flash memory or a combination of two or more of these.
  • ROM read-only memory
  • PROM Programmable ROM
  • EPROM Erasable PROM
  • EEPROM electrically erasable PROM
  • EAROM Electrically Alterable ROM
  • flash memory or a combination of two or more of these.
  • This disclosure contemplates mass storage 506 taking any suitable physical form.
  • Storage 506 may include one or more storage control units facilitating communication between processor 502 and storage 506, where appropriate. Where appropriate, storage 506 may include one or more storages 506. Although this disclosure describes and illustrates particular storage, this disclosure contemplates any suitable storage.
  • I/O interface 508 includes hardware, software, or both, providing one or more interfaces for communication between computer system 500 and one or more I/O devices.
  • Computer system 500 may include one or more of these I/O devices, where appropriate.
  • One or more of these I/O devices may enable communication between a person and computer system 500.
  • a I/O device may include a keyboard, keypad, microphone, monitor, mouse, printer, scanner, speaker, still camera, stylus, tablet, touch screen, trackball, video camera, another suitable I/O device or a combination of two or more of these.
  • An I/O device may include one or more sensors. This disclosure contemplates any suitable I/O devices and any suitable I/O interfaces 508 for them.
  • I/O interface 508 may include one or more device or software drivers enabling processor 502 to drive one or more of these I/O devices.
  • I/O interface 508 may include one or more I/O interfaces 508, where appropriate. Although this disclosure describes and illustrates a particular I/O interface, this disclosure contemplates any suitable I/O interface.
  • communication interface 510 includes hardware, software, or both providing one or more interfaces for communication (such as, for example, packet-based communication) between computer system 500 and one or more other computer systems 500 or one or more networks.
  • communication interface 510 may include a Network Interface Controller (NIC) or network adapter for communicating with an Ethernet or other wire-based network or a Wireless NIC (WNIC) or wireless adapter for communicating with a wireless network, such as a WI-FI network.
  • NIC Network Interface Controller
  • WNIC Wireless NIC
  • WI-FI network wireless network
  • computer system 500 may communicate with an ad hoc network, a Personal Area Network (PAN), a Local Area Network (LAN), a Wide Area Network (WAN), a Metropolitan Area Network (MAN), or one or more portions of the Internet or a combination of two or more of these.
  • PAN Personal Area Network
  • LAN Local Area Network
  • WAN Wide Area Network
  • MAN Metropolitan Area Network
  • computer system 500 may communicate with a Wireless PAN (WPAN) (such as, for example, a BLUETOOTH WPAN), a WI-FI network, a WI-MAX network, a cellular telephone network (such as, for example, a Global System for Mobile Communications (GSM) network), or other suitable wireless network or a combination of two or more of these.
  • WPAN Wireless PAN
  • WI-FI such as, for example, a BLUETOOTH WPAN
  • WI-MAX wireless personal area network
  • cellular telephone network such as, for example, a Global System for Mobile Communications (GSM) network
  • GSM
  • bus 512 includes hardware, software, or both coupling components of computer system 500 to each other.
  • bus 512 may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a Front-Side Bus (FSB), a HYPERTRANSPORT (HT) interconnect, an Industry Standard Architecture (ISA) bus, an INFINIBAND interconnect, a low-pin-count (LPC) bus, a memory bus, a Micro Channel Architecture (MCA) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCIe) bus, a serial advanced technology attachment (SATA) bus, a Video Electronics Standards Association local (VLB) bus, or another suitable bus or a combination of two or more of these.
  • Bus 512 may include one or more buses 512, where appropriate.
  • a computer-readable non-transitory storage medium or media may include one or more semiconductor-based or other Integrated Circuits (ICs) (such, as for example, Field-Programmable Gate Arrays (FPGAs) or Application-Specific ICs (ASICs)), hard disk drives (HDDs), Hybrid Hard Drives (HHDs), optical discs, Optical Disc Drives (ODDs), magneto-optical discs, magneto-optical drives, floppy diskettes, Floppy Disk Drives (FDDs), magnetic tapes, Solid-State Drives (SSDs), RAM-drives, SECURE DIGITAL cards or drives, any other suitable computer-readable non-transitory storage media, or any suitable combination of two or more of these, where appropriate.
  • ICs such, as for example, Field-Programmable Gate Arrays (FPGAs) or Application-Specific ICs (ASICs)
  • HDDs hard disk drives
  • HDs Hybrid Hard Drives
  • ODDs
  • the present disclosure provides an easy-to-use augmented reality software application that helps in positioning underground and subsurface features in a location.
  • the method of present disclosure also helps in tracking, locating and illustrating different underground / subsurface features, which are not reachable easily otherwise.
  • the method of present disclosure presents users with a plurality of invisible information and facilities in an actual location of a feature, and thereby provides a better visual profile of a status of an invisible feature for the user at their precise location.
  • the AR software application proposed in the present disclosure is very facile and that it requires the user only to run the software and point the camera of the user device at a desired invisible, subsurface feature.
  • the method and the storage platform disclosed in the present disclosure may be utilized to overcome various technical problems related to finding, tracking, and accessing subsurface devices in a location.
  • the abovesaid technical advancements and practical applications of the disclosed method, the AR based software application and the AR generator 201 may be attributed to the aspects of a) creating a user real-time location information, utilizing one or more sensors configured in a user device associated with the user and b) processing the user real-time location information and providing a two-dimensional (2D) interface and a three-dimensional (3D) interface to the user by retrieving the feature data corresponding to the real-time location of the use.
  • 2D two-dimensional
  • 3D three-dimensional
  • an embodiment means “one or more (but not all) embodiments of the invention(s)" unless expressly specified otherwise.

Abstract

Disclosed herein is a method and an Augmented Reality (AR) software for providing information related to positioning of subsurface features in a location. In an exemplary embodiment, the present disclosure provides an easy-to-use AR software or an application for positioning underground features. The application comprises a first environment and a second environment, such that the first environment comprises a 2D interface and the second environment comprises an AR based 3D interface. The application uses the 2D interface and the AR based 3D interface for tracking, locating, and illustrating different underground or subsurface features of a location, which may not be easy to reach otherwise.

Description

AN AUGMENTED REALITY PLATFORM FOR A GEOGRAPHIC INFORMATION SYSTEM CROSS REFERENCE TO RELATED
The present invention application claims priority from pending U.S. Provisional Patent Application Serial No. 63/143,036, filed on January 29, 2021, entitled “GIS AR Platform”, which is incorporated by reference herein in its entirety.
The present subject matter is, in general, related to Geographic Information System (GIS) and more particularly, but not exclusively, to an Augmented Reality (AR) based platform for realizing the GIS and a method of creating the said platform.
In the past, the descriptive and geospatial information were converted into digital maps, in the form of graphical pieces of software such as Auto Computer-Aided Design (AutoCAD) applications. Subsequently, these information were converted to geospatial information pieces of software such as ArcGIS, prominently in work offices, after they were collected by field forces. Also, the maps of a desired region utilized to be printed on papers and utilized while performing field operations in that location or in case of fortuitous events.
However, in most of the cases, it was challenging to find underground features of the location. Consequently, this task was carried out with high error rates, and it also incurred lot of losses in terms of energy, time, and costs. The issue became even more challenging in the case of printed maps having incorrect information. Also, there was no efficient mechanism for quickly rectifying incorrect pieces of information on the printed maps
As a result, developing an easy-to-use Augmented Reality (AR) based software/application for all the users with different skillsets and capabilities is essential. Such a software/application is intended to replace the printed maps either at workplace or accident locations. Also, the AR based software/application is intended to enable users to look at all desired underground features of the location, accompanied by their descriptive information. Besides, the AR based software/application is also intended for use in correcting an old, incorrect data and load a new and/or much accurate data in case of variances in the location.
One of the existing methods by Wei Li et al (DOI:10.3390/ijgi7010032) suggests a real-time location-based rendering of urban underground pipelines. The said method analyses the potential of augmented reality for managing subsurface pipes based on the BeiDou® satellite navigation system. Further, for enhancing the spatial precision of tracing pipes, the method uses differential corrections, received from the augmented reality, for calculating the precise coordinates of users in real- time. In addition, in order to virtually express and precisely determine the position of subsurface pipes, the method proposed to use the 3D structures of the location.
Another existing method by Adel Fridhi et al (DOI: 10.22115/ SCCE.2020.212254.1148) discusses embedding all 3D data in GIS utilizing augmented reality. Here, the augmented reality in a 3D model embedded in GIS makes accessing hidden sites such as subsurface infrastructures feasible. In addition, the document suggests carrying out subsurface features positioning utilizing the Azimuth augmented reality in a 3D model. The disclosed software determines the precise of location of the feature through receiving the data from GIS database and phone sensors such as gyroscope and GPS, as well as analysing the information in a 3D environment by considering the depth and slope of feature and location plus Azimuth calculations.
Yet another existing method disclosed in CN104331423A suggests a method and device for positioning based on electronic maps. In this document, the user's location and one or several other aid goals are considered as the reference, and the navigation is carried out based on Azimuth data calculation. The sensors utilized in this idea include GPS, gyroscopic compass, and accelerator. On similar lines, reference KR101450133B1 discloses a system and method for presenting the cross-sectional data of subsurface features utilizing augmented reality, so that the data related to the features recorded in the database and the precise location is accessible in the augmented reality. The software disclosed in this reference document traces the precise location of the feature or records the location of new features in 2D and 3D environments utilizing data received from GPS, compass, and gyroscope sensors as well as Azimuth calculations.
In spite the existing methods suggest combining the geospatial and descriptive data and augmented reality, there is still a tremendous scope for improving the geospatial precision to the range of 1-2 centimetres. Also, the existing methods lack an efficient mechanism that allows administrators and field operators to correct or insert the feature data utilizing a simple, easy-to-use administrative panel in real-time, while the administrator or the field operator is present in a desired location.
The information disclosed in this background of the disclosure section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
Disclosed herein is an Augmented Reality (AR) based software application for positioning underground or subsurface features in a location. The application may comprise a first environment and a second environment. The first environment and the second environment may comprise a two-dimensional (2D) environment and an AR based three-dimensional (3D) environment, respectively (these environments are alternatively referred to as 2D interface and 3D interface, respectively, throughout this document). When a user uses/launches the application installed on a computing device (also referred to as user device), the application contacts a central server and requests for a different feature data from a Geographic Information System (GIS). The feature data may be related to the at least one subsurface feature present in the location. The feature data received from the GIS may be shown in the first and the second environments of the application. Here, the precise location of the user may be identified utilizing a plurality of sensors that exist on the user device. As an example, the plurality of sensors may include a Global Positioning System (GPS) sensor, a gyroscope and a compass sensor. In fact, the application may recognize a location of the user and may change or update it utilizing the data collected from the plurality of sensors. Subsequently, once the feature data has been received, the application may connect a created map, obtained from the 2D or the 3D model, with a current location of the user. Thereafter, the user can look at all the desired underground features in the location, along with their descriptive information. In addition, the application enables the user to rectify the old feature data or upload the new data, in cases where the old data is faulty or of variance at the location.
Accordingly, the present disclosure relates to a method for providing information related to positioning of subsurface features in the location. The method comprises creating, by a processor, feature data comprising the location information and corresponding descriptive data related to at least one subsurface feature in the location utilizing a predetermined mapping method. The location information and corresponding descriptive data are stored in a server associated with the processor. Further, the method comprises collecting a user real-time location information, utilizing one or more sensors configured in the user device associated with the user. Upon collecting the user real-time location information, the method comprises processing the user real-time location information and providing a two-dimensional (2D) interface and a three-dimensional (3D) interface to the user by retrieving the feature data corresponding to the real-time location of the user. Finally, the method comprises providing the information related to the positioning of the at least one subsurface feature in the location to the user through at least one of the 2D interface and the 3D interface. The user is allowed to switch between the 2D interface and the 3D interface during an inspection of the at least one subsurface feature in the location.
Further, the present disclosure relates to an Augmented Reality (AR) generator for providing information related to positioning of subsurface features in the location. The AR generator comprises the processor and a memory. The memory is communicatively coupled to the processor and stores processor-executable instructions, which on execution, cause the processor to create feature data comprising the location information and corresponding descriptive data related to at least one subsurface feature in the location utilizing the predetermined mapping method. The location information and corresponding descriptive data is stored in the server associated with the processor. Further, the instructions cause the processor to collect the user real-time location information, utilizing one or more sensors configured in the user device associated with the user. Thereafter, the instructions cause the processor to process the user real-time location information and providing thetwo-dimensional (2D) interface and thethree-dimensional (3D) interface to the user by retrieving the feature data corresponding to the real-time location of the user. Finally, the instructions cause the processor to provide the information related to the positioning of the at least one subsurface feature in the location to the user through at least one of the 2D interface and the 3D interface. The user is allowed to switch between the 2D interface and the 3D interface during an inspection of the at least one subsurface feature in the location.
Furthermore, the present disclosure relates to a method for detecting municipality features in the location. The method comprises receiving municipality feature data and information related to the location of one or more pipes and fittings in a GIS data format. Further, the method comprises converting the received municipality feature data into respective 2D interface and 3D interface formats. Thereafter, the method comprises matching the converted data with the user real-time location information received from the location of the user through GPS sensor.
Furthermore, the present disclosure relates to a software application installed on the user device and configured to provide information related to positioning of subsurface features in the location utilizing some of or each of the limitations stated hereinabove.
Furthermore, the present disclosure relates to the user device or a computing device, which is configured for providing information related to positioning of subsurface features in the location utilizing the AR generator utilizing some of or each of the limitations stated hereinabove.
The foregoing summary is illustrative only and is not intended to be in any way limiting. In addition to the illustrative aspects, embodiments, and features described above, further aspects, embodiments, and features will become apparent by reference to the drawings and the following detailed description.
The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate exemplary embodiments and, together with the description, explain the disclosed principles. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The same numbers are utilized throughout the figures to reference like features and components. Some embodiments of system and/or methods in accordance with embodiments of the present subject matter are now described, by way of example only, and regarding the accompanying figures, in which:
illustrates an exemplary architecture illustrating a method of providing information related to positioning of subsurface features in a location in accordance with some embodiments of the present disclosure.
shows an exemplary block diagram of an Augmented Reality (AR) generator configured for providing information related to positioning of subsurface features in a location, in accordance with some embodiments of the present disclosure.
shows an exemplary flowchart illustrating a method of collecting and processing feature data for observing a desired subsurface feature in a location in accordance with some embodiments of the present disclosure.
shows an exemplary flowchart illustrating a method of providing information related to positioning of subsurface features in a location in accordance with some embodiments of the present disclosure.
shows an exemplary block diagram of an exemplary computer system in accordance with some embodiments of the present disclosure.
It should be appreciated by those skilled in the art that any block diagrams herein represent conceptual views of illustrative systems embodying the principles of the present subject matter. Similarly, it will be appreciated that any flow charts, flow diagrams, state transition diagrams, pseudo code, and the like represent various processes which may be substantially represented in computer readable medium and executed by a computer or processor, whether such computer or processor is explicitly shown.
In the present document, the word "exemplary" is utilized herein to mean "serving as an example, instance, or illustration." Any embodiment or implementation of the present subject matter described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
While the disclosure is susceptible to various modifications and alternative forms, specific embodiment thereof has been shown by way of example in the drawings and will be described in detail below. It should be understood, however that it is not intended to limit the disclosure to the specific forms disclosed, but on the contrary, the disclosure is to cover all modifications, equivalents, and alternative falling within the scope of the disclosure.
The terms “comprises”, “comprising”, “includes”, or any other variations thereof, are intended to cover a non-exclusive inclusion, such that a setup, device, or method that comprises a list of components or steps does not include only those components or steps but may include other components or steps not expressly listed or inherent to such setup or device or method. In other words, one or more elements in a system or apparatus proceeded by “comprises… a” does not, without more constraints, preclude the existence of other elements or additional elements in the system or method.
In the following detailed description of the embodiments of the disclosure, reference is made to the accompanying drawings that form a part hereof, and in which are shown by way of illustration specific embodiments in which the disclosure may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the disclosure, and it is to be understood that other embodiments may be utilized and that changes may be made without departing from the scope of the present disclosure. The following description is, therefore, not to be taken in a limiting sense.
illustrates an exemplary architecture illustrating a method of providing information related to positioning of subsurface features in a location in accordance with some embodiments of the present disclosure.
In an exemplary embodiment, as shown in , the architecture may include, without limiting to, a user 101, a user device 103 associated with the user 101, one or more subsurface features, subsurface feature 1 1051, …, subsurface feature N 105N (collectively referred to as subsurface feature 105), a central server 109, a database 111 associated with the server 109 and a network 107 connecting the user device 103 with the server 109 and the database 111.
In an exemplary embodiment, the user 101 may be, without limitation, a field operator, a surveying officer or an excavator operator, who needs to study, track and analyse various subsurface features 105 in a particular location. Thus, one of the objectives of the proposed method is to provide an easy-to-use tool or application for the users, as state above, to help the users in easily analysing the at least one subsurface feature 105 in the location. As an example, the at least one subsurface feature 105 may include, without limiting to, distribution pipelines, underground gas connections, connective fittings, cable networks, telephone lines, optic fibre lines, subsurface pumps, or any other underground objects. As the name suggests, the at least one subsurface feature 105 may not be visible to the user 101 from the surface of the ground.
In an exemplary embodiment, the user device 103 associated with the user 101 may include, without limiting to, a Personal Digital Assistant (PDA), a smartphone, a tablet computer or even a laptop. In an implementation, the user device 103 may be any computing device equipped with an image/video capturing module (i.e., camera), an Internet connectivity and the one or more sensors including, without any limitation, a Global Positioning System (GPS) sensor, a gyroscope, and a compass. In an exemplary embodiment, the user device 103 and the Internet connectivity may be the minimal prerequisites for successful implementation of the proposed method.
In an exemplary embodiment, the server 109 may be a remote server or a central server, which may be configured for receiving and processing the user real-time location information from the user device 103 and provide a desired feature to the user 101 through the user device 103. In an exemplary embodiment, the server 109 and the user device 103 may be communicatively interfaced utilizing a predetermined communication network (generally represented as network 107 in ). In an implementation, the network may be a wired network such as, without limiting to, Local Area Network (LAN), Ethernet, fibre optics or a wireless network such as, without limiting to, Internet, Wi-Fi, and wireless LAN.
In an exemplary embodiment, the database 111 associated with the server 109 may be utilized for storing the user real-time location information and the feature data corresponding to the at least one subsurface feature 105 in the location. The database 111 may be updated at regular intervals for including any additional information or rectified information received from the user 101. In an exemplary embodiment, after detecting the real-time location of the user 101 in the location, the feature data corresponding to that location may be retrieved from the database 111 and sent to the user device 103 for displaying on the application. In an exemplary embodiment, depending on the number of users, number of requests and the amount data being exchanged, there may be a plurality of servers and associated databases for this purpose, instead of the single server 109 and database 111 shown in the .
In an exemplary embodiment, after moving to the desired location, the user 101 may launch the AR software application installed on the user device 103 for seeing the position information of the at least one subsurface feature 105 in the location. Once the AR application is launched on the user device 103, the user real-time location information of the user 101 may be collected and transmitted to the server 109 utilizing the one or more sensors in the user device 103. Thereafter, the server 109 may process the user real-time location information provided by the user device 103 and query the database 111 for retrieving the feature data corresponding to the real-time location of the user 101. The database 111 may share the requested feature data with the server 109. Thereafter, the server 109 may combine the feature data with the current location of the user 101 and may send it back to the user device 103 through the network. At the user device 103, the AR application may process the feature data received from the server 109 and may generate a the 2D interface and the 3D interface.
In an exemplary embodiment, the 2D interface may comprise a base map such as, Google® maps or other open source maps, and indicate the current location of the user 101 along with certain descriptive features of the at least one subsurface feature 105. The user 101 may switch to the 3D interface from the 2D interface by simply clicking on a designed UI button on the 2D interface. In an exemplary embodiment, the 3D interface may use launch the camera application in the user device 103 and show the location and position information of the at least one subsurface feature 105 hidden in the underground surface of the location. Thus, the AR application allows the user 101 to view and study the position and configurations of the at least one subsurface feature 105 utilizing the user device 103, as if these subsurface features 105 are present on the ground/surface of the location.
In an exemplary embodiment, in addition to the aforesaid feature of showing the subsurface features 105 in the 2D and 3D environments, the application may also provide several other tools to the user 101. As an example, the application may provide a search tool, utilizing which the desired features can be searched and viewed. This search tool may be according to the type of the subsurface feature or any pre-requested descriptive data already stored on the database 111. Further, the application may provide a filter tool, which helps in applying limitations to showing the specific features, such that, only one or a required number of the desired features are retrieved and shown to the user 101.
In addition, the application may also provide a new feature recording tool. This tool helps the user 101 in removing or correcting the new features or the features that are previously shown on the application. Here, the corrections suggested by the user 101 may be uploaded on the database 111 after approval by an office expert. The user 101 may need to just be located on the desired feature and record the current location utilizing the GPS sensor, and then insert the descriptive data about the update. This is one of the prominent features of the proposed AR application and this considerably aids in correcting and updating the base maps of the subsurface features 105 in the location.
shows an exemplary detailed block diagram of the Augmented Reality (AR) generator 201 configured for providing information related to positioning of subsurface features in the location, in accordance with some embodiments of the present disclosure.
As shown in , the AR generator 201 comprises an I/O interface 202, the processor 203, the memory 205 and one or more modules 207. The one or more modules 207 may comprise, without limiting to, a data creation module 209, an information collection module 211, a processing module 213 and a positioning module 215. In an implementation, the AR generator 201 may be installed or configured on the user device. In an alternative implementation, the AR generator 201 may be same as the AR application running on the user device.
In an exemplary embodiment, the I/O interface may be utilized for interfacing the user device with the server through the network. Further, the I/O interface may be responsible for interfacing the one or more external sensors with the processor of the AR generator 201. In an exemplary embodiment, the processor may be configured for performing each function of the AR generator 201. In an exemplary embodiment, the memory may be utilized for locally storing the user real-time location information and the feature data corresponding to the at least one subsurface feature in the current location of the user.
In an exemplary embodiment, the data creation module may be configured for creating the feature data comprising the location information and the corresponding descriptive data related to at least one subsurface feature in the location. In an exemplary embodiment, the data creation module may use predetermined mapping methods such as GIS science to create the feature data.
In an exemplary embodiment, the information collection module may be configured for collecting the user real-time location information, utilizing the one or more sensors configured in the user device associated with the user.
In an exemplary embodiment, the processing module may be configured for processing the user real-time location information and providing the two-dimensional (2D) interface and the three-dimensional (3D) interface to the user by retrieving the feature data corresponding to the real-time location of the user.
In an exemplary embodiment, the positioning module may be configured for providing the information related to the positioning of the at least one subsurface feature in the location to the user through at least one of the 2D interface and the 3D interface. Also, the positioning module may allow the user to switch between the 2D interface and the 3D interface during an inspection of the at least one subsurface feature in the location.
shows a method of collecting and processing feature data for observing a desired subsurface feature in the location in accordance with some embodiments of the present disclosure.
In an exemplary embodiment, at step 301, the feature data is collected. The feature data may include the location and descriptive data. Generally, the location information may be prepared utilizing the field operation forces utilizing different mapping methods. Also, the descriptive data may be collected by performing several scans on the features. Combining these two groups of data may be carried out through GIS science, as shown in step 303. In an exemplary embodiment, the GIS may show the location and descriptive data of different features including, without limitation, a point, linear, or polygon in a coherent viewing environment and so that a capability of analysing the features may be possible. In an exemplary embodiment, the GIS experts may investigate and record the feature data utilizing different software applications available. The recording of data may be carried out as a predetermined database and may be uploaded on a GIS server, as shown in step 305.
Subsequently, at step 307, the database may be created as a server of the software application. In an implementation, the database of the software may be created on a Linux server utilizing MySQL language. The location and descriptive data of the features may be called through connecting the server of the software to the GIS server. In cases that the data may not be called or when manually inserting the data is needed, the data may be added singularly or as a group utilizing the designed administrative panel of the software. The administrative panel may be placed in a web environment, as shown in step 307. In an exemplary embodiment, a start and finish coordinates of the line features, coordinates of the point features, and all descriptive data may be included and/or represented in the designed administrative panel.
Further, at step 309, a drawn model is created from the features using the software. In an exemplary embodiment, to create the drawn model of the features, the features may be drawn as a 2D model (step 311) or as a 3D model (step 313), that may be carried out according to a real-coordinate requested from the database. As a result, the real coordinate may be shown in a real location along with an actual scale. In an implementation, for coding the software, different coding languages such as, without limiting to, PHP, Java, C++, and C# may be utilized. Further, by running the software application on the user device, the features and the drawn models may be loaded on a program memory of the user device utilizing an Internet connection.
In an exemplary embodiment, at step 315, different sensors of the user device may detect and collect information related to the current location of the user. The sensors of the user device may comprise a GPS sensor, a gyroscope sensor, and a compass sensor. The GPS sensor may be responsible for a receiving the latest location of the user device by utilizing a satellite radio connection. The gyroscope sensor may be utilized to detect an angular change of the user device in any of the three rotational directions. The compass sensor may be utilized for reporting an orientation of the user device in accordance with the true north on the world coordinate system. In an exemplary embodiment, the software application may create a connection between the drawn maps resulted from a 2D or a 3D models and the current location of the user after receiving the above-mentioned feature data.
Finally, at step 317, a desired underground feature may be observed utilizing the AR environment on the application, which is the prominent feature of the application. In an exemplary embodiment, the AR based 3D model may show the 3D features utilizing a plurality of elements or attributes such as, without limiting to, dimension, perception depth, perspective views and perception angle of the at least one subsurface feature. This may resemble as if the user is observing the features through his naked eyes. That is, in the AR based 3D environment, the camera of the user device may be utilized instead of the base map in the background. And the user may be allowed to observe a plurality images that are taken from the camera, like what the user can see with her/his eyes. In other words, the images show the 3D model of the desired features, such that the 3D model provides the observation of the desired feature that are buried underground and cannot be observed in the normal circumstances. The observation may comprise a real location of the feature with respect to a plurality of elements such as a perspective view and a perception angle of the desired feature. Furthermore, the 3D model of the desired feature may be shown from different angles utilizing the movement of the user device in different angles. All the sensors of the user device may be engaged and utilized in the 3D model.
In an exemplary embodiment, the developed application and/or the Azimuth software may show a plurality of subsurface feature information and facilities in an actual location of the feature to the user and may provide a better visual profile of the status of invisible subsurface features for the user at their precise location. The proposed application is also very facile, and it is enough for the user to run the software and point the camera of the user device to observe the invisible subsurface features, as well as the descriptive data associated with the visible features for a better analysis of the geospatial status of the invisible subsurface features.
Thus, in summary, the present disclosure combines the geospatial and descriptive data (GIS) and AR technology and improves the geospatial precision to 1cm - 2 cm range. The present disclosure may be also applicable in sloped regions.
In an exemplary embodiment, the proposed disclosure may be utilized for analysing and exhibiting the water flow disconnection and reconnection pipe valve related to damaged water pipes. Generally, in case of burst pipes or municipal water distribution failure, the water flow should be shut off for performing the replacement and repair operations of pipelines in the desired pipe. For this purpose, the first pipe valve related to that pipe may be closed so that water would not flow in that part of the network. In many cases, since these valves are buried and not accessible, or because the pipe valve of the damaged valve is not known, the preceding pipe valve related to the main valve of the network is turned off. Therefore, on top of shutting off the water flow in that damaged pipe, the flow in the rest of the network is also shut off. This problem causes water outage for the subscribers out of the range of the damaged pipe as well. For solving this problem, the geospatial analysis according to the water flow direction in the network, and the correlation of the pipeline to the network may be considered. This network analysis is carried out online and in-site in the software environment. By choosing the damaged pipe in the environment of the proposed AR based software application, a list of pipe valves related to this pipe may be shown, which is in order of a sub-main to the main pipe. By clicking on each of these valves, the screen moves on that valve and shows the location of the valve to the user, thereby solving the problems.
shows a flowchart illustrating a method of providing information related to positioning of subsurface features in the location in accordance with some embodiments of the present disclosure.
As illustrated in , the method 400 may include one or more blocks illustrating a method of providing information related to positioning of at least one subsurface feature in the location utilizing the user device associated with the user, as illustrated in . The method 400 may be described in the general context of computer executable instructions. Generally, computer executable instructions can include routines, programs, objects, components, data structures, procedures, modules, and functions, which perform specific functions or implement specific abstract data types.
The order in which the method 400 is described is not intended to be construed as a limitation, and any number of the described method blocks can be combined in any order to implement the method. Additionally, individual blocks may be deleted from the methods without departing from the scope of the subject matter described herein. Furthermore, the method can be implemented in any suitable hardware, software, firmware, or combination thereof.
At block 401, the method 400 includes creating, by the processor of the user device, feature data comprising the location information and the corresponding descriptive data related to the at least one subsurface feature in the location utilizing predetermined mapping methods. In an exemplary embodiment, the location information and the corresponding descriptive data may be stored in the server associated with the processor. As an example, the at least one subsurface feature may comprise, without limitation, at least one of distribution pipelines, connective fittings, cable networks, subsurface pumps, or other underground objects.
In an exemplary embodiment, creating the feature data may further comprise collecting the location information associated with at least one subsurface feature by performing field operations on the location utilizing mapping methods. Once the location information is collected, the descriptive data for the at least one subsurface feature may be generated by repetitively scanning the at least one subsurface feature. Further, the location information and the descriptive data may be combined utilizing Geographic Information System (GIS) science for creating a coherent viewing environment. Finally, the combined data may be stored on the server as a predetermined database and tagged with a predetermined database with the user real-time location information for subsequent retrieval operations.
At block 403, the method 400 includes collecting, by the processor of the user device, the user real-time location information, utilizing one or more sensors configured in the user device associated with the user. As an example, the one or more sensors utilized for collecting the user real-time location information comprise, without limiting to, at least one of the Global Positioning System (GPS) sensor, the gyroscope, and the compass. In an exemplary embodiment, collecting the user real-time location information may comprise receiving, utilizing the GPS sensor, the latest user real-time location information utilizing satellite network. Further, the method may use the gyroscope for detecting angular changes of the user device in at least one of the three rotational directions. Additionally, the method may use the compass for detecting an orientation of the user device with reference to the geographical north.
In an exemplary embodiment, the user device may be the smartphone, the tablet computer or any other handheld computing device associated with the user. In an implementation, the user device may be configurable for connecting to the server for transmitting the user real-time location information and for receiving the 2D interface and the 3D interface data from the server, through a wired or wireless communication channel including the Internet.
At block 405, the method 400 includes processing, by the processor of the user device, the user real-time location information and then providing the two-dimensional (2D) interface and a three-dimensional (3D) interface to the user by retrieving the feature data corresponding to the real-time location of the user. In an exemplary embodiment, the 2D interface and the 3D interface may be created utilizing real-time coordinates of the location of the user comprised in the user real-time location information.
In an exemplary embodiment, the 2D environment or the 2D interface may comprise a base map of the location that is executed as a background application on the user device. Further, the 2D interface may represent the real-time location of the user, change in the real-time location of the user and a User Interface (UI) symbol for activating or switching to the 3D interface. In addition, the 2D interface may also represent the descriptive information of the at least one subsurface feature on a distinct window over the 2D interface. In some implementations, information from only the GPS sensor and the compass sensor may be utilized for creating the 2D interface (i.e., information collected from the gyroscope may not be utilized for creating the 2D interface).
In an exemplary embodiment, the 3D interface may be created by processing the feature data utilizing a predetermined Augmented Reality (AR) technique. As an example, the 3D interface may represent at least one of, without limitation, a dimension of the at least one subsurface feature, perception depth, perspective views and perception angle of the at least one subsurface feature. In an exemplary embodiment, upon activation of the 3D interface, an image/video capturing module (i.e., camera) in the user device captures real-time images of the location for generating a 3D model of the at least one subsurface feature in the location.
At block 407, the method 400 includes providing, by the processor of the user device, the information related to the positioning of the at least one subsurface feature in the location to the user through at least one of the 2D interface and the 3D interface, wherein the user can switch between the 2D interface and the 3D interface during an inspection of the at least one subsurface feature in the location.
In an exemplary embodiment, in addition to providing the information related to the positioning of the at least one subsurface feature, the method may provide the search tool for enabling the user to search for at least one of a required one of at least one subsurface feature or a pre-requested descriptive data from the database. Further, the filter tool may be provided for filtering a required one of the at least one subsurface feature or the pre-requested descriptive data from the database.
Furthermore, the method may comprise allowing the user to perform at least one of adding a missing information or an additional information to the data associated with the location. Additionally, the user may be also provided with options for rectifying an incorrect information present on at least one of the 2D interface or the 3D interface of the location. In an implementation, each of the one or more updates made by the user while updating a missing information, adding an additional information, or rectifying the existing information may be stored on the database associated with the central server.
Exemplary computer system
illustrates an exemplary computer system 500 according to embodiments of the present disclosure. As an example, the computer system 500 may be the Augmented Reality (AR) generator or the user device disclosed in the claimed invention. In some exemplary embodiments, one or more computer systems 500 perform one or more steps of one or more methods described or illustrated herein. In some exemplary embodiments, one or more computer systems 500 provide functionality described or illustrated herein. In some exemplary embodiments, software running on one or more computer systems 500 performs one or more steps of one or more methods described or illustrated herein or provides functionality described or illustrated herein. Embodiments include one or more portions of one or more computer systems 500. Herein, reference to the computer system may encompass a computing device, and vice versa, where appropriate. Moreover, reference to the computer system may encompass one or more computer systems, where appropriate.
This disclosure contemplates any suitable number of computer systems 500. This disclosure contemplates computer system 500 taking any suitable physical form. As example and not by way of limitation, computer system 500 may be an embedded computer system, a System-On-Chip (SOC), a Single-Board Computer System (SBC) (such as, for example, a Computer-On-Module (COM) or System-On-Module (SOM)), a desktop computer system, a laptop or notebook computer system, an interactive kiosk, a mainframe, a mesh of computer systems, a mobile telephone, a Personal Digital Assistant (PDA), a server, a tablet computer system, or a combination of two or more of these. Where appropriate, computer system 500 may include one or more computer systems 500, be unitary or distributed, span multiple locations, span multiple machines, span multiple data centres, or reside in a cloud, which may include one or more cloud components in one or more networks. Where appropriate, one or more computer systems 500 may perform without substantial spatial or temporal limitation one or more steps of one or more methods described or illustrated herein. As an example, and not by way of limitation, one or more computer systems 500 may perform in real time or in batch mode one or more steps of one or more methods described or illustrated herein. One or more computer systems 500 may perform at different times or at different locations one or more steps of one or more methods described or illustrated herein, where appropriate.
In some exemplary embodiments, computer system 500 includes the processor 502, memory 504, storage 506, an Input / Output (I/O) interface 508, a communication interface 510, and a bus 512. Although this disclosure describes and illustrates a particular computer system having a particular number of components in a particular arrangement, this disclosure contemplates any suitable computer system having any suitable number of any suitable components in any suitable arrangement.
In some exemplary embodiments, processor 502 includes hardware for executing instructions, such as those making up a computer program. As an example, and not by way of limitation, to execute instructions, processor 502 may retrieve (or fetch) the instructions from an internal register, an internal cache, memory 504, or storage 506; decode and execute them; and then write one or more results to an internal register, an internal cache, memory 504, or storage 506.
In some exemplary embodiments, processor 502 may include one or more internal caches for data, instructions, or addresses. This disclosure contemplates processor 502 including any suitable number of any suitable internal caches, where appropriate. As an example, and not by way of limitation, processor 502 may include one or more instruction caches, one or more data caches, and one or more Translation Lookaside Buffers (TLBs). Instructions in the instruction caches may be copies of instructions in memory 504 or storage 506, and the instruction caches may speed up retrieval of those instructions by processor 502. Data in the data caches may be copies of data in memory 504 or storage 506 for instructions executing at processor 502 to operate on; the results of previous instructions executed at processor 502 for access by subsequent instructions executing at processor 502 or for writing to memory 504 or storage 506; or other suitable data. The data caches may speed up read or write operations by processor 502. The TLBs may speed up virtual-address translation for processor 502.
In some exemplary embodiments, the processor 502 may include one or more internal registers for data, instructions, or addresses. This disclosure contemplates processor 502 including any suitable number of any suitable internal registers, where appropriate. Where appropriate, processor 502 may include one or more Arithmetic Logic Units (ALUs), a multi-core processor or include one or more processors 502. Although this disclosure describes and illustrates a particular processor, this disclosure contemplates any suitable processor.
In some exemplary embodiments, the memory 504 includes main memory for storing instructions for processor 502 to execute or data for processor 502 to operate on. As an example, and not by way of limitation, computer system 500 may load instructions from storage 506 or another source (such as, for example, another computer system 500) to memory 504. Processor 502 may then load the instructions from memory 504 to an internal register or internal cache. To execute the instructions, processor 502 may retrieve the instructions from the internal register or internal cache and decode them. During or after execution of the instructions, processor 502 may write one or more results (which may be intermediate or results) to the internal register or internal cache. Processor 502 may then write one or more of those results to memory 504. In particular exemplary embodiments, processor 502 executes only instructions in one or more internal registers or internal caches or in memory 504 (as opposed to storage 506 or elsewhere) and operates only on data in one or more internal registers or internal caches or in memory 504 (as opposed to storage 506 or elsewhere). One or more memory buses (which may each include an address bus and a data bus) may couple processor 502 to memory 504. Bus 512 may include one or more memory buses, as described below.
In some exemplary embodiments, one or More Memory Management Units (MMUs) reside between processor 502 and memory 504 and facilitate accesses to memory 504 requested by processor 502. In some exemplary embodiments, memory 504 includes Random Access Memory (RAM). This RAM may be volatile memory, where appropriate Where appropriate, this RAM may be Dynamic RAM (DRAM) or Static RAM (SRAM). Moreover, where appropriate, this RAM may be single-ported or multi-ported RAM. This disclosure contemplates any suitable RAM. Memory 504 may include one or more memories 504, where appropriate. Although this disclosure describes and illustrates particular memory, this disclosure contemplates any suitable memory.
In some exemplary embodiments, storage 506 includes mass storage for data or instructions. As an example, and not by way of limitation, storage 506 may include a Hard Disk Drive (HDD), a floppy disk drive, flash memory, an optical disc, a magneto-optical disc, magnetic tape, or a Universal Serial Bus (USB) drive or a combination of two or more of these. Storage 506 may include removable or non-removable (or fixed) media, where appropriate. Storage 506 may be internal or external to computer system 500, where appropriate.
In some exemplary embodiments, storage 506 is non-volatile, solid-state memory. In some exemplary embodiments, storage 506 includes read-only memory (ROM). Where appropriate, this ROM may be mask-programmed ROM, Programmable ROM (PROM), Erasable PROM (EPROM), electrically erasable PROM (EEPROM), Electrically Alterable ROM (EAROM), or flash memory or a combination of two or more of these. This disclosure contemplates mass storage 506 taking any suitable physical form. Storage 506 may include one or more storage control units facilitating communication between processor 502 and storage 506, where appropriate. Where appropriate, storage 506 may include one or more storages 506. Although this disclosure describes and illustrates particular storage, this disclosure contemplates any suitable storage.
In some exemplary embodiments, I/O interface 508 includes hardware, software, or both, providing one or more interfaces for communication between computer system 500 and one or more I/O devices. Computer system 500 may include one or more of these I/O devices, where appropriate. One or more of these I/O devices may enable communication between a person and computer system 500. As an example, and not by way of limitation, a I/O device may include a keyboard, keypad, microphone, monitor, mouse, printer, scanner, speaker, still camera, stylus, tablet, touch screen, trackball, video camera, another suitable I/O device or a combination of two or more of these. An I/O device may include one or more sensors. This disclosure contemplates any suitable I/O devices and any suitable I/O interfaces 508 for them. Where appropriate, I/O interface 508 may include one or more device or software drivers enabling processor 502 to drive one or more of these I/O devices. I/O interface 508 may include one or more I/O interfaces 508, where appropriate. Although this disclosure describes and illustrates a particular I/O interface, this disclosure contemplates any suitable I/O interface.
In some exemplary embodiments, communication interface 510 includes hardware, software, or both providing one or more interfaces for communication (such as, for example, packet-based communication) between computer system 500 and one or more other computer systems 500 or one or more networks. As an example, and not by way of limitation, communication interface 510 may include a Network Interface Controller (NIC) or network adapter for communicating with an Ethernet or other wire-based network or a Wireless NIC (WNIC) or wireless adapter for communicating with a wireless network, such as a WI-FI network. This disclosure contemplates any suitable network and any suitable communication interface 510 for it. As an example, and not by way of limitation, computer system 500 may communicate with an ad hoc network, a Personal Area Network (PAN), a Local Area Network (LAN), a Wide Area Network (WAN), a Metropolitan Area Network (MAN), or one or more portions of the Internet or a combination of two or more of these. One or more portions of one or more of these networks may be wired or wireless. As an example, computer system 500 may communicate with a Wireless PAN (WPAN) (such as, for example, a BLUETOOTH WPAN), a WI-FI network, a WI-MAX network, a cellular telephone network (such as, for example, a Global System for Mobile Communications (GSM) network), or other suitable wireless network or a combination of two or more of these. Computer system 500 may include any suitable communication interface 510 for any of these networks, where appropriate. Communication interface 510 may include one or more communication interfaces 510, where appropriate. Although this disclosure describes and illustrates a particular communication interface, this disclosure contemplates any suitable communication interface.
In some exemplary embodiments, bus 512 includes hardware, software, or both coupling components of computer system 500 to each other. As an example and not by way of limitation, bus 512 may include an Accelerated Graphics Port (AGP) or other graphics bus, an Enhanced Industry Standard Architecture (EISA) bus, a Front-Side Bus (FSB), a HYPERTRANSPORT (HT) interconnect, an Industry Standard Architecture (ISA) bus, an INFINIBAND interconnect, a low-pin-count (LPC) bus, a memory bus, a Micro Channel Architecture (MCA) bus, a Peripheral Component Interconnect (PCI) bus, a PCI-Express (PCIe) bus, a serial advanced technology attachment (SATA) bus, a Video Electronics Standards Association local (VLB) bus, or another suitable bus or a combination of two or more of these. Bus 512 may include one or more buses 512, where appropriate. Although this disclosure describes and illustrates a particular bus, this disclosure contemplates any suitable bus or interconnect.
Herein, a computer-readable non-transitory storage medium or media may include one or more semiconductor-based or other Integrated Circuits (ICs) (such, as for example, Field-Programmable Gate Arrays (FPGAs) or Application-Specific ICs (ASICs)), hard disk drives (HDDs), Hybrid Hard Drives (HHDs), optical discs, Optical Disc Drives (ODDs), magneto-optical discs, magneto-optical drives, floppy diskettes, Floppy Disk Drives (FDDs), magnetic tapes, Solid-State Drives (SSDs), RAM-drives, SECURE DIGITAL cards or drives, any other suitable computer-readable non-transitory storage media, or any suitable combination of two or more of these, where appropriate. A computer-readable non-transitory storage medium may be volatile, non-volatile, or a combination of volatile and non-volatile, where appropriate.
Advantages of the exemplary
The present disclosure provides an easy-to-use augmented reality software application that helps in positioning underground and subsurface features in a location.
The method of present disclosure also helps in tracking, locating and illustrating different underground / subsurface features, which are not reachable easily otherwise.
The method of present disclosure presents users with a plurality of invisible information and facilities in an actual location of a feature, and thereby provides a better visual profile of a status of an invisible feature for the user at their precise location.
In an exemplary embodiment, the AR software application proposed in the present disclosure is very facile and that it requires the user only to run the software and point the camera of the user device at a desired invisible, subsurface feature.
The method and the storage platform disclosed in the present disclosure may be utilized to overcome various technical problems related to finding, tracking, and accessing subsurface devices in a location. The abovesaid technical advancements and practical applications of the disclosed method, the AR based software application and the AR generator 201 may be attributed to the aspects of a) creating a user real-time location information, utilizing one or more sensors configured in a user device associated with the user and b) processing the user real-time location information and providing a two-dimensional (2D) interface and a three-dimensional (3D) interface to the user by retrieving the feature data corresponding to the real-time location of the use. These aspects have been recited in steps 2 and 3 of independent claim 1 of the present disclosure.
In light of the technical advancements provided by the proposed disclosure, it shall be noted that the claimed steps, as discussed above, are not routine, conventional, or well-known aspects in the art, as the claimed steps provide the aforesaid solutions to the technical problems existing in the conventional technologies. Further, the claimed steps clearly bring an improvement in the functioning of the system itself, as the claimed steps provide a technical solution to a technical problem.
The terms "an embodiment", "embodiment", "embodiments", “exemplary embodiment”, "the embodiment", "the embodiments", "one or more embodiments", "some embodiments", and "one embodiment" mean "one or more (but not all) embodiments of the invention(s)" unless expressly specified otherwise.
The terms "including", "comprising", “having” and variations thereof mean "including but not limited to", unless expressly specified otherwise.
The enumerated listing of items does not imply that any or all the items are mutually exclusive, unless expressly specified otherwise. The terms "a", "an" and "the" mean "one or more", unless expressly specified otherwise.
Herein, "or" is inclusive and not exclusive, unless expressly indicated otherwise or indicated otherwise by context. Therefore, herein, "A or B" means "A, B, or both," unless expressly indicated otherwise or indicated otherwise by context. Moreover, "and" is both joint and several, unless expressly indicated otherwise or indicated otherwise by context. Therefore, herein, "A and B" means "A and B, jointly or severally," unless expressly indicated otherwise or indicated otherwise by context.
A description of an exemplary embodiment with several components in communication with each other does not imply that all such components are required. On the contrary, a variety of optional components are described to illustrate the wide variety of possible embodiments of the invention.
When a single device or article is described herein, it will be clear that more than one device/article (whether they cooperate) may be utilized in place of a single device/article. Similarly, where more than one device/article is described herein (whether they cooperate), it will be clear that a single device/article may be utilized in place of the more than one device/article or a different number of devices/articles may be utilized instead of the shown number of devices or programs. The functionality and/or features of a device may be alternatively embodied by one or more other devices which are not explicitly described as having such functionality/features. Thus, other embodiments of invention need not include the device itself.
Finally, the language utilized in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the inventive subject matter. It is therefore intended that the scope of the invention be limited not by this detailed description, but rather by any claims that issue on an application based here on. Accordingly, the exemplary embodiments of the present invention are intended to be illustrative, but not limiting, of the scope of the invention, which is set forth in the following claims.
While various aspects and embodiments have been disclosed herein, other aspects and exemplary embodiments will be apparent to those skilled in the art. The various aspects and exemplary embodiments disclosed herein are for purposes of illustration and are not intended to be limiting, with the true spirit being indicated by the following claims.
Reference numbers:
Reference Number Description
101 User
103 User device
1051 – 105N Subsurface features
107 Network
109 Server
111 Database
201 Augmented Reality (AR) generator
202 I/O Interface
203 Processor
205 Memory
207 Modules
209 Data creation module
211 Information collection module
213 Processing module
215 Positioning module
500 Exemplary computer system
502 Processor
504 Memory
506 Storage
508 I/O Interface of the computer system
510 Communication interface

Claims (38)

  1. A method for providing information related to positioning of subsurface features in a location, the method comprising:
    creating, by a processor, feature data comprising the location information and corresponding descriptive data related to at least one subsurface feature in the location utilizing a predetermined mapping method, wherein the location information and the corresponding descriptive data are stored in a server associated with the processor;
    collecting, by the processor, a user real-time location information , utilizing one or more sensors configured in a user device associated with the user;
    processing, by the processor, the user real-time location information and providing a two-dimensional (2D) interface and a three-dimensional (3D) interface to the user by retrieving the feature data corresponding to a real-time location of the user; and
    providing, by the processor, the information related to a positioning of the at least one subsurface feature to the user through at least one of the 2D interface and the 3D interface, wherein the user is allowed to switch between the 2D interface and the 3D interface during an inspection of the at least one subsurface feature in the location.
  2. The method of claim 1, wherein the at least one subsurface feature comprises at least one distribution pipeline network, connective fitting, cable network, subsurface pump, or other underground objects.
  3. The method of claim 1, wherein creating the feature data comprising:
    collecting the location information associated with at least one subsurface feature by performing field operations on the location utilizing the predetermined mapping method;
    generating the descriptive data for the at least one subsurface feature by repetitively scanning the at least one subsurface feature;
    combining the location information and the descriptive data utilizing Geographic Information System (GIS) science for creating a coherent viewing environment; and
    storing the combined data on the server as a predetermined database and tagging the predetermined database with the user real-time location information for subsequent retrieval operations.
  4. The method of claim 1, wherein the predetermined mapping method comprises at least one of Geographical Information System (GIS).
  5. The method of claim 1, wherein the 2D interface and the 3D interface are created utilizing real-time coordinates of the location of the user comprised in the user real-time location information.
  6. The method of claim 1, wherein the one or more sensors utilized for collecting the user real-time location information comprises at least one of a Global Positioning System (GPS) sensor, a gyroscope, and a compass.
  7. The method of claim 6, wherein collecting the user real-time location information comprising:
    receiving, utilizing the GPS sensor, latest location information of the user device utilizing satellite network;
    detecting, utilizing the gyroscope, angular changes of the user device in at least one of the three rotational directions; and
    detecting, utilizing the compass, orientation of the user device with reference to the geographical north.
  8. The method of claim 1, wherein the user device comprises a smartphone or other computing device associated with the user, and wherein the user device is configurable for:
    connecting to the server for transmitting the user real-time location information of the user and for receiving the 2D interface and the 3D interface data from the server, through a predetermined communication channel.
  9. The method of claim 1, wherein the 2D interface comprises a base map of the location that is executed as a background application on the user device.
  10. The method of claim 1, wherein the 2D interface represents the real-time location of the user, changes in the real-time location of the user and an option for activating or switching to the 3D interface.
  11. The method of claim 10, further comprising representing the descriptive information of the at least one subsurface feature on a distinct window over the 2D interface.
  12. The method of claim 11, wherein information from the GPS sensor and the compass sensor are utilized for creating the 2D interface.
  13. The method of claim 1, wherein the 3D interface is created by processing the feature data utilizing a predetermined Augmented Reality (AR) technique.
  14. The method of claim 13, wherein the 3D interface represents at least one of a dimensions of the at least one subsurface feature, perception depth, perspective views and perception angle of the at least one subsurface feature.
  15. The method of claim 14, wherein upon activation of the 3D interface, an image/video capturing module in the user device captures real-time images of the location for generating a 3D model of the at least one subsurface feature in the location.
  16. The method of claim 1, further comprising:
    providing a search tool for enabling the user to search for at least one of a required one of at least one subsurface feature or a pre-requested descriptive data from the database; and
    providing a filter tool for filtering a required one of the at least one subsurface feature or the pre-requested descriptive data from the database.
  17. The method of claim 1, further comprising allowing the user to perform at least one of:
    adding a missing or an additional information associated with the location; and
    rectifying an incorrect information present on at least one of the 2D interface or the 3D interface of the location.
  18. The method of claim 17, further comprising storing the one or more updates made by the user while updating a missing information, additional information or rectifying the existing information in the database.
  19. A method for detecting municipality features in a location, the method comprising:
    receiving municipality feature data and information related to location of one or more pipes and fittings in a GIS data format;
    converting the received municipality feature data into respective 2D interface and 3D interface formats; and
    matching the converted data with user real-time location information received from the location of the user through GPS sensor.
  20. An Augmented Reality (AR) generator for providing information related to positioning of subsurface features in a location, the AR generator comprising:
    one or more processors; and
    a memory, communicatively coupled to the one or more processors, wherein the memory stores processor-executable instructions, which on execution, cause the one or more processors to:
    create feature data comprising location information and corresponding descriptive data related to at least one subsurface feature in the location utilizing predetermined mapping methods, wherein the location information and the corresponding descriptive data are stored in a server associated with the AR generator;
    collect the user real-time location information, utilizing one or more sensors configured in a user device associated with the user;
    process the user real-time location information and providing a two-dimensional (2D) interface and a three-dimensional (3D) interface to the user by retrieving the feature data corresponding to a real-time location of the user; and
    provide the information related to a positioning of the at least one subsurface feature in the location to the user through at least one of the 2D interface and the 3D interface, wherein the user is allowed to switch between the 2D interface and the 3D interface during an inspection of the at least one subsurface feature in the location.
  21. The AR generator of claim 20, wherein the at least one subsurface feature comprises at least one distribution pipeline network, connective fitting, cable network, subsurface pump, or other underground objects.
  22. The AR generator of claim 20, wherein the one or more processors are further configured to create the feature data by:
    collecting the location information associated with the at least one subsurface feature by performing field operations on the location utilizing the predetermined mapping method;
    generating the descriptive data for the at least one subsurface feature by repetitively scanning the at least one subsurface feature;
    combining the location information and the descriptive data utilizing Geographic Information System (GIS) science for creating a coherent viewing environment; and
    storing the combined data on the server as a predetermined database and tagging the predetermined database with the user real-time location information for subsequent retrieval operations.
  23. The AR generator of claim 20, wherein the predetermined mapping method comprises at least one of Geographical Information System (GIS).
  24. The AR generator of claim 20, wherein the one or more processors are configured to create the 2D interface and the 3D interface utilizing the real-time coordinates of the location of the user comprised in the user real-time location information.
  25. The AR generator of claim 20, wherein the one or more sensors comprise at least one of the Global Positioning System (GPS) sensor, the gyroscope, and the compass.
  26. The AR generator of claim 25, wherein the one or more processors collect the user real-time location information by:
    receiving, utilizing the GPS sensor, latest location information of the user device utilizing satellite network;
    detecting, utilizing the gyroscope, angular changes of the user device in at least one of the three rotational directions; and
    detecting, utilizing the compass, orientation of the user device with reference to the geographical north.
  27. The AR generator of claim 20, wherein the user device comprises a smartphone or other computing device associated with the user, and wherein the one or more processors configure the user device to:
    connect to the server for transmitting the user real-time location information and for receiving the 2D interface and the 3D interface data from the server, through a predetermined communication channel.
  28. The AR generator of claim 20, wherein the 2D interface comprises a base map of the location that is executed as the background application on the user device.
  29. The AR generator of claim 20, wherein the 2D interface represents the real-time location of the user, changes in the real-time location of the user and an option for activating or switching to the 3D interface.
  30. The AR generator of claim 29, wherein the one or more processors are further configured to represent the descriptive information of the at least one subsurface feature on a distinct window over the 2D interface.
  31. The AR generator of claim 30, wherein the one or more processors utilize information from the GPS sensor and the compass sensor for creating the 2D interface.
  32. The AR generator of claim 20, wherein the one or more processors are configured to create the 3D interface by processing the feature data utilizing the predetermined Augmented Reality (AR) technique.
  33. The AR generator of claim 32, wherein the 3D interface represents at least one of a dimensions of the at least one subsurface feature, perception depth, perspective views and perception angle of the at least one subsurface feature.
  34. The AR generator of claim 33, wherein upon activation of the 3D interface, the image/video capturing module in the user device captures real-time images of the location for generating a 3D model of the at least one subsurface feature in the location.
  35. The AR generator of claim 20, wherein the one or more processors are further configured to:
    provide the search tool for enabling the user to search for at least one of a required one of at least one subsurface feature or a pre-requested descriptive data from the database; and
    provide the filter tool for filtering a required one of the at least one subsurface feature or the pre-requested descriptive data from the database.
  36. The AR generator of claim 20, wherein the one or more processors are further configured to allow the user to perform at least one of:
    adding the missing or additional information associated with the location; and
    rectifying the incorrect information present on at least one of the 2D interface or the 3D interface of the location.
  37. The AR generator of claim 36, wherein the one or more processors are configured to store the one or more updates made by the user while updating the missing information, additional information or rectifying the existing information in the database.
  38. the user device providing information related to positioning of subsurface features in a location, the user device comprising:
    one or more processors;
    a memory, communicatively coupled to the one or more processors; and
    an Augmented Reality (AR) generator configured to perform operations comprising:
    creating feature data comprising location information and corresponding descriptive data related to at least one subsurface feature in the location utilizing the predetermined mapping method, wherein the location information and the corresponding descriptive data are stored in the server associated with the AR generator;
    collecting the user real-time location information, utilizing one or more sensors configured in the user device associated with the user;
    processing the user real-time location information and providing the two-dimensional (2D) interface and the three-dimensional (3D) interface to the user by retrieving the feature data corresponding to the real-time location of the user; and
    providing the information related to the positioning of the at least one subsurface feature in the location to the user through at least one of the 2D interface and the 3D interface, wherein the user is allowed to switch between the 2D interface and the 3D interface during an inspection of the at least one subsurface feature in the location.
PCT/IB2021/052187 2021-01-29 2021-03-16 An augmented reality platform for a geographic information system WO2022162433A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202163143036P 2021-01-29 2021-01-29
US63/143,036 2021-01-29

Publications (1)

Publication Number Publication Date
WO2022162433A1 true WO2022162433A1 (en) 2022-08-04

Family

ID=82654258

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2021/052187 WO2022162433A1 (en) 2021-01-29 2021-03-16 An augmented reality platform for a geographic information system

Country Status (1)

Country Link
WO (1) WO2022162433A1 (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190114696A (en) * 2018-03-30 2019-10-10 주식회사 지오멕스소프트 An augmented reality representation method for managing underground pipeline data with vertical drop and the recording medium thereof

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190114696A (en) * 2018-03-30 2019-10-10 주식회사 지오멕스소프트 An augmented reality representation method for managing underground pipeline data with vertical drop and the recording medium thereof

Similar Documents

Publication Publication Date Title
KR100997084B1 (en) A method and system for providing real time information of underground object, and a sever and method for providing information of the same, and recording medium storing a program thereof
US9424371B2 (en) Click to accept as built modeling
US9359880B2 (en) Methods and systems for managing underground assets
KR20180132183A (en) Mobile terminal, management server for underground facility and system for managing underground facility based on 3d spatial information
US11725946B2 (en) Operating modes of magnetic navigation devices
US10997785B2 (en) System and method for collecting geospatial object data with mediated reality
US20230297616A1 (en) Contextual augmentation of map information using overlays
KR20210022343A (en) Method and system for providing mixed reality contents related to underground facilities
US11680802B2 (en) Correlating overlapping magnetic measurement data from multiple magnetic navigation devices and updating a geomagnetic map with that data
WO2022162433A1 (en) An augmented reality platform for a geographic information system
US9338361B2 (en) Visualizing pinpoint attraction objects in three-dimensional space
Stylianidis et al. LBS augmented reality assistive system for utilities infrastructure management through Galileo and EGNOS
Fenais Developing an augmented reality solution for mapping underground infrastructure
CA3056834C (en) System and method for collecting geospatial object data with mediated reality
Утепов et al. Value of augmented reality for construction planning
KR20220091716A (en) Underground utility maintenance system using mixed reality technique
JP2024519424A (en) Correlating overlapping magnetic measurement data from multiple magnetic navigation devices and updating a geomagnetic map with this data - Patents.com
AU2014101564A4 (en) A system, method, computer program and data signal for the provision of information
Dadras et al. Using Mobile Phone Sensors to Identify Underground Pipes: A Narrative Review of Modern Mobile Phone Applications
KR20230100432A (en) Mixed reality-based underground buried facility control system centered on field workers
TW202043982A (en) A system of combining augmented reality to dynamical rerender features and method thereof
EP2869093A9 (en) Detection of incursion of proposed excavation zones into buried assets

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21922721

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21922721

Country of ref document: EP

Kind code of ref document: A1