US20170316033A1 - Streaming representation of moving objects and shapes in a geographic information service - Google Patents

Streaming representation of moving objects and shapes in a geographic information service Download PDF

Info

Publication number
US20170316033A1
US20170316033A1 US15/652,993 US201715652993A US2017316033A1 US 20170316033 A1 US20170316033 A1 US 20170316033A1 US 201715652993 A US201715652993 A US 201715652993A US 2017316033 A1 US2017316033 A1 US 2017316033A1
Authority
US
United States
Prior art keywords
video
objects
region
map
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US15/652,993
Inventor
Naphtali David Rishe
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Florida International University FIU
Original Assignee
Florida International University FIU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201361792985P priority Critical
Priority to US14/215,484 priority patent/US9734161B2/en
Application filed by Florida International University FIU filed Critical Florida International University FIU
Priority to US15/652,993 priority patent/US20170316033A1/en
Assigned to THE FLORIDA INTERNATIONAL UNIVERSITY BOARD OF TRUSTEES reassignment THE FLORIDA INTERNATIONAL UNIVERSITY BOARD OF TRUSTEES ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: RISHE, NAPHTALI DAVID
Publication of US20170316033A1 publication Critical patent/US20170316033A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • G06F17/30241
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

A geographical information system (GIS) is described that enables querying, analysis and visualization of real-time streaming data pertaining to at least one moving object or entity (e.g., vehicles, people, sensors, weather phenomena, etc.) in conjunction with relatively static multi-temporal geospatial data. An application programming interface is provided to present the GIS functionality for handling dynamically moving objects or entities to clients.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • The present application is a divisional application of U.S. application Ser. No. 14/215,484, filed Mar. 17, 2014, which claims the benefit of U.S. Provisional Application Ser. No. 61/792,985, filed Mar. 15, 2013, the disclosures of each of which are hereby incorporated by reference in their entirety, including all figures, tables, and drawings.
  • GOVERNMENT SUPPORT
  • This invention was made with government support under Award Number HRD-0833093 awarded by the National Science Foundation. The government has certain rights in the invention.
  • BACKGROUND
  • A Geographic Information System (GIS) captures, stores, analyzes, manages, and presents data linked to geographic locations. Example GISs include Google Earth™, ArcGIS® from ESRI, and commercial fleet management services.
  • In general, a GIS presents tools for users to query and make decisions based on geographic information. The geographic information may be spatial and temporal; and is often heterogeneous, from divergent sources, and may contain structured and unstructured data. As a result, the processing of this data becomes complex and involves numerous challenges. For example, the use of multiple, disparate tools are often necessary in order to process and analyze geospatial data in real-time. These tools are often expensive and can require specialized skills and training to use. In addition, each tool may require the data to be in different formats, increasing the difficulty in combining heterogeneous types of data.
  • Another challenge associated with current commercial systems is that much of the information currently stored in these systems is either historical or static in nature. While this is acceptable for visualizing data such as road-maps, and even handling a single moving object, such as in global positioning system (GPS) navigation where a moving vehicle is the only dynamic object represented, there exists a gap in presenting and handling the dynamic information associated with moving objects in the surrounding environment having different locations, speeds, shapes and trajectories.
  • BRIEF SUMMARY
  • This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
  • Certain embodiments provide a GIS system that leverages video streaming sensor nodes and presents the dynamic information associated with moving objects in a manner that can be used across a variety of clients. In addition, techniques are discussed for minimizing bandwidth issues.
  • An application programming interface (API) is provided in which clients may access and use dynamic information associated with moving objects in a surrounding environment.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 shows an operating environment in which an embodiment of the invention may be implemented.
  • FIG. 2 is a flowchart illustrating an example operation performed by a GIS server according to an embodiment of the invention.
  • DETAILED DISCLOSURE
  • An application programming interface (API) for a Geographic Information System (GIS) is disclosed. The API enables clients to perform visual querying and rendering.
  • Querying, analysis, and visualization of real-time data pertaining to at least two moving objects in conjunction with relatively static multi-temporal geospatial data can be facilitated on client devices through the presentation of the API.
  • A GIS server, that may provide a GIS service including the API, can incorporate data from mobile video sensors and streaming technologies in order to present streaming and/or video data.
  • In certain embodiments, the GIS server can perform the step of processing and analyzing geographic, spatial and/or temporal data to provide visual representation of the trajectories of relevant objects, which may be known to the server in advance or transmitted in real time, sampled, and have uncertainty aspects.
  • Geographic data exploration can be enhanced through incorporation of moving objects. For example, at the client, moving objects of a specific area of interest may be viewed overlaid on geospatial data. According to embodiments of the invention, dynamic movement of objects within the geographic area and at specific resolutions of interest can be presented. This translates to the user as a real-world experience with objects moving across their screen or “zooming” by them.
  • User scenarios that may be supported include, but are not limited to: (1) the user is on board a moving object whose trajectory (e.g., location as a function of time) is known a priori to the server; (2) the user is on board a moving object whose trajectory is generated in real-time and received by the server; or (3) the user is located at a fixed point.
  • Likewise, the trajectories of relevant objects may be known to the server in advance or received by the server in real time and sampled. Visual querying and rendering can be provided to the client.
  • FIG. 1 shows an operating environment in which an embodiment of the invention may be implemented. Referring to FIG. 1, a client device 105 can communicate with a GIS server 110 over a network 115.
  • The network 115 can include, but is not limited to, a cellular network (e.g., wireless phone), a point-to-point dial up connection, a satellite network, the Internet, a local area network (LAN), a wide area network (WAN), a WiFi network, an ad hoc network or a combination thereof. Such networks are widely used to connect various types of network elements, such as hubs, bridges, routers, switches, servers, and gateways. The network 115 may include one or more connected networks (e.g., a multi-network environment) including public networks, such as the Internet, and/or private networks such as a secure enterprise private network. Access to the network 115 may be provided via one or more wired or wireless access networks as will be understood by those skilled in the art.
  • The client device 105 may be, but is not limited to, a personal computer (e.g. desktop computer), laptop, personal digital assistant (PDA), mobile phone (or smart phone), tablet, slate, terminal, or set-top box.
  • The GIS server 110 may include distributed servers that execute real-time continuous queries to facilitate rendering and collaborating with vehicular ad-hoc networks (VANets) 120 and other video streaming sources. For example, a mobile sensor (e.g., on a drone aircraft or even a smart phone) with geo-positioning and communication capabilities and a camera can capture vehicles and pedestrians that are in line of sight and/or a stationary sensor with communication capabilities and a camera can be installed at geographically distributed locations such as at a traffic light, and these videos and/or images be communicated over a network, and ultimately collected by the GIS server 110.
  • The GIS server 110 may include one or more computing devices. For example, the GIS server 110 can include one or more blade server devices, standalone server devices, personal computers, routers, hubs, switches, bridges, firewall devices, intrusion detection devices, mainframe computers, network-attached storage devices, and other types of computing devices.
  • In embodiments where the GIS server 110 includes multiple computing devices, the server can include one or more communications networks that facilitate communication among the computing devices.
  • For example, the one or more communications networks can include a local or wide area network that facilitates communication among the computing devices. One or more direct communication links can be included between the computing devices. In addition, in some cases, the computing devices can be installed at geographically distributed locations. In other cases, the multiple computing devices can be installed at a single geographic location, such as a server farm or an office. Certain embodiments of the invention can be practiced in distributed-computing environments where tasks are performed by remote-processing devices that are linked through a communications network. In a distributed-computing environment, program modules can be located in both local and remote computer-readable storage media.
  • The GIS server 110 can also access multiple geographic information sources 130 for static and/or dynamic geographic information. Once the results from the information sources are received by the GIS server 110, the GIS server can perform steps of filtering and formatting the results; storing the results in a database, transforming the results into a visual representation, and/or transmitting the results in a form to client devices.
  • The client device 105 can communicate with the GIS server 110 to obtain visual data and information on relevant moving objects based on criteria submitted by a user of the client device 105.
  • To allow ergonomic formulation of queries, a user interface can be provided in a web browser of the client device 105 in which the end-user will be provided the ability to graphically manipulate objects and navigation examples. The presented data about the object can include the object's existing and computed properties, including location, description, context, and personalization. This query pattern adds information retrieval and database selection-like constraints to traditional spatio-temporal queries. Such constraints give users the flexibility to execute free-text searches (information-retrieval style) on unstructured data, or refined attribute-based predicate retrieval (SQL style) on structured data. For example: “visualize the trajectories of large green trucks near the current point” is a query where “green” is a keyword, “truck” is a category, and “large” translates into a numerical predicate.
  • Certain embodiments provide support for requests to the GIS server 110 from the client that include functions for opening historic movement trajectories, getting live movement trajectories from the server, getting a video player (or a view of a player) to play videos captured on a given trajectory, and getting movement trajectory inside a search window that a user is interested in.
  • An API method for performing these functions is disclosed. The GIS services API involves a set of request messages available to a client 105 (or server) along with a definition of the structure of response messages sent to the client (or server). The response messages from the GIS 110 server to the client 105 may be in a markup language such as Extensible Markup Language (XML) or JavaScript Object Notation (JSON). The GIS services provided by GIS server 110 can interact programmatically over the network through industry standard Web protocols, such as, but not limited to, XML, JSON, Hypertext Transfer Protocol (HTTP) Representational State Transfer (REST), and Simple Object Access Protocol (SOAP).
  • According to certain embodiments of the invention, API functions that may be called by the client 150 include:
      • “getHistoricTraces”: to get all the historic movement trajectories from server;
      • “getLiveTraces”: to get all the live movement trajectories from server;
      • “openPlayer”: to ask server to open a player to play the videos captured on the given trajectory; and
      • “openSearchWindowWithPolygon”: to get the movement trajectory inside a search window.
  • The search window is described by a series of points in latitudes and longitudes bounding the area that the user is interested in.
  • FIG. 2 is a flowchart illustrating an example operation performed by a GIS server according to an embodiment of the invention. An incoming message from the client may be received by an API server of the GIS server. The incoming message may be received as a SOAP protocol message.
  • When the API/GIS server receives the incoming message, the API/GIS server determines whether the incoming message includes a historic traces request “getHistoricTraces” (202). If the incoming message includes the historic traces request (“Yes” of 204), the API/GIS server may get some or all historic traces and sends response back to client (204). The server may get this information from a database.
  • The API/GIS server also determines whether the incoming message includes a live traces request “getLiveTraces” (206). If the incoming message includes the live traces request (“Yes” of 206), the API/GIS server may get some or all live traces and sends response back to client (208). The server may get this information from at least one sensor having a camera.
  • The API/GIS server also determines whether the incoming message includes a video player request “openPlayer” (210). If the incoming message includes the open player request (“Yes” of 210), the API/GIS server may open the video player in the portal (e.g., user interface of client browser) and may stream video to client (212).
  • The API/GIS server also determines whether the incoming message includes a get movement trajectory request “openSearchWindowWithPolygon” (214). If the incoming message includes the get movement trajectory request (“Yes” of 214), the API/GIS server may get the traces located within the search window back to client (216). Information about the specific trajectory may be obtained from at least one sensor having a camera within a geographic region associated with the search window or from information associated with the search window that is stored in a database.
  • An example of a SOAP request for “getHistoricTraces” is as follows:
  • <SOAP-ENV:Envelope xmlns: SOAP-ENV=“ . . . ” xmlns: SOAP-ENC=“ . . . ” xmlns:xsi=“ . . . ” xmlns:xsd=“ . . . ”> <SOAP-ENV:Body>m:getHistoricTraces x lns:m=“http://map_proxy.gis.cms.ibm.com/”/> </SOAP-ENV:Body> </SOAP-ENV:Envelope>
  • An example of a SOAP request for “openPlayer”, which shows an argument for a streaming video channel, is as follows:
  • <SOAP-ENV:Envelope xmlns: SOAP-ENV=“ . . . ” xmlns: SOAP-ENC=“ . . . ” xmlns:xsi=“ . . . ” xmlns:xsd=“ . . . ”> <SOAP-ENV:Body> <m:openPlayer xmlns:m=“http://map_proxy.gis.cms.ibm.com/”> <arg0> Channel Live2 </arg0></m:openPlayer></SOAP-ENV:Body></SOAP-ENV:Envelope>
  • A user interface can be provided in which a map is displayed as part of a geographical visualization view of a region. When a user selects an object (such as one of the moving objects) displayed in the geographical visualization view in order to see its video, the user interface can request the video from the GIS server using, for example, the openPlayer request. Once the client receives the streaming video captured by the sensor corresponding to the selected object, the user interface can present a video player in the portal to show the video stream. While watching the video stream, the user can interact with the video player, resulting in changes to the geographical visualization view of the region. For example, the user may manually shift a time pointer in the video stream being watched in the player. In response to receiving this input, the interface can reposition the moving object on the map. In some embodiments, a polygonal projection of the sensor's view to earth surface can be provided as part of the visualization and synchronized with the playback of the video player.
  • In some implementations, the client can request “getHistoricTraces” and “getLiveTraces” at regular intervals, for example, every second. For each request, the server may return an XML, formatted file that contains the name of the movement trajectories and the coordinates of the points of the trajectories. The client may draw these trajectories on the map within a user interface and list them in a Trajectory Control panel provided to the client. In one implementation, related streaming videos that lie on the trajectory path can be displayed in response to receiving an input command, such as a right click by a mouse connected to the client device or by a touch or other gesture of a touchscreen of the client device, on the trajectory path itself. A pop-up window can be displayed with the option of “Open Player” that, if selected, would proceed to send a request for the streaming video to the server and open a window to view that video.
  • In some implementations, trajectory paths may be drawn in a manner to minimize unnecessarily obscuring other elements on the screen, for example as a thin red line. This, however, can make it more difficult for users to be able to select the trajectory path itself. To mitigate this and help users to select on the trajectory paths, an invisible buffer can be placed around the trajectory path lines that when an input indicating a selection is received on the invisible buffer, it is understood to be a selection of the trajectory path itself. Thus, the trajectory paths can be displayed with minimal disruption to the geographic visualization while still maintaining the ability to select the line for a second action. The user interface can include markers to tag the movement trajectories on the map in case the trajectories are too small to be seen from global view. The Trajectory Path Control panel can list available movement trajectories returned from the server in response to one of the requests for getting trajectories. When a trajectory is selected in the Trajectory Path Control panel, the view of the trajectory path from the perspective of an object along the trajectory path can be provided (e.g., representing a scenario where a user is on board the moving object).
  • In certain embodiments, client-server bandwidth can be optimized. In one embodiment, the client can periodically consult the server for moving objects instead of using a constant stream of data. The rate of checking for moving objects and the number of moving objects will influence the bandwidth required. Algorithms that deal with time- and speed-based stream predictions and collision detection can be used to minimize the number of checks to the server and the number of objects consulted. To optimize client-server bandwidth, objects that are moving slowly in the viewable window (indicating that they are far away), can be checked for updates less often. Objects that are close or moving at a fast pace are checked more frequently. Also, objects that are not travelling on a collision course with the client's viewable window can be ignored altogether. Of course, if a change of direction is detected, the collision course can be re-evaluated.
  • In certain embodiments, the future location of a moving object is predicted for obtaining visual results of moving objects (positions) to optimize client-server bandwidth. Location prediction refers to statistical methods that derive patterns or mathematical formulas whose purpose is, given the recent trajectory of a moving object, to predict its future location. In one related embodiment, sensor data streams are queried, wherein each update from a sensor is associated with a function allowing prediction of future values of that sensor. The sensor commits to update its value whenever the difference between the observed value and the value estimated using the prediction function exceeds a certain threshold. Location prediction enables selective transfer of moving objects' data from the server to the client. More specifically, moving objects whose locations are predicted to be viewable will be transferred, whereas other moving objects' data will not, to optimize client-server bandwidth.
  • The subject systems and methods can be used in a wide variety of applications and settings including, but not limited to, weather monitoring, troop dispatching, endangered species tracking, disaster mitigation, general aviation monitoring, fleet management, transportation and highway patrol problems, traffic analysis and visualization, commanding and controlling mobile sensors; commanding and controlling operations (e.g., homeland security, law enforcement and disaster response).
  • In one embodiment, the system and method of the invention enables situational monitoring by law enforcement (e.g., notice is provided to law enforcement regarding a hit and run accident). In a specific embodiment, video surveillance recordings, which are used in specific locations, are accessed in real time and integrated with other forms of critical information (e.g., airborne and vehicle-borne sensors). By way of example, law enforcement officers would be able to use the invention to quickly pin point the geographic location, view streaming media of the current location to quickly assess the situation, and, through the use of additional sensors, track the offender's vehicle.
  • Certain techniques set forth herein may be described or implemented in the general context of computer-executable instructions, such as program modules, executed by one or more computing devices. Generally, program modules include routines, programs, objects, components, and data structures that perform particular tasks or implement particular abstract data types.
  • Embodiments may be implemented as a computer process, a computing system, or as an article of manufacture, such as a computer program product or computer-readable medium. Certain methods and processes described herein can be embodied as code and/or data, which may be stored on one or more computer-readable media. Certain embodiments of the invention contemplate the use of a machine in the form of a computer system within which a set of instructions, when executed, can cause the system to perform any one or more of the methodologies discussed above. Certain computer program products may be one or more computer-readable storage media readable by a computer system and encoding a computer program of instructions for executing a computer process.
  • Computer-readable media can be any available computer-readable storage media or communication media that can be accessed by the computer system.
  • Communication media include the mechanisms by which a communication signal containing, for example, computer-readable instructions, data structures, program modules, or other data, is transmitted from one system to another system. The communication media can include guided transmission media, such as cables and wires (e.g., fiber optic, coaxial, and the like), and wireless (unguided transmission) media, such as acoustic, electromagnetic, RF, microwave and infrared, that can propagate energy waves. Communication media, particularly carrier waves and other propagating signals that may contain data usable by a computer system, are not included as computer-readable storage media.
  • By way of example, and not limitation, computer-readable storage media may include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. For example, a computer-readable storage medium includes, but is not limited to, volatile memory such as random access memories (RAM, DRAM, SRAM); and non-volatile memory such as flash memory, various read-only-memories (ROM, PROM, EPROM, EEPROM), magnetic and ferromagnetic/ferroelectric memories (MRAM, FeRAM), and magnetic and optical storage devices (hard drives, magnetic tape, CDs, DVDs); or other media now known or later developed that is capable of storing computer-readable information/data for use by a computer system. “Computer-readable storage media” do not consist of carrier waves or propagating signals.
  • In addition, the methods and processes described herein can be implemented in hardware modules. For example, the hardware modules can include, but are not limited to, application-specific integrated circuit (ASIC) chips, field programmable gate arrays (FPGAs), and other programmable logic devices now known or later developed. When the hardware modules are activated, the hardware modules perform the methods and processes included within the hardware modules.
  • Any reference in this specification to “one embodiment,” “an embodiment,” “example embodiment,” etc., means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the invention. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment. In addition, any elements or limitations of any invention or embodiment thereof disclosed herein can be combined with any and/or all other elements or limitations (individually or in any combination) or any other invention or embodiment thereof disclosed herein, and all such combinations are contemplated with the scope of the invention without limitation thereto.
  • It should be understood that the examples and embodiments described herein are for illustrative purposes only and that various modifications or changes in light thereof will be suggested to persons skilled in the art and are to be included within the spirit and purview of this application.

Claims (16)

What is claimed is:
1. A method of rendering a user interface to a geographical information system (GIS), comprising:
presenting a geographical visualization view of a region in the user interface to the GIS, the geographical visualization view comprising a map of the region and one or more objects in the region that may move over time; and
in response to receiving a video request selection of a particular video sensor of the one or more objects in the geographical visualization view of the region, invoking a video request for the particular video sensor and displaying a video stream of streaming video from the particular video sensor that is received in response to the video request.
2. The method of claim 1, wherein displaying the video stream comprises presenting a video player in a sub-window in the user interface to the GIS.
3. The method of claim 2, further comprising:
in response to receiving an indication of a user manually shifting a time pointer of the video in the video player, displaying a particular object corresponding to the particular video sensor in a different position on the map in the geographical visualization view, the position corresponding to an actual location or an estimated location of the particular object at a time indicated by the time pointer in the video player.
4. The method of claim 3, further comprising in response to receiving the indication of the user manually shifting the time pointer of the video stream in the video player, rendering on the map a polygonal projection onto the Earth surface of the particular video sensor's observed view, the polygonal projection having a movement across the map synchronized with the playback of the video stream.
5. The method of claim 1, further comprising:
presenting a trajectory path control panel listing available movement trajectories sent from the GIS in response to a historical trajectory request for historical trajectories of any of the one or more objects in the region.
6. The method of claim 5, further comprising:
in response to receiving a selection of a particular trajectory of a particular object of the one or more objects in the region listed in the trajectory path control panel, animating the map and the trajectory path from a perspective of the particular object along a path of the particular trajectory.
7. The method of claim 5, wherein the historical trajectory request is invoked at regular intervals.
8. The method of claim 1, wherein the video request selection comprises an indication of a selection command on a trajectory path of a particular object of the one or more objects presented on the map.
9. The method of claim 1, further comprising:
requesting information on the one or more moving objects within the geographical visualization view of the region at a first rate for any objects of the one or more moving objects moving at a first range of speed and at a second rate for any objects of the one or more moving objects moving at a second range of speed, wherein the first rate is lower than the second rate and the first range of speed is slower than the second range of speed.
10. The method of claim 9, wherein the first rate and the second rate are determined by the GIS.
11. A method of rendering a user interface to a geographical information system (GIS), comprising:
presenting a geographical visualization view of a region in the user interface to the GIS, the geographical visualization view comprising a map of the region and trajectories of one or more objects in the region that may move over time; and
allowing a user to select an object of the one or more objects in the region and/or a point in time by an interaction with the map at or within a buffer around a rendered polyline of a particular trajectory of the object of the one or more objects in the region, whereby a selected object is determined as the object of the nearest trajectory to a point or region of the interaction and a selected point in time is determined as the point in time at which the object was located at a point on the particular trajectory closest to the point or region of the interaction,
wherein selection of the object and/or the point in time at least indicates an interest in a visualization of the map in space and/or time.
12. The method of claim 11, further comprising:
in response to receiving an indication of the selected object and/or the selected point of time, rendering on the map polygonal projections onto the Earth surface of any relevant video sensors' observed views.
13. The method of claim 12, wherein the polygonal projections have movement across the map when the map is animated in time.
14. The method of claim 12, wherein the relevant video sensors' comprise at least one moving video sensor, at least one stationary video sensor, or a combination thereof.
15. The method of claim 12, wherein, in response to receiving an indication of a selection of a polygon representing the polygonal projection onto the Earth surface of a particular video sensor's observed view, invoking a video request for the particular video sensor and displaying a video stream of streaming video from the particular video sensor that is received in response to the video request.
16. The method of claim 12, wherein, in response to receiving an indication of a selection of a polygon representing the polygonal projection onto the Earth surface of a particular video sensor's observed view at a particular time, invoking a video player showing playback of video for the particular video sensor relevant to the particular time.
US15/652,993 2013-03-15 2017-07-18 Streaming representation of moving objects and shapes in a geographic information service Pending US20170316033A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US201361792985P true 2013-03-15 2013-03-15
US14/215,484 US9734161B2 (en) 2013-03-15 2014-03-17 Streaming representation of moving objects and shapes in a geographic information service
US15/652,993 US20170316033A1 (en) 2013-03-15 2017-07-18 Streaming representation of moving objects and shapes in a geographic information service

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/652,993 US20170316033A1 (en) 2013-03-15 2017-07-18 Streaming representation of moving objects and shapes in a geographic information service

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/215,484 Division US9734161B2 (en) 2013-03-15 2014-03-17 Streaming representation of moving objects and shapes in a geographic information service

Publications (1)

Publication Number Publication Date
US20170316033A1 true US20170316033A1 (en) 2017-11-02

Family

ID=51533259

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/215,484 Active 2035-01-22 US9734161B2 (en) 2013-03-15 2014-03-17 Streaming representation of moving objects and shapes in a geographic information service
US15/652,993 Pending US20170316033A1 (en) 2013-03-15 2017-07-18 Streaming representation of moving objects and shapes in a geographic information service

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/215,484 Active 2035-01-22 US9734161B2 (en) 2013-03-15 2014-03-17 Streaming representation of moving objects and shapes in a geographic information service

Country Status (1)

Country Link
US (2) US9734161B2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160091977A1 (en) * 2014-09-26 2016-03-31 The Florida International University Board Of Trustees Gesture discernment and processing system
CN108319716A (en) * 2018-02-09 2018-07-24 北京天元创新科技有限公司 A kind of communication line collection of resources automatically generates the method and system of part of path
US10231085B1 (en) 2017-09-30 2019-03-12 Oracle International Corporation Scaling out moving objects for geo-fence proximity determination

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016153790A1 (en) * 2015-03-23 2016-09-29 Oracle International Corporation Knowledge-intensive data processing system
US9693386B2 (en) 2014-05-20 2017-06-27 Allied Telesis Holdings Kabushiki Kaisha Time chart for sensor based detection system
PH12013000136A1 (en) * 2013-05-23 2015-01-21 De Antoni Ferdinand Evert Karoly A domain agnostic method and system for the capture, storage, and analysis of sensor readings
US10084871B2 (en) 2013-05-23 2018-09-25 Allied Telesis Holdings Kabushiki Kaisha Graphical user interface and video frames for a sensor based detection system
US20150338447A1 (en) 2014-05-20 2015-11-26 Allied Telesis Holdings Kabushiki Kaisha Sensor based detection system
US9779183B2 (en) 2014-05-20 2017-10-03 Allied Telesis Holdings Kabushiki Kaisha Sensor management and sensor analytics system
CN107409064B (en) * 2015-10-23 2020-06-05 Nec实验室欧洲有限公司 Method and system for supporting detection of irregularities in a network
JP6343316B2 (en) * 2016-09-16 2018-06-13 パナソニック株式会社 Terminal device, communication system, and communication control method
CN106777271A (en) * 2016-12-29 2017-05-31 广东南方数码科技股份有限公司 It is a kind of that system constituting method is built based on Service Source pond automatically
KR20180131789A (en) * 2017-06-01 2018-12-11 현대자동차주식회사 System and method for providing forward traffic information during stop
CN109902138B (en) * 2019-03-07 2021-01-08 中国水利水电科学研究院 Urban one-dimensional hydrodynamic simulation basic data topological relation construction and encoding method based on GIS
US20200338983A1 (en) * 2019-04-25 2020-10-29 Aptiv Technologies Limited Graphical user interface for display of autonomous vehicle behaviors

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020067412A1 (en) * 1994-11-28 2002-06-06 Tomoaki Kawai Camera controller
US20050073438A1 (en) * 2003-09-23 2005-04-07 Rodgers Charles E. System and method for providing pedestrian alerts
US20070168090A1 (en) * 2006-01-19 2007-07-19 Lockheed Martin Corporation System for maintaining communication between teams of vehicles
US20120280087A1 (en) * 2011-05-03 2012-11-08 Raytheon Company Unmanned Aerial Vehicle Control Using a Gamepad
US20120316782A1 (en) * 2011-06-09 2012-12-13 Research In Motion Limited Map Magnifier
US20130225180A1 (en) * 2012-02-29 2013-08-29 Lg Electronics Inc. Method and Apparatus for Performing Handover Using Path Information in Wireless Communication System
US20140068439A1 (en) * 2012-09-06 2014-03-06 Alberto Daniel Lacaze Method and System for Visualization Enhancement for Situational Awareness

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6697103B1 (en) * 1998-03-19 2004-02-24 Dennis Sunga Fernandez Integrated network for monitoring remote objects
US6801850B1 (en) * 2000-10-30 2004-10-05 University Of Illionis - Chicago Method and system for tracking moving objects
US7892078B2 (en) * 2005-12-30 2011-02-22 Microsoft Corporation Racing line optimization
US7830276B2 (en) * 2007-06-18 2010-11-09 Honeywell International Inc. System and method for displaying required navigational performance corridor on aircraft map display
US8869038B2 (en) * 2010-10-06 2014-10-21 Vistracks, Inc. Platform and method for analyzing real-time position and movement data
US9171079B2 (en) * 2011-01-28 2015-10-27 Cisco Technology, Inc. Searching sensor data
US9013352B2 (en) * 2011-04-25 2015-04-21 Saudi Arabian Oil Company Method, system, and machine to track and anticipate the movement of fluid spills when moving with water flow
US9076259B2 (en) * 2011-09-14 2015-07-07 Imagine Communications Corp Geospatial multiviewer
US8762048B2 (en) * 2011-10-28 2014-06-24 At&T Mobility Ii Llc Automatic travel time and routing determinations in a wireless network
WO2013184528A2 (en) * 2012-06-05 2013-12-12 Apple Inc. Interactive map

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020067412A1 (en) * 1994-11-28 2002-06-06 Tomoaki Kawai Camera controller
US20050073438A1 (en) * 2003-09-23 2005-04-07 Rodgers Charles E. System and method for providing pedestrian alerts
US20070168090A1 (en) * 2006-01-19 2007-07-19 Lockheed Martin Corporation System for maintaining communication between teams of vehicles
US20120280087A1 (en) * 2011-05-03 2012-11-08 Raytheon Company Unmanned Aerial Vehicle Control Using a Gamepad
US20120316782A1 (en) * 2011-06-09 2012-12-13 Research In Motion Limited Map Magnifier
US20130225180A1 (en) * 2012-02-29 2013-08-29 Lg Electronics Inc. Method and Apparatus for Performing Handover Using Path Information in Wireless Communication System
US20140068439A1 (en) * 2012-09-06 2014-03-06 Alberto Daniel Lacaze Method and System for Visualization Enhancement for Situational Awareness

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160091977A1 (en) * 2014-09-26 2016-03-31 The Florida International University Board Of Trustees Gesture discernment and processing system
US9886190B2 (en) * 2014-09-26 2018-02-06 The Florida International University Board Of Trustees Gesture discernment and processing system
US10231085B1 (en) 2017-09-30 2019-03-12 Oracle International Corporation Scaling out moving objects for geo-fence proximity determination
US10349210B2 (en) 2017-09-30 2019-07-09 Oracle International Corporation Scaling out moving objects for geo-fence proximity determination
CN108319716A (en) * 2018-02-09 2018-07-24 北京天元创新科技有限公司 A kind of communication line collection of resources automatically generates the method and system of part of path

Also Published As

Publication number Publication date
US9734161B2 (en) 2017-08-15
US20140280319A1 (en) 2014-09-18

Similar Documents

Publication Publication Date Title
US20180188793A1 (en) Location-Based Facial Recognition on Online Social Networks
AU2017204181B2 (en) Video camera scene translation
US20170293847A1 (en) Crime risk forecasting
Luque-Ayala et al. The maintenance of urban circulation: An operational logic of infrastructural control
US9586314B2 (en) Graphical rendition of multi-modal data
EP2911410B1 (en) Method and apparatus for providing selection and prioritization of sensor data
Cheng et al. Event detection using Twitter: a spatio-temporal approach
US10061486B2 (en) Area monitoring system implementing a virtual environment
US10460183B2 (en) Method and system for providing behavior of vehicle operator using virtuous cycle
US9363489B2 (en) Video analytics configuration
EP3134829B1 (en) Selecting time-distributed panoramic images for display
US10335677B2 (en) Augmented reality system with agent device for viewing persistent content and method of operation thereof
EP2801069B1 (en) System and method for displaying information local to a selected area
US10594777B2 (en) Methods, systems, and media for controlling information used to present content on a public display device
US20160240087A1 (en) System and method of preventing and remedying restricted area intrusions by unmanned aerial vehicles
Chen et al. Dynamic urban surveillance video stream processing using fog computing
CA2848215C (en) Geospatial multiviewer
US9916122B2 (en) Methods, systems, and media for launching a mobile application using a public display device
US9797740B2 (en) Method of determining trajectories through one or more junctions of a transportation network
US10810695B2 (en) Methods and systems for security tracking and generating alerts
KR101627700B1 (en) Pushing suggested search queries to mobile devices
US8146009B2 (en) Real time map rendering with data clustering and expansion and overlay
US20190037179A1 (en) Systems and methods for video analysis rules based on map data
US8812990B2 (en) Method and apparatus for presenting a first person world view of content
Kazemitabar et al. Geospatial stream query processing using Microsoft SQL Server StreamInsight

Legal Events

Date Code Title Description
AS Assignment

Owner name: THE FLORIDA INTERNATIONAL UNIVERSITY BOARD OF TRUS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:RISHE, NAPHTALI DAVID;REEL/FRAME:043035/0232

Effective date: 20140310

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STCB Information on status: application discontinuation

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED