GB2516155A - Interactive geospatial map - Google Patents

Interactive geospatial map Download PDF

Info

Publication number
GB2516155A
GB2516155A GB1408025.3A GB201408025A GB2516155A GB 2516155 A GB2516155 A GB 2516155A GB 201408025 A GB201408025 A GB 201408025A GB 2516155 A GB2516155 A GB 2516155A
Authority
GB
United Kingdom
Prior art keywords
objects
features
map
user
metadata
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1408025.3A
Other versions
GB201408025D0 (en
GB2516155B (en
Inventor
Dan Cervelli
Cai Gogwilt
Bobby Prochnow
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Palantir Technologies Inc
Original Assignee
Palantir Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201361820608P priority Critical
Priority to US13/917,571 priority patent/US8799799B1/en
Application filed by Palantir Technologies Inc filed Critical Palantir Technologies Inc
Publication of GB201408025D0 publication Critical patent/GB201408025D0/en
Publication of GB2516155A publication Critical patent/GB2516155A/en
Application granted granted Critical
Publication of GB2516155B publication Critical patent/GB2516155B/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
    • G06F3/0485Scrolling or panning
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04812Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance interaction techniques based on cursor appearance or behaviour being affected by the presence of displayed objects, e.g. visual feedback during interaction with elements of a graphical user interface through change in cursor appearance, constraint movement or attraction/repulsion with respect to a displayed object
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation, e.g. computer aided management of electronic mail or groupware; Time management, e.g. calendars, reminders, meetings or time accounting
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce, e.g. shopping or e-commerce
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06QDATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04806Zoom, i.e. interaction techniques or interactors for controlling the zooming operation

Abstract

Disclosed is a method of generating a graphical user interface for interactive maps. The maps have a features and objects displayed them, and histograms overlaid on the map. The GUI assesses a database that stores the features and objects with associated metadata, the features that are shown on the map are selectable by the user. In response to a user input selecting some of the features the system determines the metadata associated with the selected features, and the metadata categories associated with the metadata. For each of the metadata categories the system generates histograms showing the metadata values or values ranges associated with the selected features. Each of the histograms including a visual indication of a quantity of the selected features included on the map having metadata value. The features or objects may be roads, terrain, lakes, rivers, vegetation, utilities, street lights, railways, hotels, schools, hospitals, regions transportation, events and /or documents.

Description

Interactive Geospatial Map

TECHNICAL FIELD

The present disclosure relates to systems and techniques for geographical data integration, analysis, and visualization. More specifically, the present disclosure relates to interactive maps including data objects.

BACKGROUND

Interactive geographical maps, such as web-based mapping service applications and io Geographical Information Systems (GIS), are available from a number of providers.

Such maps generally comprise satellite images or generic base layers overlaid by roads.

Users of such systems may generally search for and view locations of a small number of landmarks, and determine directions from one location to another. In some interactive graphical maps, 3D terrain and/or 3D buildings may be visible in the interface.

SUMMARY

The systems, methods, and devices described herein each have several aspects, no single one of which is solely responsible for its desirable attributes. Without limiting the scope of this disclosure, several non-limiting features will now be discussed briefly.

The systems, methods, and devices of the present disclosure may provide, among other features, high-performance, interactive geospatial and/or data object map capabilities in which large amounts of geographical, geospatial, and other types of data, geodata, objects, features, and/or metadata are efficiently presented to a user on a map interface. In various embodiments, an interactive geospatial map system (also referred to as an interactive data object map system) may enable rapid and deep analysis of various objects, features, and/or metadata by the user. in some embodiments, a layer ontolo may be displayed to the user. In various embodiments, when the user rolls a selection cursor over an object/feature an outline of the object/feature is displayed.

Selection of an object/feature may cause display of metadata associated with that object/feature. In various embodiments, the interactive data object map system may automatically generate feature/object lists and/or histograms based on selections made by the user. Various aspects of the present disclosure may enable the user to perform geosearches, generate heatmaps, and/or perform keyword searches, among other actions.

In an embodiment, a computer system is disclosed comprising an electronic data structure confignred to store a plurality of features or objects, wherein each of the features or objects is associated with metadata; a computer readable medium storing software modules including computer executable instructions; one or more hardware processors in communication with the electronic data structure and the computer readaNe medium, and configured to execute a user interface module of the software modules in order to: display an interactive map on an electronic display of the computer system; include on the interactive map one or more features or objects, wherein the features or objects are selectable by a user of the computer system, and jo wherein the features or objects are accessed from the electronic data structure; receive a first input from the user selecting one or more of the included features or objects; and in response to the first input, access, from the electronic data structure, the metadata associated with each of the selected features or objects; determine one or more metadata categories based on the accessed metadata; organize the selected features or objects into one or more histograms based on the determined metadata categories and the accessed metadata; and disp'ay the one or more histograms on the electronic display.

According to an aspect, the features or objects may comprise vector data.

According to another aspect, the features or objects may comprise at least one of roads, terrain, lakes, rivers, vegetation, utilities, street lights, railroads, hotels or motels, schools, hospitals, buildings or structures, regions, transportation objects, entities, events, or documents.

According to yet another aspect, the metadata associated with the features or objects may comprise at least one of a location, a city, a county, a state, a country, an address, a district, a grade level, a phone number, a speed, a width, or other related attributes.

According to another aspect, the features or objects may be selectable by a user using a mouse and/or a touch interface.

According to yet another aspect, each histogram of the one or more histograms maybe specific to a particular metadata category.

According to another aspect, each histogram of the one or more histograms may comprise a list of items of metadata specific to the particular metadata category of the histogram, wherein the list of items is organized in descending order from an item having the hrgest number of related objects or features to an item having the smallest number of related objects or features.

According to yet another aspect, the one or more histograms displayed on the electronic display maybe displayed so as to partially overlay the displayed interactive map.

jo According to another aspect, the one or more hardware processors may be further configured to execute the user interface module in order to: receive a second input from the user selecting a second one or more features or objects from the one or more histograms; and in response to the second input, update the interactive map to display the second one or more features or objects on the display; and highlight the second one or more features or objects on the interactive map.

According to yet another aspect, updating the interactive map may comprise panning and/or zooming.

According to another aspect, highlighting the second one or more features may comprise at least one of outlining, changing color, bolding, or changing contrast.

According to yet another aspect, the one or more hardware processors maybe further configured to execute the user interface module in order to: receive a third input from the user selecting a drill-down group of features or objects from the one or more histograms; and in response to the third input, drill-down on the selected drill-down group of features or objects by: accessing the metadata associated with each of the features or objects of the selected drill-down group; determining one or more drill-down mctadata catcgorics bascd on thc acccsscd mctadata associatcd with cach of the features or objects of the selected drill-down group; organizing the features or objects of the selected driB-down group into one or more drill-down histograms based on the determined drifl-down metadata categories and the accessed metadata associated with each of the features or objects of the selected drill-down group; and displaying on the interactive map the one or more drill-down histograms.

According to another aspect, the one or more hardware processors may be frwther configured to execute the user interface module in order to enable the user to further drill down into the one or more drill-down histograms.

According to yet another aspect, the one or more hardware processors may be further configured to execute the user interface module in order to: receive a feature or object hover over input from the user; and in response to receiving the hover over input, highlight, on the electronic display, metadata associated with the particular hovered over feature or object to the user.

According to another aspect, one or more hardware processors maybe further configured to execute the user interface module in order to: receive a feature or object selection input from the user; and in response to receiving the selection input, display, on the electronic display, metadata associated with the particular selected feature or object to the user.

Tn another embodiment, a computer system is disclosed comprising: an electronic data structure configured to store a plurality of features or objects, wherein each of the features or objects is associated with metadata; a computer readable medium storing software modules including computer executable instructions; one or more hardware processors in communication with the electronic data structure and the computer readaNe medium, and configured to execute a user interface module of the software modules in order to: display an interactive map on a display of the computer system, the interactive map comprising a plurality of map tiles accessed from the electronic data structure, the map tiles each comprising an image composed of one or more vector layers; include on the interactive map a plurality of features or objects accessed from the electronic data structure, the features or objects being selectable by a user, each of the features or objects including associated metadata; receive an input from a user including at Icast onc of a zoom action, a pan action, a fcaturc or objcct sclcction, a layer selection, a geosearch, a heatmap, and a keyword search; and in response to the input from the user: request, from a server, updated map tiles, the updated map tiles being updated according to the input from the user; receive the updated map tiles from the server; and update the interactive map with the updated map tiles.

According to an aspect, the one or more vector layers may comprise at least one of a regions layer, a buildings/structures layer, a terrain layer, a transportation layer, or a utilities/infrastnicture layer.

According to an aspect, each of the one or more vector layers may be comprised of one or more sub-vector layers.

In yet another embodiment, a computer system is disclosed comprising: one or more hardware processors in communication with the computer readable medium, and o configured to execute a user interface module of the software modules in order to: display an interactive map on a display of the computer system, the interactive map comprising a plurality of map layers; determine a list of available map layers; organizing the list of available map layers according to a hierarchical layer ontology, wherein like map layers are grouped together; and display on the interactive map the hierarchical ayer ontology, wherein the user may select one or more of the displayed layers, and wherein each of the available map ayers is associated with one or more feature or object types.

According to an aspect, the map layers may comprise at least one of vector layers and base layers.

BRIEF DESCRIPTION OF THE DRAWINGS

The following aspects of the disclosure will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings.

Figure 1 illustrates a sample user interface of the interactive data object map system, according to an embodiment of the present disclosure.

Figure 2A illustrates a sample user intcrfacc of thc intcractivc data objcct map systcm in which map layers are displayed to a user, according to an embodiment of the present

disclosure.

Figure 2B illustrates an example map layer ontology, according to an embodiment of

the present disclosure.

Figure 2C illustrates a sample user interface of the interactive data object map system in which various objects are displayed, according to an embodiment of the present

disclosure.

Figure 3A illustrates a sample user interface of the interactive data object map system in which objects are se'ected, according to an embodiment of the present disclosure.

Figures 3B-3G illustrate samp'e user interfaces of the interactive data object map system in which objects are selected and a histogram is displayed, according to

embodiments of the present disclosure.

Figures 311-31 illustrate samp'e user interfaces of the interactive data object map system in which objects are selected and a list of objects is displayed, according to

embodiments of the present disclosure.

Figures 3J-3K illustrate samp'e user interfaces of the interactive data object map jo system in which objects are outfined when hovered over, according to embodiments of

the present disclosure.

Figures 4A-4D illustrate sample user interfaces of the interactive data object map system in which a radius geosearch is displayed, according to embodiments of the

present disclosure.

Figures 5A-5D illustrate samp'e user interfaces of the interactive data object map system in which a heatmap is displayed, according to embodiments of the present

disclosure.

Figures 5E-5F illustrate sampk user interfaces of the interactive data object map system in which a shape-based geosearch is displayed, according to embodiments of the

present disclosure.

Figure 5G illustrates a samp'e user interface of the interactive data object map system in which a keyword object search is displayed, according to an embodiment of the

present disclosure.

Figure 5H illustrates an example of a UTF grid of the interactive data object map system, according to an embodiment of the present disclosure.

Figure 6A shows a flow diagram depicting illustrative client-side operations of the interactive data object map system, according to an embodiment of the present

disclosure.

Figure 6B shows a flow diagram depicting iflustrative client-side metadata retrieval of the interactive data object map system, according to an embodiment of the present

disclosure.

Figure 7A shows a flow diagram depicting iflustrative server-side operations of the interactive data object map system, according to an embodiment of the present

disclosure.

Figure 7B shows a flow diagram depicting illustrative server-side layer composition of the interactive data object map system, according to an embodiment of the present

disclosure.

Figure 8A illustrates one embodiment of a database system using an ontology.

Figure 8B illustrates one embodiment of a system for creating data in a data store using a dynamic ontology.

Figure SC illustrates a sample user interface using relationships described in a data store using a dynamic ontology.

Figure SD illustrates a computer system with which ceitain methods discussed herein jo maybe implemented.

DETAILED DESCRIPTION OF SPECIFIC EMBODIMENTS

Overview Tn general, a high-performance, interactive data object map system (or "map system") is disclosed in which large amounts of geographical, geospatial, and other types of data, geodata, objects, features, and/or metadata are efficiently presented to a user on a map interface. The interactive data object map system allows for rapid and deep analysis of various objects, features, and/or metadata by the user. For example, millions of data objects and/or features maybe simultaneously viewed and selected by the user on the map interface. A layer ontology may be displayed to the user that allows the user to select and view particular layers. In various embodiments, when the user rolls a selection cursor over an object/feature (and/or otherwise selects the object/feature) an outline of the object/feature is displayed. Selection of an object/feature may cause display of metadata associated with that object/feature.

In an embodiment, the user may rapidly zoom in and out and/or move and pan around the map interface to variously see more or less detail, and more or fewer objects. In various embodiments, the interactive data object map system may automatically generate feature/object lists and/or histograms based on selections made by the user.

Tn various embodiments, the user may perform geosearches (based on any selections and/or drawn shapes), generate heatmaps, and/or perform keyword searches, among other actions as described below.

In an embodiment, the interactive data object map system includes server-side computer components and/or client-side computer components. The client-side components may implement, for example, displaying map tiles, showing object outlines, allowing the user to draw shapes, and/or allowing the user to select objects/features, among other actions. The server-side components may implement, for examp'e, composition of layers into map tiles, caching of composed map tiles and/or layers, and/or providing object/feature metadata, among other actions. Such functions may be distribution in any other manner. in an embodiment, object/feature outlines and/or highlighting are accomplished on the client-side through the use of a UTF grid.

Definitions jo Tn order to facilitate an understanding of the systems and methods discussed herein, a number of terms are defined below. The terms defined below, as wefl as other terms used herein, should be construed to include the provided definitions, the ordinary and customary meaning of the terms, and/or any other implied meaning for the respective terms. Thus, the definitions below do not limit the meaning of these terms, but only provide exemplary definitions.

Ontology: A hierarchical arrangement and/or grouping of data according to simi'arities and differences. The present disclosure describes two ontologies. The first relates to the arrangement of vector layers consisting of map and object data as used by the interactive data object map system (as described below with reference to Figures 2A- 2B). The second relates to the storage and arrangement of data objects in one or more databases (as described below with reference to Figures 8A-8C). For example, the stored data may comprise definitions for object types and property types for data in a database, and how objects and properties may be related.

Database: A broad term for any data structure for storing and/or organizing data, including, but not limited to, relational databases (Oracle database, mySQL database, etc.), spreadsheets, XML files, and text file, among others.

Data Object, Object, or Feature: A data container for information representing specific things in the worM that have a number of definable properties. For example, a data object can represent an entity such as a person, a place, an organization, a market instrument, or other noun. A data object can represent an event that happens at a point in time or for a duration. A data object can represent a document or other unstructured data source such as an e-mail message, a news report, or a written paper or article.

Each data object may be associated with a unique identifier that uniquely identifies the data object. The object's attributes (e.g. metadata about the object) maybe represented in one or more properties. For the purposes of the present disclosure, the terms "feature," "data object," and "object" may be used interchangeably to refer to items displayed on the map interface of the interactive data object map system, and/or otherwise accessible to the user through the interactive data object map system.

Features/objects may generally include, but are not limited to, roads, terrain (such as hills, mountains, rivers, and vegetation, among others), street lights (which may be represented by a streetlight icon), railroads, hotels/motels (which may be represented by a bed icon), schools (which maybe represented by a parent-child icon), hospita's, jo other types ofbuildings or structures, regions, transportation objects, and other types of entities, events, and documents, among others. Objects displayed on the map interface generally comprise vector data, although other types of data may also be displayed. Objects generally have associated metadata and/or propeities.

Object Type: Type of a data object (e.g., Person, Event, or Document). Object types may be defined by an ontolo and may be modified or updated to include additiona' object types. An object definition (e.g., in an ontolo) may include how the object is related to other objects, such as being a sub-object type of another object type (e.g. an agent may be a sub-object type of a person object type), and the properties the object type may have.

Properties: Also referred to as "metadata," includes attributes of a data object/feature.

At a minimum, each property/metadata of a data object has a type (such as a property type) and a value or values. Properties/metadata associated with features/objects may include any information relevant to that feature/object. For example, metadata associated with a school object may include an address (for example, 123 S. Orange Street), a district (for example, 5o9c), a grade level (for example, K-6), and/or a phone number (for example, 800-0000), among other items of metadata. In another examp'e, metadata associated with a road object may indude a speed (for cxampk, 25 mph), a width (for example, 2 lanes), and/or a county (for example, Arlington), among other items of metadata.

Property Type: The data type of a property, such as a string, an integer, or a double.

Property types may include complex property types, such as a series data values associated with timed ticks (e.g. a time series), etc. -10-Property Value: The value associated with a property, which is of the type indicated in the property type associated with the property. A property may have multiple values.

Link: A connection between two data objects, based on, for examp'e, a relationship, an event, and/or matching properties. Links may be directional, such as one representing a payment from person A to B, or bidirectional.

Link Set: Set of multiple links that are shared between two or more data objects.

Description of the Figures

Embodiments of the disclosure will now be described with reference to the accompanying Figures, wherein like numerals refer to like elements throughout. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner, simply because it is being utilized in conjunction with a detailed description of certain specific embodiments of the disclosure.

Furthermore, embodiments of the disclosure may include several novel features, no single one of which is solely responsib'e for its desirable attributes or which is essential to practicing the embodiments of the disclosure herein described.

Figure 1 illustrates a sample user interface of the interactive data object map system, according to an embodiment of the present disclosure. The user interface includes a map interface 100, a selection button/icon 102, a shape button/icon 104, a layers button/icon 106, a geosearch button/icon 108, a heat map button/icon no, a search box 112, a feature information box 114, a coordinates information box n6, map scale information n8, zoom selectors 120, and highlighted features 122. The functionality of the interactive data object map system may be implemented in one or more computer modules and/or processors, as is described below with reference to Figure SD.

Thc map intcrfacc 100 of Figurc 1 is composcd of multiplc map tilcs. Thc map tilcs arc generally composed of multiple layers of geographical, vector, and/or other types of data. Vector data layers (also referred to as vector layers) may include associated and/or Unked data objects/features. Tn an embodiment, vector layers are composed of data objects/features. The various data objects and/or features associated with a particular vector layer may be displayed to the user when that particular vector layer is activated. For example, a transportation vector layer may include road, railroad, and bike path objects and/or features that may be displayed to the user when the -11 -transportation layer is selected. The layers used to compose the map tiles and the map interface 100 may vary based on, for example, whether a user has selected features displayed in the map interface 100, and/or the particular layers a user has selected for display. Tn an embodiment, composition of map tiles is accomplished by server-side components of the interactive data object map system. tn an embodiment, composed map tiles may be cached by the server-side components to speed up map tile delivery to client-side components. The map tiles may then be transmitted to the client-side components of the interactive data object map system where they are composed into the map interface 100.

Tn general, the user interface of Figure 1 is dispthyed on an electronic display viewable by a user of the interactive data object map system. The user of the interactive data object map system may interact with the user interface of Figure 1 by, for example, touching the display when the display is touch-enabled and/or using a mouse pointer to dick on the various elements of the user interface.

The map interface 100 includes various highlighted features 122 and feature icons. For example, the map interface 100 includes roads, buildings and structures, utilities, lakes, rivers, vegetation, and railroads, among other features. The user may interact with the map interface 100 by, for example, rolling over and/or clicking on various features. In one embodiment, rolling over and/or placing the mouse pointer over a feature causes the feature to be outlined and/or otherwise highlighted. Additionally, the name of the feature and/or other information about the feature may be shown in the feature information box t14.

The user of the map system may interact with the user interface of Figure 1 by scrolling or panning up, down, and/or side to side; zooming in or out; selecting features; drawing shapes; selecting layers; performing a geosearch; generating a heat map; and/or pcrforming a kcyword scarch; among othcr actions as arc dcscribcd bclow.

Various user actions may reveal more or less map detail, and/or more or fewer features/objects.

Figure 2A iflustrates a sample user interface of the map system in which map layers are displayed to a user, according to an embodiment of the present disclosure. tn the user interface of Figure 2A, the user has selected the layers button 106, revealing the layers window 202. The layers window 202 includes a list of base layers, vector layers, and -12 -user layers. The base layers include, for example, overhead imagery, topographic, blank (Mercator), base map, aviation, and blank (unprojected). The vector layers include general categories such as, for example, regions, buildings/stnictures, terrain, transportation, and utilities/infrastnicture. While no user layers are included in the user interface of Figure 2A, user layers may be added by the user of the map system, as is described below.

In an embodiment, the user may select one or more of the base layers which may be used during composition of the map tiles. For example, selection of the overhead Jo imagery base layer will produce map tiles in which the underlying map tile imagery is made up of recent aerial imagery. Similarly, selection of the topographic base layer will produce map tiles in which the underlying map tile imagery includes topographic map imagery.

Further, in an embodiment, the user may select one or more of the vector layers which may be used during composition of the map tiles. For example, selecting the transportation layer results in transportation-related objects and/or features being displayed on the map tiles. Transportation-related features may include, for example, roads, railroads, street signs, and/or street lights, among others. Examples of transportation-related features may be seen in the user interface of Figure 2A where various roads, railroads, and street light icons are displayed.

In an embodiment, the user of the map system may create and save map layers. These saved map layers may be listed as user layers in the layers window 202.

Figure 2B illustrates an example map layer ontology, according to an embodiment of the present disclosure. As mentioned above with reference to Figure 2A, the list of vector layers in the layers window 202 may include general categories/layers such as rcgions, buildings/structurcs, tcrrain, transportation, and utilitics/infrastnicturc. Thc vector layers available in the map system may be further organized into an ontology, or hierarchical arrangement. For example, as shown in the vector layers window 206, the buiHings/structures category 208 may be further subdivided into layers including structures, government, medical, education, and commercial. The terrain category 210 may include vegetation and/or water/hydrography layers. The utilities/infrastructure category may include fire and/or storage/draining.

-13 -In an embodiment, the user of the map system may select one or more of the layers and/or sub-layers of the layer ontology. As shown in Figure 2B, the user has deselected the vegetation sub-layer, and all of the utilities/infrastructure layers. Selecting and deselecting vector layers, or toggling vectors layers on and off, may cause the vector objects and/or features associated with those layers to be displayed or not displayed in the map interface. For example, when the user selects the transportation category/layer, road objects associated with the transportation layer may be displayed on the map interface. Likewise, when a user deselects the transportation category/layer, road objects associated with the transportation layer may be removed o from the map interface.

In an embodiment, additional hierarchical levels of layers may be displayed to the user.

For example, the vector layers window 206 may include sub-sub-layers (for example, the education sub-layer may be divided into elementary schools, secondary schools, and post-secondary schools). Alternatively, fewer hierarchical levels may be displayed to the user.

In an embodiment, each of the vector layers shown in the vector layers window 206 may be made up of many layers of map vector data. In this embodiment, the map system may advantageously generate a simplified layer ontology, such as the one shown in 206. The simplified layer ontology allows the user to easily select layers of interest from a reduced number of layers, rather than a large number of discrete layers. As described above, vector layers may contain data regarding associated features and/or objects. Thus, features visible in the map interface correspond to the currently active/selected layers. In an embodiment, the layer ontology may have an arbitrary depth.

Figure 2C illustrates a sample user interface of the map system in which various objects arc displaycd, according to an cmbodimcnt of thc prcscnt disclosurc. The uscr intcrfacc of Figure 2C includes a map interface 214, an outlined feature 216, and feature information box 114 indicating that the outlined feature 216 is called "Union Park." Various features/objects maybe seen in the map interface 214 including, for example, roads, buildings, terrain, street lights (represented by a streetlight icon), railroads, hotels/motels (represented by a bed icon), and schools (represented by a parent-child icon), among other features.

Figure 3A illustrates a sample user interface of the map system in which objects are selected, according to an embodiment of the present disclosure. The user interface of Figure 3A includes a highlighted user selection rectangle 302. The highlighted user selection rectangle 302 illustrates the user actively selecting a particular region of the map interface so as to select the features/objects that fall within the bounds of that rectangle. in an embodiment, visible features may be selected by the user, while features that are not currently visible are not selectable. For example, features related to layers that are not currently active are not selected when the user performs a selection. In another embodiment, even features that are not visible in a selected area jo may be selected.

Figures 3B-3C illustrate sample user interfaces of the map system in which objects are selected and a feature histogram 304 is displayed in a selection window, according to embodiments of the present disclosure. The selected objects/features of Figure 3B (including roads 310 and other features 312) may have been selected via the highlighted user selection rectangle 302 of Figure 3A. Selected features are indicated by highllghting and/or altered colors on the map tiles making up the map interface.

Feature histogram 304 is shown in a selection window included in the user interface of Figure 3B. The histogram 304 shows a categorized histogram of all objects/features selected by the user in the map interface. The histogram divides the features into common buckets and/or categories based on related metadata (also referred to as metadata categories). For example, at 306, "Belongs to Layer" indicates that the following histogram includes all selected features organized by layer categoiy. In this example there are over 70,000 selected buildings/structures features, over 40,000 selected facility features, and over 6,ooo selected road features, among others. Further, the feature histogram 304 includes histograms of the selected objects organized by account and acreage. In various embodiments, the map system may select histogram categories and/or mctadata catcgorics based on, for example, the fcatures selccted and/or types of features selected, among others. Any other categorization of selected features may be displayed in the histograms of the feature histogram 304.

In an embodiment, the user of the map system may select a subset of the selected features for further analysis and/or histogram generation. For example, the user may select a subset comprising selected objects bálonging to the road category by, for example, clicking on the roads item 308. This selection may result in "drilling down" to -15 -histograms of that subset of features, as shown in Figure 3C. Thus, a drill-down group of features/objects (for example, the subset of features/objects) maybe used by the map system to determine new driB-down metadata categories, or buckets of related metadata. At 314 in Figure 3C, the arrow icon indicates that of the originally selected 124,172 features, the feature histogram now shows an analysis of the 6,724 features belonging to the road category (see item 316). The feature histogram window of Figure 3C thus shows a new set of histograms organized by layer, address, addressed, and agency, among others. The user may thus "drill down" and "drill tip" through the selected features via the displayed histograms.

Tn an embodiment, items selected in the feature histogram are correspondingly highlighted in the map interface of the map system. For example, in the map interface of Figure 3B, the user has selected the roads in the histogram at 308. Corresponding features (in this example, roads) are thus highlighted in the map interface (as shown at 310).

Figures 3D-3G illustrate additiona' example user interfaces of the map system in which objects are selected from a histogram and correspondingly highlighted in the map interface, according to embodiments of the present disclosure. In Figures 3D-3F, in the selection window, the user is viewing a histogram of all selected roads organized in a histogram according to the road speed limit. In Figure 3D, the user has selected (at 318) roads with speed limits of 55 and 65. The corresponding road features are highlighted in the map interface at, for example 320. in Figure 3E, the user has selected (at 322) roads with speed limits of 35, 45, 40,55, and 6. The corresponding road features are highlighted in the map interface at, for example 324. In Figure 3F, the user has selected (at 326) roads with speed limits of 25. The corresponding road features are highlighted in the map interface at, for example 328. in Figure 3G, the user may "drill down" into the histogram by, for example, right clicking on an item and selecting "Remove other objccts in histogram" (330).

Figures 3H and 31 illustrate sample user interfaces of the map system in which objects are selected and a list of selected objects 332 is displayed in the selection window, according to embodiments of the present disclosure. With reference to Figure 3H, the list of features 332 indicates that the user has drilled down further into the selected features of Figure 3G by selecting a subset of selected features consisting of only roads with speed limits of 20. Thus, the subset of the example of Figure 3H includes the 163 features that are roads with speed limits of 20. The user has additionally selected to view the list of features 332 in the selection window (rather than the feature histogram). The list of features 332 lists each individua' feature that is included in the currently selected subset. For example, the list includes S Central Av 334, among others.

In Figure 31, the user has selected feature Hamilton St at 336. In an embodiment, when a feature is selected from the list of features, the map interface automatically zooms to the location of that feature. The user may select the feature from the list of features by jo clicking on the name of the feature and/or the displayed thumbnail. In an embodiment, the map interface only zooms to the feature when the user clicks on, and/or selects, the thumbnail associated with the feature. In the example of Figure 31, the map interface is automatically zoomed to the location of the selected Hamilton St, and the selected feature is highlighted (338). Additionally, the name of the selected feature is shown in the feature information box 114. In an embodiment, the name of the selected feature is shown in the feature information box 114 when the user hovers the cursor over the thumbnail associated with the feature in the list of features. Tn an embodiment, the selected feature may be any other type of object, and may be outlined or otherwise highlighted when selected.

In various embodiments, the user of the map system may select either the list of features, or the feature histogram, of the selection window to view information about the selected features.

Figures 3J-3K illustrate sample user interfaces of the map system in which objects are outlined when hovered over, according to embodiments of the present disclosure. In Figure 3J, the user is hovering over a building feature with the mouse cursor. The feature being hovered over is automatically outlined (340). Additionally, the name of the fcaturc is displayed in the fcaturc information box 114. In Figure 3K, the user is hovering over a shelter feature with the mouse cursor. The feature being hovered over is automatically outlined (342), and the name of the feature is displayed in the feature information box 114. The user of the map system may, at anytime, highlight and/or outline any feature/object by rolling over, hovering over, selecting, and/or touching that feature/object in the map interface. -17-

In various embodiments, the user may select a feature in order to view a feature information window. The feature information window may include, for example, metadata associated with the selected feature. For example, the user may select a building feature, resulting in a display of information associated with that building feature such as the building size, the building name, and/or the building address or location, among others. Metadata associated with features/objects may include any information relevant to that feature/object. For example, metadata associated with a school may include an address (for example, 123 S. Orange Street), a district (for example, 5o9c), a grade level (for examp'e, K-6), and/or a phone number (for example, Jo 8oo-oooo), among other items of metadata. In an embodiment, a history of the object, changes made to the object, and/or user notes related to the object, among other items, maybe displayed. In an embodiment, a user may edit metadata associated with a selected feature.

Figures 4A-4D illustrate samp'e user interfaces of the map system in which a radius geosearch is disp'ayed, according to embodiments of the present disclosure. In Figure 4A, the user has selected the shape button 104 and is drawing a circle selection 404 on the map interface by first selecting a center and then a radius. Shape window 402 indicates the coordinates of the center of the circle selection, as well as the radius of the circle selection. In various embodiments, any type of polygon or other shape may be drawn on the map interface to select features.

In Figure 4B, the user has selected the geosearch button 108 so as to perform a geosearch within the selection circle 408. In an embodiment, a geosearch comprises a search through one or more databases of data objects, and metadata associated with those data objects, for any objects that meet the criteria of the geosearch. For example, a geosearch may search for any objects with geographic metadata and/or properties that indicate the object may be geographically within, for example, selection cirde 408.

A gcoscarch within a sclcctcd circlc may bc rcfcrrcd to as a radius scarch. Ccoscarch window 406 indicates various items of information related to the radius search, and includes various parameters that may be adjusted by the user. For example, the geosearch window 406 includes a search area slider that the user may slide to increase or decrease the radius of the selection circle 408. The user may also indicate a time range for the geosearch. ln an embodiment, objects/features shown and/or searchable in the map system may include a time component and/or time metadata. Thus, for example, the user of the map system may specify a date or time period, resulting in the -18 -display of any objects/features with associated time metadata, for example, falling within the specified time period. In various embodiments, associated time metadata may indicate, for example, a time the feature was created, a time the feature was added to a database of features, a time the feature was previously added to a vector layer, a time the feature was last accessed by the map system and/or a user, a time the feature was built, and/or any combination of the foregoing. Alternatively, the user may select and/or search for objects/features within parricular time periods, as shown in Figure 4B. The geosearch window 406 also allows the user to specify the types of objects to be searched, for example, entities, events, and/or documents, among others.

Tn an embodiment, the user of the map system may perform a search by clicking and/or touching a search button. The map system may then perform a search of an object database for any objects matching the criteria specified in the geosearch. For example, in the example of Figure 4B the map system will search for any objects with associated location information that falls within the selection circle 408. Objects searched by the map system may include objects other than those shown on the map interface. For examp'e, in an embodiment the map system may access one or more databases of objects (and object metadata) that may be unrelated to the features currently shown in the map interface, or features related to the currently selected vector layers. The databases accessed may include databases external to any database storing data associated with the map system. Any objects found in the geosearch may then be made available to the user (as shown in Figure 4B), and the user may be given the option of adding the objects to a new layer in the map interface (as shown in the geosearch information window 406).

Figure 4C shows objects added to the map interface following the geosearch in Figure 4B. The search results are also shown in the feature histogram 410. In this example the returned objects include various entities and events. Figure 4D shows the user has sclcctcd, in the feature histogram, all search result objects with related mctadata indicating a drug law violation. Those selected objects are additionally highlighted in the map interface of Figure 4D. Tn another example, geosearch may be used to determine, for example, that many crimes are concentrated in a downtown area of a city, while DUI5 are more common in areas with slow roads.

Figures 5A-5D illustrate sample user interfaces of the map system in which a heatmap is displayed, according to embodiments of the present disclosure. ln Figure 5A, the user has selected the heatmap button 110 so as to create a heatmap 504 based on the objects selected in Figure 4D. A heatmap information window 502 is displayed in which the user may specii various parameters related to the generation of heatmap. For example, referring now to Figure B, the user may adjust a radius (o6) of the circular heatmap related to each selected object, an opacity (508) of the heatmap, a scale of the heatmap, and an auto scale setting. in Figure B, the user has decreased the opacity of the generated heatmap and zoomed in on the map interface so as to more clearly view various objects and the underlying map tiles.

o Figure 5C shows the user selecting various objects and/or features while the heatmap is displayed using the rectangle selection tool, such as to view information regarding the features in a histogram. Figure 5D shows the selected objects, selected in Figure 5C, now highlighted (512).

Tn the map system a heatmap may be generated on any object type, and/or on mifitiple object types. Tn an embodiment, different heatmap radiuses may be set for different object types. For example, the user may generate a heatmap in which streetlights have a m radius, while hospitals have a 500 m radius. In an embodiment, the heatmap may be generated based on arbitrary shapes. For example, rather than a circular-based heatmap, the heatmap may be rectangular-based or ellipse-based. In an embodiment, the heatmap may be generated based on error ellipses and/or tolerance ellipses. A heatmap based on error ellipses may be advantageous when the relevant objects have associated error regions. For example, when a location of an object is uncertain, or multiple datapoints associated with an object are available, an error ellipse may help the user determine the actual location of the object.

Figures 5E-SF illustrate sample user interfaces of the map system in which a shape-based geosearch is displayed, according to embodiments of the present disclosure. In Figurc 5E, thc uscr has sclcctcd thc shape button 104, and a shape information window 514 is shown. In the user interface of Figure E the user has drawn lines i8, however any shapes maybe drawn on the map interface. Tnformation related to the drawn lines i8 is displayed in the shape information window 514. For example, at i6 the starting points, distance, and azimuth related to each line are displayed. Further, a total distance from the start to the end of the line is shown.

Figure 5F shows a geosearch performed on the line shape drawn in Figure 5E.

Geosearch information window 520 indicates a search area 522, a time range 524, and an object type 526 as described above with reference to Figure 4B. The search area is indicated on the map interface by the highlighted area 528 along the drawn line. The geosearch may be performed, and results may be shown, in a manner similar to that described above with reference to Figures 4B-4D. For example, geosearch along a path maybe used to determine points of interest along that path.

Figure 5G illustrates a sample user interface of the map system in which a keyword o object search is displayed, according to an embodiment of the present disclosure. The user may type words, keywords, numbers, and/or geographic coordinates, among others, into the search box 112. In Figure 5G, the user has typed Bank (530). As the user types, the map system automatically searches for objects and/or features that match the information typed. Matching may be performed based on object data and/or metadata. Search results are displayed as shown at 532 in Figure 5G. Tn the example, a list of banks (bank features) is shown. The user may then select from the list shown, at which point the map system automatically zooms to the selected feature and indicates the selected feature with an arrow 534. In various embodiments, the selected feature may be indicated by highlighting, outlining, and/or any other type of indicator. In an embodiment, the search box 112 may be linked to a gazetteer so as to enable simple word searches for particular geographic locations. For example, a search for a city name, New York, may be linked with the geographic coordinates of the city, taking the user directly to that location on the map interface.

Figure 5H illustrates an example of a UTF grid of the map system, according to an embodiment of the present disclosure. In an embodiment, the UTF grid enables feature outlining and/or highlighting of many objects with client-side components. In one embodiment, each map tile (or image) of the map interface includes an associated tcxtual TIFF (UCS Transformation Format) grid. In Figurc 5H, an cxamplc map tilc 526 is shown next to an associated example UTF grid 538. Tn this examp'e, the map tile and associated TIFF grid are generated bythe server-side components and sent to the client-side components. In the TIFF grid, each character represents a pixel in the map tile image, and each character indicates what feature is associated with the pixel. Each character in the UTF grid may additionally be associated with a feature identifier which may be used to request metadata associated with that feature.

Contiguous regions of characters in the UTF grid indicate the bonnds of a particular feature, and maybe used by the client-side components to provide the feature highUghting and/or outfining. For example, when a user hovers a mouse pointer over a feature on a map tile, the map system determines the character and portion of the UTF grid associated with the pixel hovered over, draws a feature outline based on the UTF grid, and may additionally access metadata associated with the feature based on the feature identifier associated with the feature. In an embodiment, the UTF grid is sent to the client-side components in a JSON (JavaScript Object Notation) format.

o Figure 6A shows a flow diagram depicting iflustrative client-side operations of the map system, according to an embodiment of the present disdosure. In various embodiments, fewer blocks or additional blocks may be included in the process, or various blocks may be performed in an order different from that shown in Figure 6* In an embodiment, one or more blocks in Figure 6A maybe performed by client-side components of the map system, for examp'e, computer system 800 (described below in reference to Figure 8D).

At block 602, the map system provides a user interface (for examp'e, the user interface of Figure i) to the user. As described above and below, the user interface may be provided to the user through any electronic device, such as a desktop computer, a laptop computer, a mobile smartphone, and/or a tablet, among others. At block 604, an input is received from the user of the map system. For example, the user may use a mouse to roll over and/or click on an item of the user interface, or the user may touch the display of the interface (in the example of a touch screen device).

[oioi] Inputs received from the user may include, for example, hovering over, rolling over, and/or touching and object in the user interface (6o6); filling out a text field (614); drawing a shape in the user interface (6o8), and/or drawing a selection box and/or shape in the user interface (610); among other actions or inputs as described abovc.

At block 612, any of inputs 606, 614, 6o8, and 610 may cause the map system to perform client-side actions to update the user interface. For example, hovering over an object (6o6) may result in the cflent-side components of the map system to access the UTF grid, determine the boundaries of the object, and draw an outline around the hovered-over object. in another example, filling out a text field (614) may include the user inputting data into the map system. In this example, the user may input -22-geographic coordinates, metadata, and/or other types of data to the map system. These actions may result in, for example, the client-side components of the map system storing the inputted data and/or taking an action based on the inputted data. For examp'e, the user inputting coordinates may result in the map interface being updated to display the inputted information, such as an inputted name overlaying a particular object. In yet another example, the actions/inputs of drawing a shape (608) and/or drawing a selection (6io) may result in the client-side components of the map system to update the user interface with colored and/or highlighted shapes (see, for example, Figure 3A).

Tn an embodiment, one or more blocks in Figure 6A maybe performed by server-side components of the map system, for example, server 830 (described below in reference to Figure 8D).

Figure 6B shows a flow diagram depicting illustrative ólient-side metadata retrieval of the map system, according to an embodiment of the present disclosure. In various embodiments, fewer blocks or additional blocks may be included in the process, or various blocks may be performed in an order different from that shown in Figure 6B. In an embodiment, one or more blocks in Figure 6B may be performed by client-side components of the map system, for example, computer system Soo.

At block 620, the client-side components of the map system detect that the user is hovering over and/or touching an object in the user interface. At block 622, and as described above, the client-side components may access the TIFF grid to determine the feature identifier and object boundaries associated with the hovered-over object. Then, at block 624, the client-side components may render the feature shape on the image or map interface. The feature shape may be rendered as an outline and/or other highlighting.

At block 636, the client-side components detect whether the user has selected the object. Objects maybe selected, for example, if the user cUcks on the object and or touches the object. Tf the user has selected the object, then at block 628, the cHent-side components query the server-side components to retrieve metadata associated with the selected object. in an embodiment, querying of the server-side components may include transmitting the feature identifier associated with the selected object to the -23 -server, the server retrieving from a database the relevant metadata, and the server transmitting the retrieved metadata back to the client-side components.

At block 630, the metadata is received by the client-side components and displayed to the user. For example, the metadata associated with the selected object may be displayed to the user in the user interface in a dedicated metadata window, among other possibilities.

In an embodiment, one or more blocks in Figure 6B may be performed by server-side jo components of the map system, for examp'e, server 830.

[0109] Figure 7A shows a flow diagram depicting illustrative server-side operations of the map system, according to an embodiment of the present disclosure. In various embodiments, fewer blocks or additional blocks may be included in the process, or various blocks may be performed in an order different from that shown in Figure 7A. In an embodiment, one or more blocks in Figure 7A may be performed by server-side components of the map system, for examp'e, server 830.

Server-side operations of the map system may include composing and updating the map tiles that make up the map interface. For example, when the user changes the selection of the base layer and/or one or more of the vector layers, the map tiles are re-composed and updated in the map interface to reflect the user's selection. Selection of objects resulting in highlighting of those objects may also involve re-composition of the map tiles. Further, UTF grids may be generated by the server-side components for each map tile composed.

At block 702, the user interface is provided to the user. At block 704 an input from the user is received. Inputs received from the user that may result in server-side operations may include, for example, an object selection (706), a change in layer selection (708), a gcoscarch (710), gcncrating a hcatmap (712), searching from thc search box (714), and/or panning or zooming the map interface, among others.

At block 716, the client-side components of the map system may query the server-side components in response to any of inputs 706, 708, 710, 712, and 714 from the user. The server-side components then update and re-compose the map tiles and UTF grids of the map interface in accordance with the user input (as described below in reference to Figure 7B), and transmits those updated map tiles and UTF grids back to the client-side components.

At block 718, the client-side components receive the updated map tile information from the server, and at block 720 the user interface is updated with the received information.

[0114] In an embodiment, additional information and/or data, in addition to updated map tiles, may be transmitted to the client-side components from the server-side components. For example, object metadata may be transmitted in response to a user selecting an object.

Tn an embodiment, one or more blocks in Figure 7A may be performed by cflent-side components of the map system, for example, computer system Soo.

Figure 7B shows a flow diagram depicting illustrative server-side layer composition of the map system, according to an embodiment of the present disclosure. In various embodiments, fewer blocks or additional blocks may be included in the process, or various blocks maybe performed in an order different from that shown in Figure 7B. In an embodiment, one or more blocks in Figure 7B may be performed by server-side components of the map system, for example, server 830.

At block 730, a query is received by the server-side components from the client-side components. Such a query may originate, for example, at block 716 of Figure 7A. At block 732, the server-side components determine the map tile composition based on the query. For example, if the user has selected an object or group of objects, the map tiles containing those objects may be updated to include highlighted objects. In another example, if the user has changed the layer selection, the map tiles maybe updated to include only those layers that are currently selected. In the example of Figure 7B, the layers currently s&ected are determined, and the layers are composed and/or rendered into thc map tilcs. In anothcr cxamplc, if thc user has pcrformcd a gcoscarch and selected to add the search result objects to the map interface, the map tiles are updated to include those search result objects. Tn yet another example, when the user has generated a heatmap, the map tiles are updated to show the generated heatmap. In another example, if the user searches via the search box, the sdected objects maybe highlighted in the re-composed map tiles. In another example, when the user pans and/or zooms in the map interface, the map tiles are updated to reflect the new view -25 -selected by the user. In all cases, and updated UTF grid may also be generated for each composed map tile.

At block 734, the map system determines whether the layers necessary to compose the requested map tiles are cached. For example, when a layer is selected by the user, that layer may be composed by the map system and placed in a memory of the server-side components for future retrieval. Caching of composed layers may obviate the need for recomposing those layers later, which advantageously may save time and/or processing power.

Tf the required layers are cached, then at block 740 the ayers are composed into the requested map tiles and, at block 742, transmitted to the client-side components.

[0120] When the required layers are not cached, at block 736, the server-side components calculate and/or compose the requested layer and or layers, and may then, at block 738, optionafly cache the newly composed layers for future retrieval. Then, at blocks 740 and 742, the layers are composed into map tiles and provided to the client-side components.

In an embodiment, entire map tiles may be cached by the server-side components. In an embodiment, the size and/or quality of the map tiles that make up that map interface maybe selected and/or dynamically selected based on at least one of: the bandwidth available for transmitting the map tiles to the client-side components, the size of the map interface, and/or the complexity of the layer composition, among other factors. In an embodiment, the map tiles comprise images, for example, in one or more of the following formats: PNG, GIF, JPEG, TIFF, BMP, and/or any other type of appropriate image format.

In an embodiment, the layer and object data composed into layers and map tiles compriscs vcctor data. Thc vcctor data (for cxamplc, objcct data) may includc associated metadata, as described above. In an embodiment, the vector, layer, and/or object data and associated metadata may originate from one or more databases and/or electronic data stores.

In an embodiment, one or more blocks in Figure 7B may be performed by client-side components of the map system, for example, computer system 800.

-26 -In an embodiment, the map system may display more than 50 million selectable features to a user simultaneously. In an embodiment, the map system may support tens or hundreds of concurrent users accessing the same map and object data. Tn an embodiment, map and object data used by the map system maybe mirrored and/or spread across multiple computers, servers, and/or server-side components.

In an embodiment, rather than updating the map tiles to reflect a selection by the user of one or more objects, the map system may show an approximation of the selection to the user based on client-side processing.

Tn an embodiment, a user may drag and drop files, for example, vector data and/or vector layers, onto the user interface of the map system, causing the map system to automatically render the file in the map interface.

Tn an embodiment, icons and/or styles associated with various objects in the map interface may be updated and/or changed by the user. For example, the styles of the various objects may be specified in or by a style data file. The sty'e data file may be formatted according to a particular format or standard readable by the map system. In an embodiment, the style data file is formatted according to the JSON format standard.

The user may thus change the look of the objects and shapes rendered in the map interface of the map system by changing the style data file. The style data file may further define the looks for object and terrain (among other items and data) at various zoom levels.

In an embodiment, objects, notes, metadata, and/or other types of data maybe added to the map system by the user through the user interface. In an embodiment, user added information may be shared between multiple users of the map system. In an embodiment, a user of the map system may add annotations and shapes to the map interface that maybe saved and shared with other users. Tn an embodiment, a user of the map system may share a selection of objects with one or more other users.

Tn an embodiment, the user interface of the map system may include a timeline window. The timeline window may enable the user to view objects and layers specific to particular moments in time and/or time periods. In an embodiment, the user may view tolerance ellipses overlaid on the map interface indicating the likely position of an object across a particular time period. -27-

In an embodiment, the map system may indude elevation profiling. Elevation profiling may allow a user of the system to determine the elevation along a path on the map interface, to perform a viewshed analysis (determine objects and/or terrain viewable from a particular location), to perform a reverse-viewshed ana'ysis (for a particular location, determine objects and/or terrain that may view the thcation), among others.

In an embodiment, vector data, object data, metadata, and/or other types of data may be prepared before it is entered into or accessed by the map system. For examp'e, the o data may be converted from one format to another, may be crawled for common items of metadata, and/or may be prepared for application of a style file or style information, among other action. In an embodiment, a layer ontology may be automatically generated based on a group of data. In an embodiment, the map system may access common data sources available on the Internet, for example, road data available from openstreetmap.org.

Tn an embodiment, roads shown in the map interface are labeled with their names, and buildings are rendered in faux-3D to indicate the building heights. In an embodiment, Blue Force Tracking may be integrated into the map system as a layer with the characteristics of both a static vector layer and a dynamic selection layer. A Blue Force layer may enable the use of the map system for live operational analysis. In an embodiment, the map system may quickly render detailed chloropleths or heatmaps with minimal data transfer. For example, the system may render a chloropleth with a property value on the individual shapes of the properties themselves, rather than aggregating this information on a county or zip code level.

Advantageously, the map system displays many items of data, objects, features, and/or layers in a single map interface. A user may easily interact with things on the map and gather information by hovering over or selecting features, even though those features may not be labeled. The user may select features, may "drill down" on a particular type of feature (for example, roads), may view features through histograms, may use histograms to determine common characteristics (for example, determine the most common speed limit), and/or may determine correlations among features (for example, see that slower speed limit areas are centered around schools). Further, the map system may be useful in many different situations. For example, the system may be useful to operational planners and/or disaster relief personnel.

-28 -Additionafly, the map system accomplishes at least three core ideas: providing a robust and fast back-end (server-side) renderer, keeping data on the back-end, and only transferring the data necessary to have interactivity. Tn one embodiment, the primary function of the server-side components is rendering map tiles. The sewer is capable of drawing very detailed maps with a variety of styles that can be based on vector metadata. Rendered map tiles for a vector layer are cached, and severa' of these layer tiles are drawn on top of one another to produce the final tile that is sent to the client-side browser. Map tile rendering is fast enough for displaying dynamic tiles for jo selection and highlight to the user. Server-side operations allow for dynamic selections of very large numbers of features, calculation of the histogram, determining the number of items shown and/or selected, and drawing the selection, for example.

Further, the heatmap may include large numbers of points without incurring the cost of transferring those points to the client-side browser. Additionally, transferring only as mitch data as necessary to have interactivity enables quick server rendering of dynamic selections and vector layers. On the other hand, highlighting hovered-over features may be performed client-side nearly instantaneously, and provides useful feedback that enhances the interactivity of the map system. In an embodiment, to avoid transferring too much geometric data, the geometries of objects (in the map tiles and TIFF grid) are down-sampled depending on how zoomed in the user is to the map interface. Thus, map tiles may be rendered and presented to a user of the map system in a dynamic and useaNe manner.

Object Centric Data Model To provide a framework for the following discussion of specific systems and methods described above and below, an example database system 1210 using an ontology 1205 will now be described. This description is provided for the purpose of providing an example and is not intended to limit the techniques to the example data mod&, the cxamplc databasc systcm, orthc cxamplc database system's usc of an ontology to represent information.

Tn one embodiment, a body of data is conceptuafly structured according to an object-centric data model represented by ontothgy 1205. The conceptual data model is independent of any particifiar database used for durably storing one or more database(s) 1209 based on the ontology 1205. For example, each object of the conceptual data model may correspond to one or more rows in a relational database or -29 -an entry in Lightweight Directory Access Protocol (LDAP) database, or any combination of one or more databases.

Figure 8A illustrates an object-centric conceptual data modd according to an embodiment. An ontology 1205, as noted above, may include stored information providing a data mode' for storage of data in the database 1209. The ontology 1205 may be defined by one or more object types, which may each be associated with one or more property types. At the highest level of abstraction, data object 1201 is a container for information representing things in the world. For example, data object 1201 can jo represent an entity such as a person, a place, an organization, a market instrument, or other noun. Data object 1201 can represent an event that happens at a point in time or for a duration. Data object 1201 can represent a document or other unstructured data source such as an e-mail message, a news report, or a written paper or article. Each data object 1201 is associated with a unique identifier that uniquely identifies the data object within the database system.

Different types of data objects may have different property types. For example, a "Person" data object might have an "Eye Color" property type and an "Event" data object might have a "Date" property type. Each property 1203 as represented by data in the database system 1210 may have a property type defined by the ontology 1205 used by the database 1205.

Objects may be instantiated in the database 1209 in accordance with the corresponding object definition for the particular object in the ontology 1205. For example, a specific monetary payment (e.g., an object of type "event") of US$30.00 (e.g., a property of type "currency") taking place on 3/27/2009 (e.g., a property of type "date") may be stored in the database 1209 as an event object with associated currency and date properties as defined within the ontology 1205.

The data objects defined in the ontology 1205 may support property mdtiplicity. Tn particular, a data object 1201 maybe allowed to have more than one property 1203 of the same property type. For example, a "Person" data object might have multiple "Address" properties or multiple "Name" properties.

Each link 1202 represents a connection between two data objects 1201. tn one embodiment, the connection is either through a relationship, an event, or through matching properties. A relationship connection may be asymmetrica' or symmetrical.

For example, "Person" data object A may be connected to "Person" data object B by a "Child Of' relationship (where "Person" data object B has an asymmetric "Parent Of' r&ationship to "Person" data object A), a "Kin Of" symmetric relationship to "Person" data object C, and an asymmetric "Member Of' relationship to "Organization" data object X. The type of relationship between two data obj ects may vary depending on the types of the data objects. For example, "Person" data object A may have an "Appears In" relationship with "Document" data object Y or have a "Participate In" relationship with "Event" data object E. As an example of an event connection, two "Person" data o objects may be connected by an "Airline Flight" data object representing a particular airline flight if they traveled together on that flight, or by a "Meeting" data object representing a particular meeting if they both attended that meeting. In one embodiment, when two data objects are connected by an event, they are also connected by relationships, in which each data object has a specific relationship to the event, such as, for examp'e, an "Appears Tn" relationship.

As an example of a matching properties connection, two "Person" data objects representing a brother and a sister, may both have an "Address" property that indicates where they live. If the brother and the sister live in the same home, then their "Address" properties likely contain similar, if not identical property values. In one embodiment, a link between two data objects may be established based on similar or matching properties (e.g., property types and/or property values) of the data objects. These are just some examples of the types of connections that may be represented by a link and other types of connections may be represented; embodiments are not limited to any particular types of connections between data objects. For example, a document might contain references to two different objects. For example, a document may contain a reference to a payment (one object), and a person (a second object). A link between these two objects may represent a connection between these two entities through their co-occurrcncc within thc samc documcnt.

Each data object 1201 can have multiple links with another data object 1201 to form a link set 1204. For example, two "Person" data objects representing a husband and a wife could be linked through a "Spouse Of' relationship, a matching "Address" property, and one or more matching "Event" properties (e.g., a wedding). Each link 1202 as represented by data in a database may have a link type defined by the database ontology used by the database.

Figure 8B is a block diagram illustrating exemp'ary components and data that maybe used in identifying and storing data according to an ontology. Tn this example, the ontology may be configured, and data in the data model populated, by a system of parsers and ontology configuration tools. tn the embodiment of Figure 8B, input data 1300 is provided to parser 1302. The input data may comprise data from one or more sources. For example, an institution may have one or more databases with information on credit card transactions, rental cars, and people. The databases may contain a variety of related information and attributes about each type of data, such as a "date" jo for a credit card transaction, an address for a person, and a date for when a rental car is rented. The parser 1302 is able to read a variety of source input data types and determine which type of data it is reading.

In accordance with the discussion above, the example ontology 1205 comprises stored information providing the data model of data stored in database 1209, and the ontology is defined by one or more object types 1310, one or more property types 1316, and one or more link types 1330. Based on information determined by the parser 1302 or other mapping of source input information to object type, one or more data objects 1201 may be instantiated in the database 209 based on respective determined object types l3io, and each of the objects 1201 has one or more properties 1203 that are instantiated based on property types 1316. Two data objects 1201 may be connected by one or more links 1202 that may be instantiated based on link types 1330. The property types 1316 each may comprise one or more data types 1318, such as a string, number, etc. Property types 1316 may be instantiated based on a base property type 1320. For example, a base property type 1320 may be "Locations" and a property type 1316 may be "Home." [0146] Tn an embodiment, a user of the system uses an object type editor 1324 to create and/or modify the object types 1310 and define attributes of the object types. In an embodiment, a user of the system uses a property type editor 1326 to create and/or modify ti-ic propcrty types 1316 and dcfinc attributcs of thc propcrty typcs. In an embodiment, a user of the system uses fink type editor 1328 to create the link types 1330. Mternatively, other programs, processes, or programmatic controls maybe used to create link types and property types and define attributes, and using editors is not required.

In an embodiment, creating a property type 1316 using the property type editor 1326 involves defining at least one parser definition using a parser editor 1322. A parser definition comprises metadata that informs parser 1302 how to parse inpnt data 1300 to determine whether values in the input data can be assigned to the property type 1316 that is associated with the parser definition. Tn an embodiment, each parser definition may comprise a regular expression parser 13o4A or a code module parser 1304B. Tn other embodiments, other kinds of parser definitions may be provided nsing scripts or other programmatic elements. Once defined, both a regular expression parser 13o4A and a code module parser 13o4B can provide input to parser 1302 to control parsing of input data 1300.

Jo Using the data types defined in the ontology, input data 1300 may be parsed by the parser 1302 determine which object type 1310 shoffid receive data from a record created from the input data, and which property types 1316 should be assigned to data from individual field values in the input data. Based on the object-property mapping 1301, the parser 1302 selects one of the parser definitions that is associated with a property type in the input data. The parser parses an input data field using the selected parser definition, resulting in creating new or modified data 1303. The new or modified data 1303 is added to the database 1209 according to ontothgy 205 by storing values of the new or modified data in a property of the specified property type. As a result, input data 1300 having varying format or syntax can be created in database 1209. The ontology 1205 may be modified at anytime using object type editor 1324, property type editor 1326, and link type editor 1328, or under program control without human use of an editor. Parser editor 1322 enables creating multiple parser definitions that can successfully parse input data 1300 having varying format or syntax and determine which property types should be used to transform input data 300 into new or modified input data 1303.

The properties, objects, and links (e.g. relationships) between the objects can be visualized using a graphical user interface (GUI). For example, Figure SC displays a uscr intcrfacc showing a graph rcprescntation 1403 of rclationships (including rdationships and/or finks 1404,1405, 1406,1407,1408,1409,1410,1411,1412, and 1413) between the data objects (including data objects 1421, 1422, 1423, 1424, 1425, 1426, 1427, 1428, and 1429) that are represented as nodes in the example of Figure 8C.

In this embodiment, the data objects include person objects 1421, 1422, 1423, 1424, 1425, and 1426; a flight object 1427; a financial account 1428; and a computer object 1429. In this example, each person node (associated with person data objects), flight node (associated with flight data objects), financial account node (associated with financial account data objects), and computer node (associated with computer data objects) may have relationships and/or links with any of the other nodes through, for examp'e, other objects such as payment objects.

For example, in Figure 8C, relationship 1404 is based on a payment associated with the individua's indicated in person data objects 1421 and 1423. The link 1404 represents these shared payments (for example, the individual associated with data object 1421 may have paid the individual associated with data object 1423 on three occasions). The relationship is further indicated by the common relationship between person data o objects 1421 and 1423 and financia' account data object 1428. For example, link 1411 indicates that person data object 1421 transferred money into financial account data object 1428, while person data object 1423 transferred money out of financial account data object 1428. In another example, the relationships between person data objects 1424 and 1425 and flight data object 1427 are indicated by links 1406, 1409, and 1410.

Tn this example, person data objects 1424 and 1425 have a common address and were passengers on the same flight data object 1427. In an embodiment, further details rdated to the relationships between the various objects may be displayed. For examp'e, links 1411 and 1412 may, in some embodiments, indicate the timing of the respective money transfers. In another example, the time of the flight associated with the flight data object 1427 may be shown.

Rebtionships between data objects maybe stored as links, or in some embodiments, as properties, where a relationship may be detected between the properties. tn some cases, as stated above, the links may be directional. For example, a payment link may have a direction associated with the payment, where one person object is a receiver of a payment, and another person object is the payer of payment.

[0152] In various embodiments, data objects may further include geographical metadata and/or links. Such geographical metadata may be accessed by the interactive data objcct map systcm for displaying objccts and fcaturcs on thc map intcrfacc (as described above).

Tn addition to visually showing relationships between the data objects, the user interface may allow various other manipulations. For example, the objects within database no8 may be searched using a search interface 1450 (e.g., text string matching of object properties), inspected (e.g., properties and associated data viewed), filtered (e.g., narrowing the universe of objects into sets and subsets by properties or relationships), and statistically aggregated (e.g., numerically summarized based on summarization criteria), among other operations and visualizations. Additionally, as described above, objects within database no8 maybe searched, accessed, and implemented in the map interface of the interactive data object map system via, for example, a geosearch and/or radius search.

Implementation Mechanisms According to an embodiment, the interactive data object map system and other methods and techniques described herein are implemented by one or more special-o purpose computing devices. The special-purpose computing devices maybe hard-wired to perform the techniques, or may include digital electronic devices such as one or more application-specific integrated circuits (ASICs) or field programmable gate arrays (FPGAs) that are persistently programmed to perform the techniques, or may include one or more general purpose hardware processors programmed to perform the techniques pursuant to program instructions in firmware, memory, other storage, or a combination. Such speciakpurpose computing devices may also combine custom hard-wired logic, ASICs, or FPGAs with custom programming to accompllsh the techniques.

The special-purpose computing devices maybe desktop computer systems, server computer systems, portable computer systems, handheld devices, networking devices or any other device or combination of devices that incorporate hard-wired and/or program ogic to implement the techniques.

Computing device(s) are generally controlled and coordinated by operating system software, such as iOS, Android, Chrome OS, Windows XP, Windows Vista, Windows 7, Windows 8, Windows Server, Windows CE, Unix, Linux, SunOS, Solaris, iOS, Blackberry OS, VxWorks, or other compatible operating systems. In other embodiments, the computing device may be controlled by a proprietary operating system. Conventional operating systems contro' and schedu'e computer processes for execution, pcrform mcmory managemcnt, providc filc system, networking, I/O services, and provide a user interface functionality, such as a graphical user interface ("GUT"), among other things.

For example, Figure 8D is a block diagram that illustrates a computer system 800 upon which the various systems and methods discussed herein may be implemented.

Computer system 800 includes a bus 802 or other communication mechanism for communicating information, and a hardware processor, or multiple processors, 804 coupled with bus 802 for processing information. Hardware processor(s) 804 may be, for example, one or more general purpose microprocessors.

Computer system 800 also includes a main memory 806, such as a random access memory (RAM), cache and/or other dynamic storage devices, coupled to bus 802 for storing information and instructions to be executed by processor 804. Main memory 8o6 also may be used for storing temporary variables or other intermediate information during execution of instructions to be executed by processor 804. Such instructions, when stored in storage media accessible to processor 804, render o computer system 800 into a special-purpose machine that is customized to perform the operations specified in the instructions.

Computer system 8oo further includes a read only memory (ROM) 8o8 or other static storage device coupled to bus 802 for storing static information and instructions for processor 804. A storage device 810, such as a magnetic disk, optical disk, or TJSB thumb drive (flash drive), etc., is provided and coup'ed to bus 802 for storing information and instructions.

Computer system 800 maybe coupled via bus 802 to a display 812, such as a cathode ray tube (CRT), LCD display, or touch screen display, for displaying information to a computer user and/or receiving input from the user. An input device 814, including alphanumeric and other keys, is coupled to bus 802 for communicating information and command selections to processor 804. Another type of user input device is cursor control 816, such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor 804 and for controlling cii rsor movement on display 812. This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to speci' positions in a plane. In some embodiments, the same direction information and command sclcctions as cursor control may bc implcmcntcd via receiving touches on a touch screen without a cursor.

Computing system 800 may include a user interface module, and/or various other types of modules to implement a GUI, a map interface, and the various other aspects of the interactive data object map system. The modules maybe stored in a mass storage device as executable software codes that are executed by the computing device(s). This and other modules may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, snbroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables.

In general, the word "module," as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, Java, Lua, C or C++.

A software mod ule maybe compiled and linked into an executable program, installed in o a dynamic link library, or may be written in an interpreted programming language such as, for examp'e, BASIC, Perl, or Python. Tt will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software modules configured for execution on computing devices may be provided on a computer readable medium, such as a compact disc, digita' video disc, flash drive, magnetic disc, or any other tangible medium, or as a digital download (and may be originally stored in a compressed or instaflable format that requires installation, decompression or decryption prior to execution). Such software code may be stored, partially or fully, on a memory device of the executing computing device, for execution by the computing device. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules or computing device functionality described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage Computer systcm 8oo may implcmcnt thc tcchniqucs dcscribcd hcrcin using customized hard-wired logic, one or more ASICs or FPGAs, firmware and/or program logic which in combination with the computer system causes or programs computer system 8oo to be a special-purpose machine. According to one embodiment, the techniques herein are performed by computer system 800 in response to processor(s) 804 executing one or more sequences of one or more modules and/or instructions contained in main memory 806. Such instructions may be read into main memory 806 from another storage medium, such as storage device 810. Execution of the sequences of instructions contained in main memory 8o6 causes processor(s) 804 to perform the process steps described herein. In alternative embodiments, hard-wired circuitry may be used in place of or in combination with software instructions.

The term "non-transitory media," and similar terms, as used herein refers to any media that store data and/or instructions that cause a machine to operate in a specific fashion.

Such non-transitory media may comprise non-volatile media and/or volatile media.

Non-volatile media includes, for example, optical or magnetic disks, such as storage device 8io. Volatile media includes dynamic memory, such as main memory Sob.

jo Common forms of non-transitory media include, for example, a floppy disk, a flexible disk, hard disk, solid state drive, magnetic tape, or any other magnetic data storage medium, a CD-ROM, any other optical data storage medium, any physical medium with patterns of holes, a RAM, a PROM, and EPROM, a FLASH-EPROM, NVRAM, any other memory chip or cartridge, and networked versions of the same.

[0164] Non-transitory media is distinct from but may be used in conjunction with transmission media. Transmission media participates in transferring information between nontransitory media. For example, transmission media includes coaxial cables, copper wire and fiber optics, inëluding the wires that comprise bus 802.

Transmission media can also take the form of acoustic or light waves, such as those generated during radio-wave and infra-red data communications.

[0165] Various forms of media may be involved in carrying one or more sequences of one or more instructions to processor 804 for execution. For example, the instructions may initially be carried on a magnetic disk or solid state drive of a remote computer.

The remote computer can load the instructions and/or modules into its dynamic memory and send the instructions over a telephone line using a modem. A modem local to computer system 8oo can receive the data on the telephone line and use an infra-red transmitter to convert the data to an infra-red signal. An infra-red detector can receive the data carried in the infra-red signal and appropriate circuitry can place the data on bus 802. Bus 802 carrics thc data to main mcmoiy 806, from which proccssor 804 retrieves and executes the instructions. The instructions received by main memory 806 may optionally be stored on storage device 810 either before or after execution by processor 804.

Computer system 800 also indudes a communication interface 818 coupled to bus 802.

Communication interface 818 provides a two-way data communication coupling to a network link 820 that is connected to a ocal network 822. For examp'e.

-38 -communication interface 818 maybe an integrated services digital network (ISDN) card, cable modem, satellite modem, or a modem to provide a data communication connection to a corresponding type of telephone line. As another example, communication interface 8i8 maybe a local area network (LAN) card to provide a data communication connection to a compatible LAN (or WAN component to communicated with a WAN). Wireless links may also be implemented. in any such implementation, communication interface 8i8 sends and receives electrical, electromagnetic or optical signals that carry digital data streams representing various types of information.

Network link 820 typically provides data communication through one or more networks to other data devices. For example, network link 820 may provide a connection through local network 822 to a host computer 824 or to data equipment operated by an internet Service Provider (ISP) 826. ISP 826 in turn provides data communication services through the world wide packet data communication network now commonly referred to as the "Internet" 828. Local network 822 and Internet 828 both use electrical, electromagnetic or optical signa's that carry digita' data streams.

The signals through the various networks and the signals on network link 820 and through communication interface 818, which carry the digital data to and from computer system 8oo, are example forms of transmission media.

Computer system 800 can send messages and receive data, including program code, through the network(s), network link 820 and communication interface 818. In the Internet example, a server 830 might transmit a requested code for an application program through Internet 828, ISP 826, local network 822 and communication interface 8i8. Server-side components of the interactive data object map system described above (for example, with reference to Figures 7A and 7B) may be implemented in the server 830. For example, the server 830 may compose map layers and tilcs, and transmit thosc map tilcs to thc computcr systcm 800.

The computer system 8oo, on the other hand, may implement the the client-side components of the map system as described above (for example, with reference to Figures 6A and 6B). For example, the computer system may receive map tiles and/or other code that may be executed by processor 804 as it is received, and/or stored in storage device 810, or other non-volatile storage for later execution. The computer system 800 may further compose the map interface from the map tiles, display the map interface to the user, generate object outlines and other functionality, and/or receive input from the user.

Tn an embodiment, the map system may be accessible by the user through a web-based viewer, such as a web browser. In this embodiment, the map interface may be generated by the server 830 and/or the computer system 800 and transmitted to the web browser of the user. The user may then interact with the map interface through the web-browser. In an embodiment, the computer system 8oo may comprise a mobile electronic device, such as a cell phone, smaitphone, and/or tablet. The map system may o be accessible by the user through such a mobile electronic device, among other types of electronic devices.

Each of the processes, methods, and algorithms described in the preceding sections may be embodied in, and ful'y or paitially automated by, code modules executed by one or more computer systems or computer processors comprising computer hardware.

The processes and algorithms may be implemented partially or wholly in application-specific circuitry.

The various features and processes described above may be used independently of one another, or may be combined in various ways. All possible combinations and subcombinations are intended to fall within the scope of this disclosure. In addition, certain method or process blocks may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate. For example, described blocks or states maybe performed in an order other than that specifically disclosed, or multiple blocks or states may be combined in a single block or state. The example blocks or states may be performed in serial, in parallel, or in some other manner. Blocks or states may be added to or removed from the disclosed example embodiments. The example systems and components described herein may be configured differently than described. For examp'e, elements may be added to, removed from, or rearranged compared to the disclosed example embodiments.

Conditional language, such as, among others, "can," "could," "might," or "may," unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain embodiments include, while other embodiments do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in anyway required for one or more embodiments or that one or more embodiments necessarily include logic for deciding, with or without user input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular embodiment.

Any process descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the attached Figures should be understood as potentially jo representing modules, segments, or portions of code which include one or more executable instnictions for implementing specific logica' functions or steps in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those skilled in the art.

It should be emphasized that many variations and modifications may be made to the above-described embodiments, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure. The foregoing description details certain embodiments of the invention, it will be appreciated, however, that no matter how detailed the foregoing appears in text, the invention can be practiced in many ways. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the invention should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the invention with which that terminology is associated. The scope of the invention should therefore be construcd in accordancc with thc appcndcd claims and any cquivalcnts thcrcof.

Various example embodiments of the disclosure can be described with respect to the following clauses: Clause 1. A computer system comprising: an electronic data structure configured to store a plurality of features or objects, wherein each of the features or objects is associated with metadata; a computer readable medium storing software modules including computer executable instructions; one or more hardware processors in communication with the electronic data structure and the computer readable medium, and configured to execute a user interface module of the software modules in order to: display an interactive map on an electronic display of the computer system; include on the interactive map one or more features or objects, wherein the features or objects are selectable by a user of the computer system, and wherein the features or objects are accessed from the electronic data structure; receive a first input from the user selecting one or more of the included features or objects; and in response to the first input, access, from the electronic data structure, the metadata associated with each of the selected features or objects; determine one or more metadata categories based on the accessed metadata; organize the selected features or objects into one or more histograms based on the determined metadata categories and the accessed metadata; and display the one or more histograms on the electronic display.

Clause 2. The computer system of Clause 1, wherein the features or objects comprise vector data.

Clause 3. The computer system of Clause 1, wherein the features or objects comprise at least one of roads, terrain, lakes, rivers, vegetation, utilities, street lights, railroads, hotels or motels, schools, hospitals, buildings or structures, regions, transportation objects, entities, events, or documents.

Clause 4. The computer system of Clause 1, wherein the metadata associated with tim fcaturcs or objccts comprise at least onc of a location, a city, a county, a statc, a country, an address, a district, a grade level, a phone number, a speed, a width, or other rdated attributes.

Clause 5. The computer system of Clause 1, wherein the features or objects are selectable by a user using a mouse and/or a touch interface.

Clause 6. The computer system of Clause 1, wherein each histogram of the one or more histograms is specific to a particular metadata category.

Clause 7. The computer system of Clause 6, wherein each histogram of the one or more histograms comprises a list of items of metadata specific to the particular metadata category of the histogram, wherein the list of items is organized in descending order from an item having the largest number of related objects or features to an item having the smallest number of related objects or features.

o Clause 8. The computer system of Clause 1, wherein the one or more histograms displayed on the electronic display are displayed so as to partially overthy the displayed interactive map.

Cause 9. The computer system of Clause 1, wherein the one or more hardware processors are further configured to execute the user interface module in order to: receive a second input from the user selecting a second one or more features or objects from the one or more histograms; and in response to the second input, update the interactive map to display the second one or more features or objects on the display; and highlight the second one or more features or objects on the interactive map.

Clause 10. The computer system of Clause 9, wherein updating the interactive map comprises panning and/or zooming.

Clause ii. The computer system of Clause 9, wherein highlighting the second one or more features comprises at least one of outlining, changing color, bolding, or changing contrast.

Clause 12. The computer system of Clause 9, wherein the one or more hardware processors are further configured to execute the user interface module in order to: receive a third input from the user selecting a drilkdown group of features or objects from the one or more histograms; and in response to the third input, drill-down on the selected drill-down group of features or objects by: accessing the metadata associated with each of the features or objects of the selected drill-down group; determining one or more drill-down metadata categories based on the accessed metadata associated with each of the features or objects of the selected drill-down group; organizing the features or objects of the selected drill-down group into one or more drill-down histograms based on the determined drill-down metadata categories and the accessed metadata associated with each of the features or objects of the selected drill-down group; and displaying on the interactive map the one or more drill-down histograms.

Clause 13. The computer system of Clause 12, wherein the one or more hardware processors are further configured to execute the user interface module in order to enable the user to further drill down into the one or more drill-down histograms.

Clause 14. The computer system of Clause 1, wherein the one or more hardware processors are further configured to execute the user interface module in order to: receive a feature or object hover over input from the user; and in response to receiving the hover over input, highlight, on the electronic display, metadata associated with the particular hovered over feature or object to the user.

Clause 15. The computer system of Clause 1, wherein one or more hardware processors are further configured to execute the user interface module in order to: receive a feature or object selection input from the user; and in response to receiving the selection input, display, on the electronic display, metadata associated with the particular selected feature or object to the user.

Clausc 16. A computcr systcm comprising: an electronic data structure configured to store a phirality of features or objects, wherein each of the features or objects is associated with metadata; a computer readable medium storing software modules including computer executable instnictions; one or more hardware processors in communication with the electronic data structure and the computer readable medium, and configured to execute a user interface module of the software modules in order to: display an interactive map on a display of the computer system, the interactive map comprising a plurality of map tiles accessed from the electronic data structure, the map tiles each comprising an image composed of one or more vector layers; include on the interactive map a phirality of features or objects accessed from the electronic data structure, the features or objects being selectable by a user, each of the features or objects including associated metadata; receive an input from a user including at least one of a zoom action, a pan action, a feature or object selection, a layer selection, a geosearch, a heatmap, and a keyword search; and in response to the input from the user: request, from a server, updated map tiles, the updated map tiles being updated according to the input from the user; receive the updated map tiles from the server; and update the interactive map with the updated map tiles.

Clause 17. The computer system of Clause 16, wherein the one or more vector layers comprise at least one of a regions thyer, a buildings/structures layer, a terrain layer, a transportation layer, or a utilities/infrastructure layer.

Clause iS. The computer system of Clause i6, wherein each of the one or more vector layers is comprised of one or more sub-vector layers.

Clause 19. A computer system comprising: one or more hardware processors in communication with the computer readable medium, and configured to execute a user interface module of the software modules in order to: display an interactive map on a display of the computer system, the interactive map comprising a plurality of map layers; dctcrminc a list of availabic map laycrs; organizing the list of availab'e map layers according to a hierarchical layer ontolo, wherein like map ayers are grouped together; and display on the interactive map the hierarchical layer ontology, wherein the user may select one or more of the displayed layers, and wherein each of the available map layers is associated with one or more feature or object types.

Olause 20. The computer system of Clause 19, wherein the map ayers comprise at least one of vector layers and base layers. -46 -

Claims (22)

  1. Claims 1. A method comprising: generating a graphica' user interface including an interactive map, a plurality of features or objects displayed on the interactive map, and one of more histograms overlaid on an area of the interactive map, wherein generating the graphical user interface comprises: accessing an electronic data structure configured to store a plurality of features or objects, wherein each of the features or objects is associated with metadata; inchiding at least some of the accessed plurality of features or objects on the interactive map, the features or objects being sdectable by a user; and in response to a first input from the user selecting a plurality of the included features or objects: determining metadata associated with respective selected features or objects; determining one or more metadata categories associated with at least one of the determined metadata; for each of the determined metadata categories: generating one or more histograms including metadata values or value ranges associated with respective selected features or objects, each of the histograms including a visual indicator indicating a quantity of the respective selected plurality of features or objects included on the interactive map having the respective metadata value or value range; and overlaying the one or more histograms on the area of the interactive map; and displaying, at a computing device, the graphical user interface to the user.
  2. 2. The method of claim 1, wherein the features or objects comprise vector data.
  3. 3. Thc method of claim 1 or claim 2, whcrcin thc fcaturcs or objccts comprisc at least one of roads, terrain, lakes, rivers, vegetation, utilities, streetlights, railroads, hotels or motels, schools, hospitals, buildings or structures, regions, transportation objects, entities, events, or documents.
  4. 4. The method of any of claims 1-3, wherein the metadata associated with the features or objects comprise at least one of a location, a city, a county, a state, a country, an address, a district, a grade level, a phone number, a speed, a width, or other related attributes.
  5. 5. The method of any of claims 1-4, wherein the features or objects are selectable by the user using a mouse and/or a touch interface.
  6. 6. The method of any of claims 1-5, wherein each histogram of the one or more histograms is specific to a particular metadata category.
  7. 7. The method of claim 6, wherein each histogram of the one or more histograms comprises a fist of items of metadata specific to the particular metadata category of the histogram, wherein the list of items is organized in descending order from an item having the largest number of related objects or features to an item having the smallest number of related objects or features.
  8. 8. The method of any of claims 1-7, wherein the one or more histograms displayed on the electronic display are displayed so as to partially oveñay the displayed interactive map.
  9. 9. The method of any of claims i-8, wherein generating the graphical user interface further comprises: in response to a second input from the user selecting a second one or more features or objects from the one or more histograms: updating the user interface to display the second one or more features or objects on the interactive map; and highlighting the second one or more features or objects on the interactive map.
  10. 10. The method of claim 9, wherein updating the user interface comprises panning and/or zooming thc intcractivc map.
  11. ii. The method of daim 9 or claim 10, wherein highfighting the second one or more features comprises at least one of outlining, changing c6lor, bolding, or changing contrast.
  12. 12. The method of any of claims 9-11, wherein generating the graphical user interface further comprises: -48 -in response to a third input from the user selecting a drill-down group of features or objects from the one or more histograms, drill-down on the selected drill-down group of features or objects by: determining metadata associated with respective features or objects of the selected drill-down group; determining one or more drill-down metadata categories associated with at least one of the accessed metadata associated with each of the features or objects of the selected drill-down group; for each of the determined drill-down metadata categories, generating one or jo more drill-down histograms including drill-down metadata values or value ranges associated with respective features or objects of the selected drill-down group, each of the drill-down histograms including a visual indicator indicating a quantity of the respective features or objects of the selected drill-down group having the respective drill-down metadata value or value range; and overlaying the one or more drill-down histograms on the area of the interactive map.
  13. 13. The method of claim 12 further comprising allowing the user to further drill down into the one or more drill-down histograms.
  14. 14. The method of any of claims 1-13, wherein generating the graphical user interface further comprises: receiving a feature or object hover over input from the user; and in response to receiving the hover over input, highlighting metadata associated with the particular hovered over feature or object to the user.
  15. 15. The method of any of claims 1-14, wherein generating the graphical user interface further comprises: rccciving a fcaturc or objcct sclcction input from thc uscr; and in response to receiving the selection input, displaying metadata associated with the particular selected feature or object to the user.
  16. 16. A method comprising: generating a graphical user interface including an interactive map and a plurality of features or objects displayed on the interactive map, the interactive map -49 -comprising a plurality of map tiles, the map tiles each comprising an image composed of one or more vector layers, wherein generating the graphical user interface comprises: accessing an electronic data structure configured to store a plurality of features or objects, wherein each of the features or objects is associated with metadata; including at least some of the accessed plurality of features or objects on the interactive map, the features or objects being selectable by a user; and in response to a first input from the user including at least one of a zoom action, a pan action, a feature or object selection, a layer selection, a geosearch, a heatmap, or a keyword search: requesting, from a server, updated map tiles, the updated map tiles being updated according to the input from the user; receiving the updated map tiles from the server; and updating the interactive map with the updated map tiles; and displaying, at a computing device, the graphical user interface to the user.
  17. 17. The method of daim iô, wherein the one or more vector layers comprise at least one of a regions layer, a buildings/structures layer, a terrain layer, a transportation layer, or a utilities/infrastructure layer.
  18. i8. The method of claim i6 or claim 17, wherein each of the one or more vector layers is comprised of one or more sub-vector layers.
  19. 19. A method comprising: generating a graphical user interface including an interactive map comprised of a plurality of map layers, wherein generating the graphical user interface comprises: determining a list of available map layers; organizing the list of available map layers according to a hierarchical layer ontology, wherein like map layers are grouped together; and displaying on thc intcractivc map thc hicrarchical laycr ontology, whcrcin a uscr may select one or more of the displayed layers, and wherein each of the available map layers is associated with one or more feature or object types; and displaying, at a computing device, the graphical user interface to the user.
  20. 20. The method of claim 19, wherein the map layers comprise at least one of vector layers and base layers.
  21. 21. A computer system including one or more computer processors configured with computer-executable instructions so as to perform the method of any of claims 1-20.
  22. 22. A computer-readable storage medium storing software instniction that, when executed by one or more processors, cause the one or more processors to carry out the method of any of claims 1-20.
GB1408025.3A 2013-05-07 2014-05-07 Interactive geospatial map Active GB2516155B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201361820608P true 2013-05-07 2013-05-07
US13/917,571 US8799799B1 (en) 2013-05-07 2013-06-13 Interactive geospatial map

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1620827.4A GB2542517B (en) 2013-05-07 2014-05-07 Interactive Geospatial map

Publications (3)

Publication Number Publication Date
GB201408025D0 GB201408025D0 (en) 2014-06-18
GB2516155A true GB2516155A (en) 2015-01-14
GB2516155B GB2516155B (en) 2017-01-18

Family

ID=50980703

Family Applications (2)

Application Number Title Priority Date Filing Date
GB1408025.3A Active GB2516155B (en) 2013-05-07 2014-05-07 Interactive geospatial map
GB1620827.4A Active GB2542517B (en) 2013-05-07 2014-05-07 Interactive Geospatial map

Family Applications After (1)

Application Number Title Priority Date Filing Date
GB1620827.4A Active GB2542517B (en) 2013-05-07 2014-05-07 Interactive Geospatial map

Country Status (1)

Country Link
GB (2) GB2516155B (en)

Cited By (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9043696B1 (en) 2014-01-03 2015-05-26 Palantir Technologies Inc. Systems and methods for visual definition of data associations
US9043894B1 (en) 2014-11-06 2015-05-26 Palantir Technologies Inc. Malicious software detection in a computing system
US9116975B2 (en) 2013-10-18 2015-08-25 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive simultaneous querying of multiple data stores
US9123086B1 (en) 2013-01-31 2015-09-01 Palantir Technologies, Inc. Automatically generating event objects from images
US9129219B1 (en) 2014-06-30 2015-09-08 Palantir Technologies, Inc. Crime risk forecasting
US9223773B2 (en) 2013-08-08 2015-12-29 Palatir Technologies Inc. Template system for custom document generation
US9256664B2 (en) 2014-07-03 2016-02-09 Palantir Technologies Inc. System and method for news events detection and visualization
US9335911B1 (en) 2014-12-29 2016-05-10 Palantir Technologies Inc. Interactive user interface for dynamic data analysis exploration and query processing
US9335897B2 (en) 2013-08-08 2016-05-10 Palantir Technologies Inc. Long click display of a context menu
US9367872B1 (en) 2014-12-22 2016-06-14 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive investigation of bad actor behavior based on automatic clustering of related data in various data structures
US9383911B2 (en) 2008-09-15 2016-07-05 Palantir Technologies, Inc. Modal-less interface enhancements
US9449035B2 (en) 2014-05-02 2016-09-20 Palantir Technologies Inc. Systems and methods for active column filtering
US9454281B2 (en) 2014-09-03 2016-09-27 Palantir Technologies Inc. System for providing dynamic linked panels in user interface
US9454785B1 (en) 2015-07-30 2016-09-27 Palantir Technologies Inc. Systems and user interfaces for holistic, data-driven investigation of bad actor behavior based on clustering and scoring of related data
US9460175B1 (en) 2015-06-03 2016-10-04 Palantir Technologies Inc. Server implemented geographic information system with graphical interface
US9483162B2 (en) 2014-02-20 2016-11-01 Palantir Technologies Inc. Relationship visualizations
US9501851B2 (en) 2014-10-03 2016-11-22 Palantir Technologies Inc. Time-series analysis system
US9619557B2 (en) 2014-06-30 2017-04-11 Palantir Technologies, Inc. Systems and methods for key phrase characterization of documents
US9639580B1 (en) 2015-09-04 2017-05-02 Palantir Technologies, Inc. Computer-implemented systems and methods for data management and visualization
US9646396B2 (en) 2013-03-15 2017-05-09 Palantir Technologies Inc. Generating object time series and data objects
US9727560B2 (en) 2015-02-25 2017-08-08 Palantir Technologies Inc. Systems and methods for organizing and identifying documents via hierarchies and dimensions of tags
US9727622B2 (en) 2013-12-16 2017-08-08 Palantir Technologies, Inc. Methods and systems for analyzing entity performance
US9767172B2 (en) 2014-10-03 2017-09-19 Palantir Technologies Inc. Data aggregation and analysis system
US9785317B2 (en) 2013-09-24 2017-10-10 Palantir Technologies Inc. Presentation and analysis of user interaction data
US9785773B2 (en) 2014-07-03 2017-10-10 Palantir Technologies Inc. Malware data item analysis
US9817563B1 (en) 2014-12-29 2017-11-14 Palantir Technologies Inc. System and method of generating data points from one or more data stores of data items for chart creation and manipulation
US9823818B1 (en) 2015-12-29 2017-11-21 Palantir Technologies Inc. Systems and interactive user interfaces for automatic generation of temporal representation of data objects
US9852195B2 (en) 2013-03-15 2017-12-26 Palantir Technologies Inc. System and method for generating event visualizations
US9852205B2 (en) 2013-03-15 2017-12-26 Palantir Technologies Inc. Time-sensitive cube
US9857958B2 (en) 2014-04-28 2018-01-02 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive access of, investigation of, and analysis of data objects stored in one or more databases
US9864493B2 (en) 2013-10-07 2018-01-09 Palantir Technologies Inc. Cohort-based presentation of user interaction data
US9870205B1 (en) 2014-12-29 2018-01-16 Palantir Technologies Inc. Storing logical units of program code generated using a dynamic programming notebook user interface
US9880987B2 (en) 2011-08-25 2018-01-30 Palantir Technologies, Inc. System and method for parameterizing documents for automatic workflow generation
US9886467B2 (en) 2015-03-19 2018-02-06 Plantir Technologies Inc. System and method for comparing and visualizing data entities and data entity series
US9891808B2 (en) 2015-03-16 2018-02-13 Palantir Technologies Inc. Interactive user interfaces for location-based data analysis
US9898509B2 (en) 2015-08-28 2018-02-20 Palantir Technologies Inc. Malicious activity detection system capable of efficiently processing data accessed from databases and generating alerts for display in interactive user interfaces
US9898528B2 (en) 2014-12-22 2018-02-20 Palantir Technologies Inc. Concept indexing among database of documents using machine learning techniques
US9898335B1 (en) 2012-10-22 2018-02-20 Palantir Technologies Inc. System and method for batch evaluation programs
US9921734B2 (en) 2013-08-09 2018-03-20 Palantir Technologies Inc. Context-sensitive views
US9923925B2 (en) 2014-02-20 2018-03-20 Palantir Technologies Inc. Cyber security sharing and identification system
US9946738B2 (en) 2014-11-05 2018-04-17 Palantir Technologies, Inc. Universal data pipeline
US9953445B2 (en) 2013-05-07 2018-04-24 Palantir Technologies Inc. Interactive data object map
US9965937B2 (en) 2013-03-15 2018-05-08 Palantir Technologies Inc. External malware data item clustering and analysis
US9965534B2 (en) 2015-09-09 2018-05-08 Palantir Technologies, Inc. Domain-specific language for dataset transformations
US9984133B2 (en) 2014-10-16 2018-05-29 Palantir Technologies Inc. Schematic and database linking system
US9998485B2 (en) 2014-07-03 2018-06-12 Palantir Technologies, Inc. Network intrusion data item clustering and analysis
US9996229B2 (en) 2013-10-03 2018-06-12 Palantir Technologies Inc. Systems and methods for analyzing performance of an entity
US9996595B2 (en) 2015-08-03 2018-06-12 Palantir Technologies, Inc. Providing full data provenance visualization for versioned datasets
US10037383B2 (en) 2013-11-11 2018-07-31 Palantir Technologies, Inc. Simple web search
US10037314B2 (en) 2013-03-14 2018-07-31 Palantir Technologies, Inc. Mobile reports
US10102369B2 (en) 2015-08-19 2018-10-16 Palantir Technologies Inc. Checkout system executable code monitoring, and user account compromise determination system
US10109094B2 (en) 2015-12-21 2018-10-23 Palantir Technologies Inc. Interface to index and display geospatial data
WO2019009935A1 (en) * 2017-07-03 2019-01-10 Google Llc Semantic vector tiles
US10180977B2 (en) 2014-03-18 2019-01-15 Palantir Technologies Inc. Determining and extracting changed data from a data source
US10180929B1 (en) 2014-06-30 2019-01-15 Palantir Technologies, Inc. Systems and methods for identifying key phrase clusters within documents
US10198515B1 (en) 2013-12-10 2019-02-05 Palantir Technologies Inc. System and method for aggregating data from a plurality of data sources
US10216801B2 (en) 2013-03-15 2019-02-26 Palantir Technologies Inc. Generating data clusters
US10229284B2 (en) 2007-02-21 2019-03-12 Palantir Technologies Inc. Providing unique views of data based on changes or rules
US10230746B2 (en) 2014-01-03 2019-03-12 Palantir Technologies Inc. System and method for evaluating network threats and usage
US10262047B1 (en) 2013-11-04 2019-04-16 Palantir Technologies Inc. Interactive vehicle information map

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8924872B1 (en) 2013-10-18 2014-12-30 Palantir Technologies Inc. Overview user interface of emergency call data of a law enforcement agency
US9552615B2 (en) 2013-12-20 2017-01-24 Palantir Technologies Inc. Automated database analysis to detect malfeasance
US9600146B2 (en) 2015-08-17 2017-03-21 Palantir Technologies Inc. Interactive geospatial map

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000009529A2 (en) * 1998-08-14 2000-02-24 I2 Technologies, Inc. System and method for visually representing a supply chain
US20090172511A1 (en) * 2007-12-26 2009-07-02 Alexander Decherd Analysis of time-based geospatial mashups using AD HOC visual queries
US20110161096A1 (en) * 2009-12-28 2011-06-30 General Electric Company Methods and systems for mapping healthcare services analytics for volume and trends
US20110270705A1 (en) * 2010-04-29 2011-11-03 Cheryl Parker System and Method for Geographic Based Data Visualization and Extraction
US20120173985A1 (en) * 2010-12-29 2012-07-05 Tyler Peppel Multi-dimensional visualization of temporal information

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000009529A2 (en) * 1998-08-14 2000-02-24 I2 Technologies, Inc. System and method for visually representing a supply chain
US20090172511A1 (en) * 2007-12-26 2009-07-02 Alexander Decherd Analysis of time-based geospatial mashups using AD HOC visual queries
US20110161096A1 (en) * 2009-12-28 2011-06-30 General Electric Company Methods and systems for mapping healthcare services analytics for volume and trends
US20110270705A1 (en) * 2010-04-29 2011-11-03 Cheryl Parker System and Method for Geographic Based Data Visualization and Extraction
US20120173985A1 (en) * 2010-12-29 2012-07-05 Tyler Peppel Multi-dimensional visualization of temporal information

Cited By (79)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10229284B2 (en) 2007-02-21 2019-03-12 Palantir Technologies Inc. Providing unique views of data based on changes or rules
US9383911B2 (en) 2008-09-15 2016-07-05 Palantir Technologies, Inc. Modal-less interface enhancements
US10248294B2 (en) 2008-09-15 2019-04-02 Palantir Technologies, Inc. Modal-less interface enhancements
US9880987B2 (en) 2011-08-25 2018-01-30 Palantir Technologies, Inc. System and method for parameterizing documents for automatic workflow generation
US9898335B1 (en) 2012-10-22 2018-02-20 Palantir Technologies Inc. System and method for batch evaluation programs
US9123086B1 (en) 2013-01-31 2015-09-01 Palantir Technologies, Inc. Automatically generating event objects from images
US9380431B1 (en) 2013-01-31 2016-06-28 Palantir Technologies, Inc. Use of teams in a mobile application
US10037314B2 (en) 2013-03-14 2018-07-31 Palantir Technologies, Inc. Mobile reports
US9646396B2 (en) 2013-03-15 2017-05-09 Palantir Technologies Inc. Generating object time series and data objects
US10216801B2 (en) 2013-03-15 2019-02-26 Palantir Technologies Inc. Generating data clusters
US10264014B2 (en) 2013-03-15 2019-04-16 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive investigation based on automatic clustering of related data in various data structures
US9852205B2 (en) 2013-03-15 2017-12-26 Palantir Technologies Inc. Time-sensitive cube
US9965937B2 (en) 2013-03-15 2018-05-08 Palantir Technologies Inc. External malware data item clustering and analysis
US9852195B2 (en) 2013-03-15 2017-12-26 Palantir Technologies Inc. System and method for generating event visualizations
US9779525B2 (en) 2013-03-15 2017-10-03 Palantir Technologies Inc. Generating object time series from data objects
US9953445B2 (en) 2013-05-07 2018-04-24 Palantir Technologies Inc. Interactive data object map
US9223773B2 (en) 2013-08-08 2015-12-29 Palatir Technologies Inc. Template system for custom document generation
US9335897B2 (en) 2013-08-08 2016-05-10 Palantir Technologies Inc. Long click display of a context menu
US9921734B2 (en) 2013-08-09 2018-03-20 Palantir Technologies Inc. Context-sensitive views
US9785317B2 (en) 2013-09-24 2017-10-10 Palantir Technologies Inc. Presentation and analysis of user interaction data
US9996229B2 (en) 2013-10-03 2018-06-12 Palantir Technologies Inc. Systems and methods for analyzing performance of an entity
US9864493B2 (en) 2013-10-07 2018-01-09 Palantir Technologies Inc. Cohort-based presentation of user interaction data
US9116975B2 (en) 2013-10-18 2015-08-25 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive simultaneous querying of multiple data stores
US9514200B2 (en) 2013-10-18 2016-12-06 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive simultaneous querying of multiple data stores
US10262047B1 (en) 2013-11-04 2019-04-16 Palantir Technologies Inc. Interactive vehicle information map
US10037383B2 (en) 2013-11-11 2018-07-31 Palantir Technologies, Inc. Simple web search
US10198515B1 (en) 2013-12-10 2019-02-05 Palantir Technologies Inc. System and method for aggregating data from a plurality of data sources
US9727622B2 (en) 2013-12-16 2017-08-08 Palantir Technologies, Inc. Methods and systems for analyzing entity performance
US9734217B2 (en) 2013-12-16 2017-08-15 Palantir Technologies Inc. Methods and systems for analyzing entity performance
US10025834B2 (en) 2013-12-16 2018-07-17 Palantir Technologies Inc. Methods and systems for analyzing entity performance
US9043696B1 (en) 2014-01-03 2015-05-26 Palantir Technologies Inc. Systems and methods for visual definition of data associations
US10230746B2 (en) 2014-01-03 2019-03-12 Palantir Technologies Inc. System and method for evaluating network threats and usage
US10120545B2 (en) 2014-01-03 2018-11-06 Palantir Technologies Inc. Systems and methods for visual definition of data associations
US9483162B2 (en) 2014-02-20 2016-11-01 Palantir Technologies Inc. Relationship visualizations
US9923925B2 (en) 2014-02-20 2018-03-20 Palantir Technologies Inc. Cyber security sharing and identification system
US10180977B2 (en) 2014-03-18 2019-01-15 Palantir Technologies Inc. Determining and extracting changed data from a data source
US9857958B2 (en) 2014-04-28 2018-01-02 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive access of, investigation of, and analysis of data objects stored in one or more databases
US9449035B2 (en) 2014-05-02 2016-09-20 Palantir Technologies Inc. Systems and methods for active column filtering
US9129219B1 (en) 2014-06-30 2015-09-08 Palantir Technologies, Inc. Crime risk forecasting
US9619557B2 (en) 2014-06-30 2017-04-11 Palantir Technologies, Inc. Systems and methods for key phrase characterization of documents
US9836694B2 (en) 2014-06-30 2017-12-05 Palantir Technologies, Inc. Crime risk forecasting
US10162887B2 (en) 2014-06-30 2018-12-25 Palantir Technologies Inc. Systems and methods for key phrase characterization of documents
US10180929B1 (en) 2014-06-30 2019-01-15 Palantir Technologies, Inc. Systems and methods for identifying key phrase clusters within documents
US9256664B2 (en) 2014-07-03 2016-02-09 Palantir Technologies Inc. System and method for news events detection and visualization
US9785773B2 (en) 2014-07-03 2017-10-10 Palantir Technologies Inc. Malware data item analysis
US9298678B2 (en) 2014-07-03 2016-03-29 Palantir Technologies Inc. System and method for news events detection and visualization
US9998485B2 (en) 2014-07-03 2018-06-12 Palantir Technologies, Inc. Network intrusion data item clustering and analysis
US9880696B2 (en) 2014-09-03 2018-01-30 Palantir Technologies Inc. System for providing dynamic linked panels in user interface
US9454281B2 (en) 2014-09-03 2016-09-27 Palantir Technologies Inc. System for providing dynamic linked panels in user interface
US9767172B2 (en) 2014-10-03 2017-09-19 Palantir Technologies Inc. Data aggregation and analysis system
US9501851B2 (en) 2014-10-03 2016-11-22 Palantir Technologies Inc. Time-series analysis system
US9984133B2 (en) 2014-10-16 2018-05-29 Palantir Technologies Inc. Schematic and database linking system
US9946738B2 (en) 2014-11-05 2018-04-17 Palantir Technologies, Inc. Universal data pipeline
US10191926B2 (en) 2014-11-05 2019-01-29 Palantir Technologies, Inc. Universal data pipeline
US9043894B1 (en) 2014-11-06 2015-05-26 Palantir Technologies Inc. Malicious software detection in a computing system
US10135863B2 (en) 2014-11-06 2018-11-20 Palantir Technologies Inc. Malicious software detection in a computing system
US9367872B1 (en) 2014-12-22 2016-06-14 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive investigation of bad actor behavior based on automatic clustering of related data in various data structures
US9898528B2 (en) 2014-12-22 2018-02-20 Palantir Technologies Inc. Concept indexing among database of documents using machine learning techniques
US10127021B1 (en) 2014-12-29 2018-11-13 Palantir Technologies Inc. Storing logical units of program code generated using a dynamic programming notebook user interface
US10157200B2 (en) 2014-12-29 2018-12-18 Palantir Technologies Inc. Interactive user interface for dynamic data analysis exploration and query processing
US9870205B1 (en) 2014-12-29 2018-01-16 Palantir Technologies Inc. Storing logical units of program code generated using a dynamic programming notebook user interface
US9817563B1 (en) 2014-12-29 2017-11-14 Palantir Technologies Inc. System and method of generating data points from one or more data stores of data items for chart creation and manipulation
US9335911B1 (en) 2014-12-29 2016-05-10 Palantir Technologies Inc. Interactive user interface for dynamic data analysis exploration and query processing
US9870389B2 (en) 2014-12-29 2018-01-16 Palantir Technologies Inc. Interactive user interface for dynamic data analysis exploration and query processing
US9727560B2 (en) 2015-02-25 2017-08-08 Palantir Technologies Inc. Systems and methods for organizing and identifying documents via hierarchies and dimensions of tags
US9891808B2 (en) 2015-03-16 2018-02-13 Palantir Technologies Inc. Interactive user interfaces for location-based data analysis
US9886467B2 (en) 2015-03-19 2018-02-06 Plantir Technologies Inc. System and method for comparing and visualizing data entities and data entity series
US9460175B1 (en) 2015-06-03 2016-10-04 Palantir Technologies Inc. Server implemented geographic information system with graphical interface
US10223748B2 (en) 2015-07-30 2019-03-05 Palantir Technologies Inc. Systems and user interfaces for holistic, data-driven investigation of bad actor behavior based on clustering and scoring of related data
US9454785B1 (en) 2015-07-30 2016-09-27 Palantir Technologies Inc. Systems and user interfaces for holistic, data-driven investigation of bad actor behavior based on clustering and scoring of related data
US9996595B2 (en) 2015-08-03 2018-06-12 Palantir Technologies, Inc. Providing full data provenance visualization for versioned datasets
US10102369B2 (en) 2015-08-19 2018-10-16 Palantir Technologies Inc. Checkout system executable code monitoring, and user account compromise determination system
US9898509B2 (en) 2015-08-28 2018-02-20 Palantir Technologies Inc. Malicious activity detection system capable of efficiently processing data accessed from databases and generating alerts for display in interactive user interfaces
US9639580B1 (en) 2015-09-04 2017-05-02 Palantir Technologies, Inc. Computer-implemented systems and methods for data management and visualization
US9996553B1 (en) 2015-09-04 2018-06-12 Palantir Technologies Inc. Computer-implemented systems and methods for data management and visualization
US9965534B2 (en) 2015-09-09 2018-05-08 Palantir Technologies, Inc. Domain-specific language for dataset transformations
US10109094B2 (en) 2015-12-21 2018-10-23 Palantir Technologies Inc. Interface to index and display geospatial data
US9823818B1 (en) 2015-12-29 2017-11-21 Palantir Technologies Inc. Systems and interactive user interfaces for automatic generation of temporal representation of data objects
WO2019009935A1 (en) * 2017-07-03 2019-01-10 Google Llc Semantic vector tiles

Also Published As

Publication number Publication date
GB2542517A (en) 2017-03-22
GB201408025D0 (en) 2014-06-18
GB2542517B (en) 2018-01-24
GB2516155B (en) 2017-01-18
GB201620827D0 (en) 2017-01-18

Similar Documents

Publication Publication Date Title
Craglia et al. Next-generation digital earth
Moscovich et al. Topology-aware navigation in large networks
US6751620B2 (en) Apparatus for viewing information in virtual space using multiple templates
NL2009649B1 (en) System and method for displaying information local to a selected area.
US8745162B2 (en) Method and system for presenting information with multiple views
US7142205B2 (en) Single gesture map navigation graphical user interface for a personal digital assistant
US9043696B1 (en) Systems and methods for visual definition of data associations
US8244743B2 (en) Scalable rendering of large spatial databases
US20040056883A1 (en) Interactive video tour system editor
US20070171716A1 (en) System and method for visualizing configurable analytical spaces in time for diagrammatic context representations
Becker et al. DBpedia Mobile: A Location-Enabled Linked Data Browser.
US9304837B2 (en) Cellular user interface
US7889888B2 (en) System and method for grouping and visualizing data
US6426761B1 (en) Information presentation system for a graphical user interface
US20090254867A1 (en) Zoom for annotatable margins
US7643673B2 (en) Markup language for interactive geographic information system
US7428705B2 (en) Web map tool
US8281238B2 (en) System, method and computer program for creating and manipulating data structures using an interactive graphical interface
US20070132767A1 (en) System and method for generating stories in time and space and for analysis of story patterns in an integrated visual representation on a user interface
US7925982B2 (en) System and method of overlaying and integrating data with geographic mapping applications
US9411828B2 (en) Method and system for navigating in a database of a computer system
US20020089550A1 (en) Method and apparatus for organizing hierarchical screens in virtual space
US20180032571A1 (en) Search around visual queries
US8966398B2 (en) System and method for visualizing connected temporal and spatial information as an integrated visual representation on a user interface
US20150338233A1 (en) Geotagging Structured Data