US20080077597A1 - Systems and methods for photograph mapping - Google Patents

Systems and methods for photograph mapping Download PDF

Info

Publication number
US20080077597A1
US20080077597A1 US11/844,203 US84420307A US2008077597A1 US 20080077597 A1 US20080077597 A1 US 20080077597A1 US 84420307 A US84420307 A US 84420307A US 2008077597 A1 US2008077597 A1 US 2008077597A1
Authority
US
United States
Prior art keywords
user
spot
images
image
users
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/844,203
Inventor
Lance Butler
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US11/844,203 priority Critical patent/US20080077597A1/en
Priority to US12/438,360 priority patent/US20100235350A1/en
Priority to PCT/US2007/076718 priority patent/WO2008024949A2/en
Publication of US20080077597A1 publication Critical patent/US20080077597A1/en
Priority to US13/196,044 priority patent/US8990239B2/en
Priority to US14/624,095 priority patent/US9881093B2/en
Priority to US15/841,190 priority patent/US10776442B2/en
Priority to US17/021,326 priority patent/US20210073305A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9537Spatial or temporal dependent retrieval, e.g. spatiotemporal queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2457Query processing with adaptation to user needs
    • G06F16/24578Query processing with adaptation to user needs using ranking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/5866Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/587Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using geographical or spatial information, e.g. location

Definitions

  • a first digital image and at least one user-generated datum is received from at least one user.
  • the first image is geographically organized according to the at least one datum.
  • the first image is associated with at least one location and at least one direction.
  • the first image is provided from a first person perspective to a user in response to a request.
  • FIG. 1 illustrates an example of a computing system environment 100 on which an embodiment of the invention may be implemented
  • FIG. 2 is a functional block diagram of an exemplary operating environment in which an embodiment of the invention can be implemented
  • FIG. 3 shows a “Find Places” GUI finding locations and views
  • FIG. 4 shows a “Walk Around” GUI in an embodiment
  • FIG. 5 shows an “Upload” GUI in an embodiment
  • FIG. 6A shows a sort GUI in an embodiment
  • FIG. 6B shows an alternate embodiment of a Sort GUI
  • FIG. 7 shows a Locate/Link GUI in an embodiment
  • FIGS. 8 A-E show a Save Locale button in multiple embodiments.
  • FIG. 9 shows an example of a restaurant in one embodiment.
  • An embodiment of the current invention provides tools that allow anyone in the world to take photos, upload them to a database, and/or put them together to create their own navigable locales. It is just as easy to include indoor data as outdoor data. Users can seamlessly navigate between outdoor and indoor locations.
  • An embodiment of the current invention provides tools to make it relatively quick and easy to build navigable locales. Because one embodiment of the tools works within a standard browser, the tools can be made available to the general public anywhere in the world. Other embodiments can be built as standalone or downloadable applications to provide other functionality.
  • An embodiment of the invention works with still images, and any device capable of capturing digital images and transferring the images to a computer-readable memory is compatible with the invention.
  • Various embodiments require no special hardware or software, no plug-ins or downloaded applications; a standard web browser is all that is needed.
  • Embodiments give users the ability to create and view geo-located, navigationally linked images of real-world locations, and allows anyone to blanket an area with photos in an organized way so others can use their web browser to visually “walk around” that area with a first-person perspective. Any type of place, anywhere in the world: businesses, homes, parks, travel destinations, etc.
  • Embodiments are designed to allow tight physical density for this type of visual navigation, but also allow for arbitrary physical density, where users are free to pack spots as tight or loose as they wish.
  • Innovations of the current invention include an easy-to-use system for managing large amounts of photographic images and/or linking them together, as well as an easy way to navigate these photographs.
  • Users of the current invention can interconnect locales made by many people to completely “blanket” a region with images, and eventually the entire world.
  • embodiments enable users to create their own virtual communities oriented around very specific places and/or activities or interests. Users don't even need to live in or near these places to contribute.
  • “tours” or “trails” or “threads” highlighting spots or views of interest through any existing locales, including locales of other users.
  • the tours are like guided tours that users can create for others to follow. When a user follows a particular tour, the user is guided through an ordered set of moves, turns and/or zooms to see whatever the creator of the tour wants to highlight and/or talk about. The tour creator can make a set of comments that take priority over the standard comments, showing prominently.
  • Locales are geo-located, and can be tagged, blogged and rated, and are fully searchable. Users can read and write comments or questions for any locale or image in the world in place-logs (or “plogs”). Users can search the various data by geographic location, by tags, by comments or by other criteria.
  • Embodiments provide tools that allow users anywhere in the world to create and organize large amounts of image data quickly and efficiently.
  • Embodiments can include integrated support for tags, comments, locale interlinks, zooms, user requests, and other data associated with locales and/or images.
  • Embodiments can allow many types of searches, including but not limited to geographic searches based on location, tag searches based on user-entered tags and place or landmark name searches.
  • users can view specific areas or items of interest in high visual detail without requiring all the imagery for the spot to be of high resolution, and with no limits on the detail magnification. This allows zooming in on a feature of interest of arbitrarily high resolution, so extreme details can be provided, and allows far less data to be transferred to the client to show a particular view.
  • aspects of the invention provide users the ability to visually navigate remote places in a new, novel and natural way, using nothing more than still photos and a basic web browser.
  • aspects of the invention provide tools that allow anyone to create visually navigable locales from simple photos.
  • the process is not limited by proprietary tools or knowledge, and does not need special equipment. None is required beyond an inexpensive digital camera and a standard web browser.
  • Spots can be created wherever anyone can take pictures, i.e. indoor/outdoor, urban settings or in the middle of a forest.
  • aspects of the invention allow for navigationally interlinked locales, making it possible for many individuals to collectively blanket entire cities, and indeed the world. Either through planned collaboration, or after the fact. Interlinked locales make the whole more valuable than the sum of the parts.
  • aspects of the invention provide virtual reality (VR)-like navigation and views, but unlike traditional VR allows for virtually unlimited detail of objects or points of interest without significantly increasing data or bandwidth requirements.
  • VR virtual reality
  • aspects of the invention include geo-location of photos by simply clicking on a high-resolution map in a browser, using no plug-ins or extra software.
  • aspects of the invention provide a new and novel visual display of the “level of dirtiness” of documents or local data.
  • aspects of the invention provide for unique types of “embedded ads” directly within the first-person real-world views.
  • aspects of the invention have a proprietary provisional data scheme to greatly reduce the amount of “bad data” that gets into the system.
  • a method of the invention includes receiving user-generated digital images and user-generated data about the images from users; organizing and linking the images geographically according to the user-generated data, and displaying the images in response to a user request.
  • An embodiment includes but is not limited to five types of objects: views, spots, locales, zooms and trails.
  • a view as described herein is a single digital image, which can be a photographic image.
  • Each view is associated with a spot and can have a defined orientation associated with the view, including a lateral (compass) direction denoting a direction a camera was facing when the image was captured, as well as a vertical angle relative to a horizon of the location.
  • Views can include time references including time of day, date, and season.
  • a zoom has described herein can be a close-up image of an area or point of interest within a view, but it need not be simply a higher resolution zoomed version of the original view. This gives users creative flexibility to make interesting and creative zoomed images. Views and zooms can also include tags, which are words or short phrases. In addition to latitude and longitude, views can also include elevation. A view's elevation can be specified by “floor” (as in a building), or in feet above sea level.
  • a spot has described herein has one or more views, taken from a single geographic location, facing in different lateral and/or vertical directions.
  • Each spot has a precise geographic location, which can be indicated by latitude and longitude, elevation, GPS coordinates, or other means.
  • spots include 8 lateral views, which can align, for example, with the 8 major compass directions.
  • a spot can have any number of views associated with the spot, facing in arbitrary directions.
  • Horizontally oriented lateral views give the user the ability to rotate the view left and right, conversely views taken at other angles relative to horizontal allow the user to see up or down.
  • views taken from the same spot at different times can also be included, allowing users to watch how a location changes over time.
  • a locale as described herein is a coverage area of one or more spots. Spots can be interlinked within a locale, so users can navigate between spots. Locales can be interlinked when their borders become sufficiently close or overlap, allowing users to navigate among locales without leaving their first-person frame of reference. Interlinked locales are locales with linkage from one or more existing spots in one locale to one or more existing spots in another locale.
  • a trail as described herein is defined as paths that lead a user through a specific sequence of views and/or zooms of one or more spots in one or more locales. Users can create a trail through their own locales or other users' locales, or a combination of both.
  • FIG. 1 illustrates an example of a computing system environment 100 on which an embodiment of the invention may be implemented.
  • the computing system environment 100 is an example of a suitable computing environment, however it is appreciated that other environments, systems, and devices may be used to implement various embodiments of the invention as described in more detail below.
  • Embodiments of the invention are operational with numerous other general-purpose or special purpose computing system environments or configurations.
  • Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with embodiments of the invention include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set-top boxes, programmable consumer electronics, mobile telephones, portable data assistants, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
  • Embodiments of the invention may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer.
  • program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • Embodiments of the invention may also be practiced in distributed-computing environments where tasks are performed by remote processing devices that are linked through a communications network.
  • program modules may be located in both local and remote computer storage media including memory storage devices.
  • an exemplary system for implementing an embodiment of the invention includes a computing device, such as computing device 100 .
  • the computing device 100 typically includes at least one processing unit 102 and memory 104 .
  • memory 104 may be volatile (such as random-access memory (RAM)), nonvolatile (such as read-only memory (ROM), flash memory, etc.) or some combination of the two. This most basic configuration is illustrated in FIG. 1 by dashed line 106 .
  • the device 100 may have additional features, aspects, and functionality.
  • the device 100 may include additional storage (removable and/or non-removable) which may take the form of, but is not limited to, magnetic or optical disks or tapes.
  • additional storage is illustrated in FIG. 1 by removable storage 108 and non-removable storage 110 .
  • Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data.
  • Memory 104 , removable storage 108 and non-removable storage 110 are all examples of computer storage media.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by device 100 . Any such computer storage media may be part of device 100 .
  • the device 100 may also contain a communications connection 112 that allows the device to communicate with other devices.
  • the Communications connection 112 is an example of communication media.
  • Communication media typically embodies computer-readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media.
  • modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
  • the communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio-frequency (RF), infrared and other wireless media.
  • RF radio-frequency
  • the term computer-readable media as used herein includes both storage media and communication media.
  • the device 100 may also have an input device 114 such as keyboard, mouse, pen, voice-input device, touch-input device, etc. Further, an output device 116 such as a display, speakers, printer, etc. may also be included. Additional input devices 114 and output devices 116 may be included depending on a desired functionality of the device 100 .
  • an embodiment of the present invention takes the form of an exemplary computer network system 200 .
  • the system 200 includes an electronic client device 210 , such as a personal computer or workstation or portable data assistant or mobile telephone, that is linked via a communication medium, such as a network 220 (e.g., the Internet), to an electronic device or system, such as a server 230 .
  • the server 230 may further be coupled, or otherwise have access, to a database 240 and a computer system 260 .
  • FIG. 2 includes one server 230 coupled to one client device 210 via the network 220 , it should be recognized that embodiments of the invention may be implemented using one or more such client devices coupled to one or more such servers.
  • the client device 210 and the server 230 may include all or fewer than all of the features associated with the device 100 illustrated in and discussed with reference to FIG. 1 .
  • the client device 210 includes or is otherwise coupled to a computer screen or display 250 .
  • the client device 210 may be used for various purposes such as network- and local-computing processes.
  • the client device 210 is linked via the network 220 to server 230 so that computer programs, such as, for example, a browser, running on the client device 210 can cooperate in two-way communication with server 230 .
  • the server 230 may be coupled to database 240 to retrieve information therefrom and to store information thereto.
  • Database 240 may include a plurality of different tables (not shown) that can be used by the server 230 to enable performance of various aspects of embodiments of the invention.
  • the server 230 may be coupled to the computer system 260 in a manner allowing the server to delegate certain processing functions to the computer system.
  • a user desiring to electronically organize and map a series of photographs.
  • a user (not shown) of the client device 210 desiring to electronically map photographs uses a browser application running on the client device to access web content, which may, but need not, be served by the server 230 .
  • the user may download from the server 230 and install on the client device 210 a user interface module 280 comprising computer-executable instructions as described more fully hereinafter.
  • the user may receive the module 280 on a tangible computer-readable medium (not shown), such as, for example, a CD-ROM, and subsequently install the module on the client device 210 from the medium.
  • a user interface 130 may be displayed on the display device 250 .
  • Embodiments of the invention include various graphical user interfaces (GUIs) for allowing a user to interact with the embodiments.
  • GUIs graphical user interfaces
  • a “Find Places” GUI 130 for finding locations and views is shown in FIG. 3 .
  • the GUI 130 includes a map 132 with a zoom bar 134 and navigation buttons 136 for changing an area displayed by the map 132 , including a “Center” button 138 .
  • the map 132 includes markers 140 which can denote locations of views, spots, and locales. Users can jump directly to any city, state, country or zip code by inputting the information in a “Place” field 42 , or browse at random among markers 140 on the map 132 , or search for or filter based on, tags with a “Tags” field 148 .
  • a “Current Locale” window 131 displays a name of, owner of, and number of spots contained in, the current locale.
  • the window 131 can also display clickable options such as zooming in to or walking around the current locale.
  • a legend 144 can be included, showing the spot density of each locale, or any other information of interest.
  • a clickable list 146 of locales can be included, allowing a user to select a locale from the list 46 for display on the map 132 .
  • Tabs 149 can be included to allow users to quickly navigate between GUI functional areas.
  • a “Walk Around” GUI 150 is shown in FIG. 4 .
  • the GUI 150 includes an overhead map view 152 . Users can jump directly to a spot or feature of interest by clicking on any marker 140 in the overhead map view 152 .
  • the map 152 updates to the new location and orientation.
  • the map 152 also shows other spots that are in the vicinity, and the user can see at a glance which spots have already been “visited” by the type of marker 410 on each spot; in embodiments, empty circles 151 signify unvisited spots and filled circles 153 signify visited spots.
  • a current spot marker 155 is a filled circle with an arrow pointing in the current direction, and a destination marker (not shown) with a ‘+’ symbol can indicate the spot the user moves to if the user performs the “move-forward” action.
  • the overhead map 152 can be a top-down view, including but not limited to a map, satellite imagery, “bird's-eye view” imagery, an architectural floor plan, a hand-sketched diagram and a user-supplied image.
  • a top-down view provides an overview of the area of interest in enough detail to distinguish the positions of a locale's spots. If a custom top-down image is used, then the locale creator first geo-locates the top-down image. In some cases, a distant panoramic side view can be used in place of a top-down image if the area of interest is very linear, for example, a section of seashore.
  • a view window 154 for displaying digital images, which can include real, eye-level, photographic images. Some views have “zooms”, which can be close-up images of specific areas within a view, or information or images related to the view. Clicking on a zoom icon can animate the zoom image so it appears to grow from the position on the screen of its marking identifier into a larger image.
  • the server can provide images to match the real time of day, date, or season of the spot if images taken at different times are associated with the spot.
  • Locale, spot, view and feature (not shown) comment areas 154 , 156 , 158 can display comments made by users about the displayed locale, spot, and view, respectively.
  • Feature comments are related to a feature in the locale which may appear in more than one specific view or spot, i.e. a distant mountain peak that is visible from all easterly views in a locale.
  • Users can also make requests to be displayed in the areas 154 , 156 , 158 , such as a request that digital images be captured at a specific spot.
  • Movement, turning and zooming are performed from a first-person perspective; these actions can be performed either by using the keyboard, or by using the mouse or other user interaction.
  • Other interactions include but are not limited to touch-screens, voice commands, and/or motion controls.
  • Users can rotate from a current direction to an adjacent direction. Users can navigate forward, backward, up, down, or sideways, and can navigate directly to other spots in the locale, and the view displayed will change accordingly.
  • Embodiments of the invention allow users to smoothly transition from one view to the next.
  • the old view slides out, and as the old view slides out, the new view slides in to take its place (left, right, up or down).
  • Embodiments allow locale creators to specify the amount of overlap between any two adjacent views, which allows the new view panning in to start out partially overlapped by the old view, if desired.
  • the new view can zoom in or out as the old view zooms out or in, and during this process the views can fade out and in.
  • Animated line graphics representing the new view can “float” above the old view and expand or contract to symbolize the movement to the new view.
  • An “Upload” GUI 160 shown in FIG. 5 enables users to upload digital images.
  • the interface 160 includes directions 162 and a “File Upload” field 164 .
  • the user selects either an individual .jpeg or other photo file type, or .zip file containing multiple images, and presses a ‘Send File’ button 166 .
  • the upload progress is displayed to the user.
  • embodiments can process the raw uploaded image files and create multiple resolutions of said images for storage on the server to be used for various purposes. As each image or batch of images is being processed, the status of the processing for that batch can be displayed onscreen for the user to view.
  • Embodiments can allow the user to leave the interface 160 during image processing, and can allow the user to start using processed images before all images have been processed. Downloadable applications or plug-ins can be included, but are not required for uploading images.
  • a “Sort” GUI 170 allows users to sort images 171 by spot and orientation.
  • a spot list 172 lists all the spots associated with the current locale, displayed in a “My Locales” area 174 .
  • the user can click on any spot name to select a spot and display the spot's images in a “Current Spot” or “Selected Spot” display area 176 ; the area 176 displays views 177 associated with the current spot.
  • thumbnails are arranged in a grid, from which the user can sort the images 171 into spots corresponding to, for example, the physical location(s) from which they were taken. If the images 171 are in left-to-right order, then the user can select the first image, click on a “Grab 8 ” button 180 , and select seven more images in order. Alternatively, if the images 171 are not in left-to-right order, the user can select each image in order to get a left-to-right ordering. The user can add other images associated with a spot, like zooms or up/down shots.
  • FIG. 6B shows an alternate embodiment of a Sort GUI 170 .
  • the Selected Spot area 176 includes eight views 177 associated with the eight major compass directions, and two zooms 185 associated with a view 177 of the spot. Users can associate zooms 185 with views 177 with either the Sort GUI 170 or the Locate/Link GUI 190 (shown below).
  • a zoom icon 186 is created on the view 177 at that point.
  • the user can reposition the zoom 185 by clicking and dragging a zoom icon 186 representing the zoom 185 within the view 177 .
  • the interface 170 lightly constrains the icon 186 within the edges of the associated view 177 , to allow easy positioning at the edges, but also allows deletion of the association via a quick dragging motion to escape the edges of the view 177 .
  • the zoom icon 186 has been dragged outside its associated view 177 , its graphical representation 186 changes to highlight the fact that if the user releases the icon 186 , the association will be deleted.
  • Zooms 85 can be associated with any view 177 , multiple different zooms 185 can be associated with a single view 177 , and one zoom 185 can be associated with different views 177 ; for example, when an item of interest lies in an area which visually overlaps two adjacent views 177 .
  • Zooms 185 can be merely magnified views of an area of interest, or angles can be completely different to get a better view of an item of interest.
  • the zoom 185 could “peer over, or around” a fence or other obstacle, “peek” inside a window, or through a wall of a building, if a user has access to an image of the inside of the building.
  • a view 177 of the outside of a restaurant could associate a position on the restaurant's front door with a zoom 185 of the restaurant's menu.
  • the zoom 185 could be an image of different times, days or seasons.
  • a Locate/Link GUI 190 shown in FIG. 7 , allows users to specify a geographical location for each spot and create links between spots.
  • the user selects a locale from the Locale list 192 , and a map 94 including the locale data is displayed in a map pane 196 . If the user is locating a new locale the user can move the map 194 to a new location by typing the location name or other identifying information into a “Find Place” field 198 above the map pane 196 , or the user can position the map 194 by panning and/or zooming in until they find the correct place.
  • a “Spots” area 1100 lists all spots associated with the current locale. Spots that have already been geo-located will have an indicator, such as a checkbox 1102 .
  • a user displays a desired location of the spot in the map pane 196 and selects a spot to geo-locate by clicking on the spot name in the Spots area 1100 —a selected spot's views will be displayed in the Selected Spot area 1104 , above the map pane 196 .
  • the user clicks on the map 194 at the desired location of the spot and an indicator 1106 appears at the desired location on the map 194 .
  • a spot's exact location can be adjusted in fine or coarse detail by using the keyboard and/or mouse.
  • Other embodiments can correlate spots with a set of previously defined geographic coordinates, for example from a GPS device.
  • users can select the spot and use the arrow keys to move the spot marker 1106 on the map 194 , or hold down the “M” key (move) on the keyboard and click on the desired location on the map 194 , or input actual coordinate data.
  • Spots that have been geo-located can be linked together so users can navigate around the locale among the spots.
  • One embodiment allows users to link two spots by selecting a first spot by clicking on its marker 1106 or its name in the Spots area 1100 , and holding down the “C” key (connect) and clicking on a marker 1106 for a second spot.
  • a line appears between the spots, signifying that the spots are linked.
  • Spots can be linked with more than one other spot.
  • a link between spots is actually comprised of two separate “moves”, one in each direction; i.e. a move from the first spot to the second spot, and a move from the second spot to the first spot.
  • Embodiments can allow users to delete moves and/or links, for example, by clicking on a link marker connecting 2 spots and allowing the user to choose which moves to delete.
  • Embodiments can download various meta-data about a locale, its spots, views, tags and structure to the client, giving the ability to change various aspects of the client GUI (like sorting criteria and/or appearance of markers on the map) without querying the server.
  • the default legend might color the locale markers according to how many spots they contain, but users can change the marker colors or appearance to distinguish the visible locales by area type, i.e. Residential, Business, Park, rural, Beach; or by user rating, or by date of creation, or any number of other criteria.
  • a network-based client-server application there are actions that directly modify data on the servers, and/or there are actions that can be local to the user's computer until the changes are explicitly saved to the servers. There is an overhead to maintaining or synchronizing the data between client and server.
  • embodiments can keep track of the number of changes made and/or the types of changes made, and can use that information to gradually alter the appearance of one or more aspects of the GUI on-screen as the user makes local changes, thereby giving more useful information to the user than a simple binary “dirty” indicator. So, for example, as the user is manipulating newly uploaded images, the modification indicator can change slightly each time the user modifies data; i.e. the color of the border of the graphical representation of the data changes from neutral to light red and eventually to deep red. Or, as shown in FIGS. 8A-8E , a button such as a “Save Locale” button 1110 can change color from white to dark red.
  • This aspect of the invention can be applied to general-purpose software applications, including desktop applications or any other software which manipulates data and for which those manipulations and/or the manipulated data can be saved.
  • Various software applications and GUIs have a visual mechanism to indicate that a document is “dirty” (has unsaved changes), but a binary representation is not nearly as useful as being able to see at a glance HOW dirty the document is at the moment.
  • the rate of change of the appearance of the graphical representation of the dirtiness with respect to the changes in the unsaved data can be set by a user.
  • a user could set a visual change to occur after adding, modifying, or deleting every ten characters, or after every hundred characters, or after every 1,000 characters.
  • Varied algorithms can be used to determine when and/or how often to update the indicator(s).
  • Embodiments can use different color schemes to make more obvious the amount of editing that has taken place since the last save. For example, faint green to bright green to faint yellow to bright yellow to faint orange to bright orange to faint red to bright red.
  • Embodiments can include ads embedded within images.
  • an image of a billboard can have virtual ads placed directly in the billboard area, complete with PPC (pay per click) outbound linking.
  • a view 1120 of a restaurant can have a clickable embedded ad 1122 featuring the name of the restaurant and an offer of free coupons.
  • images of or near businesses or related places of interest can contain ads of many different kinds, including rollover hotspots with data, imagery or PPC outbound links.
  • Embedded ads can be in the form of zooms, where clicking on a zoom icon displays the ad.
  • the ad links could be a portion of a captured image provided by a user, or ads can be overlaid after the fact.
  • Clients who wish to keep ads (which may be a competitor's) off a particular locale can pay to either run ads of their choosing, or to run no ads whatsoever on locale(s) and/or view(s) of their choosing.
  • Server-side software can “paste” small images/links/icons directly onto the views themselves dynamically at delivery time. For example, in the upper right corner of all images, or of all images belonging to a particular user, or for all images being viewed by a particular user, or for all users from Alabama, or all locales in Alabama, etc. By adding these ad images dynamically, embodiments can optionally reposition them to different parts of the views.
  • Users of the invention have a trust level. New users generally have a trust level of zero. As they contribute good, valid data, their trust level goes up. All locale and/or image data has a trust level equal to the trust level of the user that supplied the data.
  • Embodiments can allow businesses or other entities or groups to have restricted access to their locale data and/or images so that only authorized representatives may view, annotate, comment on and/or manipulate the data for one or more locales.
  • aspects of the invention can include allowing, in a controlled way, locale creators and locations to advertise their availabilities and needs, for example, in view, spot, or locale comment areas, or within views in their own locales.
  • Entities can also pay or bid for services using points rewarded to the entity through a point system.
  • Users are rewarded points for various activities, such as flagging inappropriate content or capturing one or more digital images of a particular location.
  • the users can offer the points to other users as incentives to perform some activity, such as capturing a digital image of a particular location or creating a locale.
  • Points can also be used to bid for a section of pixels on our pages, which are displayed to the general public. Users of the site can promote a favorite charity, a blog, a business, or anything to which a user wants to draw attention.
  • feature comments are included.
  • Feature comments are comments related to a particular feature in a locale which can be visible in many different views. Without this, it can sometimes be difficult to find all the comments related to a particular item of interest within a locale. Essentially, it's a sub-grouping of comments that can be associated separately from a particular spot or view.
  • the view can also be partially rotated, before or while moving forward/backward. This allows the new view to be aligned closer to the center of the currently displayed view, which helps the user maintain their visual frame of reference as they move to another spot—especially with any zooming animation.

Abstract

Systems and methods for photograph mapping are disclosed herein. In one embodiment a first digital image and at least one user-generated datum is received from at least one user. The first image is geographically organized according to the at least one datum. The first image is associated with at least one location and at least one direction. The first image is provided from a first person perspective to a user in response to a request.

Description

    PRIORITY CLAIM
  • This application claims the benefit of U.S. Provisional Application Ser. No. 60/840,134 filed on Aug. 24, 2006, and is herein incorporated by reference in its entirety.
  • BACKGROUND OF THE INVENTION
  • Consumers have an enormous and increasing need for visual detail and information on real-world places. This is easily evidenced by the aggressive moves into mapping and local search by Google, Yahoo, Microsoft, Amazon and others.
  • SUMMARY OF THE INVENTION
  • Systems and methods for photograph mapping are disclosed herein. In one embodiment a first digital image and at least one user-generated datum is received from at least one user. The first image is geographically organized according to the at least one datum. The first image is associated with at least one location and at least one direction. The first image is provided from a first person perspective to a user in response to a request.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The preferred and alternative embodiments of the present invention are described in detail below with reference to the following drawings.
  • FIG. 1 illustrates an example of a computing system environment 100 on which an embodiment of the invention may be implemented;
  • FIG. 2 is a functional block diagram of an exemplary operating environment in which an embodiment of the invention can be implemented;
  • FIG. 3 shows a “Find Places” GUI finding locations and views;
  • FIG. 4 shows a “Walk Around” GUI in an embodiment;
  • FIG. 5 shows an “Upload” GUI in an embodiment;
  • FIG. 6A shows a sort GUI in an embodiment;
  • FIG. 6B shows an alternate embodiment of a Sort GUI;
  • FIG. 7 shows a Locate/Link GUI in an embodiment;
  • FIGS. 8A-E show a Save Locale button in multiple embodiments; and
  • FIG. 9 shows an example of a restaurant in one embodiment.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • An embodiment of the current invention provides tools that allow anyone in the world to take photos, upload them to a database, and/or put them together to create their own navigable locales. It is just as easy to include indoor data as outdoor data. Users can seamlessly navigate between outdoor and indoor locations.
  • An embodiment of the current invention provides tools to make it relatively quick and easy to build navigable locales. Because one embodiment of the tools works within a standard browser, the tools can be made available to the general public anywhere in the world. Other embodiments can be built as standalone or downloadable applications to provide other functionality.
  • An embodiment of the invention works with still images, and any device capable of capturing digital images and transferring the images to a computer-readable memory is compatible with the invention. Various embodiments require no special hardware or software, no plug-ins or downloaded applications; a standard web browser is all that is needed. Embodiments give users the ability to create and view geo-located, navigationally linked images of real-world locations, and allows anyone to blanket an area with photos in an organized way so others can use their web browser to visually “walk around” that area with a first-person perspective. Any type of place, anywhere in the world: businesses, homes, parks, travel destinations, etc. Embodiments are designed to allow tight physical density for this type of visual navigation, but also allow for arbitrary physical density, where users are free to pack spots as tight or loose as they wish.
  • Innovations of the current invention include an easy-to-use system for managing large amounts of photographic images and/or linking them together, as well as an easy way to navigate these photographs.
  • Users of the current invention can interconnect locales made by many people to completely “blanket” a region with images, and eventually the entire world. By allowing users to interlink their locales, embodiments enable users to create their own virtual communities oriented around very specific places and/or activities or interests. Users don't even need to live in or near these places to contribute.
  • Users can create “tours” (or “trails” or “threads”) highlighting spots or views of interest through any existing locales, including locales of other users. The tours are like guided tours that users can create for others to follow. When a user follows a particular tour, the user is guided through an ordered set of moves, turns and/or zooms to see whatever the creator of the tour wants to highlight and/or talk about. The tour creator can make a set of comments that take priority over the standard comments, showing prominently.
  • Locales are geo-located, and can be tagged, blogged and rated, and are fully searchable. Users can read and write comments or questions for any locale or image in the world in place-logs (or “plogs”). Users can search the various data by geographic location, by tags, by comments or by other criteria.
  • Embodiments provide tools that allow users anywhere in the world to create and organize large amounts of image data quickly and efficiently. Embodiments can include integrated support for tags, comments, locale interlinks, zooms, user requests, and other data associated with locales and/or images. Embodiments can allow many types of searches, including but not limited to geographic searches based on location, tag searches based on user-entered tags and place or landmark name searches. There are general map browsing tools provided by the embedded map engine, i.e. panning, zooming and changing map types.
  • With embodiments of the current invention, users can view specific areas or items of interest in high visual detail without requiring all the imagery for the spot to be of high resolution, and with no limits on the detail magnification. This allows zooming in on a feature of interest of arbitrarily high resolution, so extreme details can be provided, and allows far less data to be transferred to the client to show a particular view.
  • Aspects of the invention provide users the ability to visually navigate remote places in a new, novel and natural way, using nothing more than still photos and a basic web browser.
  • Aspects of the invention provide tools that allow anyone to create visually navigable locales from simple photos. The process is not limited by proprietary tools or knowledge, and does not need special equipment. Nothing is required beyond an inexpensive digital camera and a standard web browser.
  • In aspects of the invention you have the freedom to move around similarly to how you move in real life, i.e. you walk forward to other spots within your forward vision, and turn left/right/up/down as if you were there. Navigational linkage between spots is explicit, not merely implied by location and direction. This creates logically cohesive data.
  • Spots can be created wherever anyone can take pictures, i.e. indoor/outdoor, urban settings or in the middle of a forest.
  • Aspects of the invention allow for navigationally interlinked locales, making it possible for many individuals to collectively blanket entire cities, and indeed the world. Either through planned collaboration, or after the fact. Interlinked locales make the whole more valuable than the sum of the parts.
  • Aspects of the invention provide virtual reality (VR)-like navigation and views, but unlike traditional VR allows for virtually unlimited detail of objects or points of interest without significantly increasing data or bandwidth requirements.
  • Aspects of the invention include geo-location of photos by simply clicking on a high-resolution map in a browser, using no plug-ins or extra software.
  • Aspects of the invention provide a new and novel visual display of the “level of dirtiness” of documents or local data.
  • With aspects of the invention, anyone can document a particular location over time. Different times of day, times of year, or changes over an extended period of time.
  • Aspects of the invention provide for unique types of “embedded ads” directly within the first-person real-world views.
  • Aspects of the invention have a proprietary provisional data scheme to greatly reduce the amount of “bad data” that gets into the system.
  • A method of the invention includes receiving user-generated digital images and user-generated data about the images from users; organizing and linking the images geographically according to the user-generated data, and displaying the images in response to a user request.
  • An embodiment includes but is not limited to five types of objects: views, spots, locales, zooms and trails.
  • A view as described herein is a single digital image, which can be a photographic image. Each view is associated with a spot and can have a defined orientation associated with the view, including a lateral (compass) direction denoting a direction a camera was facing when the image was captured, as well as a vertical angle relative to a horizon of the location. Views can include time references including time of day, date, and season.
  • A zoom has described herein can be a close-up image of an area or point of interest within a view, but it need not be simply a higher resolution zoomed version of the original view. This gives users creative flexibility to make interesting and creative zoomed images. Views and zooms can also include tags, which are words or short phrases. In addition to latitude and longitude, views can also include elevation. A view's elevation can be specified by “floor” (as in a building), or in feet above sea level.
  • A spot has described herein has one or more views, taken from a single geographic location, facing in different lateral and/or vertical directions. Each spot has a precise geographic location, which can be indicated by latitude and longitude, elevation, GPS coordinates, or other means. In one embodiment, spots include 8 lateral views, which can align, for example, with the 8 major compass directions. In other embodiments, a spot can have any number of views associated with the spot, facing in arbitrary directions. Horizontally oriented lateral views give the user the ability to rotate the view left and right, conversely views taken at other angles relative to horizontal allow the user to see up or down. Furthermore, views taken from the same spot at different times can also be included, allowing users to watch how a location changes over time.
  • A locale, as described herein is a coverage area of one or more spots. Spots can be interlinked within a locale, so users can navigate between spots. Locales can be interlinked when their borders become sufficiently close or overlap, allowing users to navigate among locales without leaving their first-person frame of reference. Interlinked locales are locales with linkage from one or more existing spots in one locale to one or more existing spots in another locale.
  • A trail as described herein is defined as paths that lead a user through a specific sequence of views and/or zooms of one or more spots in one or more locales. Users can create a trail through their own locales or other users' locales, or a combination of both.
  • FIG. 1 illustrates an example of a computing system environment 100 on which an embodiment of the invention may be implemented. The computing system environment 100, as illustrated, is an example of a suitable computing environment, however it is appreciated that other environments, systems, and devices may be used to implement various embodiments of the invention as described in more detail below.
  • Embodiments of the invention are operational with numerous other general-purpose or special purpose computing system environments or configurations. Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with embodiments of the invention include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set-top boxes, programmable consumer electronics, mobile telephones, portable data assistants, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
  • Embodiments of the invention may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Embodiments of the invention may also be practiced in distributed-computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
  • With reference to FIG. 1, an exemplary system for implementing an embodiment of the invention includes a computing device, such as computing device 100. The computing device 100 typically includes at least one processing unit 102 and memory 104.
  • Depending on the exact configuration and type of computing device, memory 104 may be volatile (such as random-access memory (RAM)), nonvolatile (such as read-only memory (ROM), flash memory, etc.) or some combination of the two. This most basic configuration is illustrated in FIG. 1 by dashed line 106.
  • Additionally, the device 100 may have additional features, aspects, and functionality. For example, the device 100 may include additional storage (removable and/or non-removable) which may take the form of, but is not limited to, magnetic or optical disks or tapes. Such additional storage is illustrated in FIG. 1 by removable storage 108 and non-removable storage 110. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. Memory 104, removable storage 108 and non-removable storage 110 are all examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by device 100. Any such computer storage media may be part of device 100.
  • The device 100 may also contain a communications connection 112 that allows the device to communicate with other devices. The Communications connection 112 is an example of communication media. Communication media typically embodies computer-readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, the communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio-frequency (RF), infrared and other wireless media. The term computer-readable media as used herein includes both storage media and communication media.
  • The device 100 may also have an input device 114 such as keyboard, mouse, pen, voice-input device, touch-input device, etc. Further, an output device 116 such as a display, speakers, printer, etc. may also be included. Additional input devices 114 and output devices 116 may be included depending on a desired functionality of the device 100.
  • Referring now to FIG. 2, an embodiment of the present invention takes the form of an exemplary computer network system 200. The system 200 includes an electronic client device 210, such as a personal computer or workstation or portable data assistant or mobile telephone, that is linked via a communication medium, such as a network 220 (e.g., the Internet), to an electronic device or system, such as a server 230. The server 230 may further be coupled, or otherwise have access, to a database 240 and a computer system 260. Although the embodiment illustrated in FIG. 2 includes one server 230 coupled to one client device 210 via the network 220, it should be recognized that embodiments of the invention may be implemented using one or more such client devices coupled to one or more such servers.
  • The client device 210 and the server 230 may include all or fewer than all of the features associated with the device 100 illustrated in and discussed with reference to FIG. 1. The client device 210 includes or is otherwise coupled to a computer screen or display 250. The client device 210 may be used for various purposes such as network- and local-computing processes.
  • The client device 210 is linked via the network 220 to server 230 so that computer programs, such as, for example, a browser, running on the client device 210 can cooperate in two-way communication with server 230. The server 230 may be coupled to database 240 to retrieve information therefrom and to store information thereto. Database 240 may include a plurality of different tables (not shown) that can be used by the server 230 to enable performance of various aspects of embodiments of the invention. Additionally, the server 230 may be coupled to the computer system 260 in a manner allowing the server to delegate certain processing functions to the computer system.
  • Still referring to FIG. 2, and in operation according to an embodiment of the invention, a user (not shown) of the client device 210 desiring to electronically organize and map a series of photographs.
  • Still referring to FIG. 2, and in operation according to an embodiment of the invention, a user (not shown) of the client device 210 desiring to electronically map photographs uses a browser application running on the client device to access web content, which may, but need not, be served by the server 230. Specifically, by employing an appropriate uniform resource locator (URL) in a known manner, the user may download from the server 230 and install on the client device 210 a user interface module 280 comprising computer-executable instructions as described more fully hereinafter. Alternatively, the user may receive the module 280 on a tangible computer-readable medium (not shown), such as, for example, a CD-ROM, and subsequently install the module on the client device 210 from the medium.
  • Upon execution of the module 280 by the client device 210, and referring to FIG. 3, a user interface 130 may be displayed on the display device 250.
  • Embodiments of the invention include various graphical user interfaces (GUIs) for allowing a user to interact with the embodiments. A “Find Places” GUI 130 for finding locations and views is shown in FIG. 3. The GUI 130 includes a map 132 with a zoom bar 134 and navigation buttons 136 for changing an area displayed by the map 132, including a “Center” button 138. The map 132 includes markers 140 which can denote locations of views, spots, and locales. Users can jump directly to any city, state, country or zip code by inputting the information in a “Place” field 42, or browse at random among markers 140 on the map 132, or search for or filter based on, tags with a “Tags” field 148. A “Current Locale” window 131 displays a name of, owner of, and number of spots contained in, the current locale. The window 131 can also display clickable options such as zooming in to or walking around the current locale. A legend 144 can be included, showing the spot density of each locale, or any other information of interest. A clickable list 146 of locales can be included, allowing a user to select a locale from the list 46 for display on the map 132. Tabs 149 can be included to allow users to quickly navigate between GUI functional areas.
  • A “Walk Around” GUI 150 is shown in FIG. 4. The GUI 150 includes an overhead map view 152. Users can jump directly to a spot or feature of interest by clicking on any marker 140 in the overhead map view 152. When a user turns, or moves to another spot, the map 152 updates to the new location and orientation. The map 152 also shows other spots that are in the vicinity, and the user can see at a glance which spots have already been “visited” by the type of marker 410 on each spot; in embodiments, empty circles 151 signify unvisited spots and filled circles 153 signify visited spots. A current spot marker 155 is a filled circle with an arrow pointing in the current direction, and a destination marker (not shown) with a ‘+’ symbol can indicate the spot the user moves to if the user performs the “move-forward” action.
  • The overhead map 152 can be a top-down view, including but not limited to a map, satellite imagery, “bird's-eye view” imagery, an architectural floor plan, a hand-sketched diagram and a user-supplied image. A top-down view provides an overview of the area of interest in enough detail to distinguish the positions of a locale's spots. If a custom top-down image is used, then the locale creator first geo-locates the top-down image. In some cases, a distant panoramic side view can be used in place of a top-down image if the area of interest is very linear, for example, a section of seashore.
  • Alongside the overhead map 152 is a view window 154 for displaying digital images, which can include real, eye-level, photographic images. Some views have “zooms”, which can be close-up images of specific areas within a view, or information or images related to the view. Clicking on a zoom icon can animate the zoom image so it appears to grow from the position on the screen of its marking identifier into a larger image. The server can provide images to match the real time of day, date, or season of the spot if images taken at different times are associated with the spot.
  • Locale, spot, view and feature (not shown) comment areas 154, 156, 158 can display comments made by users about the displayed locale, spot, and view, respectively. Feature comments are related to a feature in the locale which may appear in more than one specific view or spot, i.e. a distant mountain peak that is visible from all easterly views in a locale. Users can also make requests to be displayed in the areas 154, 156, 158, such as a request that digital images be captured at a specific spot.
  • Movement, turning and zooming are performed from a first-person perspective; these actions can be performed either by using the keyboard, or by using the mouse or other user interaction. Other interactions include but are not limited to touch-screens, voice commands, and/or motion controls. Users can rotate from a current direction to an adjacent direction. Users can navigate forward, backward, up, down, or sideways, and can navigate directly to other spots in the locale, and the view displayed will change accordingly.
  • Embodiments of the invention allow users to smoothly transition from one view to the next. When the user rotates the current direction, the old view slides out, and as the old view slides out, the new view slides in to take its place (left, right, up or down). Embodiments allow locale creators to specify the amount of overlap between any two adjacent views, which allows the new view panning in to start out partially overlapped by the old view, if desired. In the case of moving forward and backward, the new view can zoom in or out as the old view zooms out or in, and during this process the views can fade out and in. Animated line graphics representing the new view can “float” above the old view and expand or contract to symbolize the movement to the new view.
  • An “Upload” GUI 160 shown in FIG. 5 enables users to upload digital images. The interface 160 includes directions 162 and a “File Upload” field 164. The user selects either an individual .jpeg or other photo file type, or .zip file containing multiple images, and presses a ‘Send File’ button 166. While the file is being uploaded, the upload progress is displayed to the user. Once the file is uploaded, embodiments can process the raw uploaded image files and create multiple resolutions of said images for storage on the server to be used for various purposes. As each image or batch of images is being processed, the status of the processing for that batch can be displayed onscreen for the user to view. Embodiments can allow the user to leave the interface 160 during image processing, and can allow the user to start using processed images before all images have been processed. Downloadable applications or plug-ins can be included, but are not required for uploading images.
  • A “Sort” GUI 170, shown in FIG. 6A, allows users to sort images 171 by spot and orientation. A spot list 172 lists all the spots associated with the current locale, displayed in a “My Locales” area 174. The user can click on any spot name to select a spot and display the spot's images in a “Current Spot” or “Selected Spot” display area 176; the area 176 displays views 177 associated with the current spot.
  • Uploaded images which have not been assigned to a spot appear in the user's “Unsorted Images” bin 178. In an embodiment, thumbnails are arranged in a grid, from which the user can sort the images 171 into spots corresponding to, for example, the physical location(s) from which they were taken. If the images 171 are in left-to-right order, then the user can select the first image, click on a “Grab 8button 180, and select seven more images in order. Alternatively, if the images 171 are not in left-to-right order, the user can select each image in order to get a left-to-right ordering. The user can add other images associated with a spot, like zooms or up/down shots.
  • When all images are selected for a desired spot, the user clicks a “New Spot” button 82 and is prompted for a spot name. When the user enters a name, a new spot is created, and is added to the spot list 172 for the current locale. Spots can be renamed by clicking a “Rename” button 184 and typing a new name. Geo-located spots can be “dumped” back into the Unsorted Images bin 178 by clicking a “Dump to Unsorted” button 179. When a spot is dumped, its images are kept together as a group, within the Unsorted Images bin, but other references to it are deleted, such as its location on the map 1106.
  • FIG. 6B shows an alternate embodiment of a Sort GUI 170. The Selected Spot area 176 includes eight views 177 associated with the eight major compass directions, and two zooms 185 associated with a view 177 of the spot. Users can associate zooms 185 with views 177 with either the Sort GUI 170 or the Locate/Link GUI 190 (shown below).
  • To associate a zoom 185 with a particular point of interest on a view 177, the user clicks and drags the zoom 185 onto the desired view 177, and a zoom icon 186 is created on the view 177 at that point. After the zoom 185 has been associated with its view 177, the user can reposition the zoom 185 by clicking and dragging a zoom icon 186 representing the zoom 185 within the view 177. The interface 170 lightly constrains the icon 186 within the edges of the associated view 177, to allow easy positioning at the edges, but also allows deletion of the association via a quick dragging motion to escape the edges of the view 177. When the zoom icon 186 has been dragged outside its associated view 177, its graphical representation 186 changes to highlight the fact that if the user releases the icon 186, the association will be deleted.
  • Zooms 85 can be associated with any view 177, multiple different zooms 185 can be associated with a single view 177, and one zoom 185 can be associated with different views 177; for example, when an item of interest lies in an area which visually overlaps two adjacent views 177. Zooms 185 can be merely magnified views of an area of interest, or angles can be completely different to get a better view of an item of interest. The zoom 185 could “peer over, or around” a fence or other obstacle, “peek” inside a window, or through a wall of a building, if a user has access to an image of the inside of the building. A view 177 of the outside of a restaurant could associate a position on the restaurant's front door with a zoom 185 of the restaurant's menu. The zoom 185 could be an image of different times, days or seasons.
  • A Locate/Link GUI 190, shown in FIG. 7, allows users to specify a geographical location for each spot and create links between spots.
  • For existing locales, the user selects a locale from the Locale list 192, and a map 94 including the locale data is displayed in a map pane 196. If the user is locating a new locale the user can move the map 194 to a new location by typing the location name or other identifying information into a “Find Place” field 198 above the map pane 196, or the user can position the map 194 by panning and/or zooming in until they find the correct place.
  • A “Spots” area 1100 lists all spots associated with the current locale. Spots that have already been geo-located will have an indicator, such as a checkbox 1102.
  • To geo-locate a spot, a user displays a desired location of the spot in the map pane 196 and selects a spot to geo-locate by clicking on the spot name in the Spots area 1100—a selected spot's views will be displayed in the Selected Spot area 1104, above the map pane 196. The user then clicks on the map 194 at the desired location of the spot and an indicator 1106 appears at the desired location on the map 194. Once placed, a spot's exact location can be adjusted in fine or coarse detail by using the keyboard and/or mouse. Other embodiments can correlate spots with a set of previously defined geographic coordinates, for example from a GPS device.
  • To move a spot that has already been located, users can select the spot and use the arrow keys to move the spot marker 1106 on the map 194, or hold down the “M” key (move) on the keyboard and click on the desired location on the map 194, or input actual coordinate data.
  • Spots that have been geo-located can be linked together so users can navigate around the locale among the spots. One embodiment allows users to link two spots by selecting a first spot by clicking on its marker 1106 or its name in the Spots area 1100, and holding down the “C” key (connect) and clicking on a marker 1106 for a second spot. A line (not shown) appears between the spots, signifying that the spots are linked. Spots can be linked with more than one other spot.
  • A link between spots is actually comprised of two separate “moves”, one in each direction; i.e. a move from the first spot to the second spot, and a move from the second spot to the first spot.
  • Embodiments can allow users to delete moves and/or links, for example, by clicking on a link marker connecting 2 spots and allowing the user to choose which moves to delete.
  • Embodiments can download various meta-data about a locale, its spots, views, tags and structure to the client, giving the ability to change various aspects of the client GUI (like sorting criteria and/or appearance of markers on the map) without querying the server. For example, the default legend might color the locale markers according to how many spots they contain, but users can change the marker colors or appearance to distinguish the visible locales by area type, i.e. Residential, Business, Park, Rural, Beach; or by user rating, or by date of creation, or any number of other criteria.
  • In a network-based client-server application, there are actions that directly modify data on the servers, and/or there are actions that can be local to the user's computer until the changes are explicitly saved to the servers. There is an overhead to maintaining or synchronizing the data between client and server. When changes that are local to the user's computer occur, embodiments can keep track of the number of changes made and/or the types of changes made, and can use that information to gradually alter the appearance of one or more aspects of the GUI on-screen as the user makes local changes, thereby giving more useful information to the user than a simple binary “dirty” indicator. So, for example, as the user is manipulating newly uploaded images, the modification indicator can change slightly each time the user modifies data; i.e. the color of the border of the graphical representation of the data changes from neutral to light red and eventually to deep red. Or, as shown in FIGS. 8A-8E, a button such as a “Save Locale” button 1110 can change color from white to dark red.
  • This aspect of the invention can be applied to general-purpose software applications, including desktop applications or any other software which manipulates data and for which those manipulations and/or the manipulated data can be saved. Various software applications and GUIs have a visual mechanism to indicate that a document is “dirty” (has unsaved changes), but a binary representation is not nearly as useful as being able to see at a glance HOW dirty the document is at the moment. The rate of change of the appearance of the graphical representation of the dirtiness with respect to the changes in the unsaved data can be set by a user. Thus, for a text document, a user could set a visual change to occur after adding, modifying, or deleting every ten characters, or after every hundred characters, or after every 1,000 characters. Varied algorithms can be used to determine when and/or how often to update the indicator(s). Embodiments can use different color schemes to make more obvious the amount of editing that has taken place since the last save. For example, faint green to bright green to faint yellow to bright yellow to faint orange to bright orange to faint red to bright red.
  • Other visual indicators can optionally be offered to make it even more obvious to the user how much editing has taken place since the last save. Instead of, or in addition to gradually changing colors, a strategically located icon or other graphical representation could change through a set of different appearances, each of a more urgent nature than the last.
  • Embodiments can include ads embedded within images. For example, an image of a billboard can have virtual ads placed directly in the billboard area, complete with PPC (pay per click) outbound linking. As shown in FIG. 9, a view 1120 of a restaurant can have a clickable embedded ad 1122 featuring the name of the restaurant and an offer of free coupons. Also, images of or near businesses or related places of interest can contain ads of many different kinds, including rollover hotspots with data, imagery or PPC outbound links. Embedded ads can be in the form of zooms, where clicking on a zoom icon displays the ad. The ad links could be a portion of a captured image provided by a user, or ads can be overlaid after the fact.
  • Clients who wish to keep ads (which may be a competitor's) off a particular locale can pay to either run ads of their choosing, or to run no ads whatsoever on locale(s) and/or view(s) of their choosing.
  • Server-side software can “paste” small images/links/icons directly onto the views themselves dynamically at delivery time. For example, in the upper right corner of all images, or of all images belonging to a particular user, or for all images being viewed by a particular user, or for all users from Alabama, or all locales in Alabama, etc. By adding these ad images dynamically, embodiments can optionally reposition them to different parts of the views. Users of the invention have a trust level. New users generally have a trust level of zero. As they contribute good, valid data, their trust level goes up. All locale and/or image data has a trust level equal to the trust level of the user that supplied the data. As more trusted users view the data without flagging it as bad, corrupt or otherwise inappropriate, the trust level of the data increases. Users can specify that only data of a certain trust level or higher be displayed. Embodiments can allow businesses or other entities or groups to have restricted access to their locale data and/or images so that only authorized representatives may view, annotate, comment on and/or manipulate the data for one or more locales.
  • Some entities may be willing to pay to have high quality locales created for their locations. Aspects of the invention can include allowing, in a controlled way, locale creators and locations to advertise their availabilities and needs, for example, in view, spot, or locale comment areas, or within views in their own locales.
  • Entities can also pay or bid for services using points rewarded to the entity through a point system. Users are rewarded points for various activities, such as flagging inappropriate content or capturing one or more digital images of a particular location. As users accumulate points, the users can offer the points to other users as incentives to perform some activity, such as capturing a digital image of a particular location or creating a locale. Points can also be used to bid for a section of pixels on our pages, which are displayed to the general public. Users of the site can promote a favorite charity, a blog, a business, or anything to which a user wants to draw attention.
  • In an alternate embodiment, feature comments are included. Feature comments are comments related to a particular feature in a locale which can be visible in many different views. Without this, it can sometimes be difficult to find all the comments related to a particular item of interest within a locale. Essentially, it's a sub-grouping of comments that can be associated separately from a particular spot or view.
  • In yet another embodiment the view can also be partially rotated, before or while moving forward/backward. This allows the new view to be aligned closer to the center of the currently displayed view, which helps the user maintain their visual frame of reference as they move to another spot—especially with any zooming animation.
  • While the preferred embodiment of the invention has been illustrated and described, as noted above, many changes can be made without departing from the spirit and scope of the invention. Accordingly, the scope of the invention is not limited by the disclosure of the preferred embodiment. Instead, the invention should be determined entirely by reference to the claims that follow.

Claims (4)

1. A method comprising:
receiving a first digital image, and at least one user-generated datum, from at least one user;
organizing the first image geographically according to the at least one datum, including associating the first image with at least one location and at least one direction; and
providing, from a first-person perspective, at least one user-generated digital image in response to a user request.
2. The method of claim 1, further comprising:
receiving additional images and data;
organizing the additional images and the first image spatially according to the data; and
providing the additional images and data in response to a user request.
3. The method of claim 2, wherein organizing the additional images includes linking the images such that a user can access the images from a first-person perspective.
4. A system comprising:
a computer-readable memory including at least one user-generated digital image and at least one user-generated datum;
a processor in data communication with the memory and a network; the processor comprising:
a first component configured to receive a first digital image, and at least one user-generated datum, from at least one user;
a second component configured to organize the first image geographically according to the at least one datum, including associating the first image with at least one location and at least one direction; and
a third component configured to provide, from a first-person perspective, at least one user-generated digital image in response to a user request.
US11/844,203 2006-08-24 2007-08-23 Systems and methods for photograph mapping Abandoned US20080077597A1 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
US11/844,203 US20080077597A1 (en) 2006-08-24 2007-08-23 Systems and methods for photograph mapping
US12/438,360 US20100235350A1 (en) 2006-08-24 2007-08-24 Systems and methods for photograph mapping
PCT/US2007/076718 WO2008024949A2 (en) 2006-08-24 2007-08-24 Systems and methods for photograph mapping
US13/196,044 US8990239B2 (en) 2006-08-24 2011-08-02 Systems and methods for photograph mapping
US14/624,095 US9881093B2 (en) 2006-08-24 2015-02-17 Systems and methods for photograph mapping
US15/841,190 US10776442B2 (en) 2006-08-24 2017-12-13 Systems and methods for photograph mapping
US17/021,326 US20210073305A1 (en) 2006-08-24 2020-09-15 Systems and methods for photograph mapping

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US84013406P 2006-08-24 2006-08-24
US11/844,203 US20080077597A1 (en) 2006-08-24 2007-08-23 Systems and methods for photograph mapping

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US12/438,360 Continuation US20100235350A1 (en) 2006-08-24 2007-08-24 Systems and methods for photograph mapping
PCT/US2007/076718 Continuation WO2008024949A2 (en) 2006-08-24 2007-08-24 Systems and methods for photograph mapping

Publications (1)

Publication Number Publication Date
US20080077597A1 true US20080077597A1 (en) 2008-03-27

Family

ID=39107708

Family Applications (6)

Application Number Title Priority Date Filing Date
US11/844,203 Abandoned US20080077597A1 (en) 2006-08-24 2007-08-23 Systems and methods for photograph mapping
US12/438,360 Abandoned US20100235350A1 (en) 2006-08-24 2007-08-24 Systems and methods for photograph mapping
US13/196,044 Expired - Fee Related US8990239B2 (en) 2006-08-24 2011-08-02 Systems and methods for photograph mapping
US14/624,095 Expired - Fee Related US9881093B2 (en) 2006-08-24 2015-02-17 Systems and methods for photograph mapping
US15/841,190 Active US10776442B2 (en) 2006-08-24 2017-12-13 Systems and methods for photograph mapping
US17/021,326 Abandoned US20210073305A1 (en) 2006-08-24 2020-09-15 Systems and methods for photograph mapping

Family Applications After (5)

Application Number Title Priority Date Filing Date
US12/438,360 Abandoned US20100235350A1 (en) 2006-08-24 2007-08-24 Systems and methods for photograph mapping
US13/196,044 Expired - Fee Related US8990239B2 (en) 2006-08-24 2011-08-02 Systems and methods for photograph mapping
US14/624,095 Expired - Fee Related US9881093B2 (en) 2006-08-24 2015-02-17 Systems and methods for photograph mapping
US15/841,190 Active US10776442B2 (en) 2006-08-24 2017-12-13 Systems and methods for photograph mapping
US17/021,326 Abandoned US20210073305A1 (en) 2006-08-24 2020-09-15 Systems and methods for photograph mapping

Country Status (2)

Country Link
US (6) US20080077597A1 (en)
WO (1) WO2008024949A2 (en)

Cited By (128)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090240653A1 (en) * 2008-03-21 2009-09-24 Kistler Peter Cornelius Method for extracting attribute data from a media file
WO2010024873A1 (en) * 2008-08-28 2010-03-04 Google Inc. Architectures and methods for creating and representing time-dependent imagery
US20100070897A1 (en) * 2008-09-15 2010-03-18 Andrew Aymeloglu Modal-less interface enhancements
US20100214302A1 (en) * 2009-02-24 2010-08-26 Ryan Melcher System and method for supplementing an image gallery with status indicators
US20110007094A1 (en) * 2008-08-28 2011-01-13 Google Inc. Architectures and methods for creating and representing time-dependent imagery
US20110055749A1 (en) * 2009-08-26 2011-03-03 Apple Inc. Tracking Device Movement and Captured Images
WO2011163351A2 (en) * 2010-06-22 2011-12-29 Ohio University Immersive video intelligence network
US8487957B1 (en) * 2007-05-29 2013-07-16 Google Inc. Displaying and navigating within photo placemarks in a geographic information system, and applications thereof
US8584013B1 (en) * 2007-03-20 2013-11-12 Google Inc. Temporal layers for presenting personalization markers on imagery
US20140047381A1 (en) * 2012-08-10 2014-02-13 Microsoft Corporation 3d data environment navigation tool
US20140068445A1 (en) * 2012-09-06 2014-03-06 Sap Ag Systems and Methods for Mobile Access to Enterprise Work Area Information
US8713467B1 (en) 2013-08-09 2014-04-29 Palantir Technologies, Inc. Context-sensitive views
US8782564B2 (en) 2008-03-21 2014-07-15 Trimble Navigation Limited Method for collaborative display of geographic data
US8799799B1 (en) 2013-05-07 2014-08-05 Palantir Technologies Inc. Interactive geospatial map
US8812960B1 (en) 2013-10-07 2014-08-19 Palantir Technologies Inc. Cohort-based presentation of user interaction data
US8832594B1 (en) 2013-11-04 2014-09-09 Palantir Technologies Inc. Space-optimized display of multi-column tables with selective text truncation based on a combined text width
US8855999B1 (en) 2013-03-15 2014-10-07 Palantir Technologies Inc. Method and system for generating a parser and parsing complex data
US20140304312A1 (en) * 2013-04-05 2014-10-09 Dropbox, Inc. Ordering content items
US8868486B2 (en) 2013-03-15 2014-10-21 Palantir Technologies Inc. Time-sensitive cube
US8917274B2 (en) 2013-03-15 2014-12-23 Palantir Technologies Inc. Event matrix based on integrated data
US8924872B1 (en) 2013-10-18 2014-12-30 Palantir Technologies Inc. Overview user interface of emergency call data of a law enforcement agency
US8930897B2 (en) 2013-03-15 2015-01-06 Palantir Technologies Inc. Data integration tool
US8938686B1 (en) 2013-10-03 2015-01-20 Palantir Technologies Inc. Systems and methods for analyzing performance of an entity
US8937619B2 (en) 2013-03-15 2015-01-20 Palantir Technologies Inc. Generating an object time series from data objects
US9009827B1 (en) 2014-02-20 2015-04-14 Palantir Technologies Inc. Security sharing system
US9009171B1 (en) 2014-05-02 2015-04-14 Palantir Technologies Inc. Systems and methods for active column filtering
US9021260B1 (en) 2014-07-03 2015-04-28 Palantir Technologies Inc. Malware data item analysis
US9021384B1 (en) 2013-11-04 2015-04-28 Palantir Technologies Inc. Interactive vehicle information map
US20150130838A1 (en) * 2013-11-13 2015-05-14 Sony Corporation Display control device, display control method, and program
US9043696B1 (en) 2014-01-03 2015-05-26 Palantir Technologies Inc. Systems and methods for visual definition of data associations
US9043894B1 (en) 2014-11-06 2015-05-26 Palantir Technologies Inc. Malicious software detection in a computing system
US9116975B2 (en) 2013-10-18 2015-08-25 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive simultaneous querying of multiple data stores
US9123086B1 (en) 2013-01-31 2015-09-01 Palantir Technologies, Inc. Automatically generating event objects from images
US20150302633A1 (en) * 2014-04-22 2015-10-22 Google Inc. Selecting time-distributed panoramic images for display
US9202249B1 (en) 2014-07-03 2015-12-01 Palantir Technologies Inc. Data item clustering and analysis
US9223773B2 (en) 2013-08-08 2015-12-29 Palatir Technologies Inc. Template system for custom document generation
US9256664B2 (en) 2014-07-03 2016-02-09 Palantir Technologies Inc. System and method for news events detection and visualization
US9335897B2 (en) 2013-08-08 2016-05-10 Palantir Technologies Inc. Long click display of a context menu
US9335911B1 (en) 2014-12-29 2016-05-10 Palantir Technologies Inc. Interactive user interface for dynamic data analysis exploration and query processing
US9367872B1 (en) 2014-12-22 2016-06-14 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive investigation of bad actor behavior based on automatic clustering of related data in various data structures
US9454281B2 (en) 2014-09-03 2016-09-27 Palantir Technologies Inc. System for providing dynamic linked panels in user interface
US9454785B1 (en) 2015-07-30 2016-09-27 Palantir Technologies Inc. Systems and user interfaces for holistic, data-driven investigation of bad actor behavior based on clustering and scoring of related data
US9483162B2 (en) 2014-02-20 2016-11-01 Palantir Technologies Inc. Relationship visualizations
US9501851B2 (en) 2014-10-03 2016-11-22 Palantir Technologies Inc. Time-series analysis system
US9552615B2 (en) 2013-12-20 2017-01-24 Palantir Technologies Inc. Automated database analysis to detect malfeasance
US9571971B2 (en) 2015-04-24 2017-02-14 International Business Machines Corporation Managing crowd sourced data acquisition
USD780210S1 (en) 2014-04-22 2017-02-28 Google Inc. Display screen with graphical user interface or portion thereof
USD780211S1 (en) 2014-04-22 2017-02-28 Google Inc. Display screen with graphical user interface or portion thereof
USD780797S1 (en) 2014-04-22 2017-03-07 Google Inc. Display screen with graphical user interface or portion thereof
US9619557B2 (en) 2014-06-30 2017-04-11 Palantir Technologies, Inc. Systems and methods for key phrase characterization of documents
US9727560B2 (en) 2015-02-25 2017-08-08 Palantir Technologies Inc. Systems and methods for organizing and identifying documents via hierarchies and dimensions of tags
US9727622B2 (en) 2013-12-16 2017-08-08 Palantir Technologies, Inc. Methods and systems for analyzing entity performance
US9740369B2 (en) 2013-03-15 2017-08-22 Palantir Technologies Inc. Systems and methods for providing a tagging interface for external content
US9767172B2 (en) 2014-10-03 2017-09-19 Palantir Technologies Inc. Data aggregation and analysis system
US9785328B2 (en) 2014-10-06 2017-10-10 Palantir Technologies Inc. Presentation of multivariate data on a graphical user interface of a computing system
US9785307B1 (en) * 2012-09-27 2017-10-10 Open Text Corporation Reorder and selection persistence of displayed objects
US9785773B2 (en) 2014-07-03 2017-10-10 Palantir Technologies Inc. Malware data item analysis
US9785317B2 (en) 2013-09-24 2017-10-10 Palantir Technologies Inc. Presentation and analysis of user interaction data
US9817563B1 (en) 2014-12-29 2017-11-14 Palantir Technologies Inc. System and method of generating data points from one or more data stores of data items for chart creation and manipulation
US9823818B1 (en) 2015-12-29 2017-11-21 Palantir Technologies Inc. Systems and interactive user interfaces for automatic generation of temporal representation of data objects
US9836580B2 (en) 2014-03-21 2017-12-05 Palantir Technologies Inc. Provider portal
US9857958B2 (en) 2014-04-28 2018-01-02 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive access of, investigation of, and analysis of data objects stored in one or more databases
US9870205B1 (en) 2014-12-29 2018-01-16 Palantir Technologies Inc. Storing logical units of program code generated using a dynamic programming notebook user interface
US9880987B2 (en) 2011-08-25 2018-01-30 Palantir Technologies, Inc. System and method for parameterizing documents for automatic workflow generation
US9886467B2 (en) 2015-03-19 2018-02-06 Plantir Technologies Inc. System and method for comparing and visualizing data entities and data entity series
US9891808B2 (en) 2015-03-16 2018-02-13 Palantir Technologies Inc. Interactive user interfaces for location-based data analysis
US9898167B2 (en) 2013-03-15 2018-02-20 Palantir Technologies Inc. Systems and methods for providing a tagging interface for external content
US9898335B1 (en) 2012-10-22 2018-02-20 Palantir Technologies Inc. System and method for batch evaluation programs
US9898509B2 (en) 2015-08-28 2018-02-20 Palantir Technologies Inc. Malicious activity detection system capable of efficiently processing data accessed from databases and generating alerts for display in interactive user interfaces
US9898528B2 (en) 2014-12-22 2018-02-20 Palantir Technologies Inc. Concept indexing among database of documents using machine learning techniques
US9934222B2 (en) 2014-04-22 2018-04-03 Google Llc Providing a thumbnail image that follows a main image
US9946738B2 (en) 2014-11-05 2018-04-17 Palantir Technologies, Inc. Universal data pipeline
US9965937B2 (en) 2013-03-15 2018-05-08 Palantir Technologies Inc. External malware data item clustering and analysis
US9965534B2 (en) 2015-09-09 2018-05-08 Palantir Technologies, Inc. Domain-specific language for dataset transformations
US9984133B2 (en) 2014-10-16 2018-05-29 Palantir Technologies Inc. Schematic and database linking system
US9996595B2 (en) 2015-08-03 2018-06-12 Palantir Technologies, Inc. Providing full data provenance visualization for versioned datasets
US10037383B2 (en) 2013-11-11 2018-07-31 Palantir Technologies, Inc. Simple web search
US10037314B2 (en) 2013-03-14 2018-07-31 Palantir Technologies, Inc. Mobile reports
US20180285550A1 (en) * 2017-04-03 2018-10-04 Cleveland State University Shoulder-surfing resistant authentication methods and systems
US10102369B2 (en) 2015-08-19 2018-10-16 Palantir Technologies Inc. Checkout system executable code monitoring, and user account compromise determination system
US10120857B2 (en) 2013-03-15 2018-11-06 Palantir Technologies Inc. Method and system for generating a parser and parsing complex data
US10180929B1 (en) 2014-06-30 2019-01-15 Palantir Technologies, Inc. Systems and methods for identifying key phrase clusters within documents
US10180977B2 (en) 2014-03-18 2019-01-15 Palantir Technologies Inc. Determining and extracting changed data from a data source
US10198515B1 (en) 2013-12-10 2019-02-05 Palantir Technologies Inc. System and method for aggregating data from a plurality of data sources
US10216801B2 (en) 2013-03-15 2019-02-26 Palantir Technologies Inc. Generating data clusters
US10229284B2 (en) 2007-02-21 2019-03-12 Palantir Technologies Inc. Providing unique views of data based on changes or rules
US10230746B2 (en) 2014-01-03 2019-03-12 Palantir Technologies Inc. System and method for evaluating network threats and usage
US10275778B1 (en) 2013-03-15 2019-04-30 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive investigation based on automatic malfeasance clustering of related data in various data structures
US10296617B1 (en) 2015-10-05 2019-05-21 Palantir Technologies Inc. Searches of highly structured data
US10318630B1 (en) 2016-11-21 2019-06-11 Palantir Technologies Inc. Analysis of large bodies of textual data
US10324609B2 (en) 2016-07-21 2019-06-18 Palantir Technologies Inc. System for providing dynamic linked panels in user interface
US10356032B2 (en) 2013-12-26 2019-07-16 Palantir Technologies Inc. System and method for detecting confidential information emails
US10362133B1 (en) 2014-12-22 2019-07-23 Palantir Technologies Inc. Communication data processing architecture
US10365804B1 (en) * 2014-02-20 2019-07-30 Google Llc Manipulation of maps as documents
US10372879B2 (en) 2014-12-31 2019-08-06 Palantir Technologies Inc. Medical claims lead summary report generation
US10387834B2 (en) 2015-01-21 2019-08-20 Palantir Technologies Inc. Systems and methods for accessing and storing snapshots of a remote application in a document
US10403011B1 (en) 2017-07-18 2019-09-03 Palantir Technologies Inc. Passing system with an interactive user interface
US10423582B2 (en) 2011-06-23 2019-09-24 Palantir Technologies, Inc. System and method for investigating large amounts of data
US10437840B1 (en) 2016-08-19 2019-10-08 Palantir Technologies Inc. Focused probabilistic entity resolution from multiple data sources
US10437612B1 (en) 2015-12-30 2019-10-08 Palantir Technologies Inc. Composite graphical interface with shareable data-objects
US10444940B2 (en) 2015-08-17 2019-10-15 Palantir Technologies Inc. Interactive geospatial map
US10452678B2 (en) 2013-03-15 2019-10-22 Palantir Technologies Inc. Filter chains for exploring large data sets
US10460602B1 (en) 2016-12-28 2019-10-29 Palantir Technologies Inc. Interactive vehicle information mapping system
US10484407B2 (en) 2015-08-06 2019-11-19 Palantir Technologies Inc. Systems, methods, user interfaces, and computer-readable media for investigating potential malicious communications
US10489391B1 (en) 2015-08-17 2019-11-26 Palantir Technologies Inc. Systems and methods for grouping and enriching data items accessed from one or more databases for presentation in a user interface
US10552994B2 (en) 2014-12-22 2020-02-04 Palantir Technologies Inc. Systems and interactive user interfaces for dynamic retrieval, analysis, and triage of data items
US10572496B1 (en) 2014-07-03 2020-02-25 Palantir Technologies Inc. Distributed workflow system and database with access controls for city resiliency
US10572487B1 (en) 2015-10-30 2020-02-25 Palantir Technologies Inc. Periodic database search manager for multiple data sources
US20200064981A1 (en) * 2018-08-22 2020-02-27 International Business Machines Corporation Configuring an application for launching
US10628834B1 (en) 2015-06-16 2020-04-21 Palantir Technologies Inc. Fraud lead detection system for efficiently processing database-stored data and automatically generating natural language explanatory information of system results for display in interactive user interfaces
US10636097B2 (en) 2015-07-21 2020-04-28 Palantir Technologies Inc. Systems and models for data analytics
US10678860B1 (en) 2015-12-17 2020-06-09 Palantir Technologies, Inc. Automatic generation of composite datasets based on hierarchical fields
US10698938B2 (en) 2016-03-18 2020-06-30 Palantir Technologies Inc. Systems and methods for organizing and identifying documents via hierarchies and dimensions of tags
US10706434B1 (en) 2015-09-01 2020-07-07 Palantir Technologies Inc. Methods and systems for determining location information
US10719188B2 (en) 2016-07-21 2020-07-21 Palantir Technologies Inc. Cached database and synchronization system for providing dynamic linked panels in user interface
US10754822B1 (en) 2018-04-18 2020-08-25 Palantir Technologies Inc. Systems and methods for ontology migration
US10795723B2 (en) 2014-03-04 2020-10-06 Palantir Technologies Inc. Mobile tasks
US10817513B2 (en) 2013-03-14 2020-10-27 Palantir Technologies Inc. Fair scheduling for mixed-query loads
US10839144B2 (en) 2015-12-29 2020-11-17 Palantir Technologies Inc. Real-time document annotation
US10853378B1 (en) 2015-08-25 2020-12-01 Palantir Technologies Inc. Electronic note management via a connected entity graph
US10885021B1 (en) 2018-05-02 2021-01-05 Palantir Technologies Inc. Interactive interpreter and graphical user interface
US10956406B2 (en) 2017-06-12 2021-03-23 Palantir Technologies Inc. Propagated deletion of database records and derived data
US11119630B1 (en) 2018-06-19 2021-09-14 Palantir Technologies Inc. Artificial intelligence assisted evaluations and user interface for same
US11138180B2 (en) 2011-09-02 2021-10-05 Palantir Technologies Inc. Transaction protocol for reading database values
US11150917B2 (en) 2015-08-26 2021-10-19 Palantir Technologies Inc. System for data aggregation and analysis of data from a plurality of data sources
US11302426B1 (en) 2015-01-02 2022-04-12 Palantir Technologies Inc. Unified data interface and system
US11481433B2 (en) 2011-06-09 2022-10-25 MemoryWeb, LLC Method and apparatus for managing digital files
US11599369B1 (en) 2018-03-08 2023-03-07 Palantir Technologies Inc. Graphical user interface configuration system

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050134945A1 (en) * 2003-12-17 2005-06-23 Canon Information Systems Research Australia Pty. Ltd. 3D view for digital photograph management
MX2010001101A (en) 2007-07-27 2010-06-25 Intertrust Tech Corp Content publishing systems and methods.
US8825387B2 (en) * 2008-07-25 2014-09-02 Navteq B.V. Positioning open area maps
US8099237B2 (en) 2008-07-25 2012-01-17 Navteq North America, Llc Open area maps
US20100021013A1 (en) * 2008-07-25 2010-01-28 Gale William N Open area maps with guidance
CN102483751A (en) * 2009-10-15 2012-05-30 博世汽车部件(苏州)有限公司 Navigation system and method with improved destination searching
US8952983B2 (en) 2010-11-04 2015-02-10 Nokia Corporation Method and apparatus for annotating point of interest information
US9639857B2 (en) * 2011-09-30 2017-05-02 Nokia Technologies Oy Method and apparatus for associating commenting information with one or more objects
US20150153933A1 (en) * 2012-03-16 2015-06-04 Google Inc. Navigating Discrete Photos and Panoramas
US11606992B2 (en) 2012-04-18 2023-03-21 Nike, Inc. Vented garment
US9313344B2 (en) 2012-06-01 2016-04-12 Blackberry Limited Methods and apparatus for use in mapping identified visual features of visual images to location areas
US9292264B2 (en) 2013-03-15 2016-03-22 Paschar Llc Mobile device user interface advertising software development kit
US9827714B1 (en) 2014-05-16 2017-11-28 Google Llc Method and system for 3-D printing of 3-D object models in interactive content items
US11406148B2 (en) * 2015-10-07 2022-08-09 Nike, Inc. Vented garment
US11019865B2 (en) 2016-10-06 2021-06-01 Nike, Inc. Insulated garment
US10743596B2 (en) 2016-10-06 2020-08-18 Nike, Inc. Insulated vented garment formed using non-woven polymer sheets

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020029226A1 (en) * 2000-09-05 2002-03-07 Gang Li Method for combining data with maps
US20050073443A1 (en) * 2003-02-14 2005-04-07 Networks In Motion, Inc. Method and system for saving and retrieving spatial related information
US20050278371A1 (en) * 2004-06-15 2005-12-15 Karsten Funk Method and system for georeferential blogging, bookmarking a location, and advanced off-board data processing for mobile systems
US20060195475A1 (en) * 2005-02-28 2006-08-31 Microsoft Corporation Automatic digital image grouping using criteria based on image metadata and spatial information
US20070070233A1 (en) * 2005-09-28 2007-03-29 Patterson Raul D System and method for correlating captured images with their site locations on maps

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6956573B1 (en) * 1996-11-15 2005-10-18 Sarnoff Corporation Method and apparatus for efficiently representing storing and accessing video information
US6266684B1 (en) * 1997-08-06 2001-07-24 Adobe Systems Incorporated Creating and saving multi-frame web pages
US6552744B2 (en) * 1997-09-26 2003-04-22 Roxio, Inc. Virtual reality camera
US6106460A (en) * 1998-03-26 2000-08-22 Scimed Life Systems, Inc. Interface for controlling the display of images of diagnostic or therapeutic instruments in interior body regions and related data
US6246412B1 (en) * 1998-06-18 2001-06-12 Microsoft Corporation Interactive construction and refinement of 3D models from multiple panoramic images
JP3646582B2 (en) * 1998-09-28 2005-05-11 富士通株式会社 Electronic information display method, electronic information browsing apparatus, and electronic information browsing program storage medium
US6895126B2 (en) * 2000-10-06 2005-05-17 Enrico Di Bernardo System and method for creating, storing, and utilizing composite images of a geographic location
US6904160B2 (en) * 2000-10-18 2005-06-07 Red Hen Systems, Inc. Method for matching geographic information with recorded images
US7013289B2 (en) * 2001-02-21 2006-03-14 Michel Horn Global electronic commerce system
US7909696B2 (en) * 2001-08-09 2011-03-22 Igt Game interaction in 3-D gaming environments
EP1304626A1 (en) * 2001-10-18 2003-04-23 Sun Microsystems, Inc. Managing modified documents
US7187377B1 (en) * 2002-06-28 2007-03-06 Microsoft Corporation Three-dimensional virtual tour method and system
US7317449B2 (en) * 2004-03-02 2008-01-08 Microsoft Corporation Key-based advanced navigation techniques
US7327349B2 (en) * 2004-03-02 2008-02-05 Microsoft Corporation Advanced navigation techniques for portable devices
US7548936B2 (en) * 2005-01-12 2009-06-16 Microsoft Corporation Systems and methods to present web image search results for effective image browsing
US20060230051A1 (en) * 2005-04-08 2006-10-12 Muds Springs Geographers Inc. Method to share and exchange geographic based information
EP1866809A1 (en) * 2005-04-08 2007-12-19 Juicy Tours Inc. Architecture for creating, organizing, editing, management and delivery of locationally-specific information to a user in the field
US20090004410A1 (en) * 2005-05-12 2009-01-01 Thomson Stephen C Spatial graphical user interface and method for using the same
EP1902581B1 (en) * 2005-07-13 2015-06-24 Grape Technology Group, Inc. System and method for providing mobile device services using sms communications
AU2005203074A1 (en) * 2005-07-14 2007-02-01 Canon Information Systems Research Australia Pty Ltd Image browser
US8510669B2 (en) * 2006-02-06 2013-08-13 Yahoo! Inc. Method and system for presenting photos on a website
KR100641791B1 (en) * 2006-02-14 2006-11-02 (주)올라웍스 Tagging Method and System for Digital Data
US7797019B2 (en) * 2006-03-29 2010-09-14 Research In Motion Limited Shared image database with geographic navigation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020029226A1 (en) * 2000-09-05 2002-03-07 Gang Li Method for combining data with maps
US20050073443A1 (en) * 2003-02-14 2005-04-07 Networks In Motion, Inc. Method and system for saving and retrieving spatial related information
US20050278371A1 (en) * 2004-06-15 2005-12-15 Karsten Funk Method and system for georeferential blogging, bookmarking a location, and advanced off-board data processing for mobile systems
US20060195475A1 (en) * 2005-02-28 2006-08-31 Microsoft Corporation Automatic digital image grouping using criteria based on image metadata and spatial information
US20070070233A1 (en) * 2005-09-28 2007-03-29 Patterson Raul D System and method for correlating captured images with their site locations on maps

Cited By (281)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10719621B2 (en) 2007-02-21 2020-07-21 Palantir Technologies Inc. Providing unique views of data based on changes or rules
US10229284B2 (en) 2007-02-21 2019-03-12 Palantir Technologies Inc. Providing unique views of data based on changes or rules
US11636138B1 (en) 2007-03-20 2023-04-25 Google Llc Temporal layers for presenting personalization markers on imagery
US10585920B2 (en) 2007-03-20 2020-03-10 Google Llc Temporal layers for presenting personalization markers on imagery
US8584013B1 (en) * 2007-03-20 2013-11-12 Google Inc. Temporal layers for presenting personalization markers on imagery
US8487957B1 (en) * 2007-05-29 2013-07-16 Google Inc. Displaying and navigating within photo placemarks in a geographic information system, and applications thereof
US9280258B1 (en) 2007-05-29 2016-03-08 Google Inc. Displaying and navigating within photo placemarks in a geographic information system and applications thereof
US20090240653A1 (en) * 2008-03-21 2009-09-24 Kistler Peter Cornelius Method for extracting attribute data from a media file
US8898179B2 (en) * 2008-03-21 2014-11-25 Trimble Navigation Limited Method for extracting attribute data from a media file
US8782564B2 (en) 2008-03-21 2014-07-15 Trimble Navigation Limited Method for collaborative display of geographic data
US9099057B2 (en) 2008-08-28 2015-08-04 Google Inc. Architectures and methods for creating and representing time-dependent imagery
US8737683B2 (en) 2008-08-28 2014-05-27 Google Inc. Architectures and methods for creating and representing time-dependent imagery
US8295550B2 (en) 2008-08-28 2012-10-23 Google Inc. Architectures and methods for creating and representing time-dependent imagery
WO2010024873A1 (en) * 2008-08-28 2010-03-04 Google Inc. Architectures and methods for creating and representing time-dependent imagery
US20100054527A1 (en) * 2008-08-28 2010-03-04 Google Inc. Architecture and methods for creating and representing time-dependent imagery
US20110007094A1 (en) * 2008-08-28 2011-01-13 Google Inc. Architectures and methods for creating and representing time-dependent imagery
US9542723B2 (en) 2008-08-28 2017-01-10 Google Inc. Architectures and methods for creating and representing time-dependent imagery
US8872847B2 (en) * 2008-08-28 2014-10-28 Google Inc. Architectures and methods for creating and representing time-dependent imagery
US8077918B2 (en) 2008-08-28 2011-12-13 Google, Inc. Architectures and methods for creating and representing time-dependent imagery
US9916070B1 (en) 2008-08-28 2018-03-13 Google Llc Architectures and methods for creating and representing time-dependent imagery
US8520977B2 (en) 2008-08-28 2013-08-27 Google Inc. Architectures and methods for creating and representing time-dependent imagery
US10248294B2 (en) 2008-09-15 2019-04-02 Palantir Technologies, Inc. Modal-less interface enhancements
US10747952B2 (en) 2008-09-15 2020-08-18 Palantir Technologies, Inc. Automatic creation and server push of multiple distinct drafts
US20100070897A1 (en) * 2008-09-15 2010-03-18 Andrew Aymeloglu Modal-less interface enhancements
US9383911B2 (en) * 2008-09-15 2016-07-05 Palantir Technologies, Inc. Modal-less interface enhancements
US10607272B2 (en) 2009-02-24 2020-03-31 Ebay Inc. Supplementing an image gallery with status indicators
US11361360B2 (en) 2009-02-24 2022-06-14 Ebay Inc. Supplementing an image gallery with status indicators
US9406042B2 (en) * 2009-02-24 2016-08-02 Ebay Inc. System and method for supplementing an image gallery with status indicators
US10169801B2 (en) 2009-02-24 2019-01-01 Ebay Inc. System and method for supplementing an image gallery with status indicators
US11651409B2 (en) 2009-02-24 2023-05-16 Ebay Inc. Supplementing an image gallery with status indicators
US20100214302A1 (en) * 2009-02-24 2010-08-26 Ryan Melcher System and method for supplementing an image gallery with status indicators
US20110055749A1 (en) * 2009-08-26 2011-03-03 Apple Inc. Tracking Device Movement and Captured Images
US8839131B2 (en) * 2009-08-26 2014-09-16 Apple Inc. Tracking device movement and captured images
WO2011163351A2 (en) * 2010-06-22 2011-12-29 Ohio University Immersive video intelligence network
WO2011163351A3 (en) * 2010-06-22 2014-04-10 Ohio University Immersive video intelligence network
US11899726B2 (en) 2011-06-09 2024-02-13 MemoryWeb, LLC Method and apparatus for managing digital files
US11636149B1 (en) 2011-06-09 2023-04-25 MemoryWeb, LLC Method and apparatus for managing digital files
US11768882B2 (en) 2011-06-09 2023-09-26 MemoryWeb, LLC Method and apparatus for managing digital files
US11599573B1 (en) 2011-06-09 2023-03-07 MemoryWeb, LLC Method and apparatus for managing digital files
US11481433B2 (en) 2011-06-09 2022-10-25 MemoryWeb, LLC Method and apparatus for managing digital files
US11636150B2 (en) 2011-06-09 2023-04-25 MemoryWeb, LLC Method and apparatus for managing digital files
US10423582B2 (en) 2011-06-23 2019-09-24 Palantir Technologies, Inc. System and method for investigating large amounts of data
US11392550B2 (en) 2011-06-23 2022-07-19 Palantir Technologies Inc. System and method for investigating large amounts of data
US10706220B2 (en) 2011-08-25 2020-07-07 Palantir Technologies, Inc. System and method for parameterizing documents for automatic workflow generation
US9880987B2 (en) 2011-08-25 2018-01-30 Palantir Technologies, Inc. System and method for parameterizing documents for automatic workflow generation
US11138180B2 (en) 2011-09-02 2021-10-05 Palantir Technologies Inc. Transaction protocol for reading database values
US10008015B2 (en) 2012-08-10 2018-06-26 Microsoft Technology Licensing, Llc Generating scenes and tours in a spreadsheet application
US9996953B2 (en) 2012-08-10 2018-06-12 Microsoft Technology Licensing, Llc Three-dimensional annotation facing
US20140047381A1 (en) * 2012-08-10 2014-02-13 Microsoft Corporation 3d data environment navigation tool
US9881396B2 (en) 2012-08-10 2018-01-30 Microsoft Technology Licensing, Llc Displaying temporal information in a spreadsheet application
US9317963B2 (en) 2012-08-10 2016-04-19 Microsoft Technology Licensing, Llc Generating scenes and tours in a spreadsheet application
US20140068445A1 (en) * 2012-09-06 2014-03-06 Sap Ag Systems and Methods for Mobile Access to Enterprise Work Area Information
US10474327B2 (en) 2012-09-27 2019-11-12 Open Text Corporation Reorder and selection persistence of displayed objects
US9785307B1 (en) * 2012-09-27 2017-10-10 Open Text Corporation Reorder and selection persistence of displayed objects
US10866701B2 (en) 2012-09-27 2020-12-15 Open Text Corporation Reorder and selection persistence of displayed objects
US11182204B2 (en) 2012-10-22 2021-11-23 Palantir Technologies Inc. System and method for batch evaluation programs
US9898335B1 (en) 2012-10-22 2018-02-20 Palantir Technologies Inc. System and method for batch evaluation programs
US9380431B1 (en) 2013-01-31 2016-06-28 Palantir Technologies, Inc. Use of teams in a mobile application
US10743133B2 (en) 2013-01-31 2020-08-11 Palantir Technologies Inc. Populating property values of event objects of an object-centric data model using image metadata
US9123086B1 (en) 2013-01-31 2015-09-01 Palantir Technologies, Inc. Automatically generating event objects from images
US10313833B2 (en) 2013-01-31 2019-06-04 Palantir Technologies Inc. Populating property values of event objects of an object-centric data model using image metadata
US10997363B2 (en) 2013-03-14 2021-05-04 Palantir Technologies Inc. Method of generating objects and links from mobile reports
US10817513B2 (en) 2013-03-14 2020-10-27 Palantir Technologies Inc. Fair scheduling for mixed-query loads
US10037314B2 (en) 2013-03-14 2018-07-31 Palantir Technologies, Inc. Mobile reports
US8868486B2 (en) 2013-03-15 2014-10-21 Palantir Technologies Inc. Time-sensitive cube
US10452678B2 (en) 2013-03-15 2019-10-22 Palantir Technologies Inc. Filter chains for exploring large data sets
US10264014B2 (en) 2013-03-15 2019-04-16 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive investigation based on automatic clustering of related data in various data structures
US9852205B2 (en) 2013-03-15 2017-12-26 Palantir Technologies Inc. Time-sensitive cube
US10977279B2 (en) 2013-03-15 2021-04-13 Palantir Technologies Inc. Time-sensitive cube
US8930897B2 (en) 2013-03-15 2015-01-06 Palantir Technologies Inc. Data integration tool
US10482097B2 (en) 2013-03-15 2019-11-19 Palantir Technologies Inc. System and method for generating event visualizations
US9965937B2 (en) 2013-03-15 2018-05-08 Palantir Technologies Inc. External malware data item clustering and analysis
US9779525B2 (en) 2013-03-15 2017-10-03 Palantir Technologies Inc. Generating object time series from data objects
US8937619B2 (en) 2013-03-15 2015-01-20 Palantir Technologies Inc. Generating an object time series from data objects
US9740369B2 (en) 2013-03-15 2017-08-22 Palantir Technologies Inc. Systems and methods for providing a tagging interface for external content
US10453229B2 (en) 2013-03-15 2019-10-22 Palantir Technologies Inc. Generating object time series from data objects
US8855999B1 (en) 2013-03-15 2014-10-07 Palantir Technologies Inc. Method and system for generating a parser and parsing complex data
US10216801B2 (en) 2013-03-15 2019-02-26 Palantir Technologies Inc. Generating data clusters
US10809888B2 (en) 2013-03-15 2020-10-20 Palantir Technologies, Inc. Systems and methods for providing a tagging interface for external content
US10120857B2 (en) 2013-03-15 2018-11-06 Palantir Technologies Inc. Method and system for generating a parser and parsing complex data
US9898167B2 (en) 2013-03-15 2018-02-20 Palantir Technologies Inc. Systems and methods for providing a tagging interface for external content
US10275778B1 (en) 2013-03-15 2019-04-30 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive investigation based on automatic malfeasance clustering of related data in various data structures
US9646396B2 (en) 2013-03-15 2017-05-09 Palantir Technologies Inc. Generating object time series and data objects
US8917274B2 (en) 2013-03-15 2014-12-23 Palantir Technologies Inc. Event matrix based on integrated data
US9852195B2 (en) 2013-03-15 2017-12-26 Palantir Technologies Inc. System and method for generating event visualizations
US20140304312A1 (en) * 2013-04-05 2014-10-09 Dropbox, Inc. Ordering content items
US9152646B2 (en) * 2013-04-05 2015-10-06 Dropbox, Inc. Ordering content items
US8799799B1 (en) 2013-05-07 2014-08-05 Palantir Technologies Inc. Interactive geospatial map
US10360705B2 (en) 2013-05-07 2019-07-23 Palantir Technologies Inc. Interactive data object map
US9953445B2 (en) 2013-05-07 2018-04-24 Palantir Technologies Inc. Interactive data object map
US10976892B2 (en) 2013-08-08 2021-04-13 Palantir Technologies Inc. Long click display of a context menu
US9335897B2 (en) 2013-08-08 2016-05-10 Palantir Technologies Inc. Long click display of a context menu
US10699071B2 (en) 2013-08-08 2020-06-30 Palantir Technologies Inc. Systems and methods for template based custom document generation
US9223773B2 (en) 2013-08-08 2015-12-29 Palatir Technologies Inc. Template system for custom document generation
US8713467B1 (en) 2013-08-09 2014-04-29 Palantir Technologies, Inc. Context-sensitive views
US9921734B2 (en) 2013-08-09 2018-03-20 Palantir Technologies Inc. Context-sensitive views
US10545655B2 (en) 2013-08-09 2020-01-28 Palantir Technologies Inc. Context-sensitive views
US9557882B2 (en) 2013-08-09 2017-01-31 Palantir Technologies Inc. Context-sensitive views
US9785317B2 (en) 2013-09-24 2017-10-10 Palantir Technologies Inc. Presentation and analysis of user interaction data
US10732803B2 (en) 2013-09-24 2020-08-04 Palantir Technologies Inc. Presentation and analysis of user interaction data
US8938686B1 (en) 2013-10-03 2015-01-20 Palantir Technologies Inc. Systems and methods for analyzing performance of an entity
US9996229B2 (en) 2013-10-03 2018-06-12 Palantir Technologies Inc. Systems and methods for analyzing performance of an entity
US9864493B2 (en) 2013-10-07 2018-01-09 Palantir Technologies Inc. Cohort-based presentation of user interaction data
US10635276B2 (en) 2013-10-07 2020-04-28 Palantir Technologies Inc. Cohort-based presentation of user interaction data
US8812960B1 (en) 2013-10-07 2014-08-19 Palantir Technologies Inc. Cohort-based presentation of user interaction data
US10877638B2 (en) 2013-10-18 2020-12-29 Palantir Technologies Inc. Overview user interface of emergency call data of a law enforcement agency
US9514200B2 (en) 2013-10-18 2016-12-06 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive simultaneous querying of multiple data stores
US10719527B2 (en) 2013-10-18 2020-07-21 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive simultaneous querying of multiple data stores
US8924872B1 (en) 2013-10-18 2014-12-30 Palantir Technologies Inc. Overview user interface of emergency call data of a law enforcement agency
US9116975B2 (en) 2013-10-18 2015-08-25 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive simultaneous querying of multiple data stores
US10042524B2 (en) 2013-10-18 2018-08-07 Palantir Technologies Inc. Overview user interface of emergency call data of a law enforcement agency
US8832594B1 (en) 2013-11-04 2014-09-09 Palantir Technologies Inc. Space-optimized display of multi-column tables with selective text truncation based on a combined text width
US9021384B1 (en) 2013-11-04 2015-04-28 Palantir Technologies Inc. Interactive vehicle information map
US10262047B1 (en) 2013-11-04 2019-04-16 Palantir Technologies Inc. Interactive vehicle information map
US10037383B2 (en) 2013-11-11 2018-07-31 Palantir Technologies, Inc. Simple web search
US11100174B2 (en) 2013-11-11 2021-08-24 Palantir Technologies Inc. Simple web search
US20150130838A1 (en) * 2013-11-13 2015-05-14 Sony Corporation Display control device, display control method, and program
US10832448B2 (en) 2013-11-13 2020-11-10 Sony Corporation Display control device, display control method, and program
US10115210B2 (en) * 2013-11-13 2018-10-30 Sony Corporation Display control device, display control method, and program
US10198515B1 (en) 2013-12-10 2019-02-05 Palantir Technologies Inc. System and method for aggregating data from a plurality of data sources
US11138279B1 (en) 2013-12-10 2021-10-05 Palantir Technologies Inc. System and method for aggregating data from a plurality of data sources
US9727622B2 (en) 2013-12-16 2017-08-08 Palantir Technologies, Inc. Methods and systems for analyzing entity performance
US10025834B2 (en) 2013-12-16 2018-07-17 Palantir Technologies Inc. Methods and systems for analyzing entity performance
US9734217B2 (en) 2013-12-16 2017-08-15 Palantir Technologies Inc. Methods and systems for analyzing entity performance
US9552615B2 (en) 2013-12-20 2017-01-24 Palantir Technologies Inc. Automated database analysis to detect malfeasance
US10356032B2 (en) 2013-12-26 2019-07-16 Palantir Technologies Inc. System and method for detecting confidential information emails
US10805321B2 (en) 2014-01-03 2020-10-13 Palantir Technologies Inc. System and method for evaluating network threats and usage
US10230746B2 (en) 2014-01-03 2019-03-12 Palantir Technologies Inc. System and method for evaluating network threats and usage
US10901583B2 (en) 2014-01-03 2021-01-26 Palantir Technologies Inc. Systems and methods for visual definition of data associations
US10120545B2 (en) 2014-01-03 2018-11-06 Palantir Technologies Inc. Systems and methods for visual definition of data associations
US9043696B1 (en) 2014-01-03 2015-05-26 Palantir Technologies Inc. Systems and methods for visual definition of data associations
US9923925B2 (en) 2014-02-20 2018-03-20 Palantir Technologies Inc. Cyber security sharing and identification system
US10365804B1 (en) * 2014-02-20 2019-07-30 Google Llc Manipulation of maps as documents
US9483162B2 (en) 2014-02-20 2016-11-01 Palantir Technologies Inc. Relationship visualizations
US10402054B2 (en) 2014-02-20 2019-09-03 Palantir Technologies Inc. Relationship visualizations
US9009827B1 (en) 2014-02-20 2015-04-14 Palantir Technologies Inc. Security sharing system
US10873603B2 (en) 2014-02-20 2020-12-22 Palantir Technologies Inc. Cyber security sharing and identification system
US10795723B2 (en) 2014-03-04 2020-10-06 Palantir Technologies Inc. Mobile tasks
US10180977B2 (en) 2014-03-18 2019-01-15 Palantir Technologies Inc. Determining and extracting changed data from a data source
US10853454B2 (en) 2014-03-21 2020-12-01 Palantir Technologies Inc. Provider portal
US9836580B2 (en) 2014-03-21 2017-12-05 Palantir Technologies Inc. Provider portal
USD780211S1 (en) 2014-04-22 2017-02-28 Google Inc. Display screen with graphical user interface or portion thereof
USD868092S1 (en) 2014-04-22 2019-11-26 Google Llc Display screen with graphical user interface or portion thereof
USD835147S1 (en) 2014-04-22 2018-12-04 Google Llc Display screen with graphical user interface or portion thereof
USD780210S1 (en) 2014-04-22 2017-02-28 Google Inc. Display screen with graphical user interface or portion thereof
USD780794S1 (en) 2014-04-22 2017-03-07 Google Inc. Display screen with graphical user interface or portion thereof
USD780795S1 (en) 2014-04-22 2017-03-07 Google Inc. Display screen with graphical user interface or portion thereof
USD780796S1 (en) 2014-04-22 2017-03-07 Google Inc. Display screen with graphical user interface or portion thereof
USD780777S1 (en) * 2014-04-22 2017-03-07 Google Inc. Display screen with graphical user interface or portion thereof
USD780797S1 (en) 2014-04-22 2017-03-07 Google Inc. Display screen with graphical user interface or portion thereof
USD830407S1 (en) 2014-04-22 2018-10-09 Google Llc Display screen with graphical user interface or portion thereof
USD830399S1 (en) 2014-04-22 2018-10-09 Google Llc Display screen with graphical user interface or portion thereof
USD781337S1 (en) 2014-04-22 2017-03-14 Google Inc. Display screen with graphical user interface or portion thereof
US11860923B2 (en) 2014-04-22 2024-01-02 Google Llc Providing a thumbnail image that follows a main image
USD829737S1 (en) 2014-04-22 2018-10-02 Google Llc Display screen with graphical user interface or portion thereof
USD781317S1 (en) * 2014-04-22 2017-03-14 Google Inc. Display screen with graphical user interface or portion thereof
USD781318S1 (en) * 2014-04-22 2017-03-14 Google Inc. Display screen with graphical user interface or portion thereof
USD791813S1 (en) 2014-04-22 2017-07-11 Google Inc. Display screen with graphical user interface or portion thereof
US9972121B2 (en) * 2014-04-22 2018-05-15 Google Llc Selecting time-distributed panoramic images for display
USD1008302S1 (en) 2014-04-22 2023-12-19 Google Llc Display screen with graphical user interface or portion thereof
USD1006046S1 (en) 2014-04-22 2023-11-28 Google Llc Display screen with graphical user interface or portion thereof
USD791811S1 (en) 2014-04-22 2017-07-11 Google Inc. Display screen with graphical user interface or portion thereof
USD994696S1 (en) 2014-04-22 2023-08-08 Google Llc Display screen with graphical user interface or portion thereof
USD792460S1 (en) 2014-04-22 2017-07-18 Google Inc. Display screen with graphical user interface or portion thereof
USD933691S1 (en) 2014-04-22 2021-10-19 Google Llc Display screen with graphical user interface or portion thereof
USD934281S1 (en) 2014-04-22 2021-10-26 Google Llc Display screen with graphical user interface or portion thereof
US11163813B2 (en) 2014-04-22 2021-11-02 Google Llc Providing a thumbnail image that follows a main image
US9934222B2 (en) 2014-04-22 2018-04-03 Google Llc Providing a thumbnail image that follows a main image
USD877765S1 (en) 2014-04-22 2020-03-10 Google Llc Display screen with graphical user interface or portion thereof
US10540804B2 (en) 2014-04-22 2020-01-21 Google Llc Selecting time-distributed panoramic images for display
USD868093S1 (en) 2014-04-22 2019-11-26 Google Llc Display screen with graphical user interface or portion thereof
US20150302633A1 (en) * 2014-04-22 2015-10-22 Google Inc. Selecting time-distributed panoramic images for display
US9857958B2 (en) 2014-04-28 2018-01-02 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive access of, investigation of, and analysis of data objects stored in one or more databases
US10871887B2 (en) 2014-04-28 2020-12-22 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive access of, investigation of, and analysis of data objects stored in one or more databases
US9009171B1 (en) 2014-05-02 2015-04-14 Palantir Technologies Inc. Systems and methods for active column filtering
US9449035B2 (en) 2014-05-02 2016-09-20 Palantir Technologies Inc. Systems and methods for active column filtering
US9619557B2 (en) 2014-06-30 2017-04-11 Palantir Technologies, Inc. Systems and methods for key phrase characterization of documents
US11341178B2 (en) 2014-06-30 2022-05-24 Palantir Technologies Inc. Systems and methods for key phrase characterization of documents
US10162887B2 (en) 2014-06-30 2018-12-25 Palantir Technologies Inc. Systems and methods for key phrase characterization of documents
US10180929B1 (en) 2014-06-30 2019-01-15 Palantir Technologies, Inc. Systems and methods for identifying key phrase clusters within documents
US10929436B2 (en) 2014-07-03 2021-02-23 Palantir Technologies Inc. System and method for news events detection and visualization
US9021260B1 (en) 2014-07-03 2015-04-28 Palantir Technologies Inc. Malware data item analysis
US9785773B2 (en) 2014-07-03 2017-10-10 Palantir Technologies Inc. Malware data item analysis
US10798116B2 (en) 2014-07-03 2020-10-06 Palantir Technologies Inc. External malware data item clustering and analysis
US9998485B2 (en) 2014-07-03 2018-06-12 Palantir Technologies, Inc. Network intrusion data item clustering and analysis
US9202249B1 (en) 2014-07-03 2015-12-01 Palantir Technologies Inc. Data item clustering and analysis
US9344447B2 (en) 2014-07-03 2016-05-17 Palantir Technologies Inc. Internal malware data item clustering and analysis
US9298678B2 (en) 2014-07-03 2016-03-29 Palantir Technologies Inc. System and method for news events detection and visualization
US9256664B2 (en) 2014-07-03 2016-02-09 Palantir Technologies Inc. System and method for news events detection and visualization
US10572496B1 (en) 2014-07-03 2020-02-25 Palantir Technologies Inc. Distributed workflow system and database with access controls for city resiliency
US9880696B2 (en) 2014-09-03 2018-01-30 Palantir Technologies Inc. System for providing dynamic linked panels in user interface
US9454281B2 (en) 2014-09-03 2016-09-27 Palantir Technologies Inc. System for providing dynamic linked panels in user interface
US10866685B2 (en) 2014-09-03 2020-12-15 Palantir Technologies Inc. System for providing dynamic linked panels in user interface
US10360702B2 (en) 2014-10-03 2019-07-23 Palantir Technologies Inc. Time-series analysis system
US11004244B2 (en) 2014-10-03 2021-05-11 Palantir Technologies Inc. Time-series analysis system
US9767172B2 (en) 2014-10-03 2017-09-19 Palantir Technologies Inc. Data aggregation and analysis system
US9501851B2 (en) 2014-10-03 2016-11-22 Palantir Technologies Inc. Time-series analysis system
US10664490B2 (en) 2014-10-03 2020-05-26 Palantir Technologies Inc. Data aggregation and analysis system
US9785328B2 (en) 2014-10-06 2017-10-10 Palantir Technologies Inc. Presentation of multivariate data on a graphical user interface of a computing system
US10437450B2 (en) 2014-10-06 2019-10-08 Palantir Technologies Inc. Presentation of multivariate data on a graphical user interface of a computing system
US11275753B2 (en) 2014-10-16 2022-03-15 Palantir Technologies Inc. Schematic and database linking system
US9984133B2 (en) 2014-10-16 2018-05-29 Palantir Technologies Inc. Schematic and database linking system
US10853338B2 (en) 2014-11-05 2020-12-01 Palantir Technologies Inc. Universal data pipeline
US10191926B2 (en) 2014-11-05 2019-01-29 Palantir Technologies, Inc. Universal data pipeline
US9946738B2 (en) 2014-11-05 2018-04-17 Palantir Technologies, Inc. Universal data pipeline
US9558352B1 (en) 2014-11-06 2017-01-31 Palantir Technologies Inc. Malicious software detection in a computing system
US10135863B2 (en) 2014-11-06 2018-11-20 Palantir Technologies Inc. Malicious software detection in a computing system
US9043894B1 (en) 2014-11-06 2015-05-26 Palantir Technologies Inc. Malicious software detection in a computing system
US10728277B2 (en) 2014-11-06 2020-07-28 Palantir Technologies Inc. Malicious software detection in a computing system
US9898528B2 (en) 2014-12-22 2018-02-20 Palantir Technologies Inc. Concept indexing among database of documents using machine learning techniques
US10362133B1 (en) 2014-12-22 2019-07-23 Palantir Technologies Inc. Communication data processing architecture
US10447712B2 (en) 2014-12-22 2019-10-15 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive investigation of bad actor behavior based on automatic clustering of related data in various data structures
US11252248B2 (en) 2014-12-22 2022-02-15 Palantir Technologies Inc. Communication data processing architecture
US10552994B2 (en) 2014-12-22 2020-02-04 Palantir Technologies Inc. Systems and interactive user interfaces for dynamic retrieval, analysis, and triage of data items
US9367872B1 (en) 2014-12-22 2016-06-14 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive investigation of bad actor behavior based on automatic clustering of related data in various data structures
US9589299B2 (en) 2014-12-22 2017-03-07 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive investigation of bad actor behavior based on automatic clustering of related data in various data structures
US9817563B1 (en) 2014-12-29 2017-11-14 Palantir Technologies Inc. System and method of generating data points from one or more data stores of data items for chart creation and manipulation
US9335911B1 (en) 2014-12-29 2016-05-10 Palantir Technologies Inc. Interactive user interface for dynamic data analysis exploration and query processing
US10838697B2 (en) 2014-12-29 2020-11-17 Palantir Technologies Inc. Storing logical units of program code generated using a dynamic programming notebook user interface
US10157200B2 (en) 2014-12-29 2018-12-18 Palantir Technologies Inc. Interactive user interface for dynamic data analysis exploration and query processing
US9870389B2 (en) 2014-12-29 2018-01-16 Palantir Technologies Inc. Interactive user interface for dynamic data analysis exploration and query processing
US9870205B1 (en) 2014-12-29 2018-01-16 Palantir Technologies Inc. Storing logical units of program code generated using a dynamic programming notebook user interface
US10127021B1 (en) 2014-12-29 2018-11-13 Palantir Technologies Inc. Storing logical units of program code generated using a dynamic programming notebook user interface
US10552998B2 (en) 2014-12-29 2020-02-04 Palantir Technologies Inc. System and method of generating data points from one or more data stores of data items for chart creation and manipulation
US10372879B2 (en) 2014-12-31 2019-08-06 Palantir Technologies Inc. Medical claims lead summary report generation
US11030581B2 (en) 2014-12-31 2021-06-08 Palantir Technologies Inc. Medical claims lead summary report generation
US11302426B1 (en) 2015-01-02 2022-04-12 Palantir Technologies Inc. Unified data interface and system
US10387834B2 (en) 2015-01-21 2019-08-20 Palantir Technologies Inc. Systems and methods for accessing and storing snapshots of a remote application in a document
US9727560B2 (en) 2015-02-25 2017-08-08 Palantir Technologies Inc. Systems and methods for organizing and identifying documents via hierarchies and dimensions of tags
US10474326B2 (en) 2015-02-25 2019-11-12 Palantir Technologies Inc. Systems and methods for organizing and identifying documents via hierarchies and dimensions of tags
US9891808B2 (en) 2015-03-16 2018-02-13 Palantir Technologies Inc. Interactive user interfaces for location-based data analysis
US10459619B2 (en) 2015-03-16 2019-10-29 Palantir Technologies Inc. Interactive user interfaces for location-based data analysis
US9886467B2 (en) 2015-03-19 2018-02-06 Plantir Technologies Inc. System and method for comparing and visualizing data entities and data entity series
US9571972B2 (en) * 2015-04-24 2017-02-14 International Business Machines Corporation Managing crowd sourced data acquisition
US9571971B2 (en) 2015-04-24 2017-02-14 International Business Machines Corporation Managing crowd sourced data acquisition
US10628834B1 (en) 2015-06-16 2020-04-21 Palantir Technologies Inc. Fraud lead detection system for efficiently processing database-stored data and automatically generating natural language explanatory information of system results for display in interactive user interfaces
US10636097B2 (en) 2015-07-21 2020-04-28 Palantir Technologies Inc. Systems and models for data analytics
US9454785B1 (en) 2015-07-30 2016-09-27 Palantir Technologies Inc. Systems and user interfaces for holistic, data-driven investigation of bad actor behavior based on clustering and scoring of related data
US10223748B2 (en) 2015-07-30 2019-03-05 Palantir Technologies Inc. Systems and user interfaces for holistic, data-driven investigation of bad actor behavior based on clustering and scoring of related data
US11501369B2 (en) 2015-07-30 2022-11-15 Palantir Technologies Inc. Systems and user interfaces for holistic, data-driven investigation of bad actor behavior based on clustering and scoring of related data
US9996595B2 (en) 2015-08-03 2018-06-12 Palantir Technologies, Inc. Providing full data provenance visualization for versioned datasets
US10484407B2 (en) 2015-08-06 2019-11-19 Palantir Technologies Inc. Systems, methods, user interfaces, and computer-readable media for investigating potential malicious communications
US10444940B2 (en) 2015-08-17 2019-10-15 Palantir Technologies Inc. Interactive geospatial map
US10444941B2 (en) 2015-08-17 2019-10-15 Palantir Technologies Inc. Interactive geospatial map
US10489391B1 (en) 2015-08-17 2019-11-26 Palantir Technologies Inc. Systems and methods for grouping and enriching data items accessed from one or more databases for presentation in a user interface
US10922404B2 (en) 2015-08-19 2021-02-16 Palantir Technologies Inc. Checkout system executable code monitoring, and user account compromise determination system
US10102369B2 (en) 2015-08-19 2018-10-16 Palantir Technologies Inc. Checkout system executable code monitoring, and user account compromise determination system
US10853378B1 (en) 2015-08-25 2020-12-01 Palantir Technologies Inc. Electronic note management via a connected entity graph
US11150917B2 (en) 2015-08-26 2021-10-19 Palantir Technologies Inc. System for data aggregation and analysis of data from a plurality of data sources
US11934847B2 (en) 2015-08-26 2024-03-19 Palantir Technologies Inc. System for data aggregation and analysis of data from a plurality of data sources
US11048706B2 (en) 2015-08-28 2021-06-29 Palantir Technologies Inc. Malicious activity detection system capable of efficiently processing data accessed from databases and generating alerts for display in interactive user interfaces
US10346410B2 (en) 2015-08-28 2019-07-09 Palantir Technologies Inc. Malicious activity detection system capable of efficiently processing data accessed from databases and generating alerts for display in interactive user interfaces
US9898509B2 (en) 2015-08-28 2018-02-20 Palantir Technologies Inc. Malicious activity detection system capable of efficiently processing data accessed from databases and generating alerts for display in interactive user interfaces
US10706434B1 (en) 2015-09-01 2020-07-07 Palantir Technologies Inc. Methods and systems for determining location information
US9965534B2 (en) 2015-09-09 2018-05-08 Palantir Technologies, Inc. Domain-specific language for dataset transformations
US11080296B2 (en) 2015-09-09 2021-08-03 Palantir Technologies Inc. Domain-specific language for dataset transformations
US10296617B1 (en) 2015-10-05 2019-05-21 Palantir Technologies Inc. Searches of highly structured data
US10572487B1 (en) 2015-10-30 2020-02-25 Palantir Technologies Inc. Periodic database search manager for multiple data sources
US10678860B1 (en) 2015-12-17 2020-06-09 Palantir Technologies, Inc. Automatic generation of composite datasets based on hierarchical fields
US11625529B2 (en) 2015-12-29 2023-04-11 Palantir Technologies Inc. Real-time document annotation
US10540061B2 (en) 2015-12-29 2020-01-21 Palantir Technologies Inc. Systems and interactive user interfaces for automatic generation of temporal representation of data objects
US10839144B2 (en) 2015-12-29 2020-11-17 Palantir Technologies Inc. Real-time document annotation
US9823818B1 (en) 2015-12-29 2017-11-21 Palantir Technologies Inc. Systems and interactive user interfaces for automatic generation of temporal representation of data objects
US10437612B1 (en) 2015-12-30 2019-10-08 Palantir Technologies Inc. Composite graphical interface with shareable data-objects
US10698938B2 (en) 2016-03-18 2020-06-30 Palantir Technologies Inc. Systems and methods for organizing and identifying documents via hierarchies and dimensions of tags
US10698594B2 (en) 2016-07-21 2020-06-30 Palantir Technologies Inc. System for providing dynamic linked panels in user interface
US10719188B2 (en) 2016-07-21 2020-07-21 Palantir Technologies Inc. Cached database and synchronization system for providing dynamic linked panels in user interface
US10324609B2 (en) 2016-07-21 2019-06-18 Palantir Technologies Inc. System for providing dynamic linked panels in user interface
US10437840B1 (en) 2016-08-19 2019-10-08 Palantir Technologies Inc. Focused probabilistic entity resolution from multiple data sources
US10318630B1 (en) 2016-11-21 2019-06-11 Palantir Technologies Inc. Analysis of large bodies of textual data
US10460602B1 (en) 2016-12-28 2019-10-29 Palantir Technologies Inc. Interactive vehicle information mapping system
US10956552B2 (en) * 2017-04-03 2021-03-23 Cleveland State University Shoulder-surfing resistant authentication methods and systems
US20180285550A1 (en) * 2017-04-03 2018-10-04 Cleveland State University Shoulder-surfing resistant authentication methods and systems
US10956406B2 (en) 2017-06-12 2021-03-23 Palantir Technologies Inc. Propagated deletion of database records and derived data
US10403011B1 (en) 2017-07-18 2019-09-03 Palantir Technologies Inc. Passing system with an interactive user interface
US11599369B1 (en) 2018-03-08 2023-03-07 Palantir Technologies Inc. Graphical user interface configuration system
US10754822B1 (en) 2018-04-18 2020-08-25 Palantir Technologies Inc. Systems and methods for ontology migration
US10885021B1 (en) 2018-05-02 2021-01-05 Palantir Technologies Inc. Interactive interpreter and graphical user interface
US11119630B1 (en) 2018-06-19 2021-09-14 Palantir Technologies Inc. Artificial intelligence assisted evaluations and user interface for same
US10824296B2 (en) * 2018-08-22 2020-11-03 International Business Machines Corporation Configuring an application for launching
US20200064981A1 (en) * 2018-08-22 2020-02-27 International Business Machines Corporation Configuring an application for launching

Also Published As

Publication number Publication date
US20180267983A1 (en) 2018-09-20
US20210073305A1 (en) 2021-03-11
US10776442B2 (en) 2020-09-15
US8990239B2 (en) 2015-03-24
WO2008024949A2 (en) 2008-02-28
US9881093B2 (en) 2018-01-30
WO2008024949A3 (en) 2008-08-14
US20150302018A1 (en) 2015-10-22
US20120243804A1 (en) 2012-09-27
US20100235350A1 (en) 2010-09-16

Similar Documents

Publication Publication Date Title
US20210073305A1 (en) Systems and methods for photograph mapping
CA2658304C (en) Panoramic ring user interface
US9916673B2 (en) Method and apparatus for rendering a perspective view of objects and content related thereto for location-based services on mobile device
CN102129812B (en) Viewing media in the context of street-level images
US9269190B1 (en) System and method for displaying transitions between map views
US8812990B2 (en) Method and apparatus for presenting a first person world view of content
KR101298422B1 (en) Techniques for manipulating panoramas
US20120221552A1 (en) Method and apparatus for providing an active search user interface element
US20110283223A1 (en) Method and apparatus for rendering user interface for location-based service having main view portion and preview portion
US20120240077A1 (en) Method and apparatus for displaying interactive preview information in a location-based user interface
Hoelzl et al. Google Street View: navigating the operative image
US20130132846A1 (en) Multiple concurrent contributor mapping system and method
Beeharee et al. Exploiting real world knowledge in ubiquitous applications
US10521943B1 (en) Lot planning
Carboni et al. GeoPix: image retrieval on the geo web, from camera click to mouse click
US10102597B1 (en) Internet based interactive graphical interface for real estate listings
Yan et al. Design and implementation of a mobile gis for field data collection
Bajjali 3-D Visualization
US20110258585A1 (en) Apparatus, Method, Computer Program and User Interface
Hanchard The Socio-technical Development of Digital Maps
Asra et al. Web Based GIS for Tourism in Mysore and adjoining Mandya District
Liu B. Sc., Nankai, Tianjin, China, 1987
KR20100066343A (en) Apparatus and method for updating spatial information digital contents

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION