US20080077597A1 - Systems and methods for photograph mapping - Google Patents

Systems and methods for photograph mapping Download PDF

Info

Publication number
US20080077597A1
US20080077597A1 US11844203 US84420307A US2008077597A1 US 20080077597 A1 US20080077597 A1 US 20080077597A1 US 11844203 US11844203 US 11844203 US 84420307 A US84420307 A US 84420307A US 2008077597 A1 US2008077597 A1 US 2008077597A1
Authority
US
Grant status
Application
Patent type
Prior art keywords
user
image
images
spot
view
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11844203
Inventor
Lance Butler
Original Assignee
Lance Butler
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/30Information retrieval; Database structures therefor ; File system structures therefor
    • G06F17/30861Retrieval from the Internet, e.g. browsers
    • G06F17/30864Retrieval from the Internet, e.g. browsers by querying, e.g. search engines or meta-search engines, crawling techniques, push systems
    • G06F17/3087Spatially dependent indexing and retrieval, e.g. location dependent results to queries
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/30Information retrieval; Database structures therefor ; File system structures therefor
    • G06F17/30244Information retrieval; Database structures therefor ; File system structures therefor in image databases
    • G06F17/30265Information retrieval; Database structures therefor ; File system structures therefor in image databases based on information manually generated or based on information not derived from the image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/30Information retrieval; Database structures therefor ; File system structures therefor
    • G06F17/30244Information retrieval; Database structures therefor ; File system structures therefor in image databases
    • G06F17/30265Information retrieval; Database structures therefor ; File system structures therefor in image databases based on information manually generated or based on information not derived from the image data
    • G06F17/30268Information retrieval; Database structures therefor ; File system structures therefor in image databases based on information manually generated or based on information not derived from the image data using information manually generated, e.g. tags, keywords, comments, manually generated location and time information
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/30Information retrieval; Database structures therefor ; File system structures therefor
    • G06F17/30286Information retrieval; Database structures therefor ; File system structures therefor in structured data stores
    • G06F17/30386Retrieval requests
    • G06F17/30424Query processing
    • G06F17/30522Query processing with adaptation to user needs
    • G06F17/3053Query processing with adaptation to user needs using ranking

Abstract

Systems and methods for photograph mapping are disclosed herein. In one embodiment a first digital image and at least one user-generated datum is received from at least one user. The first image is geographically organized according to the at least one datum. The first image is associated with at least one location and at least one direction. The first image is provided from a first person perspective to a user in response to a request.

Description

    PRIORITY CLAIM
  • This application claims the benefit of U.S. Provisional Application Ser. No. 60/840,134 filed on Aug. 24, 2006, and is herein incorporated by reference in its entirety.
  • BACKGROUND OF THE INVENTION
  • Consumers have an enormous and increasing need for visual detail and information on real-world places. This is easily evidenced by the aggressive moves into mapping and local search by Google, Yahoo, Microsoft, Amazon and others.
  • SUMMARY OF THE INVENTION
  • Systems and methods for photograph mapping are disclosed herein. In one embodiment a first digital image and at least one user-generated datum is received from at least one user. The first image is geographically organized according to the at least one datum. The first image is associated with at least one location and at least one direction. The first image is provided from a first person perspective to a user in response to a request.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The preferred and alternative embodiments of the present invention are described in detail below with reference to the following drawings.
  • FIG. 1 illustrates an example of a computing system environment 100 on which an embodiment of the invention may be implemented;
  • FIG. 2 is a functional block diagram of an exemplary operating environment in which an embodiment of the invention can be implemented;
  • FIG. 3 shows a “Find Places” GUI finding locations and views;
  • FIG. 4 shows a “Walk Around” GUI in an embodiment;
  • FIG. 5 shows an “Upload” GUI in an embodiment;
  • FIG. 6A shows a sort GUI in an embodiment;
  • FIG. 6B shows an alternate embodiment of a Sort GUI;
  • FIG. 7 shows a Locate/Link GUI in an embodiment;
  • FIGS. 8A-E show a Save Locale button in multiple embodiments; and
  • FIG. 9 shows an example of a restaurant in one embodiment.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • An embodiment of the current invention provides tools that allow anyone in the world to take photos, upload them to a database, and/or put them together to create their own navigable locales. It is just as easy to include indoor data as outdoor data. Users can seamlessly navigate between outdoor and indoor locations.
  • An embodiment of the current invention provides tools to make it relatively quick and easy to build navigable locales. Because one embodiment of the tools works within a standard browser, the tools can be made available to the general public anywhere in the world. Other embodiments can be built as standalone or downloadable applications to provide other functionality.
  • An embodiment of the invention works with still images, and any device capable of capturing digital images and transferring the images to a computer-readable memory is compatible with the invention. Various embodiments require no special hardware or software, no plug-ins or downloaded applications; a standard web browser is all that is needed. Embodiments give users the ability to create and view geo-located, navigationally linked images of real-world locations, and allows anyone to blanket an area with photos in an organized way so others can use their web browser to visually “walk around” that area with a first-person perspective. Any type of place, anywhere in the world: businesses, homes, parks, travel destinations, etc. Embodiments are designed to allow tight physical density for this type of visual navigation, but also allow for arbitrary physical density, where users are free to pack spots as tight or loose as they wish.
  • Innovations of the current invention include an easy-to-use system for managing large amounts of photographic images and/or linking them together, as well as an easy way to navigate these photographs.
  • Users of the current invention can interconnect locales made by many people to completely “blanket” a region with images, and eventually the entire world. By allowing users to interlink their locales, embodiments enable users to create their own virtual communities oriented around very specific places and/or activities or interests. Users don't even need to live in or near these places to contribute.
  • Users can create “tours” (or “trails” or “threads”) highlighting spots or views of interest through any existing locales, including locales of other users. The tours are like guided tours that users can create for others to follow. When a user follows a particular tour, the user is guided through an ordered set of moves, turns and/or zooms to see whatever the creator of the tour wants to highlight and/or talk about. The tour creator can make a set of comments that take priority over the standard comments, showing prominently.
  • Locales are geo-located, and can be tagged, blogged and rated, and are fully searchable. Users can read and write comments or questions for any locale or image in the world in place-logs (or “plogs”). Users can search the various data by geographic location, by tags, by comments or by other criteria.
  • Embodiments provide tools that allow users anywhere in the world to create and organize large amounts of image data quickly and efficiently. Embodiments can include integrated support for tags, comments, locale interlinks, zooms, user requests, and other data associated with locales and/or images. Embodiments can allow many types of searches, including but not limited to geographic searches based on location, tag searches based on user-entered tags and place or landmark name searches. There are general map browsing tools provided by the embedded map engine, i.e. panning, zooming and changing map types.
  • With embodiments of the current invention, users can view specific areas or items of interest in high visual detail without requiring all the imagery for the spot to be of high resolution, and with no limits on the detail magnification. This allows zooming in on a feature of interest of arbitrarily high resolution, so extreme details can be provided, and allows far less data to be transferred to the client to show a particular view.
  • Aspects of the invention provide users the ability to visually navigate remote places in a new, novel and natural way, using nothing more than still photos and a basic web browser.
  • Aspects of the invention provide tools that allow anyone to create visually navigable locales from simple photos. The process is not limited by proprietary tools or knowledge, and does not need special equipment. Nothing is required beyond an inexpensive digital camera and a standard web browser.
  • In aspects of the invention you have the freedom to move around similarly to how you move in real life, i.e. you walk forward to other spots within your forward vision, and turn left/right/up/down as if you were there. Navigational linkage between spots is explicit, not merely implied by location and direction. This creates logically cohesive data.
  • Spots can be created wherever anyone can take pictures, i.e. indoor/outdoor, urban settings or in the middle of a forest.
  • Aspects of the invention allow for navigationally interlinked locales, making it possible for many individuals to collectively blanket entire cities, and indeed the world. Either through planned collaboration, or after the fact. Interlinked locales make the whole more valuable than the sum of the parts.
  • Aspects of the invention provide virtual reality (VR)-like navigation and views, but unlike traditional VR allows for virtually unlimited detail of objects or points of interest without significantly increasing data or bandwidth requirements.
  • Aspects of the invention include geo-location of photos by simply clicking on a high-resolution map in a browser, using no plug-ins or extra software.
  • Aspects of the invention provide a new and novel visual display of the “level of dirtiness” of documents or local data.
  • With aspects of the invention, anyone can document a particular location over time. Different times of day, times of year, or changes over an extended period of time.
  • Aspects of the invention provide for unique types of “embedded ads” directly within the first-person real-world views.
  • Aspects of the invention have a proprietary provisional data scheme to greatly reduce the amount of “bad data” that gets into the system.
  • A method of the invention includes receiving user-generated digital images and user-generated data about the images from users; organizing and linking the images geographically according to the user-generated data, and displaying the images in response to a user request.
  • An embodiment includes but is not limited to five types of objects: views, spots, locales, zooms and trails.
  • A view as described herein is a single digital image, which can be a photographic image. Each view is associated with a spot and can have a defined orientation associated with the view, including a lateral (compass) direction denoting a direction a camera was facing when the image was captured, as well as a vertical angle relative to a horizon of the location. Views can include time references including time of day, date, and season.
  • A zoom has described herein can be a close-up image of an area or point of interest within a view, but it need not be simply a higher resolution zoomed version of the original view. This gives users creative flexibility to make interesting and creative zoomed images. Views and zooms can also include tags, which are words or short phrases. In addition to latitude and longitude, views can also include elevation. A view's elevation can be specified by “floor” (as in a building), or in feet above sea level.
  • A spot has described herein has one or more views, taken from a single geographic location, facing in different lateral and/or vertical directions. Each spot has a precise geographic location, which can be indicated by latitude and longitude, elevation, GPS coordinates, or other means. In one embodiment, spots include 8 lateral views, which can align, for example, with the 8 major compass directions. In other embodiments, a spot can have any number of views associated with the spot, facing in arbitrary directions. Horizontally oriented lateral views give the user the ability to rotate the view left and right, conversely views taken at other angles relative to horizontal allow the user to see up or down. Furthermore, views taken from the same spot at different times can also be included, allowing users to watch how a location changes over time.
  • A locale, as described herein is a coverage area of one or more spots. Spots can be interlinked within a locale, so users can navigate between spots. Locales can be interlinked when their borders become sufficiently close or overlap, allowing users to navigate among locales without leaving their first-person frame of reference. Interlinked locales are locales with linkage from one or more existing spots in one locale to one or more existing spots in another locale.
  • A trail as described herein is defined as paths that lead a user through a specific sequence of views and/or zooms of one or more spots in one or more locales. Users can create a trail through their own locales or other users' locales, or a combination of both.
  • FIG. 1 illustrates an example of a computing system environment 100 on which an embodiment of the invention may be implemented. The computing system environment 100, as illustrated, is an example of a suitable computing environment, however it is appreciated that other environments, systems, and devices may be used to implement various embodiments of the invention as described in more detail below.
  • Embodiments of the invention are operational with numerous other general-purpose or special purpose computing system environments or configurations. Examples of well-known computing systems, environments, and/or configurations that may be suitable for use with embodiments of the invention include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set-top boxes, programmable consumer electronics, mobile telephones, portable data assistants, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like.
  • Embodiments of the invention may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. Embodiments of the invention may also be practiced in distributed-computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
  • With reference to FIG. 1, an exemplary system for implementing an embodiment of the invention includes a computing device, such as computing device 100. The computing device 100 typically includes at least one processing unit 102 and memory 104.
  • Depending on the exact configuration and type of computing device, memory 104 may be volatile (such as random-access memory (RAM)), nonvolatile (such as read-only memory (ROM), flash memory, etc.) or some combination of the two. This most basic configuration is illustrated in FIG. 1 by dashed line 106.
  • Additionally, the device 100 may have additional features, aspects, and functionality. For example, the device 100 may include additional storage (removable and/or non-removable) which may take the form of, but is not limited to, magnetic or optical disks or tapes. Such additional storage is illustrated in FIG. 1 by removable storage 108 and non-removable storage 110. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. Memory 104, removable storage 108 and non-removable storage 110 are all examples of computer storage media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by device 100. Any such computer storage media may be part of device 100.
  • The device 100 may also contain a communications connection 112 that allows the device to communicate with other devices. The Communications connection 112 is an example of communication media. Communication media typically embodies computer-readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave or other transport mechanism and includes any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, the communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio-frequency (RF), infrared and other wireless media. The term computer-readable media as used herein includes both storage media and communication media.
  • The device 100 may also have an input device 114 such as keyboard, mouse, pen, voice-input device, touch-input device, etc. Further, an output device 116 such as a display, speakers, printer, etc. may also be included. Additional input devices 114 and output devices 116 may be included depending on a desired functionality of the device 100.
  • Referring now to FIG. 2, an embodiment of the present invention takes the form of an exemplary computer network system 200. The system 200 includes an electronic client device 210, such as a personal computer or workstation or portable data assistant or mobile telephone, that is linked via a communication medium, such as a network 220 (e.g., the Internet), to an electronic device or system, such as a server 230. The server 230 may further be coupled, or otherwise have access, to a database 240 and a computer system 260. Although the embodiment illustrated in FIG. 2 includes one server 230 coupled to one client device 210 via the network 220, it should be recognized that embodiments of the invention may be implemented using one or more such client devices coupled to one or more such servers.
  • The client device 210 and the server 230 may include all or fewer than all of the features associated with the device 100 illustrated in and discussed with reference to FIG. 1. The client device 210 includes or is otherwise coupled to a computer screen or display 250. The client device 210 may be used for various purposes such as network- and local-computing processes.
  • The client device 210 is linked via the network 220 to server 230 so that computer programs, such as, for example, a browser, running on the client device 210 can cooperate in two-way communication with server 230. The server 230 may be coupled to database 240 to retrieve information therefrom and to store information thereto. Database 240 may include a plurality of different tables (not shown) that can be used by the server 230 to enable performance of various aspects of embodiments of the invention. Additionally, the server 230 may be coupled to the computer system 260 in a manner allowing the server to delegate certain processing functions to the computer system.
  • Still referring to FIG. 2, and in operation according to an embodiment of the invention, a user (not shown) of the client device 210 desiring to electronically organize and map a series of photographs.
  • Still referring to FIG. 2, and in operation according to an embodiment of the invention, a user (not shown) of the client device 210 desiring to electronically map photographs uses a browser application running on the client device to access web content, which may, but need not, be served by the server 230. Specifically, by employing an appropriate uniform resource locator (URL) in a known manner, the user may download from the server 230 and install on the client device 210 a user interface module 280 comprising computer-executable instructions as described more fully hereinafter. Alternatively, the user may receive the module 280 on a tangible computer-readable medium (not shown), such as, for example, a CD-ROM, and subsequently install the module on the client device 210 from the medium.
  • Upon execution of the module 280 by the client device 210, and referring to FIG. 3, a user interface 130 may be displayed on the display device 250.
  • Embodiments of the invention include various graphical user interfaces (GUIs) for allowing a user to interact with the embodiments. A “Find Places” GUI 130 for finding locations and views is shown in FIG. 3. The GUI 130 includes a map 132 with a zoom bar 134 and navigation buttons 136 for changing an area displayed by the map 132, including a “Center” button 138. The map 132 includes markers 140 which can denote locations of views, spots, and locales. Users can jump directly to any city, state, country or zip code by inputting the information in a “Place” field 42, or browse at random among markers 140 on the map 132, or search for or filter based on, tags with a “Tags” field 148. A “Current Locale” window 131 displays a name of, owner of, and number of spots contained in, the current locale. The window 131 can also display clickable options such as zooming in to or walking around the current locale. A legend 144 can be included, showing the spot density of each locale, or any other information of interest. A clickable list 146 of locales can be included, allowing a user to select a locale from the list 46 for display on the map 132. Tabs 149 can be included to allow users to quickly navigate between GUI functional areas.
  • A “Walk Around” GUI 150 is shown in FIG. 4. The GUI 150 includes an overhead map view 152. Users can jump directly to a spot or feature of interest by clicking on any marker 140 in the overhead map view 152. When a user turns, or moves to another spot, the map 152 updates to the new location and orientation. The map 152 also shows other spots that are in the vicinity, and the user can see at a glance which spots have already been “visited” by the type of marker 410 on each spot; in embodiments, empty circles 151 signify unvisited spots and filled circles 153 signify visited spots. A current spot marker 155 is a filled circle with an arrow pointing in the current direction, and a destination marker (not shown) with a ‘+’ symbol can indicate the spot the user moves to if the user performs the “move-forward” action.
  • The overhead map 152 can be a top-down view, including but not limited to a map, satellite imagery, “bird's-eye view” imagery, an architectural floor plan, a hand-sketched diagram and a user-supplied image. A top-down view provides an overview of the area of interest in enough detail to distinguish the positions of a locale's spots. If a custom top-down image is used, then the locale creator first geo-locates the top-down image. In some cases, a distant panoramic side view can be used in place of a top-down image if the area of interest is very linear, for example, a section of seashore.
  • Alongside the overhead map 152 is a view window 154 for displaying digital images, which can include real, eye-level, photographic images. Some views have “zooms”, which can be close-up images of specific areas within a view, or information or images related to the view. Clicking on a zoom icon can animate the zoom image so it appears to grow from the position on the screen of its marking identifier into a larger image. The server can provide images to match the real time of day, date, or season of the spot if images taken at different times are associated with the spot.
  • Locale, spot, view and feature (not shown) comment areas 154, 156, 158 can display comments made by users about the displayed locale, spot, and view, respectively. Feature comments are related to a feature in the locale which may appear in more than one specific view or spot, i.e. a distant mountain peak that is visible from all easterly views in a locale. Users can also make requests to be displayed in the areas 154, 156, 158, such as a request that digital images be captured at a specific spot.
  • Movement, turning and zooming are performed from a first-person perspective; these actions can be performed either by using the keyboard, or by using the mouse or other user interaction. Other interactions include but are not limited to touch-screens, voice commands, and/or motion controls. Users can rotate from a current direction to an adjacent direction. Users can navigate forward, backward, up, down, or sideways, and can navigate directly to other spots in the locale, and the view displayed will change accordingly.
  • Embodiments of the invention allow users to smoothly transition from one view to the next. When the user rotates the current direction, the old view slides out, and as the old view slides out, the new view slides in to take its place (left, right, up or down). Embodiments allow locale creators to specify the amount of overlap between any two adjacent views, which allows the new view panning in to start out partially overlapped by the old view, if desired. In the case of moving forward and backward, the new view can zoom in or out as the old view zooms out or in, and during this process the views can fade out and in. Animated line graphics representing the new view can “float” above the old view and expand or contract to symbolize the movement to the new view.
  • An “Upload” GUI 160 shown in FIG. 5 enables users to upload digital images. The interface 160 includes directions 162 and a “File Upload” field 164. The user selects either an individual .jpeg or other photo file type, or .zip file containing multiple images, and presses a ‘Send File’ button 166. While the file is being uploaded, the upload progress is displayed to the user. Once the file is uploaded, embodiments can process the raw uploaded image files and create multiple resolutions of said images for storage on the server to be used for various purposes. As each image or batch of images is being processed, the status of the processing for that batch can be displayed onscreen for the user to view. Embodiments can allow the user to leave the interface 160 during image processing, and can allow the user to start using processed images before all images have been processed. Downloadable applications or plug-ins can be included, but are not required for uploading images.
  • A “Sort” GUI 170, shown in FIG. 6A, allows users to sort images 171 by spot and orientation. A spot list 172 lists all the spots associated with the current locale, displayed in a “My Locales” area 174. The user can click on any spot name to select a spot and display the spot's images in a “Current Spot” or “Selected Spot” display area 176; the area 176 displays views 177 associated with the current spot.
  • Uploaded images which have not been assigned to a spot appear in the user's “Unsorted Images” bin 178. In an embodiment, thumbnails are arranged in a grid, from which the user can sort the images 171 into spots corresponding to, for example, the physical location(s) from which they were taken. If the images 171 are in left-to-right order, then the user can select the first image, click on a “Grab 8” button 180, and select seven more images in order. Alternatively, if the images 171 are not in left-to-right order, the user can select each image in order to get a left-to-right ordering. The user can add other images associated with a spot, like zooms or up/down shots.
  • When all images are selected for a desired spot, the user clicks a “New Spot” button 82 and is prompted for a spot name. When the user enters a name, a new spot is created, and is added to the spot list 172 for the current locale. Spots can be renamed by clicking a “Rename” button 184 and typing a new name. Geo-located spots can be “dumped” back into the Unsorted Images bin 178 by clicking a “Dump to Unsorted” button 179. When a spot is dumped, its images are kept together as a group, within the Unsorted Images bin, but other references to it are deleted, such as its location on the map 1106.
  • FIG. 6B shows an alternate embodiment of a Sort GUI 170. The Selected Spot area 176 includes eight views 177 associated with the eight major compass directions, and two zooms 185 associated with a view 177 of the spot. Users can associate zooms 185 with views 177 with either the Sort GUI 170 or the Locate/Link GUI 190 (shown below).
  • To associate a zoom 185 with a particular point of interest on a view 177, the user clicks and drags the zoom 185 onto the desired view 177, and a zoom icon 186 is created on the view 177 at that point. After the zoom 185 has been associated with its view 177, the user can reposition the zoom 185 by clicking and dragging a zoom icon 186 representing the zoom 185 within the view 177. The interface 170 lightly constrains the icon 186 within the edges of the associated view 177, to allow easy positioning at the edges, but also allows deletion of the association via a quick dragging motion to escape the edges of the view 177. When the zoom icon 186 has been dragged outside its associated view 177, its graphical representation 186 changes to highlight the fact that if the user releases the icon 186, the association will be deleted.
  • Zooms 85 can be associated with any view 177, multiple different zooms 185 can be associated with a single view 177, and one zoom 185 can be associated with different views 177; for example, when an item of interest lies in an area which visually overlaps two adjacent views 177. Zooms 185 can be merely magnified views of an area of interest, or angles can be completely different to get a better view of an item of interest. The zoom 185 could “peer over, or around” a fence or other obstacle, “peek” inside a window, or through a wall of a building, if a user has access to an image of the inside of the building. A view 177 of the outside of a restaurant could associate a position on the restaurant's front door with a zoom 185 of the restaurant's menu. The zoom 185 could be an image of different times, days or seasons.
  • A Locate/Link GUI 190, shown in FIG. 7, allows users to specify a geographical location for each spot and create links between spots.
  • For existing locales, the user selects a locale from the Locale list 192, and a map 94 including the locale data is displayed in a map pane 196. If the user is locating a new locale the user can move the map 194 to a new location by typing the location name or other identifying information into a “Find Place” field 198 above the map pane 196, or the user can position the map 194 by panning and/or zooming in until they find the correct place.
  • A “Spots” area 1100 lists all spots associated with the current locale. Spots that have already been geo-located will have an indicator, such as a checkbox 1102.
  • To geo-locate a spot, a user displays a desired location of the spot in the map pane 196 and selects a spot to geo-locate by clicking on the spot name in the Spots area 1100—a selected spot's views will be displayed in the Selected Spot area 1104, above the map pane 196. The user then clicks on the map 194 at the desired location of the spot and an indicator 1106 appears at the desired location on the map 194. Once placed, a spot's exact location can be adjusted in fine or coarse detail by using the keyboard and/or mouse. Other embodiments can correlate spots with a set of previously defined geographic coordinates, for example from a GPS device.
  • To move a spot that has already been located, users can select the spot and use the arrow keys to move the spot marker 1106 on the map 194, or hold down the “M” key (move) on the keyboard and click on the desired location on the map 194, or input actual coordinate data.
  • Spots that have been geo-located can be linked together so users can navigate around the locale among the spots. One embodiment allows users to link two spots by selecting a first spot by clicking on its marker 1106 or its name in the Spots area 1100, and holding down the “C” key (connect) and clicking on a marker 1106 for a second spot. A line (not shown) appears between the spots, signifying that the spots are linked. Spots can be linked with more than one other spot.
  • A link between spots is actually comprised of two separate “moves”, one in each direction; i.e. a move from the first spot to the second spot, and a move from the second spot to the first spot.
  • Embodiments can allow users to delete moves and/or links, for example, by clicking on a link marker connecting 2 spots and allowing the user to choose which moves to delete.
  • Embodiments can download various meta-data about a locale, its spots, views, tags and structure to the client, giving the ability to change various aspects of the client GUI (like sorting criteria and/or appearance of markers on the map) without querying the server. For example, the default legend might color the locale markers according to how many spots they contain, but users can change the marker colors or appearance to distinguish the visible locales by area type, i.e. Residential, Business, Park, Rural, Beach; or by user rating, or by date of creation, or any number of other criteria.
  • In a network-based client-server application, there are actions that directly modify data on the servers, and/or there are actions that can be local to the user's computer until the changes are explicitly saved to the servers. There is an overhead to maintaining or synchronizing the data between client and server. When changes that are local to the user's computer occur, embodiments can keep track of the number of changes made and/or the types of changes made, and can use that information to gradually alter the appearance of one or more aspects of the GUI on-screen as the user makes local changes, thereby giving more useful information to the user than a simple binary “dirty” indicator. So, for example, as the user is manipulating newly uploaded images, the modification indicator can change slightly each time the user modifies data; i.e. the color of the border of the graphical representation of the data changes from neutral to light red and eventually to deep red. Or, as shown in FIGS. 8A-8E, a button such as a “Save Locale” button 1110 can change color from white to dark red.
  • This aspect of the invention can be applied to general-purpose software applications, including desktop applications or any other software which manipulates data and for which those manipulations and/or the manipulated data can be saved. Various software applications and GUIs have a visual mechanism to indicate that a document is “dirty” (has unsaved changes), but a binary representation is not nearly as useful as being able to see at a glance HOW dirty the document is at the moment. The rate of change of the appearance of the graphical representation of the dirtiness with respect to the changes in the unsaved data can be set by a user. Thus, for a text document, a user could set a visual change to occur after adding, modifying, or deleting every ten characters, or after every hundred characters, or after every 1,000 characters. Varied algorithms can be used to determine when and/or how often to update the indicator(s). Embodiments can use different color schemes to make more obvious the amount of editing that has taken place since the last save. For example, faint green to bright green to faint yellow to bright yellow to faint orange to bright orange to faint red to bright red.
  • Other visual indicators can optionally be offered to make it even more obvious to the user how much editing has taken place since the last save. Instead of, or in addition to gradually changing colors, a strategically located icon or other graphical representation could change through a set of different appearances, each of a more urgent nature than the last.
  • Embodiments can include ads embedded within images. For example, an image of a billboard can have virtual ads placed directly in the billboard area, complete with PPC (pay per click) outbound linking. As shown in FIG. 9, a view 1120 of a restaurant can have a clickable embedded ad 1122 featuring the name of the restaurant and an offer of free coupons. Also, images of or near businesses or related places of interest can contain ads of many different kinds, including rollover hotspots with data, imagery or PPC outbound links. Embedded ads can be in the form of zooms, where clicking on a zoom icon displays the ad. The ad links could be a portion of a captured image provided by a user, or ads can be overlaid after the fact.
  • Clients who wish to keep ads (which may be a competitor's) off a particular locale can pay to either run ads of their choosing, or to run no ads whatsoever on locale(s) and/or view(s) of their choosing.
  • Server-side software can “paste” small images/links/icons directly onto the views themselves dynamically at delivery time. For example, in the upper right corner of all images, or of all images belonging to a particular user, or for all images being viewed by a particular user, or for all users from Alabama, or all locales in Alabama, etc. By adding these ad images dynamically, embodiments can optionally reposition them to different parts of the views. Users of the invention have a trust level. New users generally have a trust level of zero. As they contribute good, valid data, their trust level goes up. All locale and/or image data has a trust level equal to the trust level of the user that supplied the data. As more trusted users view the data without flagging it as bad, corrupt or otherwise inappropriate, the trust level of the data increases. Users can specify that only data of a certain trust level or higher be displayed. Embodiments can allow businesses or other entities or groups to have restricted access to their locale data and/or images so that only authorized representatives may view, annotate, comment on and/or manipulate the data for one or more locales.
  • Some entities may be willing to pay to have high quality locales created for their locations. Aspects of the invention can include allowing, in a controlled way, locale creators and locations to advertise their availabilities and needs, for example, in view, spot, or locale comment areas, or within views in their own locales.
  • Entities can also pay or bid for services using points rewarded to the entity through a point system. Users are rewarded points for various activities, such as flagging inappropriate content or capturing one or more digital images of a particular location. As users accumulate points, the users can offer the points to other users as incentives to perform some activity, such as capturing a digital image of a particular location or creating a locale. Points can also be used to bid for a section of pixels on our pages, which are displayed to the general public. Users of the site can promote a favorite charity, a blog, a business, or anything to which a user wants to draw attention.
  • In an alternate embodiment, feature comments are included. Feature comments are comments related to a particular feature in a locale which can be visible in many different views. Without this, it can sometimes be difficult to find all the comments related to a particular item of interest within a locale. Essentially, it's a sub-grouping of comments that can be associated separately from a particular spot or view.
  • In yet another embodiment the view can also be partially rotated, before or while moving forward/backward. This allows the new view to be aligned closer to the center of the currently displayed view, which helps the user maintain their visual frame of reference as they move to another spot—especially with any zooming animation.
  • While the preferred embodiment of the invention has been illustrated and described, as noted above, many changes can be made without departing from the spirit and scope of the invention. Accordingly, the scope of the invention is not limited by the disclosure of the preferred embodiment. Instead, the invention should be determined entirely by reference to the claims that follow.

Claims (4)

  1. 1. A method comprising:
    receiving a first digital image, and at least one user-generated datum, from at least one user;
    organizing the first image geographically according to the at least one datum, including associating the first image with at least one location and at least one direction; and
    providing, from a first-person perspective, at least one user-generated digital image in response to a user request.
  2. 2. The method of claim 1, further comprising:
    receiving additional images and data;
    organizing the additional images and the first image spatially according to the data; and
    providing the additional images and data in response to a user request.
  3. 3. The method of claim 2, wherein organizing the additional images includes linking the images such that a user can access the images from a first-person perspective.
  4. 4. A system comprising:
    a computer-readable memory including at least one user-generated digital image and at least one user-generated datum;
    a processor in data communication with the memory and a network; the processor comprising:
    a first component configured to receive a first digital image, and at least one user-generated datum, from at least one user;
    a second component configured to organize the first image geographically according to the at least one datum, including associating the first image with at least one location and at least one direction; and
    a third component configured to provide, from a first-person perspective, at least one user-generated digital image in response to a user request.
US11844203 2006-08-24 2007-08-23 Systems and methods for photograph mapping Abandoned US20080077597A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US84013406 true 2006-08-24 2006-08-24
US11844203 US20080077597A1 (en) 2006-08-24 2007-08-23 Systems and methods for photograph mapping

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US11844203 US20080077597A1 (en) 2006-08-24 2007-08-23 Systems and methods for photograph mapping
US12438360 US20100235350A1 (en) 2006-08-24 2007-08-24 Systems and methods for photograph mapping
PCT/US2007/076718 WO2008024949A3 (en) 2006-08-24 2007-08-24 Systems and methods for photograph mapping
US13196044 US8990239B2 (en) 2006-08-24 2011-08-02 Systems and methods for photograph mapping
US14624095 US9881093B2 (en) 2006-08-24 2015-02-17 Systems and methods for photograph mapping
US15841190 US20180267983A1 (en) 2006-08-24 2017-12-13 Systems and methods for photograph mapping

Related Child Applications (2)

Application Number Title Priority Date Filing Date
PCT/US2007/076718 Continuation WO2008024949A3 (en) 2006-08-24 2007-08-24 Systems and methods for photograph mapping
US43836009 Continuation 2009-02-20 2009-02-20

Publications (1)

Publication Number Publication Date
US20080077597A1 true true US20080077597A1 (en) 2008-03-27

Family

ID=39107708

Family Applications (5)

Application Number Title Priority Date Filing Date
US11844203 Abandoned US20080077597A1 (en) 2006-08-24 2007-08-23 Systems and methods for photograph mapping
US12438360 Abandoned US20100235350A1 (en) 2006-08-24 2007-08-24 Systems and methods for photograph mapping
US13196044 Active US8990239B2 (en) 2006-08-24 2011-08-02 Systems and methods for photograph mapping
US14624095 Active US9881093B2 (en) 2006-08-24 2015-02-17 Systems and methods for photograph mapping
US15841190 Pending US20180267983A1 (en) 2006-08-24 2017-12-13 Systems and methods for photograph mapping

Family Applications After (4)

Application Number Title Priority Date Filing Date
US12438360 Abandoned US20100235350A1 (en) 2006-08-24 2007-08-24 Systems and methods for photograph mapping
US13196044 Active US8990239B2 (en) 2006-08-24 2011-08-02 Systems and methods for photograph mapping
US14624095 Active US9881093B2 (en) 2006-08-24 2015-02-17 Systems and methods for photograph mapping
US15841190 Pending US20180267983A1 (en) 2006-08-24 2017-12-13 Systems and methods for photograph mapping

Country Status (2)

Country Link
US (5) US20080077597A1 (en)
WO (1) WO2008024949A3 (en)

Cited By (79)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090240653A1 (en) * 2008-03-21 2009-09-24 Kistler Peter Cornelius Method for extracting attribute data from a media file
WO2010024873A1 (en) * 2008-08-28 2010-03-04 Google Inc. Architectures and methods for creating and representing time-dependent imagery
US20100070897A1 (en) * 2008-09-15 2010-03-18 Andrew Aymeloglu Modal-less interface enhancements
US20100214302A1 (en) * 2009-02-24 2010-08-26 Ryan Melcher System and method for supplementing an image gallery with status indicators
US20110007094A1 (en) * 2008-08-28 2011-01-13 Google Inc. Architectures and methods for creating and representing time-dependent imagery
US20110055749A1 (en) * 2009-08-26 2011-03-03 Apple Inc. Tracking Device Movement and Captured Images
WO2011163351A2 (en) * 2010-06-22 2011-12-29 Ohio University Immersive video intelligence network
US8487957B1 (en) * 2007-05-29 2013-07-16 Google Inc. Displaying and navigating within photo placemarks in a geographic information system, and applications thereof
US8584013B1 (en) * 2007-03-20 2013-11-12 Google Inc. Temporal layers for presenting personalization markers on imagery
US20140047381A1 (en) * 2012-08-10 2014-02-13 Microsoft Corporation 3d data environment navigation tool
US20140068445A1 (en) * 2012-09-06 2014-03-06 Sap Ag Systems and Methods for Mobile Access to Enterprise Work Area Information
US8713467B1 (en) 2013-08-09 2014-04-29 Palantir Technologies, Inc. Context-sensitive views
US8782564B2 (en) 2008-03-21 2014-07-15 Trimble Navigation Limited Method for collaborative display of geographic data
US8799799B1 (en) 2013-05-07 2014-08-05 Palantir Technologies Inc. Interactive geospatial map
US8812960B1 (en) 2013-10-07 2014-08-19 Palantir Technologies Inc. Cohort-based presentation of user interaction data
US8832594B1 (en) 2013-11-04 2014-09-09 Palantir Technologies Inc. Space-optimized display of multi-column tables with selective text truncation based on a combined text width
US8855999B1 (en) 2013-03-15 2014-10-07 Palantir Technologies Inc. Method and system for generating a parser and parsing complex data
US20140304312A1 (en) * 2013-04-05 2014-10-09 Dropbox, Inc. Ordering content items
US8868486B2 (en) 2013-03-15 2014-10-21 Palantir Technologies Inc. Time-sensitive cube
US8917274B2 (en) 2013-03-15 2014-12-23 Palantir Technologies Inc. Event matrix based on integrated data
US8924872B1 (en) 2013-10-18 2014-12-30 Palantir Technologies Inc. Overview user interface of emergency call data of a law enforcement agency
US8930897B2 (en) 2013-03-15 2015-01-06 Palantir Technologies Inc. Data integration tool
US8938686B1 (en) 2013-10-03 2015-01-20 Palantir Technologies Inc. Systems and methods for analyzing performance of an entity
US8937619B2 (en) 2013-03-15 2015-01-20 Palantir Technologies Inc. Generating an object time series from data objects
US9009827B1 (en) 2014-02-20 2015-04-14 Palantir Technologies Inc. Security sharing system
US9009171B1 (en) 2014-05-02 2015-04-14 Palantir Technologies Inc. Systems and methods for active column filtering
US9021384B1 (en) 2013-11-04 2015-04-28 Palantir Technologies Inc. Interactive vehicle information map
US9021260B1 (en) 2014-07-03 2015-04-28 Palantir Technologies Inc. Malware data item analysis
US20150130838A1 (en) * 2013-11-13 2015-05-14 Sony Corporation Display control device, display control method, and program
US9043696B1 (en) 2014-01-03 2015-05-26 Palantir Technologies Inc. Systems and methods for visual definition of data associations
US9043894B1 (en) 2014-11-06 2015-05-26 Palantir Technologies Inc. Malicious software detection in a computing system
US9116975B2 (en) 2013-10-18 2015-08-25 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive simultaneous querying of multiple data stores
US9123086B1 (en) 2013-01-31 2015-09-01 Palantir Technologies, Inc. Automatically generating event objects from images
US20150302633A1 (en) * 2014-04-22 2015-10-22 Google Inc. Selecting time-distributed panoramic images for display
US9202249B1 (en) 2014-07-03 2015-12-01 Palantir Technologies Inc. Data item clustering and analysis
US9223773B2 (en) 2013-08-08 2015-12-29 Palatir Technologies Inc. Template system for custom document generation
US9256664B2 (en) 2014-07-03 2016-02-09 Palantir Technologies Inc. System and method for news events detection and visualization
US9335911B1 (en) 2014-12-29 2016-05-10 Palantir Technologies Inc. Interactive user interface for dynamic data analysis exploration and query processing
US9335897B2 (en) 2013-08-08 2016-05-10 Palantir Technologies Inc. Long click display of a context menu
US9367872B1 (en) 2014-12-22 2016-06-14 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive investigation of bad actor behavior based on automatic clustering of related data in various data structures
US9454785B1 (en) 2015-07-30 2016-09-27 Palantir Technologies Inc. Systems and user interfaces for holistic, data-driven investigation of bad actor behavior based on clustering and scoring of related data
US9454281B2 (en) 2014-09-03 2016-09-27 Palantir Technologies Inc. System for providing dynamic linked panels in user interface
US9483162B2 (en) 2014-02-20 2016-11-01 Palantir Technologies Inc. Relationship visualizations
US9501851B2 (en) 2014-10-03 2016-11-22 Palantir Technologies Inc. Time-series analysis system
US9552615B2 (en) 2013-12-20 2017-01-24 Palantir Technologies Inc. Automated database analysis to detect malfeasance
US9571972B2 (en) * 2015-04-24 2017-02-14 International Business Machines Corporation Managing crowd sourced data acquisition
USD780210S1 (en) 2014-04-22 2017-02-28 Google Inc. Display screen with graphical user interface or portion thereof
USD780211S1 (en) 2014-04-22 2017-02-28 Google Inc. Display screen with graphical user interface or portion thereof
USD780797S1 (en) 2014-04-22 2017-03-07 Google Inc. Display screen with graphical user interface or portion thereof
US9619557B2 (en) 2014-06-30 2017-04-11 Palantir Technologies, Inc. Systems and methods for key phrase characterization of documents
US9727560B2 (en) 2015-02-25 2017-08-08 Palantir Technologies Inc. Systems and methods for organizing and identifying documents via hierarchies and dimensions of tags
US9727622B2 (en) 2013-12-16 2017-08-08 Palantir Technologies, Inc. Methods and systems for analyzing entity performance
US9740369B2 (en) 2013-03-15 2017-08-22 Palantir Technologies Inc. Systems and methods for providing a tagging interface for external content
US9767172B2 (en) 2014-10-03 2017-09-19 Palantir Technologies Inc. Data aggregation and analysis system
US9785307B1 (en) * 2012-09-27 2017-10-10 Open Text Corporation Reorder and selection persistence of displayed objects
US9785773B2 (en) 2014-07-03 2017-10-10 Palantir Technologies Inc. Malware data item analysis
US9785328B2 (en) 2014-10-06 2017-10-10 Palantir Technologies Inc. Presentation of multivariate data on a graphical user interface of a computing system
US9785317B2 (en) 2013-09-24 2017-10-10 Palantir Technologies Inc. Presentation and analysis of user interaction data
US9817563B1 (en) 2014-12-29 2017-11-14 Palantir Technologies Inc. System and method of generating data points from one or more data stores of data items for chart creation and manipulation
US9823818B1 (en) 2015-12-29 2017-11-21 Palantir Technologies Inc. Systems and interactive user interfaces for automatic generation of temporal representation of data objects
US9836580B2 (en) 2014-03-21 2017-12-05 Palantir Technologies Inc. Provider portal
US9857958B2 (en) 2014-04-28 2018-01-02 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive access of, investigation of, and analysis of data objects stored in one or more databases
US9870205B1 (en) 2014-12-29 2018-01-16 Palantir Technologies Inc. Storing logical units of program code generated using a dynamic programming notebook user interface
US9880987B2 (en) 2011-08-25 2018-01-30 Palantir Technologies, Inc. System and method for parameterizing documents for automatic workflow generation
US9886467B2 (en) 2015-03-19 2018-02-06 Plantir Technologies Inc. System and method for comparing and visualizing data entities and data entity series
US9891808B2 (en) 2015-03-16 2018-02-13 Palantir Technologies Inc. Interactive user interfaces for location-based data analysis
US9898509B2 (en) 2015-08-28 2018-02-20 Palantir Technologies Inc. Malicious activity detection system capable of efficiently processing data accessed from databases and generating alerts for display in interactive user interfaces
US9898335B1 (en) 2012-10-22 2018-02-20 Palantir Technologies Inc. System and method for batch evaluation programs
US9898528B2 (en) 2014-12-22 2018-02-20 Palantir Technologies Inc. Concept indexing among database of documents using machine learning techniques
US9898167B2 (en) 2013-03-15 2018-02-20 Palantir Technologies Inc. Systems and methods for providing a tagging interface for external content
US9934222B2 (en) 2014-04-22 2018-04-03 Google Llc Providing a thumbnail image that follows a main image
US9946738B2 (en) 2014-11-05 2018-04-17 Palantir Technologies, Inc. Universal data pipeline
US9965937B2 (en) 2013-03-15 2018-05-08 Palantir Technologies Inc. External malware data item clustering and analysis
US9965534B2 (en) 2015-09-09 2018-05-08 Palantir Technologies, Inc. Domain-specific language for dataset transformations
US9984133B2 (en) 2014-10-16 2018-05-29 Palantir Technologies Inc. Schematic and database linking system
US9996595B2 (en) 2015-08-03 2018-06-12 Palantir Technologies, Inc. Providing full data provenance visualization for versioned datasets
US10037383B2 (en) 2013-11-11 2018-07-31 Palantir Technologies, Inc. Simple web search
US10037314B2 (en) 2013-03-14 2018-07-31 Palantir Technologies, Inc. Mobile reports
USD829737S1 (en) 2016-07-22 2018-10-02 Google Llc Display screen with graphical user interface or portion thereof

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050134945A1 (en) * 2003-12-17 2005-06-23 Canon Information Systems Research Australia Pty. Ltd. 3D view for digital photograph management
CA2694200C (en) * 2007-07-27 2015-06-16 Intertrust Technologies Corporation Content publishing systems and methods
US8099237B2 (en) 2008-07-25 2012-01-17 Navteq North America, Llc Open area maps
US20100021013A1 (en) * 2008-07-25 2010-01-28 Gale William N Open area maps with guidance
US8825387B2 (en) * 2008-07-25 2014-09-02 Navteq B.V. Positioning open area maps
EP2488964A4 (en) * 2009-10-15 2017-11-29 Bosch Automotive Products (Suzhou) Co., Ltd. Navigation system and method with improved destination searching
US8952983B2 (en) 2010-11-04 2015-02-10 Nokia Corporation Method and apparatus for annotating point of interest information
US9639857B2 (en) * 2011-09-30 2017-05-02 Nokia Technologies Oy Method and apparatus for associating commenting information with one or more objects
US20150153933A1 (en) * 2012-03-16 2015-06-04 Google Inc. Navigating Discrete Photos and Panoramas
US9313344B2 (en) 2012-06-01 2016-04-12 Blackberry Limited Methods and apparatus for use in mapping identified visual features of visual images to location areas
US9292264B2 (en) 2013-03-15 2016-03-22 Paschar Llc Mobile device user interface advertising software development kit
US9827714B1 (en) 2014-05-16 2017-11-28 Google Llc Method and system for 3-D printing of 3-D object models in interactive content items

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020029226A1 (en) * 2000-09-05 2002-03-07 Gang Li Method for combining data with maps
US20050073443A1 (en) * 2003-02-14 2005-04-07 Networks In Motion, Inc. Method and system for saving and retrieving spatial related information
US20050278371A1 (en) * 2004-06-15 2005-12-15 Karsten Funk Method and system for georeferential blogging, bookmarking a location, and advanced off-board data processing for mobile systems
US20060195475A1 (en) * 2005-02-28 2006-08-31 Microsoft Corporation Automatic digital image grouping using criteria based on image metadata and spatial information
US20070070233A1 (en) * 2005-09-28 2007-03-29 Patterson Raul D System and method for correlating captured images with their site locations on maps

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6956573B1 (en) * 1996-11-15 2005-10-18 Sarnoff Corporation Method and apparatus for efficiently representing storing and accessing video information
US6266684B1 (en) * 1997-08-06 2001-07-24 Adobe Systems Incorporated Creating and saving multi-frame web pages
US6552744B2 (en) * 1997-09-26 2003-04-22 Roxio, Inc. Virtual reality camera
US6106460A (en) * 1998-03-26 2000-08-22 Scimed Life Systems, Inc. Interface for controlling the display of images of diagnostic or therapeutic instruments in interior body regions and related data
US6246412B1 (en) * 1998-06-18 2001-06-12 Microsoft Corporation Interactive construction and refinement of 3D models from multiple panoramic images
JP3646582B2 (en) * 1998-09-28 2005-05-11 富士通株式会社 Electronic information display method, an electronic information browsing device and the electronic information browsing program storage medium
US6895126B2 (en) * 2000-10-06 2005-05-17 Enrico Di Bernardo System and method for creating, storing, and utilizing composite images of a geographic location
US6904160B2 (en) * 2000-10-18 2005-06-07 Red Hen Systems, Inc. Method for matching geographic information with recorded images
US7013289B2 (en) * 2001-02-21 2006-03-14 Michel Horn Global electronic commerce system
US7909696B2 (en) * 2001-08-09 2011-03-22 Igt Game interaction in 3-D gaming environments
EP1304626A1 (en) * 2001-10-18 2003-04-23 Sun Microsystems Inc. Managing modified documents
US7187377B1 (en) * 2002-06-28 2007-03-06 Microsoft Corporation Three-dimensional virtual tour method and system
US7327349B2 (en) * 2004-03-02 2008-02-05 Microsoft Corporation Advanced navigation techniques for portable devices
US7317449B2 (en) * 2004-03-02 2008-01-08 Microsoft Corporation Key-based advanced navigation techniques
US7548936B2 (en) * 2005-01-12 2009-06-16 Microsoft Corporation Systems and methods to present web image search results for effective image browsing
US20060230051A1 (en) * 2005-04-08 2006-10-12 Muds Springs Geographers Inc. Method to share and exchange geographic based information
US7652594B2 (en) * 2005-04-08 2010-01-26 Trigger California, Inc. Architecture for creating, organizing, editing, management and delivery of locationally-specific information to a user in the field
US20090004410A1 (en) * 2005-05-12 2009-01-01 Thomson Stephen C Spatial graphical user interface and method for using the same
WO2007008929A3 (en) * 2005-07-13 2009-05-07 Grape Technology Group Inc System and method for providing mobile device services using sms communications
WO2007006075A1 (en) * 2005-07-14 2007-01-18 Canon Information Systems Research Australia Pty Ltd Image browser
US8510669B2 (en) * 2006-02-06 2013-08-13 Yahoo! Inc. Method and system for presenting photos on a website
KR100641791B1 (en) * 2006-02-14 2006-11-02 (주)올라웍스 Tagging Method and System for Digital Data
US7797019B2 (en) * 2006-03-29 2010-09-14 Research In Motion Limited Shared image database with geographic navigation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020029226A1 (en) * 2000-09-05 2002-03-07 Gang Li Method for combining data with maps
US20050073443A1 (en) * 2003-02-14 2005-04-07 Networks In Motion, Inc. Method and system for saving and retrieving spatial related information
US20050278371A1 (en) * 2004-06-15 2005-12-15 Karsten Funk Method and system for georeferential blogging, bookmarking a location, and advanced off-board data processing for mobile systems
US20060195475A1 (en) * 2005-02-28 2006-08-31 Microsoft Corporation Automatic digital image grouping using criteria based on image metadata and spatial information
US20070070233A1 (en) * 2005-09-28 2007-03-29 Patterson Raul D System and method for correlating captured images with their site locations on maps

Cited By (136)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8584013B1 (en) * 2007-03-20 2013-11-12 Google Inc. Temporal layers for presenting personalization markers on imagery
US9280258B1 (en) 2007-05-29 2016-03-08 Google Inc. Displaying and navigating within photo placemarks in a geographic information system and applications thereof
US8487957B1 (en) * 2007-05-29 2013-07-16 Google Inc. Displaying and navigating within photo placemarks in a geographic information system, and applications thereof
US8782564B2 (en) 2008-03-21 2014-07-15 Trimble Navigation Limited Method for collaborative display of geographic data
US8898179B2 (en) * 2008-03-21 2014-11-25 Trimble Navigation Limited Method for extracting attribute data from a media file
US20090240653A1 (en) * 2008-03-21 2009-09-24 Kistler Peter Cornelius Method for extracting attribute data from a media file
US8872847B2 (en) * 2008-08-28 2014-10-28 Google Inc. Architectures and methods for creating and representing time-dependent imagery
US8077918B2 (en) 2008-08-28 2011-12-13 Google, Inc. Architectures and methods for creating and representing time-dependent imagery
US20110007094A1 (en) * 2008-08-28 2011-01-13 Google Inc. Architectures and methods for creating and representing time-dependent imagery
US8295550B2 (en) 2008-08-28 2012-10-23 Google Inc. Architectures and methods for creating and representing time-dependent imagery
US8737683B2 (en) 2008-08-28 2014-05-27 Google Inc. Architectures and methods for creating and representing time-dependent imagery
US8520977B2 (en) 2008-08-28 2013-08-27 Google Inc. Architectures and methods for creating and representing time-dependent imagery
US9542723B2 (en) 2008-08-28 2017-01-10 Google Inc. Architectures and methods for creating and representing time-dependent imagery
US20100054527A1 (en) * 2008-08-28 2010-03-04 Google Inc. Architecture and methods for creating and representing time-dependent imagery
WO2010024873A1 (en) * 2008-08-28 2010-03-04 Google Inc. Architectures and methods for creating and representing time-dependent imagery
US9916070B1 (en) 2008-08-28 2018-03-13 Google Llc Architectures and methods for creating and representing time-dependent imagery
US9099057B2 (en) 2008-08-28 2015-08-04 Google Inc. Architectures and methods for creating and representing time-dependent imagery
US9383911B2 (en) * 2008-09-15 2016-07-05 Palantir Technologies, Inc. Modal-less interface enhancements
US20100070897A1 (en) * 2008-09-15 2010-03-18 Andrew Aymeloglu Modal-less interface enhancements
US9406042B2 (en) * 2009-02-24 2016-08-02 Ebay Inc. System and method for supplementing an image gallery with status indicators
US20100214302A1 (en) * 2009-02-24 2010-08-26 Ryan Melcher System and method for supplementing an image gallery with status indicators
US8839131B2 (en) * 2009-08-26 2014-09-16 Apple Inc. Tracking device movement and captured images
US20110055749A1 (en) * 2009-08-26 2011-03-03 Apple Inc. Tracking Device Movement and Captured Images
WO2011163351A2 (en) * 2010-06-22 2011-12-29 Ohio University Immersive video intelligence network
WO2011163351A3 (en) * 2010-06-22 2014-04-10 Ohio University Immersive video intelligence network
US9880987B2 (en) 2011-08-25 2018-01-30 Palantir Technologies, Inc. System and method for parameterizing documents for automatic workflow generation
US9881396B2 (en) 2012-08-10 2018-01-30 Microsoft Technology Licensing, Llc Displaying temporal information in a spreadsheet application
US20140047381A1 (en) * 2012-08-10 2014-02-13 Microsoft Corporation 3d data environment navigation tool
US10008015B2 (en) 2012-08-10 2018-06-26 Microsoft Technology Licensing, Llc Generating scenes and tours in a spreadsheet application
US9996953B2 (en) 2012-08-10 2018-06-12 Microsoft Technology Licensing, Llc Three-dimensional annotation facing
US9317963B2 (en) 2012-08-10 2016-04-19 Microsoft Technology Licensing, Llc Generating scenes and tours in a spreadsheet application
US20140068445A1 (en) * 2012-09-06 2014-03-06 Sap Ag Systems and Methods for Mobile Access to Enterprise Work Area Information
US9785307B1 (en) * 2012-09-27 2017-10-10 Open Text Corporation Reorder and selection persistence of displayed objects
US9898335B1 (en) 2012-10-22 2018-02-20 Palantir Technologies Inc. System and method for batch evaluation programs
US9380431B1 (en) 2013-01-31 2016-06-28 Palantir Technologies, Inc. Use of teams in a mobile application
US9123086B1 (en) 2013-01-31 2015-09-01 Palantir Technologies, Inc. Automatically generating event objects from images
US10037314B2 (en) 2013-03-14 2018-07-31 Palantir Technologies, Inc. Mobile reports
US8930897B2 (en) 2013-03-15 2015-01-06 Palantir Technologies Inc. Data integration tool
US8917274B2 (en) 2013-03-15 2014-12-23 Palantir Technologies Inc. Event matrix based on integrated data
US8868486B2 (en) 2013-03-15 2014-10-21 Palantir Technologies Inc. Time-sensitive cube
US8855999B1 (en) 2013-03-15 2014-10-07 Palantir Technologies Inc. Method and system for generating a parser and parsing complex data
US9965937B2 (en) 2013-03-15 2018-05-08 Palantir Technologies Inc. External malware data item clustering and analysis
US9646396B2 (en) 2013-03-15 2017-05-09 Palantir Technologies Inc. Generating object time series and data objects
US9852195B2 (en) 2013-03-15 2017-12-26 Palantir Technologies Inc. System and method for generating event visualizations
US9740369B2 (en) 2013-03-15 2017-08-22 Palantir Technologies Inc. Systems and methods for providing a tagging interface for external content
US9779525B2 (en) 2013-03-15 2017-10-03 Palantir Technologies Inc. Generating object time series from data objects
US8937619B2 (en) 2013-03-15 2015-01-20 Palantir Technologies Inc. Generating an object time series from data objects
US9852205B2 (en) 2013-03-15 2017-12-26 Palantir Technologies Inc. Time-sensitive cube
US9898167B2 (en) 2013-03-15 2018-02-20 Palantir Technologies Inc. Systems and methods for providing a tagging interface for external content
US20140304312A1 (en) * 2013-04-05 2014-10-09 Dropbox, Inc. Ordering content items
US9152646B2 (en) * 2013-04-05 2015-10-06 Dropbox, Inc. Ordering content items
US9953445B2 (en) 2013-05-07 2018-04-24 Palantir Technologies Inc. Interactive data object map
US8799799B1 (en) 2013-05-07 2014-08-05 Palantir Technologies Inc. Interactive geospatial map
US9335897B2 (en) 2013-08-08 2016-05-10 Palantir Technologies Inc. Long click display of a context menu
US9223773B2 (en) 2013-08-08 2015-12-29 Palatir Technologies Inc. Template system for custom document generation
US8713467B1 (en) 2013-08-09 2014-04-29 Palantir Technologies, Inc. Context-sensitive views
US9921734B2 (en) 2013-08-09 2018-03-20 Palantir Technologies Inc. Context-sensitive views
US9557882B2 (en) 2013-08-09 2017-01-31 Palantir Technologies Inc. Context-sensitive views
US9785317B2 (en) 2013-09-24 2017-10-10 Palantir Technologies Inc. Presentation and analysis of user interaction data
US8938686B1 (en) 2013-10-03 2015-01-20 Palantir Technologies Inc. Systems and methods for analyzing performance of an entity
US9996229B2 (en) 2013-10-03 2018-06-12 Palantir Technologies Inc. Systems and methods for analyzing performance of an entity
US8812960B1 (en) 2013-10-07 2014-08-19 Palantir Technologies Inc. Cohort-based presentation of user interaction data
US9864493B2 (en) 2013-10-07 2018-01-09 Palantir Technologies Inc. Cohort-based presentation of user interaction data
US10042524B2 (en) 2013-10-18 2018-08-07 Palantir Technologies Inc. Overview user interface of emergency call data of a law enforcement agency
US8924872B1 (en) 2013-10-18 2014-12-30 Palantir Technologies Inc. Overview user interface of emergency call data of a law enforcement agency
US9514200B2 (en) 2013-10-18 2016-12-06 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive simultaneous querying of multiple data stores
US9116975B2 (en) 2013-10-18 2015-08-25 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive simultaneous querying of multiple data stores
US8832594B1 (en) 2013-11-04 2014-09-09 Palantir Technologies Inc. Space-optimized display of multi-column tables with selective text truncation based on a combined text width
US9021384B1 (en) 2013-11-04 2015-04-28 Palantir Technologies Inc. Interactive vehicle information map
US10037383B2 (en) 2013-11-11 2018-07-31 Palantir Technologies, Inc. Simple web search
US20150130838A1 (en) * 2013-11-13 2015-05-14 Sony Corporation Display control device, display control method, and program
US9734217B2 (en) 2013-12-16 2017-08-15 Palantir Technologies Inc. Methods and systems for analyzing entity performance
US9727622B2 (en) 2013-12-16 2017-08-08 Palantir Technologies, Inc. Methods and systems for analyzing entity performance
US10025834B2 (en) 2013-12-16 2018-07-17 Palantir Technologies Inc. Methods and systems for analyzing entity performance
US9552615B2 (en) 2013-12-20 2017-01-24 Palantir Technologies Inc. Automated database analysis to detect malfeasance
US9043696B1 (en) 2014-01-03 2015-05-26 Palantir Technologies Inc. Systems and methods for visual definition of data associations
US9483162B2 (en) 2014-02-20 2016-11-01 Palantir Technologies Inc. Relationship visualizations
US9009827B1 (en) 2014-02-20 2015-04-14 Palantir Technologies Inc. Security sharing system
US9923925B2 (en) 2014-02-20 2018-03-20 Palantir Technologies Inc. Cyber security sharing and identification system
US9836580B2 (en) 2014-03-21 2017-12-05 Palantir Technologies Inc. Provider portal
USD781317S1 (en) * 2014-04-22 2017-03-14 Google Inc. Display screen with graphical user interface or portion thereof
USD780795S1 (en) 2014-04-22 2017-03-07 Google Inc. Display screen with graphical user interface or portion thereof
USD780796S1 (en) 2014-04-22 2017-03-07 Google Inc. Display screen with graphical user interface or portion thereof
USD791813S1 (en) 2014-04-22 2017-07-11 Google Inc. Display screen with graphical user interface or portion thereof
USD791811S1 (en) 2014-04-22 2017-07-11 Google Inc. Display screen with graphical user interface or portion thereof
USD781337S1 (en) 2014-04-22 2017-03-14 Google Inc. Display screen with graphical user interface or portion thereof
USD792460S1 (en) 2014-04-22 2017-07-18 Google Inc. Display screen with graphical user interface or portion thereof
USD780211S1 (en) 2014-04-22 2017-02-28 Google Inc. Display screen with graphical user interface or portion thereof
USD780210S1 (en) 2014-04-22 2017-02-28 Google Inc. Display screen with graphical user interface or portion thereof
US9934222B2 (en) 2014-04-22 2018-04-03 Google Llc Providing a thumbnail image that follows a main image
USD781318S1 (en) * 2014-04-22 2017-03-14 Google Inc. Display screen with graphical user interface or portion thereof
US20150302633A1 (en) * 2014-04-22 2015-10-22 Google Inc. Selecting time-distributed panoramic images for display
USD780794S1 (en) 2014-04-22 2017-03-07 Google Inc. Display screen with graphical user interface or portion thereof
USD780777S1 (en) * 2014-04-22 2017-03-07 Google Inc. Display screen with graphical user interface or portion thereof
USD780797S1 (en) 2014-04-22 2017-03-07 Google Inc. Display screen with graphical user interface or portion thereof
US9972121B2 (en) * 2014-04-22 2018-05-15 Google Llc Selecting time-distributed panoramic images for display
US9857958B2 (en) 2014-04-28 2018-01-02 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive access of, investigation of, and analysis of data objects stored in one or more databases
US9009171B1 (en) 2014-05-02 2015-04-14 Palantir Technologies Inc. Systems and methods for active column filtering
US9449035B2 (en) 2014-05-02 2016-09-20 Palantir Technologies Inc. Systems and methods for active column filtering
US9619557B2 (en) 2014-06-30 2017-04-11 Palantir Technologies, Inc. Systems and methods for key phrase characterization of documents
US9021260B1 (en) 2014-07-03 2015-04-28 Palantir Technologies Inc. Malware data item analysis
US9785773B2 (en) 2014-07-03 2017-10-10 Palantir Technologies Inc. Malware data item analysis
US9998485B2 (en) 2014-07-03 2018-06-12 Palantir Technologies, Inc. Network intrusion data item clustering and analysis
US9298678B2 (en) 2014-07-03 2016-03-29 Palantir Technologies Inc. System and method for news events detection and visualization
US9256664B2 (en) 2014-07-03 2016-02-09 Palantir Technologies Inc. System and method for news events detection and visualization
US9344447B2 (en) 2014-07-03 2016-05-17 Palantir Technologies Inc. Internal malware data item clustering and analysis
US9202249B1 (en) 2014-07-03 2015-12-01 Palantir Technologies Inc. Data item clustering and analysis
US9880696B2 (en) 2014-09-03 2018-01-30 Palantir Technologies Inc. System for providing dynamic linked panels in user interface
US9454281B2 (en) 2014-09-03 2016-09-27 Palantir Technologies Inc. System for providing dynamic linked panels in user interface
US9501851B2 (en) 2014-10-03 2016-11-22 Palantir Technologies Inc. Time-series analysis system
US9767172B2 (en) 2014-10-03 2017-09-19 Palantir Technologies Inc. Data aggregation and analysis system
US9785328B2 (en) 2014-10-06 2017-10-10 Palantir Technologies Inc. Presentation of multivariate data on a graphical user interface of a computing system
US9984133B2 (en) 2014-10-16 2018-05-29 Palantir Technologies Inc. Schematic and database linking system
US9946738B2 (en) 2014-11-05 2018-04-17 Palantir Technologies, Inc. Universal data pipeline
US9043894B1 (en) 2014-11-06 2015-05-26 Palantir Technologies Inc. Malicious software detection in a computing system
US9558352B1 (en) 2014-11-06 2017-01-31 Palantir Technologies Inc. Malicious software detection in a computing system
US9589299B2 (en) 2014-12-22 2017-03-07 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive investigation of bad actor behavior based on automatic clustering of related data in various data structures
US9367872B1 (en) 2014-12-22 2016-06-14 Palantir Technologies Inc. Systems and user interfaces for dynamic and interactive investigation of bad actor behavior based on automatic clustering of related data in various data structures
US9898528B2 (en) 2014-12-22 2018-02-20 Palantir Technologies Inc. Concept indexing among database of documents using machine learning techniques
US9870389B2 (en) 2014-12-29 2018-01-16 Palantir Technologies Inc. Interactive user interface for dynamic data analysis exploration and query processing
US9870205B1 (en) 2014-12-29 2018-01-16 Palantir Technologies Inc. Storing logical units of program code generated using a dynamic programming notebook user interface
US9817563B1 (en) 2014-12-29 2017-11-14 Palantir Technologies Inc. System and method of generating data points from one or more data stores of data items for chart creation and manipulation
US9335911B1 (en) 2014-12-29 2016-05-10 Palantir Technologies Inc. Interactive user interface for dynamic data analysis exploration and query processing
US9727560B2 (en) 2015-02-25 2017-08-08 Palantir Technologies Inc. Systems and methods for organizing and identifying documents via hierarchies and dimensions of tags
US9891808B2 (en) 2015-03-16 2018-02-13 Palantir Technologies Inc. Interactive user interfaces for location-based data analysis
US9886467B2 (en) 2015-03-19 2018-02-06 Plantir Technologies Inc. System and method for comparing and visualizing data entities and data entity series
US9571971B2 (en) 2015-04-24 2017-02-14 International Business Machines Corporation Managing crowd sourced data acquisition
US9571972B2 (en) * 2015-04-24 2017-02-14 International Business Machines Corporation Managing crowd sourced data acquisition
US9454785B1 (en) 2015-07-30 2016-09-27 Palantir Technologies Inc. Systems and user interfaces for holistic, data-driven investigation of bad actor behavior based on clustering and scoring of related data
US9996595B2 (en) 2015-08-03 2018-06-12 Palantir Technologies, Inc. Providing full data provenance visualization for versioned datasets
US9898509B2 (en) 2015-08-28 2018-02-20 Palantir Technologies Inc. Malicious activity detection system capable of efficiently processing data accessed from databases and generating alerts for display in interactive user interfaces
US9965534B2 (en) 2015-09-09 2018-05-08 Palantir Technologies, Inc. Domain-specific language for dataset transformations
US9823818B1 (en) 2015-12-29 2017-11-21 Palantir Technologies Inc. Systems and interactive user interfaces for automatic generation of temporal representation of data objects
USD829737S1 (en) 2016-07-22 2018-10-02 Google Llc Display screen with graphical user interface or portion thereof
USD830407S1 (en) 2017-05-30 2018-10-09 Google Llc Display screen with graphical user interface or portion thereof
USD830399S1 (en) 2017-05-30 2018-10-09 Google Llc Display screen with graphical user interface or portion thereof

Also Published As

Publication number Publication date Type
WO2008024949A3 (en) 2008-08-14 application
US20100235350A1 (en) 2010-09-16 application
WO2008024949A2 (en) 2008-02-28 application
US20180267983A1 (en) 2018-09-20 application
US8990239B2 (en) 2015-03-24 grant
US20150302018A1 (en) 2015-10-22 application
US9881093B2 (en) 2018-01-30 grant
US20120243804A1 (en) 2012-09-27 application

Similar Documents

Publication Publication Date Title
US6577714B1 (en) Map-based directory system
US7827507B2 (en) System to navigate within images spatially referenced to a computed space
US7564377B2 (en) Real-time virtual earth driving information
US7130742B2 (en) Electronic guide system, contents server for electronic guide system, portable electronic guide device, and information processing method for electronic guide system
US8051089B2 (en) Systems and methods for location-based real estate service
US20040225635A1 (en) Browsing user interface for a geo-coded media database
US20070233367A1 (en) Methods for Interaction, Sharing, and Exploration over Geographical Locations
US20120105475A1 (en) Range of Focus in an Augmented Reality Application
US20100250581A1 (en) System and method of displaying images based on environmental conditions
US20070011271A1 (en) Multi-source data retrieval system
US20110313649A1 (en) Method and apparatus for providing smart zooming of a geographic representation
US20100325563A1 (en) Augmenting a field of view
US20060156228A1 (en) Spatially driven content presentation in a cellular environment
US7746376B2 (en) Method and apparatus for accessing multi-dimensional mapping and information
US7142196B1 (en) Geographical data markup on a personal digital assistant (PDA)
US20100094548A1 (en) Methods and systems of advanced real estate searching
US20100122208A1 (en) Panoramic Mapping Display
US20130321461A1 (en) Method and System for Navigation to Interior View Imagery from Street Level Imagery
US20090055776A1 (en) Position based multi-dimensional locating system and method
US20150338233A1 (en) Geotagging Structured Data
Haklay et al. Web mapping 2.0: The neogeography of the GeoWeb
US20080033641A1 (en) Method of generating a three-dimensional interactive tour of a geographic location
US20120078503A1 (en) System and method for the collaborative collection, assignment, visualization, analysis, and modification of probable genealogical relationships based on geo-spatial and temporal proximity
US20110050732A1 (en) Method and apparatus for customizing map presentations based on user interests
US20080133579A1 (en) Map service system and method