US20150070356A1 - Techniques to manage map information illustrating a transition between views - Google Patents

Techniques to manage map information illustrating a transition between views Download PDF

Info

Publication number
US20150070356A1
US20150070356A1 US14/251,813 US201414251813A US2015070356A1 US 20150070356 A1 US20150070356 A1 US 20150070356A1 US 201414251813 A US201414251813 A US 201414251813A US 2015070356 A1 US2015070356 A1 US 2015070356A1
Authority
US
United States
Prior art keywords
map
view
dimensional
transition
sphere
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/251,813
Inventor
Alexandre da Veiga
Ehab Sobhy
Michael Kallay
Ian Wood
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Microsoft Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Microsoft Corp filed Critical Microsoft Corp
Priority to US14/251,813 priority Critical patent/US20150070356A1/en
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DA VEIGA, Alexandre, WOOD, IAN, KALLAY, MICHAEL
Priority to CN201480058945.7A priority patent/CN105917384A/en
Priority to PCT/US2014/054679 priority patent/WO2015038506A1/en
Priority to EP14767264.6A priority patent/EP3044763A1/en
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Publication of US20150070356A1 publication Critical patent/US20150070356A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/08Volume rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/904Browsing; Visualisation therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns
    • G09G5/377Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/12Overlay of images, i.e. displayed pixel being the result of switching between the corresponding input pixels

Definitions

  • Digital maps are becoming a universal platform for conveying map information representing locations of people, places, objects and events. As more map information is presented on a digital map, it becomes necessary to ensure the map information is presented to a user in a meaningful way. Further, digital maps are becoming more interactive to allow a user to manipulate a digital map to view particular map items of interest. In addition, the sheer volume of map information consumes significant computing and communications resources. As a result, enhanced techniques are needed to manage and manipulate a digital map to efficiently convey map information.
  • Embodiments are generally directed to enhanced techniques to manage digital maps. Some embodiments are particularly directed to enhanced techniques to manage map information for a digital map in an efficient and effective manner to facilitate consumption by a user.
  • a map application may comprise a map transition component arranged to provide map views that can be smoothly transitioned between two dimensional (2D) and three dimensional (3D) views.
  • a digital map may be switched between a 3D view of a globe and a 2D view of a flat map utilizing a series of intermediate views to simulate a smooth animated transition between views.
  • the map transition component may be operative on the logic circuit to render a set of views of a digital map for presentation on a display element in which a three dimensional view comprises data representing a spatially correct surface and the intermediate views correspond to a transition between views, each intermediate view to comprise mapping data between the spatially correct surface and a substantially spherical coordinate system.
  • a map application may comprise a map color component configured to cooperate with the map transition component during map view transitioning.
  • An example map view displaying a range of colors that represent multiple categories for a given location may transition into a flatter map view (e.g., a 2D view) with a coloring assignment that reflects the flattening of map data.
  • a digital map may provide regions with blended colors, with each color representing a different category assigned to a particular region. The blending effect provides a high level of granularity for conveying nuanced information about a region.
  • FIG. 1 illustrates an embodiment of an apparatus to manage digital maps.
  • FIGS. 2A-D illustrate embodiments of a transition between 2D/3D maps.
  • FIG. 3 illustrates an embodiment of a region map.
  • FIG. 4 illustrates an embodiment of a centralized system for the apparatus.
  • FIG. 5 illustrates an embodiment of a distributed system for the apparatus.
  • FIG. 6 illustrates an embodiment of a first logic flow for the system of FIG. 1 .
  • FIG. 7 illustrates an embodiment of a second logic flow for the system of FIG. 1 .
  • FIG. 8 illustrates an embodiment of a third logic flow for the system of FIG. 1 .
  • FIG. 9 illustrates an embodiment of a fourth logic flow for the system of FIG. 1 .
  • FIG. 10 illustrates an embodiment of a computing architecture.
  • FIG. 11 illustrates an embodiment of a communications architecture.
  • Embodiments are generally directed to enhanced techniques to manage digital maps. Some embodiments are particularly directed to enhanced techniques to manage map information for a digital map in an efficient and effective manner to facilitate consumption by a user.
  • the map information may include intermediate views illustrating a transition between a three dimensional view of the digital map and a two dimensional view of the same map. This transition may be animated in order to enable a user to clearly perceive a spatially correct surface and an entire surface in an efficient manner.
  • the embodiments can improve affordability, scalability, modularity, extendibility, or interoperability for an operator, device or network.
  • Other advantages and use scenarios apply as well.
  • a procedure is here, and generally, conceived to be a self-consistent sequence of operations leading to a desired result. These operations are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical, magnetic or optical signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It proves convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. It should be noted, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to those quantities.
  • the manipulations performed are often referred to in terms, such as adding or comparing, which are commonly associated with mental operations performed by a human operator. No such capability of a human operator is necessary, or desirable in most cases, in any of the operations described herein which form part of one or more embodiments. Rather, the operations are machine operations. Useful machines for performing operations of various embodiments include general purpose digital computers or similar devices.
  • This apparatus may be specially constructed for the required purpose or it may comprise a general purpose computer as selectively activated or reconfigured by a computer program stored in the computer.
  • This procedures presented herein are not inherently related to a particular computer or other apparatus.
  • Various general purpose machines may be used with programs written in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these machines will appear from the description given.
  • FIG. 1 illustrates a block diagram for an apparatus 100 .
  • the apparatus 100 may comprise a computer-implemented apparatus 100 having a software map application 120 comprising one or more components 122 - a .
  • the apparatus 100 shown in FIG. 1 has a limited number of elements in a certain topology, it may be appreciated that the apparatus 100 may include more or less elements in alternate topologies as desired for a given implementation.
  • the apparatus 100 may comprise the map application 120 .
  • the map application 120 may be generally arranged to manage a digital map 124 .
  • a map is a visual representation of an area.
  • the digital map 124 may comprise a digital or electronic form of a map.
  • the digital map 124 may be used to depict geography. Map information for the digital map 124 may be part of, or originate from, a geographic information system (GIS).
  • GIS geographic information system
  • a GIS is the merging of cartography, statistical analysis, and computer science technology.
  • a GIS is a system designed to capture, store, manipulate, analyze, manage, and present all types of geographical data. Other geoinformatic systems and/or data may be used as well.
  • maps in terms visualizing physical geographical locations the digital map 124 may also be used to represent any space, real or imagined, such as brain mapping, DNA mapping and extraterrestrial mapping. Embodiments are not limited in this context.
  • the map information may be provided by a same electronic device implementing the apparatus 100 . In one embodiment, the map information may be provided by a different electronic device (e.g., a server) from the one implementing the apparatus 100 (e.g., a client).
  • a server e.g., a server
  • the map application 120 may comprise any software application capable of creating, modifying, managing or otherwise using map information for the digital map 124 .
  • the map application 120 may comprise or be implemented as a stand-alone productivity application, or an add-in for a productivity application.
  • a productivity application may comprise a software application program designed to perform a specific set of functions for a knowledge worker.
  • a productivity application typically operates to create, modify, send, receive, or otherwise manage content for one or more documents. Examples for productivity applications may include without limitation a productivity suite of inter-related client applications, server applications and/or web services, designed for a particular operating system, such as a MICROSOFT® OFFICE productivity suite for MICROSOFT WINDOWS®, made by Microsoft Corporation, Redmond, Wash.
  • productivity applications may include without limitation MICROSOFT WORD, MICROSOFT EXCEL®, MICROSOFT POWERPOINT®, MICROSOFT OUTLOOK®, MICROSOFT ACCESS®, MICROSOFT INFOPATH®, MICROSOFT ONENOTE®, MICROSOFT PROJECT, MICROSOFT PUBLISHER, MICROSOFT SHAREPOINT® WORKSPACE, MICROSOFT VISIO®, MICROSOFT OFFICE INTERCONNECT, MICROSOFT OFFICE PICTURE MANAGER, MICROSOFT SHAREPOINT DESIGNER, and MICROSOFT LYNC.
  • server applications may include without limitation MICROSOFT SHAREPOINT SERVER, MICROSOFT LYNC SERVER, MICROSOFT OFFICE FORMS SERVER, MICROSOFT OFFICE GROOVE® SERVER, MICROSOFT OFFICE PROJECT SERVER, MICROSOFT OFFICE PROJECT PORTFOLIO SERVER, and MICROSOFT OFFICE PERFORMANCEPOINT® SERVER. It also is to be appreciated that embodiments may implement other types of applications in addition to productivity applications which are consistent with the described embodiments. The embodiments are not limited to these examples.
  • the map application 120 may be capable of communicating with a network device, such as a server providing network services, such as a web service.
  • a network device such as a server providing network services, such as a web service.
  • Examples for web services may include without limitation MICROSOFT WINDOWS LIVE®, MICROSOFT OFFICE WEB APPLICATIONS, MICROSOFT OFFICE LIVE, MICROSOFT LIVE MEETING, MICROSOFT OFFICE PRODUCT WEB SITE, MICROSOFT UPDATE SERVER, and MICROSOFT OFFICE 365.
  • a map application 120 may comprise a map transition component 122 - 1 arranged to provide map views that can be smoothly transitioned between two dimensional (2D) and three dimensional (3D) views. For instance, a digital map may be switched between a three dimensional view of a sphere or globe and a two dimensional view of a plane or flat map utilizing a sequence of intermediate views 130 to simulate a smooth animated transition between views.
  • distortions may occur in the ratio between east-west distance (stretched by projection) and north-south distance. This may potentially be corrected by stretching the north-south distance to match the east-west stretch, but that may cause magnification of distances in all directions.
  • the map transition component 122 - 1 may compensate for these and other problems.
  • the map transition component 122 - 1 may enable an end user to see the data on both the three dimensional and the two dimensional view. Further, the map transition component 122 - 1 may enable a smooth animated transition between the two views using a progression of intermediate views 130 . Further, both views will maintain proper shapes, angles, and/or area of map items during or after transition. In addition, spatial distortions, such as distances between map items, may be reduced or effectively eliminated.
  • Other features and advantages of the map transition component 122 - 1 are described below with reference to FIG. 2 .
  • the map application 120 may also comprise a map color component 122 - 2 .
  • the map color component 122 - 2 may be arranged to provide map views with a range of colors that represent multiple categories for a given location.
  • the digital map 124 may provide regions with blended colors, with each color representing a different category assigned to a particular region.
  • the blending effect provides a high level of granularity for conveying nuanced information about a particular region.
  • the map color component 122 - 2 provides the capability of switching between shading options to represent multiple categories for a given map.
  • the map color component 122 - 2 also provides new shading options.
  • the map color component 122 - 2 further provides smart defaults for the various shading options for user convenience. Other features and advantages of the map color component 122 - 2 are described below with reference to FIG. 3 .
  • the map application 120 may further comprise or implement a map scheduler component 122 - 3 .
  • the map scheduler component 122 - 3 provides the capability of presenting a considerable number of time-bound categorized data points on a three dimensional view of a digital map 124 .
  • the map scheduler component 122 - 3 may assign work units to multiple processors and/or processor cores to efficiently render the digital map 124 .
  • the digital map 124 may present varying types of information in a same location at different times. In this case, a relative position of a data point needs to be determined for all data points sharing the same time and location. Work may be scheduled to a given processor among the set of processors to decrease an amount of time needed to calculate this relative position.
  • Other features and advantages of the map scheduler component 122 - 3 are described below with reference to FIG. 3 .
  • FIGS. 2A-D illustrate a transition 200 with various user interface views of the digital map 124 .
  • the map transition component 122 - 1 has at least two modes for rendering the digital map 124 . The first is a three dimensional view mode. The second is a plane view mode.
  • the map transition component 122 - 1 may animate transition between the two view modes with one or more intermediate views to provide a smooth folding/unfolding visual effect.
  • FIG. 2A depicts a three dimensional view 202 .
  • the digital map 124 is rendered as a globe or sphere, the three dimensional view may be referred to as a global or a spherical map view.
  • the map application 120 may present the three dimensional view 200 during the three dimensional view mode.
  • FIG. 2D illustrates a plane view 208 .
  • the map application 120 may present the plane view 208 during the plane view mode.
  • FIGS. 2B , 2 C illustrate a pair of intermediate views 204 , 206 , respectively, that may be presented during transition between the three dimensional view 202 and the plane view 208 .
  • the intermediate view 204 illustrates an initial unfurling of the sphere at time t 1 .
  • the intermediate view 206 illustrates a further unfurling of the sphere at time t 2 .
  • any number of intermediate views may be used to provide a given level of fidelity for a particular transition, such as 24 views per second, 30 views per second, and so forth.
  • the map transition component 122 - 1 may implement animations of transitions between 2D/3D view modes by utilizing a smooth family of intermediate surfaces between round globe and flat map, along with a well-behaved cartographic projection to each such surface.
  • S(t) is a portion a sphere of radius 1/t, tangent to the flat map plane at its center.
  • S(1) is the unit sphere (e.g., round globe).
  • S(0) may be undefined, but the limit
  • the map transition component 122 - 1 renders the plane view 208 using a Mercator projection.
  • a Mercator projection has the following properties: (1) it preserves shapes and angles (e.g., it is mathematically conformal); (2) it maps parallels (e.g., lines of constant latitude) and meridians to horizontal and vertical lines on the plane, respectively; and (3) it uses a scale that is constant along any given parallel.
  • mapping M(t) from the unit sphere to the intermediate sphere S(t) may be introduced in a way that parallels and meridians on S(t) map to parallels and meridians of the unit sphere, and the point (0,0) is fixed for all t.
  • the symbol m ⁇ 1 represents the Mercator projection's inverse.
  • mapping is conformal; (2) it maps parallels and meridians on the globe to parallels and meridians on S(t); and (3) it uses a scale that is constant along any meridian (e.g., the scale is 1 along the equator).
  • visual indicia or visuals such as three dimensional columns or bubbles
  • a spherical map may utilize these visuals to represent a number of time-bound categorized data points at a particular geographic location.
  • FIG. 3 illustrates a region map 300 .
  • the region map 300 is a particular form of digital map 124 presenting regional information for an area, such as individual states within the United States of America. Red state/blue state election maps are a quintessential example.
  • the map color component 122 - 2 may assign colors without any shading. For instance, assume the map color component 122 - 2 needs to assign colors to the region map 300 utilizing a data set as shown in Table 1, as follows:
  • the data set of Table 1 provides locations representing the three states of New York, Florida and New Jersey.
  • the data set also provides a single category for each state.
  • the single category is political party affiliation, with values indicating either Democratic or Republican.
  • the map color component 122 - 2 may select a particular color to represent each state based on comparison of category values assigned to each state. For instance, assume a blue color gradient is assigned to the Democratic Party, and a red color gradient is assigned to the Republican Party. In this instance, the states of New York and New Jersey may be assigned a color blue, while the state of Florida may be assigned a color red. As there is a single category for each location, color shading within each color is not necessary to visually convey category information. A single color is sufficient for a viewer to understand whether a state has voted Democrat or Republican.
  • the map color component 122 - 2 may solve these and other problems by utilizing a color gradient having multiple shades of a given color.
  • a particular shade may reflect a particular category.
  • a particular shade may reflect a blending of multiple categories.
  • a combination of color and color shades within the color may be used to convey information from multiple categories assigned to a single location.
  • the map color component 122 - 2 may generate or retrieve a color gradient for the digital map 124 or region map 300 .
  • the map color component 122 - 2 may assign varying shades of colors from the color gradient to the location in a way that represents information from multiple categories assigned to the location.
  • Region map 300 illustrates some examples of shades assigned from a color gradient having shades for the color grey.
  • the map color component 122 - 2 may select a shade of a color gradient based on a data set for a location. For instance, assume the map color component 122 - 2 needs to assign colors to the region map 300 representing states within the United States of America. Further assume the map color component 122 - 2 receives as input a data set as shown in Table 2, as follows:
  • the data set of Table 2 has three columns.
  • the first column includes data representing the various states of New Jersey and Pennsylvania.
  • the second and third columns include data representing multiple categories for each of the states of New Jersey and Pennsylvania.
  • Column 2 indicates a first category of political party affiliation.
  • Column 3 indicates a second category of a number of votes received for each political party in a recent election.
  • the map color component 122 - 2 may select a particular shade to represent each state based on comparison of values across the multiple categories assigned to each state. For instance, assume a blue color gradient is assigned to the Democratic Party, a red color gradient is assigned to the Republican Party, and a green color gradient is assigned to the Independent party. When the map color component 122 - 2 evaluates the first category, it may assign a color (R/B/G) to each of the values representing a political party. In this case, the first category indicates that there are 3 political parties for each state.
  • the map color component 122 - 2 may either: (1) select a blending of the three colors (R/B/G) to create a blended shade representing the presence of three political parties within a state; (2) select a color (R/B/G) representing a sub-region of each state associated with each political party; or (3) evaluate other categories prior to color assignment to the location.
  • the map color component 122 - 2 may evaluate the second category of number of votes for each political party.
  • the political party with a greatest number of votes for both states is the Democratic Party.
  • the map color component 122 - 2 may select a blue color gradient for color assignment.
  • simply assigning both states a same color blue (or shade of blue) would not convey the number of votes received by the other political parties.
  • the map color component 122 - 2 may compare values for number of votes received by the Republican Party, which in this case is 2 for each state. As that does not provide any differentiation between states, it may be eliminated from consideration in color selection.
  • the map color component 122 - 2 receives values of 7 and 3 for New Jersey and Pennsylvania, respectively. As this does provide differentiation, the map color component 122 - 2 may select different shades of blue to represent this difference. For instance, the map color component 122 - 2 may select a first shade of blue for New Jersey and a second shade of blue for Pennsylvania as both are predominantly democratic. Further, the map color component 122 - 2 may make the first shade lighter or darker than the second shade in order to represent the variation in number of Independent party votes. In this manner, the region map 300 may quickly provide a viewer information from multiple categories for a single location based exclusively on selective color shading, which in turn, corresponds to two or more of the multiple categories assigned to the location from the data set of Table 2.
  • a particular shade may be selected from a color gradient based on various factors. For instance, one end of the color gradient may represent a maximum value of a category range (or multiple category ranges), while the other end of the color gradient may represent a minimum value of the category range (or multiple category ranges). Intermediate shades along the color gradient may then be scaled based on the maximum and minimum values.
  • region charts when there are several categories at a location, the map application 120 could split the column into either stacked or clustered. Similarly for bubbles, the map application 120 can split the bubble into pie slices.
  • the map application 120 may further comprise or implement a map scheduler component 122 - 3 .
  • the map scheduler component 122 - 3 provides the capability of presenting a large number of time-bound categorized data points on a spherical view or an intermediate view of a digital map 124 .
  • the map scheduler component 122 - 3 may assign work units to multiple processors and/or processor cores to efficiently render the digital map 124 .
  • the digital map 124 may present varying types of information in a same location at different times. In this case, a relative position of a data point needs to be determined for all data points sharing the same time and location. Work may be scheduled to a given processor among the set of processors to decrease an amount of time needed to calculate this relative position.
  • the map scheduler component 122 - 3 may render millions of time-bound clustered graphics elements using one or more dedicated processors, such as a graphical processing unit (GPU).
  • a rendering system needs to determine which data points are visible for any given point in time. When multiple points are visible at the same location at the same point in time, the rendering system also needs to dynamically determine (e.g., for every rendered frame) a relative position of a data point among all data points sharing the same time and location.
  • map scheduler component 122 - 3 is given a data set as shown in Table 3 where the data points share a same location:
  • FIG. 4 illustrates a block diagram of a centralized system 400 .
  • the centralized system 400 may implement some or all of the structure and/or operations for the apparatus 100 in a single computing entity, such as entirely within a single device 420 .
  • the device 420 may comprise any electronic device capable of receiving, processing, and sending information for the apparatus 100 .
  • Examples of an electronic device may include without limitation an ultra-mobile device, a mobile device, a personal digital assistant (PDA), a mobile computing device, a smart phone, a telephone, a digital telephone, a cellular telephone, eBook readers, a handset, a one-way pager, a two-way pager, a messaging device, a computer, a personal computer (PC), a desktop computer, a laptop computer, a notebook computer, a netbook computer, a handheld computer, a tablet computer, a server, a server array or server farm, a web server, a network server, an Internet server, a work station, a mini-computer, a main frame computer, a supercomputer, a network appliance, a web appliance, a distributed computing system, multiprocessor systems, processor-based systems, consumer electronics, programmable consumer electronics, game devices, television, digital television, set top box, wireless access point, base station, subscribe
  • the device 420 may execute processing operations or logic for the apparatus 100 using a processing component 430 .
  • the processing component 430 may comprise various hardware elements, software elements, or a combination of both. Examples of hardware elements may include devices, logic devices, components, processors, microprocessors, circuits, processor circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (ASIC), programmable logic devices (PLD), digital signal processors (DSP), field programmable gate array (FPGA), memory units, logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth.
  • ASIC application specific integrated circuits
  • PLD programmable logic devices
  • DSP digital signal processors
  • FPGA field programmable gate array
  • Examples of software elements may include software components, programs, applications, computer programs, application programs, system programs, software development programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory web resources, data bus speeds and other design or performance constraints, as desired for a given implementation.
  • the device 420 may execute communications operations or logic for the apparatus 100 using communications component 440 .
  • the communications component 440 may implement any well-known communications techniques and protocols, such as techniques suitable for use with packet-switched networks (e.g., public networks such as the Internet, private networks such as an enterprise intranet, and so forth), circuit-switched networks (e.g., the public switched telephone network), or a combination of packet-switched networks and circuit-switched networks (with suitable gateways and translators).
  • the communications component 440 may include various types of standard communication elements, such as one or more communications interfaces, network interfaces, network interface cards (NIC), radios, wireless transmitters/receivers (transceivers), wired and/or wireless communication media, physical connectors, and so forth.
  • communication media 412 , 442 include wired communications media and wireless communications media.
  • wired communications media may include a wire, cable, metal leads, printed circuit boards (PCB), backplanes, switch fabrics, semiconductor material, twisted-pair wire, co-axial cable, fiber optics, a propagated signal, and so forth.
  • wireless communications media may include acoustic, radio-frequency (RF) spectrum, infrared and other wireless media.
  • the device 420 may communicate with other devices 410 , 450 over a communications media 412 , 442 , respectively, using communications signals 414 , 444 , respectively, via the communications component 440 .
  • the devices 410 , 450 may be internal or external to the device 420 as desired for a given implementation.
  • the device 420 may implement the entire apparatus 100 to access a map database implemented by another device, such as a GIS 460 implemented by the device 450 .
  • the apparatus 100 may also render the digital map 124 with another device implementing some or all of apparatus 100 , such as the device 410 .
  • the digital map 124 may be communicated in any number of ways, such as through a messaging interface (e.g., email, short message service (SMS), multimedia message service (MMS), instant messaging ( 1 M), and so forth), shared network storage space, peer-to-peer communications, web technologies (e.g., a web page), and other communications modalities.
  • the device 410 may also use apparatus 100 to communicate with the GIS 460 in a manner similar to the device 420 .
  • the device 420 may further implement other platform components common to a computing and/or communications device, such as described with reference to FIG. 11 .
  • FIG. 5 illustrates a block diagram of a distributed system 500 .
  • the distributed system 500 may distribute portions of the structure and/or operations for the apparatus 100 across multiple computing entities.
  • Examples of distributed system 500 may include without limitation a client-server architecture, a 3-tier architecture, an N-tier architecture, a tightly-coupled or clustered architecture, a peer-to-peer architecture, a master-slave architecture, a shared database architecture, and other types of distributed systems.
  • the embodiments are not limited in this context.
  • the distributed system 500 may comprise a client device 510 and a server device 550 .
  • the client device 510 and the server device 550 may be the same or similar to the client device 220 as described with reference to FIG. 2 .
  • the client system 510 and the server system 550 may each comprise a processing component 530 and a communications component 550 which are the same or similar to the processing component 430 and the communications component 450 , respectively, as described with reference to FIG. 8 .
  • the devices 510 , 550 may communicate over a communications media 512 using communications signals 515 via the communications components 550 .
  • the client device 510 may comprise or employ one or more client programs that operate to perform various methodologies in accordance with the described embodiments.
  • the client device 510 may implement a portion of the apparatus 100 , such as the map application 120 , for example.
  • the server device 550 may comprise or employ one or more server programs that operate to perform various methodologies in accordance with the described embodiments.
  • the server device 550 may implement a portion of the apparatus 100 , such as the GIS 460 , for example.
  • the distributed model may be suitable for sharing map information among multiple devices or users.
  • FIG. 6 illustrates an embodiment of a logic flow 600 .
  • the logic flow 600 may be representative of some or all of the operations executed by one or more embodiments described herein.
  • a set of views pertaining to a digital map may include a three dimensional view, a plane view and a set of intermediate views corresponding to a visual transition between the three dimensional view and the plane view.
  • an intermediate view may refer to an intermediate sphere representing a map of geographic locations in which one example geographic location may be a region.
  • the logic flow 600 may refer to processing a control directive at block 604 .
  • the control directive causes a transition either between the three dimensional view and the plane view or vice versa.
  • a control directive to transition to a flat map e.g., from a sphere map
  • the logic flow 600 proceeds to block 606 A.
  • a control directive to transition to a sphere map e.g., from a flat map
  • the logic flow 600 proceeds to block 606 B.
  • the logic flow 600 may refer to presenting the three dimensional view at block 606 A.
  • the logic flow 600 may refer to presenting the plane view of the digital map at block 606 B.
  • the logic flow 600 may produce a rendering of a global map as a sphere; and the block 606 B may produce a rendering of the flat map as the plane.
  • the logic flow 600 may refer to presenting the set of intermediate views at block 608 .
  • the logic flow 600 may render a progression of intermediate views as intermediate spheres that increase or decrease in radius between two points in time such that the visual transition between the three dimensional view and the plane view is presented via animation.
  • the visual transition simulates the unfurling of the flat map into a globe or, vice versa, the furling of the globe into the flat map.
  • the logic flow 600 proceeds to either the block 606 A or 606 B, depending upon whether the control directive corresponded to a transition to the plane view or the three dimensional view, respectively.
  • the logic flow 600 returns to the block 604 and processes another control directive.
  • the logic flow 600 generally enables presentation of a view of a spatially correct surface and a view of an entire surface or at least a considerable portion thereof.
  • FIG. 7 illustrates an embodiment of a logic flow 700 .
  • the logic flow 700 may be representative of some or all of the operations executed by one or more embodiments described herein.
  • the logic flow 700 processes data representing a spatially correct surface at block 702 .
  • a unit sphere provides a suitable model for mapping a spatially correct surface, such as a globe. It is appreciated that other example models are capable of representing a spatially correct surface in accordance with the embodiments described herein. For example, some models are configured to map spheroid objects while some models are designed for other objects, such as brains. Any one of these models can be modified to illustrate a transition between a three-dimensional view of the spatially correct surface and a two-dimensional view.
  • the logic flow 700 may process mapping data between the spatially correct surface and a substantially spherical coordinate system at block 704 .
  • Any appropriate coordinate system such as a geographic coordinate system, may define locations on the spatially correct surface. Coordinates for those locations may be projected onto coordinates in the substantially spherical coordinate system.
  • One example substantially spherical coordinate system may define a spheroid with a larger dimensions than the unit sphere.
  • the resulting surface data may no longer be represented as spatially correct.
  • the logic flow 700 may generate a set of intermediate views at block 706 . Coordinates of the resulting surface data may be mapped to a second set of substantially spherical coordinates on a three dimensional surface (e.g., another spheroid) with larger dimensions that the spheroid. This process may be repeated until a set of coordinates for each intermediate view is determined. Presentation of these views may be accomplished by determining a transition rate for rendering the intermediate views on a display element. This transmission rate may represent an aesthetically pleasing and smooth animation illustrating a transition between a view of the spatially correct surface and another view, such as a plane view.
  • FIG. 8 illustrates an embodiment of a logic flow 800 .
  • the logic flow 800 may be representative of some or all of the operations executed by one or more embodiments described herein.
  • the logic flow 800 commencing unit sphere processing at block 802 .
  • the logic flow 800 may be directed to generating a number of intermediate spheres from the unit sphere at block 804 . These intermediate spheres may correspond to different radius values as desired for the transition between views.
  • a number of mathematical transformations known as projections, may be applied onto coordinates of the unit sphere in order to generate data corresponding to a number of intermediate spheres.
  • the logic flow 800 may perform a projection from the unit sphere onto planar coordinates followed by an inverse of those coordinates onto another spherical coordinate system having a different radius from the unit sphere.
  • the logic flow 800 may be directed to processing the intermediate spheres to illustrate a transition between views at block 806 .
  • the logic flow 800 may assign tasks related to the generation of the number of intermediate spheres to a plurality of processors.
  • assigning these tasks may include assigning work units to multiple processors and/or processor cores to efficiently render a progression of intermediate views of the intermediate spheres.
  • each view corresponds to a time value that also is mathematically related to the radius value. As the time value decreases approaches zero, the radius value of a corresponding intermediate sphere increases, and vice versa.
  • the logic flow 800 may transform a view of a first intermediate sphere and a time value into a view of a second intermediate sphere having a radius greater in size than a radius of the first intermediate sphere.
  • each intermediate sphere along the progression of intermediate views increases or decreases in radius between points in time.
  • FIG. 9 illustrates an embodiment of a logic flow 900 .
  • the logic flow 900 may be representative of some or all of the operations executed by one or more embodiments described herein.
  • the logic flow 900 performs a projection of geographic locations to a two dimensional map at block 902 .
  • a projection may refer to a mathematical projection or transformation between points in different coordinate systems.
  • a Mercator projection is one example projection between a global map and a flat map in which three dimensional geographic coordinates are transformed into two dimensional coordinates.
  • These geographic locations may correspond to coordinates in the three dimensional map's coordinate system. If a set of spherical coordinates refer to the global map's spherical coordinate system, the three dimensional surface is considered spatially correct.
  • the set of spherical coordinates may define a particular geographic location on a three dimensional surface.
  • these coordinates may be modified according to a desired radius value according to one embodiment.
  • This radius value may refer to an intermediate sphere modeling a global map that has been unfurled or expanded.
  • the logic flow 900 may generate data to represent an inverse between the two dimensional map and an intermediate three dimensional map having a desired radius at block 904 .
  • This inverse may refer to a Mercator projection inverse in which the set of corresponding coordinates on the two dimensional map are mapped to a set of spherical coordinates on the intermediate three dimensional map.
  • the logic flow 900 may transform the data into spherical coordinates for the geographic locations at block 906 .
  • spherical coordinates may be converted into geographic locations.
  • a coloring assignment may be applied to the intermediate three dimensional map to differentiate the geographic locations.
  • Visuals may be placed upon these coordinates to indicate date points, including data points associated with different categories.
  • FIG. 10 illustrates an embodiment of an exemplary computing architecture 1000 suitable for implementing various embodiments as previously described.
  • the computing architecture 1000 may comprise or be implemented as part of an electronic device. Examples of an electronic device may include those described with reference to FIG. 4 , among others. The embodiments are not limited in this context.
  • a component can be, but is not limited to being, a process running on a processor, a processor, a hard disk drive, multiple storage drives (of optical and/or magnetic storage medium), an object, an executable, a thread of execution, a program, and/or a computer.
  • a component can be, but is not limited to being, a process running on a processor, a processor, a hard disk drive, multiple storage drives (of optical and/or magnetic storage medium), an object, an executable, a thread of execution, a program, and/or a computer.
  • an application running on a server and the server can be a component.
  • One or more components can reside within a process and/or thread of execution, and a component can be localized on one computer and/or distributed between two or more computers.
  • components may be communicatively coupled to each other by various types of communications media to coordinate operations.
  • the coordination may involve the uni-directional or bi-directional exchange of information.
  • the components may communicate information in the form of signals communicated over the communications media.
  • the information can be implemented as signals allocated to various signal lines. In such allocations, each message is a signal.
  • Further embodiments, however, may alternatively employ data messages. Such data messages may be sent across various connections. Exemplary connections include parallel interfaces, serial interfaces, and bus interfaces.
  • the computing architecture 1000 includes various common computing elements, such as one or more processors, multi-core processors, co-processors, memory units, chipsets, controllers, peripherals, interfaces, oscillators, timing devices, video cards, audio cards, multimedia input/output (I/O) components, power supplies, and so forth.
  • processors multi-core processors
  • co-processors memory units
  • chipsets controllers
  • peripherals peripherals
  • oscillators oscillators
  • timing devices video cards
  • audio cards audio cards
  • multimedia input/output (I/O) components power supplies, and so forth.
  • the embodiments are not limited to implementation by the computing architecture 1000 .
  • the computing architecture 1000 comprises a processing unit 1004 , a system memory 1006 and a system bus 1008 .
  • the processing unit 1004 can be any of various commercially available processors, including without limitation an AMD® Athlon®, Duron® and Opteron® processors; ARM® application, embedded and secure processors; IBM® and Motorola® DragonBall® and PowerPC® processors; IBM and Sony® Cell processors; Intel® Celeron®, Core (2) Duo®, Itanium®, Pentium®, Xeon®, and XScale® processors; and similar processors. Dual microprocessors, multi-core processors, and other multi-processor architectures may also be employed as the processing unit 1004 .
  • the system bus 1008 provides an interface for system components including, but not limited to, the system memory 1006 to the processing unit 1004 .
  • the system bus 1008 can be any of several types of bus structure that may further interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures.
  • Interface adapters may connect to the system bus 1008 via a slot architecture.
  • Example slot architectures may include without limitation Accelerated Graphics Port (AGP), Card Bus, (Extended) Industry Standard Architecture ((E)ISA), Micro Channel Architecture (MCA), NuBus, Peripheral Component Interconnect (Extended) (PCI(X)), PCI Express, Personal Computer Memory Card International Association (PCMCIA), and the like.
  • the computing architecture 1000 may comprise or implement various articles of manufacture.
  • An article of manufacture may comprise a computer-readable storage medium to store logic.
  • Examples of a computer-readable storage medium may include any tangible media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth.
  • Examples of logic may include executable computer program instructions implemented using any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, object-oriented code, visual code, and the like.
  • Embodiments may also be at least partly implemented as instructions contained in or on a non-transitory computer-readable medium, which may be read and executed by one or more processors to enable performance of the operations described herein.
  • the system memory 1006 may include various types of computer-readable storage media in the form of one or more higher speed memory units, such as read-only memory (ROM), random-access memory (RAM), dynamic RAM (DRAM), Double-Data-Rate DRAM (DDRAM), synchronous DRAM (SDRAM), static RAM (SRAM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory, polymer memory such as ferroelectric polymer memory, ovonic memory, phase change or ferroelectric memory, silicon-oxide-nitride-oxide-silicon (SONOS) memory, magnetic or optical cards, an array of devices such as Redundant Array of Independent Disks (RAID) drives, solid state memory devices (e.g., USB memory, solid state drives (SSD) and any other type of storage media suitable for storing information.
  • the system memory 1006 can include non-volatile memory 1010 and/or volatile memory 1012
  • the computer 1002 may include various types of computer-readable storage media in the form of one or more lower speed memory units, including an internal (or external) hard disk drive (HDD) 1014 , a magnetic floppy disk drive (FDD) 1016 to read from or write to a removable magnetic disk 1018 , and an optical disk drive 1020 to read from or write to a removable optical disk 1022 (e.g., a CD-ROM or DVD).
  • the HDD 1014 , FDD 1016 and optical disk drive 1020 can be connected to the system bus 1008 by a HDD interface 1024 , an FDD interface 1026 and an optical drive interface 1028 , respectively.
  • the HDD interface 1024 for external drive implementations can include at least one or both of Universal Serial Bus (USB) and IEEE 1394 interface technologies.
  • the drives and associated computer-readable media provide volatile and/or nonvolatile storage of data, data structures, computer-executable instructions, and so forth.
  • a number of program modules can be stored in the drives and memory units 1010 , 1012 , including an operating system 1030 , one or more application programs 1032 , other program modules 1034 , and program data 1036 .
  • the one or more application programs 1032 , other program modules 1034 , and program data 1036 can include, for example, the various applications and/or components of the system 100 .
  • a user can enter commands and information into the computer 1002 through one or more wire/wireless input devices, for example, a keyboard 1038 and a pointing device, such as a mouse 1040 .
  • Other input devices may include microphones, infra-red (IR) remote controls, radio-frequency (RF) remote controls, game pads, stylus pens, card readers, dongles, finger print readers, gloves, graphics tablets, joysticks, keyboards, retina readers, touch screens (e.g., capacitive, resistive, etc.), trackballs, trackpads, sensors, styluses, and the like.
  • IR infra-red
  • RF radio-frequency
  • input devices are often connected to the processing unit 1004 through an input device interface 1042 that is coupled to the system bus 1008 , but can be connected by other interfaces such as a parallel port, IEEE 1394 serial port, a game port, a USB port, an IR interface, and so forth.
  • a monitor 1044 or other type of display device is also connected to the system bus 1008 via an interface, such as a video adaptor 1046 .
  • the monitor 1044 may be internal or external to the computer 1002 .
  • a computer typically includes other peripheral output devices, such as speakers, printers, and so forth.
  • the computer 1002 may operate in a networked environment using logical connections via wire and/or wireless communications to one or more remote computers, such as a remote computer 1048 .
  • the remote computer 1048 can be a workstation, a server computer, a router, a personal computer, portable computer, microprocessor-based entertainment appliance, a peer device or other common network node, and typically includes many or all of the elements described relative to the computer 1002 , although, for purposes of brevity, only a memory/storage device 1050 is illustrated.
  • the logical connections depicted include wire/wireless connectivity to a local area network (LAN) 1052 and/or larger networks, for example, a wide area network (WAN) 1054 .
  • LAN and WAN networking environments are commonplace in offices and companies, and facilitate enterprise-wide computer networks, such as intranets, all of which may connect to a global communications network, for example, the Internet.
  • the computer 1002 When used in a LAN networking environment, the computer 1002 is connected to the LAN 1052 through a wire and/or wireless communication network interface or adaptor 1056 .
  • the adaptor 1056 can facilitate wire and/or wireless communications to the LAN 1052 , which may also include a wireless access point disposed thereon for communicating with the wireless functionality of the adaptor 1056 .
  • the computer 1002 can include a modem 1058 , or is connected to a communications server on the WAN 1054 , or has other means for establishing communications over the WAN 1054 , such as by way of the Internet.
  • the modem 1058 which can be internal or external and a wire and/or wireless device, connects to the system bus 1008 via the input device interface 1042 .
  • program modules depicted relative to the computer 1002 can be stored in the remote memory/storage device 1050 . It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers can be used.
  • the computer 1002 is operable to communicate with wire and wireless devices or entities using the IEEE 802 family of standards, such as wireless devices operatively disposed in wireless communication (e.g., IEEE 802.11 over-the-air modulation techniques).
  • wireless communication e.g., IEEE 802.11 over-the-air modulation techniques.
  • the communication can be a predefined structure as with a conventional network or simply an ad hoc communication between at least two devices.
  • Wi-Fi networks use radio technologies called IEEE 802.11x (a, b, g, n, etc.) to provide secure, reliable, fast wireless connectivity.
  • a Wi-Fi network can be used to connect computers to each other, to the Internet, and to wire networks (which use IEEE 802.3-related media and functions).
  • FIG. 11 illustrates a block diagram of an exemplary communications architecture 1100 suitable for implementing various embodiments as previously described.
  • the communications architecture 1100 includes various common communications elements, such as a transmitter, receiver, transceiver, radio, network interface, baseband processor, antenna, amplifiers, filters, power supplies, and so forth.
  • the embodiments are not limited to implementation by the communications architecture 1100 .
  • the communications architecture 1100 comprises includes one or more clients 1102 and servers 1104 .
  • the clients 1102 may implement the client device 810 , 910 .
  • the servers 1104 may implement the server device 950 .
  • the clients 1102 and the servers 1104 are operatively connected to one or more respective client data stores 1108 and server data stores 1110 that can be employed to store information local to the respective clients 1102 and servers 1104 , such as cookies and/or associated contextual information.
  • the clients 1102 and the servers 1104 may communicate information between each other using a communication framework 1106 .
  • the communications framework 1106 may implement any well-known communications techniques and protocols.
  • the communications framework 1106 may be implemented as a packet-switched network (e.g., public networks such as the Internet, private networks such as an enterprise intranet, and so forth), a circuit-switched network (e.g., the public switched telephone network), or a combination of a packet-switched network and a circuit-switched network (with suitable gateways and translators).
  • the communications framework 1106 may implement various network interfaces arranged to accept, communicate, and connect to a communications network.
  • a network interface may be regarded as a specialized form of an input output interface.
  • Network interfaces may employ connection protocols including without limitation direct connect, Ethernet (e.g., thick, thin, twisted pair 10/100/1000 Base T, and the like), token ring, wireless network interfaces, cellular network interfaces, IEEE 802.11a-x network interfaces, IEEE 802.16 network interfaces, IEEE 802.20 network interfaces, and the like.
  • multiple network interfaces may be used to engage with various communications network types. For example, multiple network interfaces may be employed to allow for the communication over broadcast, multicast, and unicast networks.
  • a communications network may be any one and the combination of wired and/or wireless networks including without limitation a direct interconnection, a secured custom connection, a private network (e.g., an enterprise intranet), a public network (e.g., the Internet), a Personal Area Network (PAN), a Local Area Network (LAN), a Metropolitan Area Network (MAN), an Operating Missions as Nodes on the Internet (OMNI), a Wide Area Network (WAN), a wireless network, a cellular network, and other communications networks.
  • a private network e.g., an enterprise intranet
  • a public network e.g., the Internet
  • PAN Personal Area Network
  • LAN Local Area Network
  • MAN Metropolitan Area Network
  • OMNI Operating Missions as Nodes on the Internet
  • WAN Wide Area Network
  • wireless network a cellular network, and other communications networks.
  • Some embodiments may be described using the expression “one embodiment” or “an embodiment” along with their derivatives. These terms mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment. Further, some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. These terms are not necessarily intended as synonyms for each other. For example, some embodiments may be described using the terms “connected” and/or “coupled” to indicate that two or more elements are in direct physical or electrical contact with each other. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Remote Sensing (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Image Generation (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)
  • Instructional Devices (AREA)
  • Color Image Communication Systems (AREA)
  • Navigation (AREA)

Abstract

Techniques to manage map information illustrating a transition between views of a digital map. The transition may refer to a visual transition from a three dimensional map view and a two dimensional map view or vice versa. For each map location, the map information may include a set of spherical coordinates that corresponds to intermediate map views. This set of spherical coordinates enable presentation to a user the visual transition such that user is able to switch between viewing spatially correct surface locations and substantially all surface locations. Other embodiments are described and claimed.

Description

    RELATED APPLICATION
  • This application claims priority to, and benefit of, U.S. Provisional Patent Application No. 61/876,100 titled “Techniques to Manage Map Information” filed on Sep. 10, 2013, the entirety of which is hereby incorporated by reference.
  • BACKGROUND
  • Digital maps are becoming a universal platform for conveying map information representing locations of people, places, objects and events. As more map information is presented on a digital map, it becomes necessary to ensure the map information is presented to a user in a meaningful way. Further, digital maps are becoming more interactive to allow a user to manipulate a digital map to view particular map items of interest. In addition, the sheer volume of map information consumes significant computing and communications resources. As a result, enhanced techniques are needed to manage and manipulate a digital map to efficiently convey map information.
  • SUMMARY
  • The following presents a simplified summary in order to provide a basic understanding of some novel embodiments described herein. This summary is not an extensive overview, and it is not intended to identify key/critical elements or to delineate the scope thereof. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is presented later.
  • Embodiments are generally directed to enhanced techniques to manage digital maps. Some embodiments are particularly directed to enhanced techniques to manage map information for a digital map in an efficient and effective manner to facilitate consumption by a user.
  • In one embodiment, a map application may comprise a map transition component arranged to provide map views that can be smoothly transitioned between two dimensional (2D) and three dimensional (3D) views. For instance, a digital map may be switched between a 3D view of a globe and a 2D view of a flat map utilizing a series of intermediate views to simulate a smooth animated transition between views.
  • In one embodiment, the map transition component may be operative on the logic circuit to render a set of views of a digital map for presentation on a display element in which a three dimensional view comprises data representing a spatially correct surface and the intermediate views correspond to a transition between views, each intermediate view to comprise mapping data between the spatially correct surface and a substantially spherical coordinate system.
  • In one embodiment, a map application may comprise a map color component configured to cooperate with the map transition component during map view transitioning. An example map view displaying a range of colors that represent multiple categories for a given location may transition into a flatter map view (e.g., a 2D view) with a coloring assignment that reflects the flattening of map data. For instance, a digital map may provide regions with blended colors, with each color representing a different category assigned to a particular region. The blending effect provides a high level of granularity for conveying nuanced information about a region.
  • To the accomplishment of the foregoing and related ends, certain illustrative aspects are described herein in connection with the following description and the annexed drawings. These aspects are indicative of the various ways in which the principles disclosed herein can be practiced and all aspects and equivalents thereof are intended to be within the scope of the claimed subject matter. Other advantages and novel features will become apparent from the following detailed description when considered in conjunction with the drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 illustrates an embodiment of an apparatus to manage digital maps.
  • FIGS. 2A-D illustrate embodiments of a transition between 2D/3D maps.
  • FIG. 3 illustrates an embodiment of a region map.
  • FIG. 4 illustrates an embodiment of a centralized system for the apparatus.
  • FIG. 5 illustrates an embodiment of a distributed system for the apparatus.
  • FIG. 6 illustrates an embodiment of a first logic flow for the system of FIG. 1.
  • FIG. 7 illustrates an embodiment of a second logic flow for the system of FIG. 1.
  • FIG. 8 illustrates an embodiment of a third logic flow for the system of FIG. 1.
  • FIG. 9 illustrates an embodiment of a fourth logic flow for the system of FIG. 1.
  • FIG. 10 illustrates an embodiment of a computing architecture.
  • FIG. 11 illustrates an embodiment of a communications architecture.
  • DETAILED DESCRIPTION
  • Embodiments are generally directed to enhanced techniques to manage digital maps. Some embodiments are particularly directed to enhanced techniques to manage map information for a digital map in an efficient and effective manner to facilitate consumption by a user. The map information may include intermediate views illustrating a transition between a three dimensional view of the digital map and a two dimensional view of the same map. This transition may be animated in order to enable a user to clearly perceive a spatially correct surface and an entire surface in an efficient manner.
  • As a result, the embodiments can improve affordability, scalability, modularity, extendibility, or interoperability for an operator, device or network. Other advantages and use scenarios apply as well.
  • With general reference to notations and nomenclature used herein, the detailed descriptions which follow may be presented in terms of program procedures executed on a computer or network of computers. These procedural descriptions and representations are used by those skilled in the art to most effectively convey the substance of their work to others skilled in the art.
  • A procedure is here, and generally, conceived to be a self-consistent sequence of operations leading to a desired result. These operations are those requiring physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical, magnetic or optical signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It proves convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like. It should be noted, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to those quantities.
  • Further, the manipulations performed are often referred to in terms, such as adding or comparing, which are commonly associated with mental operations performed by a human operator. No such capability of a human operator is necessary, or desirable in most cases, in any of the operations described herein which form part of one or more embodiments. Rather, the operations are machine operations. Useful machines for performing operations of various embodiments include general purpose digital computers or similar devices.
  • Various embodiments also relate to apparatus or systems for performing these operations. This apparatus may be specially constructed for the required purpose or it may comprise a general purpose computer as selectively activated or reconfigured by a computer program stored in the computer. The procedures presented herein are not inherently related to a particular computer or other apparatus. Various general purpose machines may be used with programs written in accordance with the teachings herein, or it may prove convenient to construct more specialized apparatus to perform the required method steps. The required structure for a variety of these machines will appear from the description given.
  • Reference is now made to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding thereof. It may be evident, however, that the novel embodiments can be practiced without these specific details. In other instances, well known structures and devices are shown in block diagram form in order to facilitate a description thereof. The intention is to cover all modifications, equivalents, and alternatives consistent with the claimed subject matter.
  • FIG. 1 illustrates a block diagram for an apparatus 100. In one embodiment, the apparatus 100 may comprise a computer-implemented apparatus 100 having a software map application 120 comprising one or more components 122-a. Although the apparatus 100 shown in FIG. 1 has a limited number of elements in a certain topology, it may be appreciated that the apparatus 100 may include more or less elements in alternate topologies as desired for a given implementation.
  • It is worthy to note that “a” and “b” and “c” and similar designators as used herein are intended to be variables representing any positive integer. Thus, for example, if an implementation sets a value for a=5, then a complete set of components 122-a may include components 122-1, 122-2, 122-3, 122-4 and 122-5. The embodiments are not limited in this context.
  • The apparatus 100 may comprise the map application 120. The map application 120 may be generally arranged to manage a digital map 124. A map is a visual representation of an area. The digital map 124 may comprise a digital or electronic form of a map. The digital map 124 may be used to depict geography. Map information for the digital map 124 may be part of, or originate from, a geographic information system (GIS). A GIS is the merging of cartography, statistical analysis, and computer science technology. A GIS is a system designed to capture, store, manipulate, analyze, manage, and present all types of geographical data. Other geoinformatic systems and/or data may be used as well. Although some embodiments discuss maps in terms visualizing physical geographical locations, the digital map 124 may also be used to represent any space, real or imagined, such as brain mapping, DNA mapping and extraterrestrial mapping. Embodiments are not limited in this context.
  • In one embodiment, the map information may be provided by a same electronic device implementing the apparatus 100. In one embodiment, the map information may be provided by a different electronic device (e.g., a server) from the one implementing the apparatus 100 (e.g., a client).
  • The map application 120 may comprise any software application capable of creating, modifying, managing or otherwise using map information for the digital map 124. In one embodiment, the map application 120 may comprise or be implemented as a stand-alone productivity application, or an add-in for a productivity application. A productivity application may comprise a software application program designed to perform a specific set of functions for a knowledge worker. A productivity application typically operates to create, modify, send, receive, or otherwise manage content for one or more documents. Examples for productivity applications may include without limitation a productivity suite of inter-related client applications, server applications and/or web services, designed for a particular operating system, such as a MICROSOFT® OFFICE productivity suite for MICROSOFT WINDOWS®, made by Microsoft Corporation, Redmond, Wash. Examples for productivity applications may include without limitation MICROSOFT WORD, MICROSOFT EXCEL®, MICROSOFT POWERPOINT®, MICROSOFT OUTLOOK®, MICROSOFT ACCESS®, MICROSOFT INFOPATH®, MICROSOFT ONENOTE®, MICROSOFT PROJECT, MICROSOFT PUBLISHER, MICROSOFT SHAREPOINT® WORKSPACE, MICROSOFT VISIO®, MICROSOFT OFFICE INTERCONNECT, MICROSOFT OFFICE PICTURE MANAGER, MICROSOFT SHAREPOINT DESIGNER, and MICROSOFT LYNC. Examples for server applications may include without limitation MICROSOFT SHAREPOINT SERVER, MICROSOFT LYNC SERVER, MICROSOFT OFFICE FORMS SERVER, MICROSOFT OFFICE GROOVE® SERVER, MICROSOFT OFFICE PROJECT SERVER, MICROSOFT OFFICE PROJECT PORTFOLIO SERVER, and MICROSOFT OFFICE PERFORMANCEPOINT® SERVER. It also is to be appreciated that embodiments may implement other types of applications in addition to productivity applications which are consistent with the described embodiments. The embodiments are not limited to these examples.
  • The map application 120 may be capable of communicating with a network device, such as a server providing network services, such as a web service. Examples for web services may include without limitation MICROSOFT WINDOWS LIVE®, MICROSOFT OFFICE WEB APPLICATIONS, MICROSOFT OFFICE LIVE, MICROSOFT LIVE MEETING, MICROSOFT OFFICE PRODUCT WEB SITE, MICROSOFT UPDATE SERVER, and MICROSOFT OFFICE 365.
  • In one embodiment, a map application 120 may comprise a map transition component 122-1 arranged to provide map views that can be smoothly transitioned between two dimensional (2D) and three dimensional (3D) views. For instance, a digital map may be switched between a three dimensional view of a sphere or globe and a two dimensional view of a plane or flat map utilizing a sequence of intermediate views 130 to simulate a smooth animated transition between views.
  • Viewing geographic data on a three dimensional map comes with a lot of challenges. One challenge is the ability to view map data in a spatially correct manner and also relative to other data points in one view. Representing the data on a sphere keeps it spatially correct, but it is difficult to view globe wide data in a single view. In such cases, the two dimensional view of a flat surface may be desired as it allows presentation of all the map data in a single view. However, transforming the three dimensional view to two dimensional view may spatially distort distances between data points, particularly towards the northern and southern poles. For instance, countries such as Iceland and Greenland may appear further apart in the two dimensional view than the three dimensional view, or vice-versa. Furthermore, aside from distance, distortions may occur in the ratio between east-west distance (stretched by projection) and north-south distance. This may potentially be corrected by stretching the north-south distance to match the east-west stretch, but that may cause magnification of distances in all directions.
  • The map transition component 122-1 may compensate for these and other problems. The map transition component 122-1 may enable an end user to see the data on both the three dimensional and the two dimensional view. Further, the map transition component 122-1 may enable a smooth animated transition between the two views using a progression of intermediate views 130. Further, both views will maintain proper shapes, angles, and/or area of map items during or after transition. In addition, spatial distortions, such as distances between map items, may be reduced or effectively eliminated. Other features and advantages of the map transition component 122-1 are described below with reference to FIG. 2.
  • The map application 120 may also comprise a map color component 122-2. The map color component 122-2 may be arranged to provide map views with a range of colors that represent multiple categories for a given location. For instance, the digital map 124 may provide regions with blended colors, with each color representing a different category assigned to a particular region. The blending effect provides a high level of granularity for conveying nuanced information about a particular region. The map color component 122-2 provides the capability of switching between shading options to represent multiple categories for a given map. The map color component 122-2 also provides new shading options. The map color component 122-2 further provides smart defaults for the various shading options for user convenience. Other features and advantages of the map color component 122-2 are described below with reference to FIG. 3.
  • The map application 120 may further comprise or implement a map scheduler component 122-3. The map scheduler component 122-3 provides the capability of presenting a considerable number of time-bound categorized data points on a three dimensional view of a digital map 124. The map scheduler component 122-3 may assign work units to multiple processors and/or processor cores to efficiently render the digital map 124. For instance, the digital map 124 may present varying types of information in a same location at different times. In this case, a relative position of a data point needs to be determined for all data points sharing the same time and location. Work may be scheduled to a given processor among the set of processors to decrease an amount of time needed to calculate this relative position. Other features and advantages of the map scheduler component 122-3 are described below with reference to FIG. 3.
  • FIGS. 2A-D illustrate a transition 200 with various user interface views of the digital map 124. The map transition component 122-1 has at least two modes for rendering the digital map 124. The first is a three dimensional view mode. The second is a plane view mode. The map transition component 122-1 may animate transition between the two view modes with one or more intermediate views to provide a smooth folding/unfolding visual effect.
  • FIG. 2A depicts a three dimensional view 202. Because, in this example, the digital map 124 is rendered as a globe or sphere, the three dimensional view may be referred to as a global or a spherical map view. The map application 120 may present the three dimensional view 200 during the three dimensional view mode.
  • FIG. 2D illustrates a plane view 208. The map application 120 may present the plane view 208 during the plane view mode.
  • FIGS. 2B, 2C illustrate a pair of intermediate views 204, 206, respectively, that may be presented during transition between the three dimensional view 202 and the plane view 208. The intermediate view 204 illustrates an initial unfurling of the sphere at time t1. The intermediate view 206 illustrates a further unfurling of the sphere at time t2. Although only two intermediate views 204, 206 are shown by way of example, any number of intermediate views may be used to provide a given level of fidelity for a particular transition, such as 24 views per second, 30 views per second, and so forth.
  • The map transition component 122-1 may implement animations of transitions between 2D/3D view modes by utilizing a smooth family of intermediate surfaces between round globe and flat map, along with a well-behaved cartographic projection to each such surface. A family of intermediate surfaces may be defined as a surface S(t) for every real number t between 0 and 1, with a map plane at t=0 and the unit sphere at t=1. For the map transition component 122-1, S(t) is a portion a sphere of radius 1/t, tangent to the flat map plane at its center. S(1) is the unit sphere (e.g., round globe). S(0) may be undefined, but the limit
  • lim t 0 S ( t )
  • S(t) is clearly the plane.
  • The map transition component 122-1 renders the plane view 208 using a Mercator projection. A Mercator projection has the following properties: (1) it preserves shapes and angles (e.g., it is mathematically conformal); (2) it maps parallels (e.g., lines of constant latitude) and meridians to horizontal and vertical lines on the plane, respectively; and (3) it uses a scale that is constant along any given parallel.
  • If the scale along the equator is 1 (e.g., as a default), the Mercator projection is defined by x(longitude)=longitude, and y(latitude)=m(latitude), where:
  • m ( ϕ ) = log ( tan ( π 4 + ϕ 2 ) )
  • All angles are in radians.
  • To define the mapping M(t) from the unit sphere to the intermediate sphere S(t), spherical coordinates (longitudet, latitudet) on S(t) may be introduced in a way that parallels and meridians on S(t) map to parallels and meridians of the unit sphere, and the point (0,0) is fixed for all t.
  • The mapping is then:

  • longitudet =tlongitude

  • latitudet =m −1(tm(latitude))
  • The symbol m−1 represents the Mercator projection's inverse.
  • The following analogous properties justify viewing M(t) as a generalized Mercator projection: (1) the mapping is conformal; (2) it maps parallels and meridians on the globe to parallels and meridians on S(t); and (3) it uses a scale that is constant along any meridian (e.g., the scale is 1 along the equator).
  • In addition, the placing of visual indicia or visuals, such as three dimensional columns or bubbles, may need a smoothly varying local coordinate system at each point on the three dimensional surface. A spherical map may utilize these visuals to represent a number of time-bound categorized data points at a particular geographic location.
  • Based upon the spherical coordinate system, the surface normal at the image of the point (longitude, latitude) is:
      • (cos latitudet cos longitudet, cos latitudet sin longitudet, sin latitudet)
        The north and east pointing directions are the tangent vectors of the meridian and parallel at any point. These vectors may form the local coordinate system suitable for placing one or more of the example visuals described herein.
  • FIG. 3 illustrates a region map 300. The region map 300 is a particular form of digital map 124 presenting regional information for an area, such as individual states within the United States of America. Red state/blue state election maps are a quintessential example.
  • Choosing how to color the region map 300 becomes more difficult when the data includes multiple categories for a location. For example, if a data set includes Democratic, Republican, and Independent votes for each state, then there may be more than one way to shade each state. For example, a state that was won by Republicans may be assigned a full red color (without shading) regardless of a margin of victory. On the other hand, a state that was won by Republicans may be assigned a lighter shade of red if Republicans only won by a narrow margin (e.g., shading based on category value within location).
  • When a location is assigned a single category, the map color component 122-2 may assign colors without any shading. For instance, assume the map color component 122-2 needs to assign colors to the region map 300 utilizing a data set as shown in Table 1, as follows:
  • TABLE 1
    State Party
    NY D
    FL R
    NJ D

    The data set of Table 1 provides locations representing the three states of New York, Florida and New Jersey. The data set also provides a single category for each state. The single category is political party affiliation, with values indicating either Democratic or Republican.
  • The map color component 122-2 may select a particular color to represent each state based on comparison of category values assigned to each state. For instance, assume a blue color gradient is assigned to the Democratic Party, and a red color gradient is assigned to the Republican Party. In this instance, the states of New York and New Jersey may be assigned a color blue, while the state of Florida may be assigned a color red. As there is a single category for each location, color shading within each color is not necessary to visually convey category information. A single color is sufficient for a viewer to understand whether a state has voted Democrat or Republican.
  • When a single geographic location is assigned multiple categories, however, it becomes difficult for a single color to visually convey separate category values for each category assigned to a location. The map color component 122-2 may solve these and other problems by utilizing a color gradient having multiple shades of a given color. In some cases, a particular shade may reflect a particular category. In other cases, a particular shade may reflect a blending of multiple categories. In either case, a combination of color and color shades within the color may be used to convey information from multiple categories assigned to a single location.
  • Prior to shade selection, the map color component 122-2 may generate or retrieve a color gradient for the digital map 124 or region map 300. The map color component 122-2 may assign varying shades of colors from the color gradient to the location in a way that represents information from multiple categories assigned to the location. Region map 300 illustrates some examples of shades assigned from a color gradient having shades for the color grey.
  • The map color component 122-2 may select a shade of a color gradient based on a data set for a location. For instance, assume the map color component 122-2 needs to assign colors to the region map 300 representing states within the United States of America. Further assume the map color component 122-2 receives as input a data set as shown in Table 2, as follows:
  • TABLE 2
    State Party # Votes
    NJ Ind 7
    NJ Rep 2
    NJ Dem 10
    PA Ind 3
    PA Rep 2
    PA Dem 10

    The data set of Table 2 has three columns. The first column includes data representing the various states of New Jersey and Pennsylvania. The second and third columns include data representing multiple categories for each of the states of New Jersey and Pennsylvania. Column 2 indicates a first category of political party affiliation. Column 3 indicates a second category of a number of votes received for each political party in a recent election.
  • The map color component 122-2 may select a particular shade to represent each state based on comparison of values across the multiple categories assigned to each state. For instance, assume a blue color gradient is assigned to the Democratic Party, a red color gradient is assigned to the Republican Party, and a green color gradient is assigned to the Independent party. When the map color component 122-2 evaluates the first category, it may assign a color (R/B/G) to each of the values representing a political party. In this case, the first category indicates that there are 3 political parties for each state. As such, the map color component 122-2 may either: (1) select a blending of the three colors (R/B/G) to create a blended shade representing the presence of three political parties within a state; (2) select a color (R/B/G) representing a sub-region of each state associated with each political party; or (3) evaluate other categories prior to color assignment to the location.
  • Assume the map color component 122-2 is configured for the third option. The map color component 122-2 may evaluate the second category of number of votes for each political party. In this case, the political party with a greatest number of votes for both states is the Democratic Party. As such, the map color component 122-2 may select a blue color gradient for color assignment. However, simply assigning both states a same color blue (or shade of blue) would not convey the number of votes received by the other political parties. To represent this information, the map color component 122-2 may compare values for number of votes received by the Republican Party, which in this case is 2 for each state. As that does not provide any differentiation between states, it may be eliminated from consideration in color selection. In comparing values for number of votes received by the Independent party, the map color component 122-2 receives values of 7 and 3 for New Jersey and Pennsylvania, respectively. As this does provide differentiation, the map color component 122-2 may select different shades of blue to represent this difference. For instance, the map color component 122-2 may select a first shade of blue for New Jersey and a second shade of blue for Pennsylvania as both are predominantly democratic. Further, the map color component 122-2 may make the first shade lighter or darker than the second shade in order to represent the variation in number of Independent party votes. In this manner, the region map 300 may quickly provide a viewer information from multiple categories for a single location based exclusively on selective color shading, which in turn, corresponds to two or more of the multiple categories assigned to the location from the data set of Table 2.
  • A particular shade may be selected from a color gradient based on various factors. For instance, one end of the color gradient may represent a maximum value of a category range (or multiple category ranges), while the other end of the color gradient may represent a minimum value of the category range (or multiple category ranges). Intermediate shades along the color gradient may then be scaled based on the maximum and minimum values.
  • One of challenge with region charts is that it is difficult to split up a region within the region map 300. In column charts, when there are several categories at a location, the map application 120 could split the column into either stacked or clustered. Similarly for bubbles, the map application 120 can split the bubble into pie slices.
  • As previously described, the map application 120 may further comprise or implement a map scheduler component 122-3. The map scheduler component 122-3 provides the capability of presenting a large number of time-bound categorized data points on a spherical view or an intermediate view of a digital map 124. The map scheduler component 122-3 may assign work units to multiple processors and/or processor cores to efficiently render the digital map 124. For instance, the digital map 124 may present varying types of information in a same location at different times. In this case, a relative position of a data point needs to be determined for all data points sharing the same time and location. Work may be scheduled to a given processor among the set of processors to decrease an amount of time needed to calculate this relative position.
  • The map scheduler component 122-3 may render millions of time-bound clustered graphics elements using one or more dedicated processors, such as a graphical processing unit (GPU). In order to render several time-bound data points that share the same location in space, a rendering system needs to determine which data points are visible for any given point in time. When multiple points are visible at the same location at the same point in time, the rendering system also needs to dynamically determine (e.g., for every rendered frame) a relative position of a data point among all data points sharing the same time and location.
  • For instance, assume the map scheduler component 122-3 is given a data set as shown in Table 3 where the data points share a same location:
  • TABLE 3
    Values Category Timestamp
    3 Yellow 1
    4 Green 1
    7 Yellow 2
    2 Green 3
    1 Red 3
  • Various frames associated with the data set of Table 3 need to be rendered for time stamps.
  • FIG. 4 illustrates a block diagram of a centralized system 400. The centralized system 400 may implement some or all of the structure and/or operations for the apparatus 100 in a single computing entity, such as entirely within a single device 420.
  • The device 420 may comprise any electronic device capable of receiving, processing, and sending information for the apparatus 100. Examples of an electronic device may include without limitation an ultra-mobile device, a mobile device, a personal digital assistant (PDA), a mobile computing device, a smart phone, a telephone, a digital telephone, a cellular telephone, eBook readers, a handset, a one-way pager, a two-way pager, a messaging device, a computer, a personal computer (PC), a desktop computer, a laptop computer, a notebook computer, a netbook computer, a handheld computer, a tablet computer, a server, a server array or server farm, a web server, a network server, an Internet server, a work station, a mini-computer, a main frame computer, a supercomputer, a network appliance, a web appliance, a distributed computing system, multiprocessor systems, processor-based systems, consumer electronics, programmable consumer electronics, game devices, television, digital television, set top box, wireless access point, base station, subscriber station, mobile subscriber center, radio network controller, router, hub, gateway, bridge, switch, machine, or combination thereof. The embodiments are not limited in this context.
  • The device 420 may execute processing operations or logic for the apparatus 100 using a processing component 430. The processing component 430 may comprise various hardware elements, software elements, or a combination of both. Examples of hardware elements may include devices, logic devices, components, processors, microprocessors, circuits, processor circuits, circuit elements (e.g., transistors, resistors, capacitors, inductors, and so forth), integrated circuits, application specific integrated circuits (ASIC), programmable logic devices (PLD), digital signal processors (DSP), field programmable gate array (FPGA), memory units, logic gates, registers, semiconductor device, chips, microchips, chip sets, and so forth. Examples of software elements may include software components, programs, applications, computer programs, application programs, system programs, software development programs, machine programs, operating system software, middleware, firmware, software modules, routines, subroutines, functions, methods, procedures, software interfaces, application program interfaces (API), instruction sets, computing code, computer code, code segments, computer code segments, words, values, symbols, or any combination thereof. Determining whether an embodiment is implemented using hardware elements and/or software elements may vary in accordance with any number of factors, such as desired computational rate, power levels, heat tolerances, processing cycle budget, input data rates, output data rates, memory web resources, data bus speeds and other design or performance constraints, as desired for a given implementation.
  • The device 420 may execute communications operations or logic for the apparatus 100 using communications component 440. The communications component 440 may implement any well-known communications techniques and protocols, such as techniques suitable for use with packet-switched networks (e.g., public networks such as the Internet, private networks such as an enterprise intranet, and so forth), circuit-switched networks (e.g., the public switched telephone network), or a combination of packet-switched networks and circuit-switched networks (with suitable gateways and translators). The communications component 440 may include various types of standard communication elements, such as one or more communications interfaces, network interfaces, network interface cards (NIC), radios, wireless transmitters/receivers (transceivers), wired and/or wireless communication media, physical connectors, and so forth. By way of example, and not limitation, communication media 412, 442 include wired communications media and wireless communications media. Examples of wired communications media may include a wire, cable, metal leads, printed circuit boards (PCB), backplanes, switch fabrics, semiconductor material, twisted-pair wire, co-axial cable, fiber optics, a propagated signal, and so forth. Examples of wireless communications media may include acoustic, radio-frequency (RF) spectrum, infrared and other wireless media.
  • The device 420 may communicate with other devices 410, 450 over a communications media 412, 442, respectively, using communications signals 414, 444, respectively, via the communications component 440. The devices 410, 450 may be internal or external to the device 420 as desired for a given implementation.
  • As shown in FIG. 4, the device 420 may implement the entire apparatus 100 to access a map database implemented by another device, such as a GIS 460 implemented by the device 450. The apparatus 100 may also render the digital map 124 with another device implementing some or all of apparatus 100, such as the device 410. The digital map 124 may be communicated in any number of ways, such as through a messaging interface (e.g., email, short message service (SMS), multimedia message service (MMS), instant messaging (1M), and so forth), shared network storage space, peer-to-peer communications, web technologies (e.g., a web page), and other communications modalities. The device 410 may also use apparatus 100 to communicate with the GIS 460 in a manner similar to the device 420. The device 420 may further implement other platform components common to a computing and/or communications device, such as described with reference to FIG. 11.
  • FIG. 5 illustrates a block diagram of a distributed system 500. The distributed system 500 may distribute portions of the structure and/or operations for the apparatus 100 across multiple computing entities. Examples of distributed system 500 may include without limitation a client-server architecture, a 3-tier architecture, an N-tier architecture, a tightly-coupled or clustered architecture, a peer-to-peer architecture, a master-slave architecture, a shared database architecture, and other types of distributed systems. The embodiments are not limited in this context.
  • The distributed system 500 may comprise a client device 510 and a server device 550. In general, the client device 510 and the server device 550 may be the same or similar to the client device 220 as described with reference to FIG. 2. For instance, the client system 510 and the server system 550 may each comprise a processing component 530 and a communications component 550 which are the same or similar to the processing component 430 and the communications component 450, respectively, as described with reference to FIG. 8. In another example, the devices 510, 550 may communicate over a communications media 512 using communications signals 515 via the communications components 550.
  • The client device 510 may comprise or employ one or more client programs that operate to perform various methodologies in accordance with the described embodiments. In one embodiment, for example, the client device 510 may implement a portion of the apparatus 100, such as the map application 120, for example.
  • The server device 550 may comprise or employ one or more server programs that operate to perform various methodologies in accordance with the described embodiments. In one embodiment, for example, the server device 550 may implement a portion of the apparatus 100, such as the GIS 460, for example. The distributed model may be suitable for sharing map information among multiple devices or users.
  • FIG. 6 illustrates an embodiment of a logic flow 600. The logic flow 600 may be representative of some or all of the operations executed by one or more embodiments described herein.
  • In the illustrated embodiment shown in FIG. 6, the logic flow 600 is directed towards processing a set of views at block 602. For example, a set of views pertaining to a digital map may include a three dimensional view, a plane view and a set of intermediate views corresponding to a visual transition between the three dimensional view and the plane view. At a particular point in time in the visual transition, an intermediate view may refer to an intermediate sphere representing a map of geographic locations in which one example geographic location may be a region.
  • The logic flow 600 may refer to processing a control directive at block 604. The control directive causes a transition either between the three dimensional view and the plane view or vice versa. For example, upon receipt of a control directive to transition to a flat map (e.g., from a sphere map), the logic flow 600 proceeds to block 606A. As another example, upon receipt of a control directive to transition to a sphere map (e.g., from a flat map), the logic flow 600 proceeds to block 606B.
  • The logic flow 600 may refer to presenting the three dimensional view at block 606A. The logic flow 600 may refer to presenting the plane view of the digital map at block 606B. For example, the logic flow 600 may produce a rendering of a global map as a sphere; and the block 606B may produce a rendering of the flat map as the plane.
  • The logic flow 600 may refer to presenting the set of intermediate views at block 608. For example, the logic flow 600 may render a progression of intermediate views as intermediate spheres that increase or decrease in radius between two points in time such that the visual transition between the three dimensional view and the plane view is presented via animation. In some embodiments, the visual transition simulates the unfurling of the flat map into a globe or, vice versa, the furling of the globe into the flat map.
  • The logic flow 600 proceeds to either the block 606A or 606B, depending upon whether the control directive corresponded to a transition to the plane view or the three dimensional view, respectively. The logic flow 600 returns to the block 604 and processes another control directive. The logic flow 600 generally enables presentation of a view of a spatially correct surface and a view of an entire surface or at least a considerable portion thereof.
  • FIG. 7 illustrates an embodiment of a logic flow 700. The logic flow 700 may be representative of some or all of the operations executed by one or more embodiments described herein.
  • In the illustrated embodiment shown in FIG. 7, the logic flow 700 processes data representing a spatially correct surface at block 702. A unit sphere, as one example, provides a suitable model for mapping a spatially correct surface, such as a globe. It is appreciated that other example models are capable of representing a spatially correct surface in accordance with the embodiments described herein. For example, some models are configured to map spheroid objects while some models are designed for other objects, such as brains. Any one of these models can be modified to illustrate a transition between a three-dimensional view of the spatially correct surface and a two-dimensional view.
  • The logic flow 700 may process mapping data between the spatially correct surface and a substantially spherical coordinate system at block 704. Any appropriate coordinate system, such as a geographic coordinate system, may define locations on the spatially correct surface. Coordinates for those locations may be projected onto coordinates in the substantially spherical coordinate system. One example substantially spherical coordinate system may define a spheroid with a larger dimensions than the unit sphere. In one embodiment, by mapping the spatially correct surface onto this coordinate system, the resulting surface data may no longer be represented as spatially correct.
  • The logic flow 700 may generate a set of intermediate views at block 706. Coordinates of the resulting surface data may be mapped to a second set of substantially spherical coordinates on a three dimensional surface (e.g., another spheroid) with larger dimensions that the spheroid. This process may be repeated until a set of coordinates for each intermediate view is determined. Presentation of these views may be accomplished by determining a transition rate for rendering the intermediate views on a display element. This transmission rate may represent an aesthetically pleasing and smooth animation illustrating a transition between a view of the spatially correct surface and another view, such as a plane view.
  • FIG. 8 illustrates an embodiment of a logic flow 800. The logic flow 800 may be representative of some or all of the operations executed by one or more embodiments described herein.
  • In the illustrated embodiment shown in FIG. 8, the logic flow 800 commencing unit sphere processing at block 802. The logic flow 800 may be directed to generating a number of intermediate spheres from the unit sphere at block 804. These intermediate spheres may correspond to different radius values as desired for the transition between views. As mentioned herein, a number of mathematical transformations, known as projections, may be applied onto coordinates of the unit sphere in order to generate data corresponding to a number of intermediate spheres. For example, the logic flow 800 may perform a projection from the unit sphere onto planar coordinates followed by an inverse of those coordinates onto another spherical coordinate system having a different radius from the unit sphere.
  • The logic flow 800 may be directed to processing the intermediate spheres to illustrate a transition between views at block 806. In one embodiment, the logic flow 800 may assign tasks related to the generation of the number of intermediate spheres to a plurality of processors. By way of example, assigning these tasks may include assigning work units to multiple processors and/or processor cores to efficiently render a progression of intermediate views of the intermediate spheres. In one embodiment, each view corresponds to a time value that also is mathematically related to the radius value. As the time value decreases approaches zero, the radius value of a corresponding intermediate sphere increases, and vice versa. In one embodiment, the logic flow 800 may transform a view of a first intermediate sphere and a time value into a view of a second intermediate sphere having a radius greater in size than a radius of the first intermediate sphere. Hence, each intermediate sphere along the progression of intermediate views increases or decreases in radius between points in time.
  • FIG. 9 illustrates an embodiment of a logic flow 900. The logic flow 900 may be representative of some or all of the operations executed by one or more embodiments described herein.
  • In the illustrated embodiment shown in FIG. 9, the logic flow 900 performs a projection of geographic locations to a two dimensional map at block 902. Such a projection may refer to a mathematical projection or transformation between points in different coordinate systems. A Mercator projection is one example projection between a global map and a flat map in which three dimensional geographic coordinates are transformed into two dimensional coordinates. To illustrate an embodiment of the present disclosure, consider the following example adaption of the Mercator projection. These geographic locations may correspond to coordinates in the three dimensional map's coordinate system. If a set of spherical coordinates refer to the global map's spherical coordinate system, the three dimensional surface is considered spatially correct. For example, the set of spherical coordinates may define a particular geographic location on a three dimensional surface. When the logic flow 900 performs the mathematical projection onto a set of corresponding coordinates on the two dimensional map at block 902, these coordinates may be modified according to a desired radius value according to one embodiment. This radius value may refer to an intermediate sphere modeling a global map that has been unfurled or expanded.
  • The logic flow 900 may generate data to represent an inverse between the two dimensional map and an intermediate three dimensional map having a desired radius at block 904. This inverse, for example, may refer to a Mercator projection inverse in which the set of corresponding coordinates on the two dimensional map are mapped to a set of spherical coordinates on the intermediate three dimensional map.
  • The logic flow 900 may transform the data into spherical coordinates for the geographic locations at block 906. For example, spherical coordinates may be converted into geographic locations. A coloring assignment may be applied to the intermediate three dimensional map to differentiate the geographic locations. Visuals may be placed upon these coordinates to indicate date points, including data points associated with different categories.
  • FIG. 10 illustrates an embodiment of an exemplary computing architecture 1000 suitable for implementing various embodiments as previously described. In one embodiment, the computing architecture 1000 may comprise or be implemented as part of an electronic device. Examples of an electronic device may include those described with reference to FIG. 4, among others. The embodiments are not limited in this context.
  • As used in this application, the terms “system” and “component” are intended to refer to a computer-related entity, either hardware, a combination of hardware and software, software, or software in execution, examples of which are provided by the exemplary computing architecture 1000. For example, a component can be, but is not limited to being, a process running on a processor, a processor, a hard disk drive, multiple storage drives (of optical and/or magnetic storage medium), an object, an executable, a thread of execution, a program, and/or a computer. By way of illustration, both an application running on a server and the server can be a component. One or more components can reside within a process and/or thread of execution, and a component can be localized on one computer and/or distributed between two or more computers. Further, components may be communicatively coupled to each other by various types of communications media to coordinate operations. The coordination may involve the uni-directional or bi-directional exchange of information. For instance, the components may communicate information in the form of signals communicated over the communications media. The information can be implemented as signals allocated to various signal lines. In such allocations, each message is a signal. Further embodiments, however, may alternatively employ data messages. Such data messages may be sent across various connections. Exemplary connections include parallel interfaces, serial interfaces, and bus interfaces.
  • The computing architecture 1000 includes various common computing elements, such as one or more processors, multi-core processors, co-processors, memory units, chipsets, controllers, peripherals, interfaces, oscillators, timing devices, video cards, audio cards, multimedia input/output (I/O) components, power supplies, and so forth. The embodiments, however, are not limited to implementation by the computing architecture 1000.
  • As shown in FIG. 10, the computing architecture 1000 comprises a processing unit 1004, a system memory 1006 and a system bus 1008. The processing unit 1004 can be any of various commercially available processors, including without limitation an AMD® Athlon®, Duron® and Opteron® processors; ARM® application, embedded and secure processors; IBM® and Motorola® DragonBall® and PowerPC® processors; IBM and Sony® Cell processors; Intel® Celeron®, Core (2) Duo®, Itanium®, Pentium®, Xeon®, and XScale® processors; and similar processors. Dual microprocessors, multi-core processors, and other multi-processor architectures may also be employed as the processing unit 1004.
  • The system bus 1008 provides an interface for system components including, but not limited to, the system memory 1006 to the processing unit 1004. The system bus 1008 can be any of several types of bus structure that may further interconnect to a memory bus (with or without a memory controller), a peripheral bus, and a local bus using any of a variety of commercially available bus architectures. Interface adapters may connect to the system bus 1008 via a slot architecture. Example slot architectures may include without limitation Accelerated Graphics Port (AGP), Card Bus, (Extended) Industry Standard Architecture ((E)ISA), Micro Channel Architecture (MCA), NuBus, Peripheral Component Interconnect (Extended) (PCI(X)), PCI Express, Personal Computer Memory Card International Association (PCMCIA), and the like.
  • The computing architecture 1000 may comprise or implement various articles of manufacture. An article of manufacture may comprise a computer-readable storage medium to store logic. Examples of a computer-readable storage medium may include any tangible media capable of storing electronic data, including volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth. Examples of logic may include executable computer program instructions implemented using any suitable type of code, such as source code, compiled code, interpreted code, executable code, static code, dynamic code, object-oriented code, visual code, and the like. Embodiments may also be at least partly implemented as instructions contained in or on a non-transitory computer-readable medium, which may be read and executed by one or more processors to enable performance of the operations described herein.
  • The system memory 1006 may include various types of computer-readable storage media in the form of one or more higher speed memory units, such as read-only memory (ROM), random-access memory (RAM), dynamic RAM (DRAM), Double-Data-Rate DRAM (DDRAM), synchronous DRAM (SDRAM), static RAM (SRAM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory, polymer memory such as ferroelectric polymer memory, ovonic memory, phase change or ferroelectric memory, silicon-oxide-nitride-oxide-silicon (SONOS) memory, magnetic or optical cards, an array of devices such as Redundant Array of Independent Disks (RAID) drives, solid state memory devices (e.g., USB memory, solid state drives (SSD) and any other type of storage media suitable for storing information. In the illustrated embodiment shown in FIG. 10, the system memory 1006 can include non-volatile memory 1010 and/or volatile memory 1012. A basic input/output system (BIOS) can be stored in the non-volatile memory 1010.
  • The computer 1002 may include various types of computer-readable storage media in the form of one or more lower speed memory units, including an internal (or external) hard disk drive (HDD) 1014, a magnetic floppy disk drive (FDD) 1016 to read from or write to a removable magnetic disk 1018, and an optical disk drive 1020 to read from or write to a removable optical disk 1022 (e.g., a CD-ROM or DVD). The HDD 1014, FDD 1016 and optical disk drive 1020 can be connected to the system bus 1008 by a HDD interface 1024, an FDD interface 1026 and an optical drive interface 1028, respectively. The HDD interface 1024 for external drive implementations can include at least one or both of Universal Serial Bus (USB) and IEEE 1394 interface technologies.
  • The drives and associated computer-readable media provide volatile and/or nonvolatile storage of data, data structures, computer-executable instructions, and so forth. For example, a number of program modules can be stored in the drives and memory units 1010, 1012, including an operating system 1030, one or more application programs 1032, other program modules 1034, and program data 1036. In one embodiment, the one or more application programs 1032, other program modules 1034, and program data 1036 can include, for example, the various applications and/or components of the system 100.
  • A user can enter commands and information into the computer 1002 through one or more wire/wireless input devices, for example, a keyboard 1038 and a pointing device, such as a mouse 1040. Other input devices may include microphones, infra-red (IR) remote controls, radio-frequency (RF) remote controls, game pads, stylus pens, card readers, dongles, finger print readers, gloves, graphics tablets, joysticks, keyboards, retina readers, touch screens (e.g., capacitive, resistive, etc.), trackballs, trackpads, sensors, styluses, and the like. These and other input devices are often connected to the processing unit 1004 through an input device interface 1042 that is coupled to the system bus 1008, but can be connected by other interfaces such as a parallel port, IEEE 1394 serial port, a game port, a USB port, an IR interface, and so forth.
  • A monitor 1044 or other type of display device is also connected to the system bus 1008 via an interface, such as a video adaptor 1046. The monitor 1044 may be internal or external to the computer 1002. In addition to the monitor 1044, a computer typically includes other peripheral output devices, such as speakers, printers, and so forth.
  • The computer 1002 may operate in a networked environment using logical connections via wire and/or wireless communications to one or more remote computers, such as a remote computer 1048. The remote computer 1048 can be a workstation, a server computer, a router, a personal computer, portable computer, microprocessor-based entertainment appliance, a peer device or other common network node, and typically includes many or all of the elements described relative to the computer 1002, although, for purposes of brevity, only a memory/storage device 1050 is illustrated. The logical connections depicted include wire/wireless connectivity to a local area network (LAN) 1052 and/or larger networks, for example, a wide area network (WAN) 1054. Such LAN and WAN networking environments are commonplace in offices and companies, and facilitate enterprise-wide computer networks, such as intranets, all of which may connect to a global communications network, for example, the Internet.
  • When used in a LAN networking environment, the computer 1002 is connected to the LAN 1052 through a wire and/or wireless communication network interface or adaptor 1056. The adaptor 1056 can facilitate wire and/or wireless communications to the LAN 1052, which may also include a wireless access point disposed thereon for communicating with the wireless functionality of the adaptor 1056.
  • When used in a WAN networking environment, the computer 1002 can include a modem 1058, or is connected to a communications server on the WAN 1054, or has other means for establishing communications over the WAN 1054, such as by way of the Internet. The modem 1058, which can be internal or external and a wire and/or wireless device, connects to the system bus 1008 via the input device interface 1042. In a networked environment, program modules depicted relative to the computer 1002, or portions thereof, can be stored in the remote memory/storage device 1050. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers can be used.
  • The computer 1002 is operable to communicate with wire and wireless devices or entities using the IEEE 802 family of standards, such as wireless devices operatively disposed in wireless communication (e.g., IEEE 802.11 over-the-air modulation techniques). This includes at least Wi-Fi (or Wireless Fidelity), WiMax, and Bluetooth™ wireless technologies, among others. Thus, the communication can be a predefined structure as with a conventional network or simply an ad hoc communication between at least two devices. Wi-Fi networks use radio technologies called IEEE 802.11x (a, b, g, n, etc.) to provide secure, reliable, fast wireless connectivity. A Wi-Fi network can be used to connect computers to each other, to the Internet, and to wire networks (which use IEEE 802.3-related media and functions).
  • FIG. 11 illustrates a block diagram of an exemplary communications architecture 1100 suitable for implementing various embodiments as previously described. The communications architecture 1100 includes various common communications elements, such as a transmitter, receiver, transceiver, radio, network interface, baseband processor, antenna, amplifiers, filters, power supplies, and so forth. The embodiments, however, are not limited to implementation by the communications architecture 1100.
  • As shown in FIG. 11, the communications architecture 1100 comprises includes one or more clients 1102 and servers 1104. The clients 1102 may implement the client device 810, 910. The servers 1104 may implement the server device 950. The clients 1102 and the servers 1104 are operatively connected to one or more respective client data stores 1108 and server data stores 1110 that can be employed to store information local to the respective clients 1102 and servers 1104, such as cookies and/or associated contextual information.
  • The clients 1102 and the servers 1104 may communicate information between each other using a communication framework 1106. The communications framework 1106 may implement any well-known communications techniques and protocols. The communications framework 1106 may be implemented as a packet-switched network (e.g., public networks such as the Internet, private networks such as an enterprise intranet, and so forth), a circuit-switched network (e.g., the public switched telephone network), or a combination of a packet-switched network and a circuit-switched network (with suitable gateways and translators).
  • The communications framework 1106 may implement various network interfaces arranged to accept, communicate, and connect to a communications network. A network interface may be regarded as a specialized form of an input output interface. Network interfaces may employ connection protocols including without limitation direct connect, Ethernet (e.g., thick, thin, twisted pair 10/100/1000 Base T, and the like), token ring, wireless network interfaces, cellular network interfaces, IEEE 802.11a-x network interfaces, IEEE 802.16 network interfaces, IEEE 802.20 network interfaces, and the like. Further, multiple network interfaces may be used to engage with various communications network types. For example, multiple network interfaces may be employed to allow for the communication over broadcast, multicast, and unicast networks. Should processing requirements dictate a greater amount speed and capacity, distributed network controller architectures may similarly be employed to pool, load balance, and otherwise increase the communicative bandwidth required by clients 1102 and the servers 1104. A communications network may be any one and the combination of wired and/or wireless networks including without limitation a direct interconnection, a secured custom connection, a private network (e.g., an enterprise intranet), a public network (e.g., the Internet), a Personal Area Network (PAN), a Local Area Network (LAN), a Metropolitan Area Network (MAN), an Operating Missions as Nodes on the Internet (OMNI), a Wide Area Network (WAN), a wireless network, a cellular network, and other communications networks.
  • Some embodiments may be described using the expression “one embodiment” or “an embodiment” along with their derivatives. These terms mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment. Further, some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. These terms are not necessarily intended as synonyms for each other. For example, some embodiments may be described using the terms “connected” and/or “coupled” to indicate that two or more elements are in direct physical or electrical contact with each other. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.
  • It is emphasized that the Abstract of the Disclosure is provided to allow a reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. In addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separate embodiment. In the appended claims, the terms “including” and “in which” are used as the plain-English equivalents of the respective terms “comprising” and “wherein,” respectively. Moreover, the terms “first,” “second,” “third,” and so forth, are used merely as labels, and are not intended to impose numerical requirements on their objects.
  • What has been described above includes examples of the disclosed architecture. It is, of course, not possible to describe every conceivable combination of components and/or methodologies, but one of ordinary skill in the art may recognize that many further combinations and permutations are possible. Accordingly, the novel architecture is intended to embrace all such alterations, modifications and variations that fall within the spirit and scope of the appended claims.

Claims (20)

1. An apparatus, comprising:
a logic circuit; and
a map transition component operative on the logic circuit to render a set of views of a digital map for presentation on a display element in which a three dimensional view comprises data representing a spatially correct surface and at least one intermediate view corresponds to a transition between views, each intermediate view to comprise mapping data between the spatially correct surface and a substantially spherical coordinate system.
2. The apparatus of claim 1, wherein the map transition component operative to generate the mapping data based upon a projection between one set of spherical coordinates and another set of spherical coordinates.
3. The apparatus of claim 2, wherein the set of spherical coordinates and the other set of spherical coordinates correspond to different points in time in the transition.
4. The apparatus of claim 3, wherein corresponding coordinates of the set of spherical coordinates and the other set of spherical coordinates are represented by a color.
5. The apparatus of claim 1, wherein the at least one intermediate view corresponds to at least a portion of a transition between the three dimensional view and a two dimensional view of the digital map.
6. The apparatus of claim 1, wherein the map transition component operative on the logic circuit to modify a projection from the spatially correct surface to a two dimensional surface using a time value.
7. The apparatus of claim 1, wherein the set of views comprise a sequence of views corresponding to an animated transition between a view representing the digital map as a three dimensional surface to a view representing the digital map as a two dimensional surface.
8. The apparatus of claim 1, wherein the map transition component operative to determine a transition rate associated with presenting the set of views.
9. The apparatus of claim 1, wherein the map transition component operative to render a view of a surface in which at least one region of the surface corresponds to a color of a range of colors.
10. The apparatus of claim 1, wherein the map transition component operative to render a number of time-bound categorized data points on a view of the set of views.
11. A computer-implemented method, comprising:
processing a unit sphere representing a three dimensional view of a digital map;
generating, by circuitry, a number of intermediate spheres from the unit sphere in which each intermediate sphere corresponds to a different radius; and
processing the number of intermediate spheres to illustrate a transition between the three dimensional view and a plane view.
12. The method of claim 11, comprising assigning tasks related to the generation of the number of intermediate spheres to a plurality of processors.
13. The method of claim 11, comprising processing a control directive to switch between the three dimensional view and the plane view, the plane view depicting substantially all map data as a two-dimensional surface.
14. The method of claim 11, comprising projecting data represented by the unit sphere into data representing the plane view of the digital map, and using a desired radius of an intermediate sphere to transform the data representing the plane view into data representing a view in a substantially spherical coordinate system.
15. The method of claim 11, comprising generating a progression of intermediate views corresponding to the number of intermediate spheres in which each intermediate sphere increases or decreases in radius between points in time along the progression.
16. The method of claim 11, comprising transforming a view of a first intermediate sphere and a time value into a view of a second intermediate sphere having a radius greater in size than a radius of the first intermediate sphere.
17. The method of claim 16, the radius of the second intermediate sphere is computed from the time value.
18. At least one computer-readable storage medium comprising instructions that, when executed, cause a system to:
perform a projection of geographic locations to a two dimensional map;
generate data to represent an inverse between the two dimensional map and locations on a three dimensional map having a desired radius; and
transform the data into spherical coordinates on the three-dimensional map that correspond to the geographic locations.
19. The computer-readable storage medium of claim 18, comprising instructions that when executed cause the system to:
perform a Mercator projection between a unit sphere and the two-dimensional map;
modify data representing the Mercator projection according to time values that correspond to sequence of intermediate spheres, the sequence of intermediate spheres represent a visual transition between the unit sphere and the two-dimensional map; and
perform an inverse Mercator projection between the two-dimensional map and each intermediate sphere.
20. The computer-readable storage medium of claim 18, comprising instructions that when executed cause the system to:
assign colors representing categories to each intermediate sphere.
US14/251,813 2013-09-10 2014-04-14 Techniques to manage map information illustrating a transition between views Abandoned US20150070356A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
US14/251,813 US20150070356A1 (en) 2013-09-10 2014-04-14 Techniques to manage map information illustrating a transition between views
CN201480058945.7A CN105917384A (en) 2013-09-10 2014-09-09 Techniques to manage map information illustrating a transition between views
PCT/US2014/054679 WO2015038506A1 (en) 2013-09-10 2014-09-09 Techniques to manage map information illustrating a transition between views
EP14767264.6A EP3044763A1 (en) 2013-09-10 2014-09-09 Techniques to manage map information illustrating a transition between views

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361876100P 2013-09-10 2013-09-10
US14/251,813 US20150070356A1 (en) 2013-09-10 2014-04-14 Techniques to manage map information illustrating a transition between views

Publications (1)

Publication Number Publication Date
US20150070356A1 true US20150070356A1 (en) 2015-03-12

Family

ID=52625146

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/244,381 Active 2034-07-15 US9905043B2 (en) 2013-09-10 2014-04-03 Techniques to generate digital maps
US14/247,580 Abandoned US20150070379A1 (en) 2013-09-10 2014-04-08 Techniques to manage color representations for a digital map
US14/251,813 Abandoned US20150070356A1 (en) 2013-09-10 2014-04-14 Techniques to manage map information illustrating a transition between views

Family Applications Before (2)

Application Number Title Priority Date Filing Date
US14/244,381 Active 2034-07-15 US9905043B2 (en) 2013-09-10 2014-04-03 Techniques to generate digital maps
US14/247,580 Abandoned US20150070379A1 (en) 2013-09-10 2014-04-08 Techniques to manage color representations for a digital map

Country Status (6)

Country Link
US (3) US9905043B2 (en)
EP (3) EP3044763A1 (en)
CN (3) CN105531742A (en)
AR (1) AR097623A1 (en)
TW (1) TW201519183A (en)
WO (3) WO2015038506A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108694245A (en) * 2018-05-15 2018-10-23 东软集团股份有限公司 Multidimensional data display methods, device, readable storage medium storing program for executing and electronic equipment
CN109791554A (en) * 2016-08-12 2019-05-21 艾奎菲股份有限公司 System and method for automatically generating the metadata for media document
CN112001988A (en) * 2019-05-27 2020-11-27 珠海金山办公软件有限公司 Animation effect generation method and device
US11398078B2 (en) 2017-03-15 2022-07-26 Elbit Systems Ltd. Gradual transitioning between two-dimensional and three-dimensional augmented reality images

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9922426B2 (en) * 2016-01-25 2018-03-20 Google Llc Reducing latency in presenting map interfaces at client devices
US11593342B2 (en) 2016-02-01 2023-02-28 Smartshift Technologies, Inc. Systems and methods for database orientation transformation
US10585655B2 (en) 2016-05-25 2020-03-10 Smartshift Technologies, Inc. Systems and methods for automated retrofitting of customized code objects
US10110781B2 (en) * 2016-06-23 2018-10-23 Adobe Systems Incorporated Restoring the appearance of scans of damaged physical documents
US10089103B2 (en) 2016-08-03 2018-10-02 Smartshift Technologies, Inc. Systems and methods for transformation of reporting schema
EP3519774B1 (en) * 2016-09-29 2022-10-26 TomTom Traffic B.V. Methods and systems for generating parking related data
CN108133454B (en) * 2016-12-01 2021-06-08 阿里巴巴集团控股有限公司 Space geometric model image switching method, device and system and interaction equipment
US10678842B2 (en) * 2017-03-21 2020-06-09 EarthX, Inc. Geostory method and apparatus
US10681120B2 (en) * 2017-07-25 2020-06-09 Uber Technologies, Inc. Load balancing sticky session routing
US10528343B2 (en) * 2018-02-06 2020-01-07 Smartshift Technologies, Inc. Systems and methods for code analysis heat map interfaces
US10698674B2 (en) 2018-02-06 2020-06-30 Smartshift Technologies, Inc. Systems and methods for entry point-based code analysis and transformation
US10740075B2 (en) 2018-02-06 2020-08-11 Smartshift Technologies, Inc. Systems and methods for code clustering analysis and transformation
US10936803B2 (en) * 2018-04-02 2021-03-02 Microsoft Technology Licensing, Llc Aggregation and processing of hierarchical data for display of interactive user interface chart elements
CN110716682A (en) * 2018-07-13 2020-01-21 上海擎感智能科技有限公司 Map custom color matching method and system, storage medium and terminal
CN111695045B (en) * 2019-03-14 2023-08-11 北京嘀嘀无限科技发展有限公司 Thermodynamic diagram display and thermal data notification method and device
CN111694908A (en) * 2019-03-15 2020-09-22 丰图科技(深圳)有限公司 Data storage method, device and storage medium
CN111045767B (en) * 2019-11-26 2024-01-26 北京金和网络股份有限公司 Method and system for displaying map labels on map based on flexible data configuration
US11442969B2 (en) * 2020-04-24 2022-09-13 Capital One Services, Llc Computer-based systems configured for efficient entity resolution for database merging and reconciliation
CN115220615A (en) * 2022-07-29 2022-10-21 深圳华创电科技术有限公司 Situation interaction system based on geographic information system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040162675A1 (en) * 2001-11-01 2004-08-19 Dennis Moon System presenting meteorological information using a browser interface
US20090160859A1 (en) * 2007-12-20 2009-06-25 Steven Horowitz Systems and methods for presenting visualizations of media access patterns
US20120110501A1 (en) * 2010-11-03 2012-05-03 Samsung Electronics Co. Ltd. Mobile terminal and screen change control method based on input signals for the same
US20120233573A1 (en) * 2011-03-07 2012-09-13 Sas Institute Inc. Techniques to present hierarchical information using orthographic projections
US8471847B1 (en) * 2012-02-29 2013-06-25 Google Inc. Use of constructed three-dimensional geometries to efficiently represent time varying Cartesian data
US20140163885A1 (en) * 2012-12-07 2014-06-12 Caterpillar Inc. Terrain map updating system
US8994719B1 (en) * 2011-04-20 2015-03-31 Google Inc. Matching views between a two-dimensional geographical image and a three-dimensional geographical image

Family Cites Families (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5781704C1 (en) * 1996-10-11 2002-07-16 Environmental Criminology Res Expert system method of performing crime site analysis
US8332247B1 (en) * 1997-06-12 2012-12-11 G. William Bailey Methods and systems for optimizing network travel costs
US6941515B1 (en) * 1999-09-01 2005-09-06 Vertigoxmedia Inc. Method and apparatus for controlling a graphics engine
CA2387054A1 (en) * 1999-10-15 2001-04-26 Dekalb Genetics Corporation Methods and systems for plant performance analysis
US20020198760A1 (en) * 2001-06-05 2002-12-26 Carpenter John E. Demographic data development and processing
US20050187814A1 (en) * 2004-02-20 2005-08-25 Hector Yanez Voter strategically targeted analyzing and reporting system
US7583273B2 (en) 2005-03-02 2009-09-01 Avenza Systems, Inc. Method and system for transforming spatial data
US8850011B2 (en) 2005-04-21 2014-09-30 Microsoft Corporation Obtaining and displaying virtual earth images
US7933929B1 (en) 2005-06-27 2011-04-26 Google Inc. Network link for providing dynamic data layer in a geographic information system
US7933897B2 (en) * 2005-10-12 2011-04-26 Google Inc. Entity display priority in a distributed geographic information system
US8788431B1 (en) * 2005-10-26 2014-07-22 Movoto Llc Methods and apparatus for graphical analysis and display of real property information
GB0609037D0 (en) * 2006-05-06 2006-06-14 Guaranteed Markets Ltd Apparatus and method for intervention in electronic markets
US20080051994A1 (en) * 2006-08-28 2008-02-28 Microsoft Corporation Representation and display of geographical popularity data
US20080221978A1 (en) * 2007-02-26 2008-09-11 Samuel Richard I Microscale geospatial graphic analysis of voter characteristics for precise voter targeting
US8185122B2 (en) 2007-03-21 2012-05-22 Metropcs Wireless, Inc. Method for creating a cellular telephone infrastructure
US9477732B2 (en) * 2007-05-23 2016-10-25 Oracle International Corporation Filtering for data visualization techniques
CN101051314A (en) * 2007-05-18 2007-10-10 上海众恒信息产业有限公司 Data analysis result display method and device combined with electronic map
US8605786B2 (en) * 2007-09-04 2013-12-10 The Regents Of The University Of California Hierarchical motion vector processing method, software and devices
US8638327B2 (en) 2007-11-14 2014-01-28 Microsoft Corporation Tiled projections for planar processing of round earth data
US20110295669A1 (en) * 2008-05-30 2011-12-01 Jonathan Stiebel Internet-Assisted Systems and Methods for Building a Customer Base for Musicians
US20110179066A1 (en) * 2008-06-20 2011-07-21 Business Intelligence Solutions Safe B.V. Methods, apparatus and systems for data visualization and related applications
CN101726302B (en) * 2008-10-15 2013-02-13 高德信息技术有限公司 Map display method and guidance terminal
CN101673304A (en) * 2008-10-23 2010-03-17 中国科学院地理科学与资源研究所 Spatial visualization system of statistical population information and method thereof
US8745086B2 (en) * 2008-12-05 2014-06-03 New BIS Safe Luxco S.á.r.l. Methods, apparatus and systems for data visualization and related applications
US20100225644A1 (en) * 2009-03-05 2010-09-09 Navteq North America, Llc Method and System for Transitioning Between Views in a Traffic Report
CN101545776B (en) * 2009-05-05 2011-09-28 东南大学 Method for obtaining digital photo orientation elements based on digital map
US8294710B2 (en) 2009-06-02 2012-10-23 Microsoft Corporation Extensible map with pluggable modes
US8504512B2 (en) * 2009-12-02 2013-08-06 Microsoft Corporation Identifying geospatial patterns from device data
US8935714B2 (en) * 2010-12-30 2015-01-13 Verizon Patent And Licensing Inc. Interactive user-prediction of content
CN102096713A (en) * 2011-01-29 2011-06-15 广州都市圈网络科技有限公司 Grid-based two-dimensional or three-dimensional map matching method and system
US8706407B2 (en) * 2011-03-30 2014-04-22 Nokia Corporation Method and apparatus for generating route exceptions
CN102855237A (en) * 2011-06-29 2013-01-02 北京天一众合科技股份有限公司 Map generation method and map generation system
GB201115369D0 (en) 2011-09-06 2011-10-19 Gooisoft Ltd Graphical user interface, computing device, and method for operating the same
US8864562B2 (en) * 2011-12-18 2014-10-21 Patrick Daly Online political prediction game
US8944719B2 (en) 2012-11-09 2015-02-03 Caterpillar Paving Products Inc. Tracking of machine system movements in paving machine
US10672008B2 (en) * 2012-12-06 2020-06-02 Jpmorgan Chase Bank, N.A. System and method for data analytics

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040162675A1 (en) * 2001-11-01 2004-08-19 Dennis Moon System presenting meteorological information using a browser interface
US20090160859A1 (en) * 2007-12-20 2009-06-25 Steven Horowitz Systems and methods for presenting visualizations of media access patterns
US20120110501A1 (en) * 2010-11-03 2012-05-03 Samsung Electronics Co. Ltd. Mobile terminal and screen change control method based on input signals for the same
US20120233573A1 (en) * 2011-03-07 2012-09-13 Sas Institute Inc. Techniques to present hierarchical information using orthographic projections
US8994719B1 (en) * 2011-04-20 2015-03-31 Google Inc. Matching views between a two-dimensional geographical image and a three-dimensional geographical image
US8471847B1 (en) * 2012-02-29 2013-06-25 Google Inc. Use of constructed three-dimensional geometries to efficiently represent time varying Cartesian data
US20140163885A1 (en) * 2012-12-07 2014-06-12 Caterpillar Inc. Terrain map updating system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Brainerd et al. "Interactive map projections and distortion", Computers & Geosciences 27 (2001), Pages 299 to 314 *
Knippers “Map Projection”, Department of Geo-information Processing, August 2009, http://kartoweb.itc.nl/geometrics/map%20projections/mappro.html *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109791554A (en) * 2016-08-12 2019-05-21 艾奎菲股份有限公司 System and method for automatically generating the metadata for media document
US11398078B2 (en) 2017-03-15 2022-07-26 Elbit Systems Ltd. Gradual transitioning between two-dimensional and three-dimensional augmented reality images
CN108694245A (en) * 2018-05-15 2018-10-23 东软集团股份有限公司 Multidimensional data display methods, device, readable storage medium storing program for executing and electronic equipment
CN112001988A (en) * 2019-05-27 2020-11-27 珠海金山办公软件有限公司 Animation effect generation method and device

Also Published As

Publication number Publication date
TW201519183A (en) 2015-05-16
WO2015038511A1 (en) 2015-03-19
EP3044701A1 (en) 2016-07-20
WO2015038508A3 (en) 2015-05-07
EP3044762A2 (en) 2016-07-20
WO2015038506A1 (en) 2015-03-19
AR097623A1 (en) 2016-04-06
US20150070383A1 (en) 2015-03-12
US9905043B2 (en) 2018-02-27
EP3044763A1 (en) 2016-07-20
CN105531702A (en) 2016-04-27
WO2015038508A2 (en) 2015-03-19
CN105917384A (en) 2016-08-31
EP3044701B1 (en) 2017-08-30
CN105531742A (en) 2016-04-27
US20150070379A1 (en) 2015-03-12

Similar Documents

Publication Publication Date Title
US20150070356A1 (en) Techniques to manage map information illustrating a transition between views
CN106462997B (en) Mixing between street view and earth view
Resch et al. Web-based 4D visualization of marine geo-data using WebGL
US11967015B2 (en) Neural rendering
CN105144243B (en) data visualization method and system
CN109102560A (en) Threedimensional model rendering method and device
WO2020134436A1 (en) Method for generating animated expression and electronic device
CN113658316B (en) Rendering method and device of three-dimensional model, storage medium and computer equipment
Webster High poly to low poly workflows for real-time rendering
CN106605211A (en) Render-Time Linking of Shaders
CN107403461B (en) Sampling apparatus and method for generating random sampling distributions using random rasterization
US8457426B1 (en) Method and apparatus for compressing a document using pixel variation information
CN104123003A (en) Content sharing method and device
CN115713585A (en) Texture image reconstruction method and device, computer equipment and storage medium
Marques et al. Efficient quadrature rules for illumination integrals: From quasi Monte Carlo to Bayesian Monte Carlo
JP7352032B2 (en) Video generation method, apparatus, electronic device and computer readable storage medium
CN114820988A (en) Three-dimensional modeling method, device, equipment and storage medium
CN110992438B (en) Picture editing method and device
Jobst et al. 3D city model visualization with cartography-oriented design
Shannon et al. Graphemes: self-organizing shape-based clustered structures for network visualisations
US20220327757A1 (en) Method and apparatus for generating dynamic video of character, electronic device and storage medium
CN117649478B (en) Model training method, image processing method and electronic equipment
Blenkhorn GPU-accelerated Rendering of Atmospheric Glories
CN112306616B (en) Loading display processing method and device, computer equipment and storage medium
Yoon et al. CAMAR mashup: empowering end-user participation in U-VR environment

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WOOD, IAN;KALLAY, MICHAEL;DA VEIGA, ALEXANDRE;SIGNING DATES FROM 20140317 TO 20140411;REEL/FRAME:032664/0890

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034747/0417

Effective date: 20141014

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:039025/0454

Effective date: 20141014

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION