US20140347383A1 - Map rendering using interpolation of style parameters across zoom levels - Google Patents
Map rendering using interpolation of style parameters across zoom levels Download PDFInfo
- Publication number
- US20140347383A1 US20140347383A1 US14/456,872 US201414456872A US2014347383A1 US 20140347383 A1 US20140347383 A1 US 20140347383A1 US 201414456872 A US201414456872 A US 201414456872A US 2014347383 A1 US2014347383 A1 US 2014347383A1
- Authority
- US
- United States
- Prior art keywords
- style
- style parameters
- zoom level
- magnification
- map
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000009877 rendering Methods 0.000 title claims description 51
- 238000000034 method Methods 0.000 claims description 81
- 238000012545 processing Methods 0.000 claims description 26
- 230000015654 memory Effects 0.000 claims description 23
- 238000004891 communication Methods 0.000 claims description 19
- 230000008859 change Effects 0.000 claims description 12
- 230000004044 response Effects 0.000 claims description 3
- 230000008569 process Effects 0.000 description 30
- 238000013507 mapping Methods 0.000 description 14
- 230000006870 function Effects 0.000 description 7
- 238000004364 calculation method Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 6
- 238000003384 imaging method Methods 0.000 description 6
- 239000003086 colorant Substances 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 230000009471 action Effects 0.000 description 3
- 238000013213 extrapolation Methods 0.000 description 2
- 239000012634 fragment Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000007792 addition Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0968—Systems involving transmission of navigation instructions to the vehicle
- G08G1/0969—Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3667—Display of a road map
- G01C21/367—Details, e.g. road map scale, orientation, zooming, illumination, level of detail, scrolling of road map or positioning of current position marker
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/36—Level of detail
Definitions
- the present disclosure relates to map rendering systems, such as electronic map display systems, and more specifically to a map rendering system that renders elements of map features using interpolated style parameter values across different zoom levels.
- Digital maps are found in and may be displayed by a wide variety of devices, including mobile phones, car navigation systems, hand-held global positioning system (GPS) units, computers, and many websites. Although digital maps are easy to view and to use from an end-user's perspective, creating a digital map is a difficult task and can be a time-consuming process.
- every digital map begins with storing, in a map database, a set of raw data corresponding to millions of streets and intersections and other features to be displayed as part of a map.
- the raw map data that is stored in the map database and that is used to generate digital map images is derived from a variety of sources, with each source typically providing different amounts and types of information. This map data must therefore be compiled and stored in the map database before being accessed by map display or map rendering applications and hardware.
- map images there are, of course, different manners of digitally rendering map images (referred to as digital map images) based on map data stored in a map database.
- One method of rendering a map image is to store map images within the map database as sets of rasterized or pixilated images made up of numerous pixel data points, with each pixel data point including properties defining how a particular pixel in an image is to be displayed on an electronic display device. While this type of map data is relatively easy to create and store, the map rendering technique using this data typically requires a large amount of storage space for comprehensive digital map images, and it is difficult to manipulate the digital map images as displayed on a display device in very many useful manners.
- vector image data is typically used in high-resolution and fast-moving imaging systems, such as those associated with gaming systems, and in particular three-dimensional gaming systems.
- vector image data includes data that defines specific image objects or elements (also referred to as primitives) to be displayed as part of an image via an image display device.
- image elements or primitives may be, for example, individual roads, text labels (e.g., map or street labels), areas, text boxes, buildings, points of interest markers, terrain features, bike paths, etc.
- Each image element is generally made up or drawn as a set of one or more triangles (of different sizes, shapes, colors, fill patterns, etc.), with each triangle including three vertices interconnected by lines.
- the image database stores a set of vertex data points, with each set of vertex data points defining a particular vertex of one of the triangles making up the image element.
- each vertex data point includes data pertaining to a two-dimensional or a three-dimensional position of the vertex (in an X, Y or an X, Y, Z coordinate system, for example) and various vertex attributes defining properties of the vertex, such as color properties, fill properties, line width properties for lines emanating from the vertex, etc.
- an image shader is a set of software instructions used primarily to calculate rendering effects on graphics hardware with a high degree of flexibility.
- Image shaders are well known in the art and various types of image shaders are available in various application programming interfaces (APIs) provided by, for example, OpenGL and Direct3D, to define special shading functions.
- APIs application programming interfaces
- image shaders are simple programs in a high level programming language, for example, that describe or determine the traits of either a vertex or a pixel.
- Vertex shaders for example, define the traits (e.g., position, texture coordinates, colors, etc.) of a vertex, while pixel or fragment shaders define the traits (color, z-depth and alpha value) of a pixel.
- a vertex shader is called for each vertex in an image element or primitive so that, for each vertex input into the vertex shader, the vertex shader produces one (updated) vertex output.
- Each vertex output by the vertex shader is then rendered as a series of pixels onto a block of memory that will eventually be sent to a display screen.
- a computer-implemented method for rendering a map on a display device includes determining a first set of style parameters for a feature of a map at a first zoom level, a second set of style parameters for a second zoom level, and a third set of style parameters for a third zoom level, where the third set of style parameters is determined by interpolating the first and second set of style parameters.
- the method then renders or displays the first feature of the map in a viewing window based on the interpolated third set of style parameters.
- the method further determines a fourth set of style parameters based on a fourth zoom level and determines the third set of style parameters based on the first, second and fourth set of style parameters.
- the method further determines whether to retrieve the fourth set of style parameters based on a current bandwidth, a current processor capacity, or a resolution setting.
- the method includes using a linear interpolation process and/or a polynomial interpolation process.
- style attribute information may be stored as a style lookup table and the method may determine style attribute tables that are associated with the first, second, third and/or fourth zoom levels.
- the method may determine to retrieve only a subset of available style parameters for a given zoom level based on any combination of an interpolation attribute, a priority attribute, a current bandwidth, or a current processor capacity.
- a computer device in another embodiment, includes a communications network interface, one or more processors, one or more memories coupled to the one or more processors and a display device coupled to the one or more processors.
- the one or more memories include computer executable instructions that are executed on the processor to determine a first view of a map surface defined by a first magnification of the surface of the map, where a first zoom level of the first view corresponds to the first magnification.
- the computer executable instructions are executed to retrieve a first set of style parameters for a first feature of the map surface, the first set of style parameters corresponding to the first zoom level.
- the computer executable instructions are executed to determine a second view of the map surface defined by a second magnification of the map surface, where a second zoom level of the second view corresponds to the second magnification.
- the computer executable instructions are executed to retrieve a second set of style parameters for the first feature of the map surface, the second set of style parameters corresponding to a third zoom level, where the third zoom level corresponds to a third magnification different from the first and the second magnification.
- the computer executable instructions are executed to determine a third set of style parameters for the first feature of the map surface at the second zoom level based on interpolating the first set of style parameters and the second set of style parameters when the second zoom level is between the first and third zoom level.
- the computer executable instructions are executed to render the first feature in the second view at the second zoom level using the third set of style parameters.
- the computer device executes the computer executable instructions to retrieve the first and the second set of style parameters as a subset of style parameters available for retrieval based on a style attribute setting associated with each of the style parameters indicating whether the style parameter is designated for interpolation.
- the computer device executes the computer executable instructions to retrieve the first and the second set of style parameters as a subset of style parameters available for retrieval based on a style priority attribute associated with each of the style parameters.
- the computer device executes the computer executable instructions to retrieve the first and the second set of style parameters as a subset of style parameters available for retrieval based on whether a current processor capacity is above a threshold.
- the computer device executes the computer executable instructions to retrieve the first and the second set of style parameters as a subset of style parameters available for retrieval based on whether a current available bandwidth is above a threshold.
- the computer device executes the computer executable instructions to retrieve a fourth set of style parameters for the first feature of the map surface when a resolution setting is increased, the fourth set of style parameters corresponding to a fourth zoom level, wherein the fourth zoom level corresponds to a fourth magnification different from the first, the second, and the third magnification.
- the computer device executes the computer executable instructions to determine the third set of style parameters for the first feature of the map surface at the second zoom level based on interpolating at least the first set of style parameters at the first magnification, the second set of style parameters at the second magnification, and the fourth set of style parameters at the fourth magnification, wherein the third magnification is different from the first, second, and fourth magnification.
- a computer device in another embodiment, includes a communications network interface, one or more processors, one or more memories coupled to the one or more processors and a display device coupled to the one or more processors.
- the one or more memories include computer executable instructions that are executed on the processor to determine a first view of a map surface defined by a first magnification of the surface of the map, and where a first zoom level of the first view corresponds to the first magnification.
- the computer executable instructions are executed to retrieve a first set of style parameters for a first feature of the map surface, the first set of style parameters corresponding to a zoom level closest to the first zoom level.
- the computer executable instructions are executed to determine if the first set of style parameters corresponds to the first zoom level.
- the computer executable instructions are executed to render a first feature in the first view at the first zoom level using the first set of style parameters if the first set of style parameters corresponds to the first zoom level.
- the computer executable instructions are executed to retrieve a second set of style parameters for the first feature of the map surface if the first set of style parameters do not correspond to the first zoom level, the second set of style parameters corresponding to a third zoom level, where the third zoom level corresponds to a third magnification different from the first and the second magnification and where the first zoom level is between the second and the third zoom level.
- the computer executable instructions are executed to determine a third set of style parameters for the first feature of the map surface at the first zoom level based on interpolating the first and the second set of style parameters.
- the computer executable instructions are executed to render the first feature in the first view at the first zoom level using the third set of style parameters.
- FIG. 1 is a high-level block diagram of a map imaging system that implements communications between a map database stored in a server and one or more map image rendering devices, according to an embodiment.
- FIG. 2 is a high level block diagram of an image rendering engine used to render map images using map vector data, according to an embodiment.
- FIG. 3A is a data diagram illustrating a set of vector data in the form of vertex data points encoded using a vertex style attribute, according to an embodiment.
- FIG. 3B is a first texture map in the form of a style lookup table that defines vertex style attribute values for each of a number of different styles and which is used in the image rendering engine of FIG. 2 to resolve vertex style attributes based on a style reference, according to an embodiment.
- FIG. 4 illustrates a zoom scale including multiple zoom levels, according to an embodiment.
- FIG. 5A illustrates a graph of style parameter values and zoom level without interpolation.
- FIG. 5B illustrates a graph of style parameter values and zoom level with linear interpolation, according to an embodiment.
- FIG. 5C illustrates a graph of style parameter values and zoom level with polynomial interpolation, according to an embodiment.
- FIG. 6 illustrates a process flow diagram of a method that may be used to render a map surface using the described interpolation techniques, according to an embodiment.
- FIG. 7 illustrates a process flow diagram for retrieving style attribute data and for interpolating the style data for rendering a feature of a map, according to an embodiment.
- FIG. 8 illustrates a process flow for handling additional data sets of style parameters, according to an embodiment.
- FIG. 9 illustrates a process for determining what style parameters to include in reduced style attribute data set, according to an embodiment.
- the present application generally relates to techniques for rendering map features during zooming operations of a viewing window.
- a graphics or image rendering system such as a map image rendering system, may receive map data for a given set of zoom levels, where the map data includes style parameter values for various features of a map surface, where the style parameter values correspond to a particular zoom level.
- the techniques may interpolate at least some of the style parameter values from the received map data to provide style parameter values over a range of zoom levels and the map image rendering system may render a viewing window at a zoom level based on the interpolated style parameters.
- a map-related imaging system 10 includes a map database 12 stored in a server 14 or in multiple servers located at, for example, a central site or at various different spaced apart sites, and also includes multiple map client devices 16 , 18 , 20 , and 22 , each of which stores and implements a map rendering device or a map rendering engine.
- the map client devices 16 - 22 may be connected to the server 14 via any hardwired or wireless communication network 25 , including for example a hardwired or wireless local area network (LAN), metropolitan area network (MAN) or wide area network (WAN), the Internet, or any combination thereof.
- LAN local area network
- MAN metropolitan area network
- WAN wide area network
- the map client devices 16 - 22 may be, for example, mobile phone devices ( 18 ), computers such a laptop, tablet, desktop or other suitable types of computers ( 16 , 20 ) or components of other imaging systems such components of automobile navigation systems ( 22 ), etc.
- the client devices 16 - 22 may be communicatively connected to the server 14 via any suitable communication system, such as any publically available and/or privately owned communication network, including those that use hardwired based communication structure, such as telephone and cable hardware, and/or wireless communication structure, such as wireless communication networks, including for example, wireless LANs and WANs, satellite and cellular phone communication systems, etc.
- the map database 12 may store any desired types or kinds of map data including raster image map data and vector image map data.
- the image rendering systems described herein are best suited for use with vector image data which defines or includes a series of vertices or vertex data points for each of numerous sets of image objects, elements or primitives within an image to be displayed.
- each of the image objects defined by the vector data will have a plurality of vertices associated therewith and these vertices will be used to display a map related image object to a user via one or more of the client devices 16 - 22 .
- each of the client devices 16 - 22 includes an image rendering engine having one or more processors 30 , one or more memories 32 , a display device 34 , and in many cases a rasterizer or graphics card 36 which are generally programmed and interconnected in known manners to implement or to render graphics (images) on the associated display device 34 .
- the display device 34 for any particular client device 16 - 22 may be any type of electronic display device such as a liquid crystal display (LCD), a light emitting diode (LED) display, a plasma display, a cathode ray tube (CRT) display, or any other type of known or suitable electronic display.
- the map-related imaging system 10 of FIG. 1 operates such that a user, at one of the client devices 16 - 22 , opens or executes a map application (not shown in FIG. 1 ) that operates to communicate with and obtain map information or map related data from the map database 12 via the server 14 , and that then displays or renders a map image based on the received map data.
- the map application may allow the user to view different geographical portions of the map data stored in the map database 12 , to zoom in or zoom out on a particular geographical location, to rotate, spin or change the two-dimensional or three-dimensional viewing angle of the map being displayed, etc.
- each of the client devices 16 - 22 downloads map data in the form of vector data from the map database 12 and processes that vector data using one or more image shaders to render an image on the associated display device 34 .
- the image rendering system 40 of FIG. 2 includes two processors 30 a and 30 b, two memories 32 a and 32 b, a user interface 34 and a rasterizer 36 .
- the processor 30 b, the memory 32 b and the rasterizer 36 are disposed on a separate graphics card (denoted below the horizontal line), although this need not be the case in all embodiments.
- a single processor may be used instead.
- the image rendering system 40 includes a network interface 42 , a communications and storage routine 43 and one more map applications 48 having map display logic therein stored on the memory 32 a, which may be executed on the processor 30 a (e.g., which may be a central processing unit (CPU)).
- the processor 30 a e.g., which may be a central processing unit (CPU)
- one or more image shaders in the form of, for example, vertex shaders 44 and fragment shaders 46 are stored on the memory 32 b and are executed on the processor 30 b.
- the memories 32 a and 32 b may include either or both volatile and non-volatile memory and the routines and shaders are executed on the processors 30 a and 30 b to provide the functionality described below.
- the network interface 42 includes any well known software and/or hardware components that operate to communicate with, for example, the server 14 of FIG. 1 via a hardwired or wireless communications network to obtain image data in the form of vector data for use in creating an image display on the user interface or display device 34 .
- the image rendering device 40 also includes a data memory 49 , which may be a buffer or volatile memory for example, that stores vector data received from the map database 12 , the vector data including any number of vertex data points and one or more lookup tables as will be described in more detail.
- the map logic of the map application 48 executes on the processor 30 to determine the particular image data needed for display to a user via the display device 34 using, for example, user input, GPS signals, prestored logic or programming, etc.
- the display or map logic of the application 48 interacts with the map database 12 , using the communications routine 43 , by communicating with the server 14 through the network interface 42 to obtain map data, preferably in the form of vector data or compressed vector data from the map database 12 .
- This vector data is returned via the network interface 42 and may be decompressed and stored in the data memory 49 by the routine 43 .
- the data downloaded from the map database 12 may be a compact, structured, or otherwise optimized version of the ultimate vector data to be used, and the map application 48 may operate to transform the downloaded vector data into specific vertex data points using the processor 30 a.
- the image data sent from the server 14 includes vector data generally defining data for each of a set of vertices associated with a number of different image elements or image objects to be displayed on the screen 34 and possibly one or more lookup tables which will be described in more detail below.
- the lookup tables may be sent in, or may be decoded to be in, or may be generated by the map application 48 to be in the form of vector texture maps which are known types of data files typically defining a particular texture or color field (pixel values) to be displayed as part of an image created using vector graphics.
- the vector data for each image element or image object may include multiple vertices associated with one or more triangles making up the particular element or object of an image. Each such triangle includes three vertices (defined by vertex data points) and each vertex data point has vertex data associated therewith.
- each vertex data point includes vertex location data defining a two-dimensional or a three-dimensional position or location of the vertex in a reference or virtual space, as well as an attribute reference.
- Each vertex data point may additionally include other information, such as an object type identifier that identifies the type of image object with which the vertex data point is associated.
- the attribute reference referred to herein as a style reference or as a feature reference, references or points to a location or a set of locations in one or more of the lookup tables downloaded and stored in the data memory 43 .
- FIG. 3A illustrates an embodiment of map data that may be sent to a client device, such as device 40 of FIG. 2 , for processing, according to an embodiment.
- map data contains location data for a vertex, an object type, and a style attribute(s) for the vertex.
- a set of one or more of the vertices may comprise an image object or feature of a map, such as a road or building.
- the style attributes may be sent for each vertex or may reference a style look up table such as that illustrate in FIG. 3B that can be used to decode a style reference from FIG. 3A into a complete set of one or more style attribute parameters values, according to an embodiment.
- Style parameters may include a fill color (e.g., for area objects), an outline color, an outline width, an outline dashing pattern and an indication of whether to use rounded end caps (e.g., for road objects), an interior color, an interior width, an interior dashing pattern, and interior rounded end caps (e.g., for road objects), a text color and a text outline color (e.g., for text objects), an arrow color, an arrow width, an arrow dashing pattern (e.g., for arrow objects), a text box fill color and a set of text box outline properties (e.g., for text box objects) to name but a few.
- a fill color e.g., for area objects
- an outline color e.g., an outline width, an outline dashing pattern and an indication of whether to use rounded end caps
- an interior color, an interior width, an interior dashing pattern, and interior rounded end caps e.g., for road objects
- a text color and a text outline color e.
- the techniques for rendering a map involve determining a viewing window of a map surface at a first zoom level and determining a set of style parameters for a plurality of zoom levels (a set comprises one or more elements).
- a viewing window is to be rendered at the first zoom level
- the techniques described herein may interpolate across the plurality of style parameters to provide style parameter values appropriate for the first zoom level.
- a zoom level generally corresponds to a magnification which is used, in part, to define a displayable area of a map surface within a viewing window.
- a magnification of the viewing window may correspond with a scale for which the map surface is rendered or drawn. For example, where magnification or scale is expressed as a ratio such as 1:1,000, one of any unit of measurement on the viewing window may correspond exactly or approximately to 1,000 actual units. When the viewing window size is measured in inches, the distance scale may translate an inch of the viewing window to a length of 1,000 miles (or kilometers).
- Some computerized maps allow users to zoom in or zoom out of a map surface, where a zoom level generally corresponds to a magnification of the viewing window that displays the map surface.
- computer mapping applications may only display certain map features that can be resolved by eye at a certain zoom level or magnification (corresponding to a distance to the object) while excluding other map features that normally cannot be seen by eye at the same zoom level without the aid of a magnification device.
- increasing a zoom level of a viewing window may not only enlarge features already displayed on a map, but also cause the mapping application to draw additional features of the map.
- FIG. 4 illustrates a zoom scale including multiple zoom levels increasing from left to right, according to an embodiment. As discussed, each incremental zoom level generally corresponds to a particular magnification level. However, not all zoom levels involve acquisition of additional, usually higher resolution map data for rendering. FIG. 4 illustrates that zoom levels 402 are zoom levels that involve additional retrieval of map data and additional processing of that map data. Zoom levels 404 involve magnification but do not involve additional retrieval of map data and additional processing of that map data.
- map vector data may comprise location data and/or style attribute data for a set of vertices.
- the map vector data may be segregated or organized by zoom level, similar to that illustrated in FIG. 4 , and style attribute data may be provided along with the map data at the particular zoom levels.
- style information may simply be drawn from existing or previous style attributes from other zoom levels. Generally, this may provide abrupt changes in a visual aspect of a feature as a viewing window changes, for example when zoom level changes.
- FIG. 5A illustrates a graph of style parameter values and zoom level.
- Map data may be available for retrieval and rendering at zoom level 0.
- the map data may include style attribute data that corresponds with and is appropriate for zoom level 0 magnification.
- style attribute data corresponds to a width of a road
- the style attribute value available for zoom level 0 may have the road width scaled appropriately for that zoom level.
- additional data for zoom levels 1 and 2 are not available, some mapping systems may simply use the available style data from zoom level 0 for zoom levels 1 and 2.
- the same road width value may be used for zoom levels 1 and 2 as for level 0.
- Additional map data may be available for zoom level 3 that includes style data that corresponds with and is appropriate for zoom level 3 magnification. Again, where zoom level data is not available for zoom levels 4 and 5, previously retrieved and available style data may be used for that zoom range. Abrupt style changes may occur between certain zoom levels, e.g., between zoom level 2 and 3 as illustrated in FIG. 5A .
- One method of remedying these abrupt changes is to provide style parameters for each zoom level.
- this can be costly in terms of retrieval bandwidth (such as network bandwidth between two or more computers or intra device communication bandwidth between a processor and a local memory) and processor capacity.
- Graphics cards (having dedicated graphics processing units), such as those described in this application, are designed to specifically and efficiently process graphics calculations which often involve mathematical interpolations.
- client processing efficiencies e.g., interpolation calculation resources of a graphics card
- a CPU such as the CPU 30 a ( FIG. 2 ) may be utilized to interpolate style parameters.
- FIG. 5B illustrates a linear interpolation of style parameter values between the zoom levels 0 and 3, which are zoom levels in which map data is available and where the style values of those zoom levels are appropriate for their respective magnifications, according to an embodiment.
- a smoother transition line can be made in parameter values as compared to FIG. 5A , for example.
- FIG. 5C illustrates a more complex interpolation of three or more points (including map data at zoom level 6) that can provide a smoother style function over a range of magnifications or zoom levels, according to an embodiment.
- the interpolation of FIG. 5C may involve a polynomial interpolation function, for example.
- while rendering richer style aspects of map features may be accomplished by sending more data (i.e., style attribute data) from a server to a client for every view change (e.g., zoom level change), having a graphics processor card perform a bulk of calculations for adjusting style changes at a client device may save both bandwidth and time of download, as well as decrease overall system processing times since specialized graphics cards can more efficiently process style adjustments at a client than, for example, at a server for a plurality of clients.
- an aspect of the techniques is determining a reduced set of style parameters that may be retrieved by a client for applying interpolation processing, for example via a graphic cards/processor.
- a CPU such as the CPU 30 a ( FIG. 2 ) may be utilized to interpolate style parameters.
- FIG. 6 illustrates a process flow diagram or flow chart of a method, routine, or process 600 that may be used to render a map surface using a suitable interpolation technique, such as interpolation techniques disclosed herein, according to an embodiment.
- a block 602 may determine a viewing window state with respect to a map surface to be displayed. This viewing window state may include a viewing window magnification as well as a viewing window size, a viewing window position, and a viewing window direction for a view of a map surface. This determination may be made by receiving an input from a user of a computer device.
- the user may input a particular longitude, latitude, and altitude, as well as a zoom level corresponding to a magnification level and a viewing angle (i.e., a viewing direction).
- the determination may be made based on a pre-stored or pre-determined value for an initial rendering of a map location (e.g., an initial view of a city landmark or popular city feature) or may be based on pre-stored settings that are based on user preferences.
- a determination may be made in response to a user input indicating a pan action (e.g., a selection of an arrow indicating a pan in a particular direction, a swipe, etc.), a selection of a map feature or map point (e.g., via a touch screen press, a mouse click, etc.), etc.
- a pan action e.g., a selection of an arrow indicating a pan in a particular direction, a swipe, etc.
- a selection of a map feature or map point e.g., via a touch screen press, a mouse click, etc.
- a block 604 may determine a closest zoom level which has associated style attribute data that is appropriate for that zoom level. Determining a closest zoom level may be accomplished by any programmatic manner. For example, a closest zoom level may be determined by querying a lookup table that provides information on available zoom levels and that indicates what zoom levels are associated with style attribute data that is appropriate for that zoom level. In situations in which map data is always provided with attribute data for particular zoom levels, the determination may be based on querying for a prior lower level zoom level or a subsequent higher level zoom level that initiates a retrieval of additional data.
- a block 606 may determine whether a current viewing window zoom level (as determined to be part of the viewing state of block 602 ) is the same as the closest zoom level of block 604 .
- block 604 may not select the closest zoom level that contains an associated style attribute data for the zoom level. Instead, block 604 may choose a second closest, third closest, etc. This may be the case, when, for example, the closest zoom level that has associated style attribute data may not be available.
- a block 610 may determine at least two zoom levels containing map data including style attribute data, where the at least two zoom levels define a range that includes the viewing window zoom level. In other words, the block 610 may determine two additional zoom levels such that the viewing window zoom level is between the two additional zoom levels.
- a closest zoom level of block 604 may be used as a first zoom level for block 610 .
- a second zoom level may then be determined by searching a closest zoom level in an opposite direction from the viewing window zoom level of block 602 . For example, if block 604 determined a zoom level closest to the viewing window zoom level at a subsequent higher zoom level, block 610 may determine a prior lower zoom level as a second zoom level, and vice versa.
- a block 612 may interpolate the style parameter values of the two zoom levels to produce interpolated style attribute data for use in rendering the map display.
- a block 614 may, based on the interpolated style attribute data of the block 612 , render the map surface for the viewing window at the viewing window magnification/zoom level.
- block 612 may include determining an interpolation parameter, and utilizing the interpolation parameter to interpolate the style attribute data.
- the interpolation parameter may generally represent a degree of a particular style attribute corresponding to a first zoom level of the two zoom levels versus a degree of the particular style attribute corresponding to a second zoom level of the two zoom levels.
- the interpolation parameter is determined based on the viewing window zoom level.
- the interpolation parameter may correspond to 50% of particular style attribute corresponding to the first zoom level and 50% of the particular style attribute corresponding to the second zoom level.
- the interpolation parameter may correspond to 75% of particular style attribute corresponding to the first zoom level and 25% of the particular style attribute corresponding to the second zoom level.
- the interpolation parameter may be determined and varied over time.
- the interpolation parameter may be varied over a time period so that the interpolated style parameter changes gradually over time from 100% of a particular style attribute corresponding to the first zoom level, to 50% of the particular style attribute corresponding to the first zoom level and 50% of the particular style attribute corresponding to the second zoom level, until finally becoming 100% of the particular style attribute corresponding to the second zoom level.
- the map display may be “animated” so that a feature, e.g., a road width, initially is displayed corresponding to a first style parameter value (e.g., corresponding to the first zoom level), and gradually is changed over time so that the feature is eventually displayed corresponding to a second style parameter value (e.g., corresponding to the second zoom level).
- the map surface may initially be rendered with style data of one of the zoom levels (e.g., a closest zoom level) until a trigger is activated.
- a trigger may be an event in which a current viewing window zoom level is changed to exceed a threshold zoom level, for example.
- blocks 612 and block 614 may be repeated over time so that interpolated style attributes are gradually changed over time (e.g., from the first zoom level to the second zoom level) and the map surface is “animated” so that the rendering of map features changes gradually over time.
- FIG. 7 illustrates a process flow 700 for a mapping application, according to an embodiment.
- An initial zooming level e.g., at an initial mapping application startup, may be pre-determined to be a zoom level in which a complete set of map data, including style attribute data, is designated for that zoom level.
- an initial set of style parameters is first retrieved at a block 702 and used to render the map at block 704 .
- the process of FIG. 6 may be executed so that the rendering of block 704 is based on style attribute data that corresponds with its zoom level (via a previous interpolation).
- the mapping application may initiate a view change at block 706 , such as a zooming change, and the mapping application may determine whether additional map data is to be retrieved at block 708 .
- Block 708 may determine that additional zoom level style data is needed if the style attribute data used to render the viewing window at block 704 corresponds to data provided for a single zoom level (i.e., the zoom level of the initial viewing window) without prior interpolation (i.e., only one data point exists). In a case where the initial viewing is rendered based on a prior interpolation, block 708 may determine whether the new zoom level of bock 706 is within the prior interpolation range.
- style data from the prior interpolation can be used to provide appropriate style data for the new zoom level of block 706 .
- block 708 may still determine to retrieve additional data. This may be the case in a situation in which greater interpolation accuracy is required (to be discussed further below).
- zoom changes may not affect style parameters (e.g., a viewing window direction). Zoom changes, however, usually do affect some style parameters. Also, not all the available set of style parameters may be affected by the view change and thus, sometimes only a subset of style attributes may be retrieved.
- the mapping application determines and retrieves an additional amount of map data at block 710 that includes at least one additional style parameter value (e.g., for the reduced set of style parameters). The process then determines a new set of style parameter values at a block 712 for the view change by interpolating the newly retrieved style parameter values of block 710 and the previous style parameter values of block 702 . Because interpolation needs at least two data points, the minimum amount of additional data retrieved at block 710 is at least a one additional style parameter value.
- the viewing window may be re-rendered at block 714 based on the interpolated style data.
- blocks 712 and 714 may be repeated and an interpolation parameter utilized at block 712 is changed over time so that so that the interpolated style parameter changes gradually over time from 100%, for example, of a particular style attribute corresponding to the first zoom level, to 50% of the particular style attribute corresponding to the first zoom level and 50% of the particular style attribute corresponding to the second zoom level, until finally becoming 100%, for example, of the particular style attribute corresponding to the second zoom level.
- extrapolation based on existing style attribute data points may be used to adjust a style parameter.
- extrapolation may be a default process of adjusting a style parameter when no attribute data exists that is appropriate for a given zoom level and the given zoom level is outside the range style attribute data points that can be used for interpolation.
- FIG. 8 illustrates a process flow 800 for handling additional data sets of style parameters, according to an embodiment.
- additional data sets correspond to additional attribute data of other zoom levels.
- Block 802 determines an initial viewing window state having an initial zoom level and corresponding initial magnification.
- Block 804 may determine a minimum number of style attribute data sets (e.g., how many zoom levels of style attribute data) to retrieve for an initial rendering of the viewing window.
- the viewing window When a viewing window zoom level is chosen (e.g., by a user) that coincides with a zoom level having available style data scaled for that zoom level (i.e., the style data corresponds to or is appropriate for a feature rendered at that zoom level), the viewing window may be rendered without interpolation and without a second style attribute data set.
- two style attribute data sets corresponding to two different zoom levels surrounding the chosen zoom level may be retrieved and interpolated to provide the style parameter data for the chosen zoom level.
- at least one of the two style attribute data sets is retrieved prior to determining the viewing window (block 802 ).
- a style attribute data set for a zoom level 3 may have been retrieved previously when, at block 802 , a viewing window corresponding to a zoom level 4 is determined.
- a style attribute data set for both the zoom level 3 and a zoom level 6 may have been retrieved previously when, at block 802 , a viewing window corresponding to a zoom level 4 is determined.
- a block 806 may determine, depending on a number of factors, whether to retrieve for an initial view rendering style attribute data that includes more than a minimum two data sets of style parameter values used in performing interpolation. When this additional data is retrieved at one time, interpolation calculations may be performed for an entire range of zoom levels at one time. Additional style attribute data (i.e., additional sets of style parameter values) corresponding to zoom levels beyond what is minimally needed (e.g., to perform a basic interpolation) may be retrieved and processed depending on factors such as a type of style parameter, a frequency or likelihood of additional view changes in the near future, a degree of accuracy desired for interpolating a style parameter, etc. It should be noted that these parameters may be interrelated to each other.
- the techniques may involve some indication that additional map data points/style attribute data is available.
- the client may be provided information about what zoom levels are associated with additional map data and/or what zoom levels have additional map data for retrieval (map data that includes additional style attribute data).
- a block 808 may interpolate the sets of style parameters determined/retrieved in blocks 804 and 806 .
- a block 810 may then render a feature of a map surface using the interpolated style attribute values.
- blocks 808 and 810 may be repeated and an interpolation parameter utilized at block 808 is changed over time so that the interpolated style parameter changes gradually over time from 100%, for example, of a particular style attribute corresponding to the first zoom level, to 50% of the particular style attribute corresponding to the first zoom level and 50% of the particular style attribute corresponding to the second zoom level, until finally becoming 100%, for example, of the particular style attribute corresponding to the second zoom level.
- Increasing the number of data points for interpolation generally increases the accuracy of the interpolation.
- the described method and system may require that additional data points that define a range of zoom levels be interpolated in advance for the range so that a transition anywhere within the range is more accurate for the entire range.
- the tradeoff with increasing interpolation accuracy may often be additional bandwidth and processing capacity needed for retrieving and/or processing the additional data points.
- a level of interpolation accuracy may depend on a resolution parameter that may be set by a user or automatically set by the mapping application.
- Processor capacity for performing interpolation calculations may be considered. For example, a current processor capacity may be checked against a threshold.
- the processor that is to perform the interpolation e.g., the processor of the graphics card
- decisions of whether to download a plurality of additional style data points may depend more on bandwidth and/or latency considerations associated with retrieving the additional style data points from the server.
- bandwidth considerations may depend on checking whether a current bandwidth/time-to-download for retrieving style attribute data is above a threshold. For example, the client may check a current download rate of a retrieval process to determine whether to retrieve additional style parameters. This may also depend on the amount of style data that may be requested.
- FIG. 9 illustrates a process for determining what style parameters to include in each data set, according to an embodiment.
- a system may provide a complete set of style attribute values whenever any style attribute value is retrieved and/or requested. For example, a complete style attribute lookup table may be retrieved each time map data is retrieved and rendered for a new zoom level. Alternatively, only a subset of style parameters available for retrieval may be retrieved, requested, or processed. Using the described techniques herein, a subset or reduced set of style parameters may be determined based on whether the style parameters can be interpolated and how those style parameters may be interpolated.
- FIG. 9 illustrates a process flow 900 for determining a reduced set of style parameters that may be subject to interpolation functions, according to an embodiment.
- a set of style parameters may be retrieved. In some embodiments, this may include a complete set of all style parameters (this may be a listing of all style parameters) that can be used by a mapping application. In some situations, this may represent only a subset.
- a style parameter is selected for processing.
- Block 906 may determine whether a style parameter is designated for interpolation. This determination may be performed based on an associated flag, attribute, or other indicator associated with the style parameter indicating whether the style parameter is suitable for interpolation or can be subject to interpolation. The flag may be set by a map application designer based on aesthetic considerations of the map. Some style parameters do not represent values that can be interpolated or extrapolated. For example, where a style parameter value is selected from a set of fixed values, no interpolation may be possible or appropriate.
- style parameters may not be suitable for interpolation because an interpolation function used by the graphics processor may not be suitable for interpolating the values of a particular style parameter.
- an interpolation function used by the graphics processor may not be suitable for interpolating the values of a particular style parameter.
- a linear interpolation may be implemented.
- other interpolations such as exponential interpolations may be more suitable for calculating, for example, a curve, which the graphics processor is not programmed to perform.
- the method and system may simply determine that a particular style parameter will not be interpolated. In which case a default value may be used for the style parameter value in lieu of interpolation.
- Block 908 may determine whether a style parameter can be interpolated based on a current viewing or rendering condition. Some style parameters may not be subject to interpolation because of a current viewing condition, such as a current zoom level range. For example, certain style parameters may only be relevant at higher zoom levels than lower zoom levels. In those cases, the style parameters may not be retrieved and interpolated until a zoom level adjustment to the higher zoom level is initiated.
- Block 910 may determine whether a style parameter is to be interpolated based on a priority parameter. Similar to the interpolation indication, the priority parameter may be an associated flag, attribute, or other indicator associated with the style parameter indicating a priority value. This priority parameter may be used by the mapping system to determine a set of high priority style parameters to retrieve and process over low priority style parameters. This may be the case when a current condition of the mapping application requires reduced data retrieval and/or processing due to processor load. For example, where the processor is overloaded or backed up (the processor capacity is low or below a threshold), low priority style parameters may not be retrieved and interpolated to reduce processor workload.
- the priority parameter may be an associated flag, attribute, or other indicator associated with the style parameter indicating a priority value.
- This priority parameter may be used by the mapping system to determine a set of high priority style parameters to retrieve and process over low priority style parameters. This may be the case when a current condition of the mapping application requires reduced data retrieval and/or processing due to processor load. For example, where the processor is
- Block 912 may determine whether high and/or low priority style parameters (based on priority attribute) will be retrieved and processed based on a current bandwidth of a retrieval path, channel, or connection. For example, block 902 may determine not to retrieve and process low priority style parameters when bandwidth is below a threshold.
- Block 914 determines whether there is data sufficient for interpolating the style parameter. Interpolation generally requires at least two data points.
- Block 916 may select or not select the style parameter for retrieval and/or processing based blocks 904 - 914 .
- Blocks 904 - 914 may be repeated to determine a reduced set of style parameters for retrieval and processing.
- any suitable subset of the blocks may be implemented in any suitable order by a number of different devices (e.g., client or server) and remain consistent with the method and system described herein. Moreover, additional determination blocks may be added to refine the filtering of style parameters subject to interpolation processing.
- the network 25 may include but is not limited to any combination of a LAN, a MAN, a WAN, a mobile, a wired or wireless network, a private network, or a virtual private network.
- client devices While only four client devices are illustrated in FIG. 1 to simplify and clarify the description, it is understood that any number of client computers or display devices are supported and can be in communication with the server 14 .
- Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules.
- a hardware module is tangible unit capable of performing certain operations and may be configured or arranged in a certain manner.
- one or more computer systems e.g., a standalone, client or server computer system
- one or more hardware modules of a computer system e.g., a processor or a group of processors
- software e.g., an application or application portion
- a hardware module may be implemented mechanically or electronically.
- a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations.
- a hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
- the term hardware should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein.
- hardware modules are temporarily configured (e.g., programmed)
- each of the hardware modules need not be configured or instantiated at any one instance in time.
- the hardware modules comprise a general-purpose processor configured using software
- the general-purpose processor may be configured as respective different hardware modules at different times.
- Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
- Hardware and software modules can provide information to, and receive information from, other hardware and/or software modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware or software modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware or software modules. In embodiments in which multiple hardware modules or software are configured or instantiated at different times, communications between such hardware or software modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware or software modules have access. For example, one hardware or software module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware or software module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware and software modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
- a resource e.g., a collection of information
- processors may be temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions.
- the modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
- the methods or routines described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or processors or processor-implemented hardware modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment or as a server farm), while in other embodiments the processors may be distributed across a number of locations.
- the one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., application program interfaces (APIs).)
- SaaS software as a service
- the performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines.
- the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
- an “algorithm” or a “routine” is a self-consistent sequence of operations or similar processing leading to a desired result.
- algorithms, routines and operations involve physical manipulation of physical quantities. Typically, but not necessarily, such quantities may take the form of electrical, magnetic, or optical signals capable of being stored, accessed, transferred, combined, compared, or otherwise manipulated by a machine.
- any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment.
- the appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
- Coupled and “connected” along with their derivatives.
- some embodiments may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact.
- the term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still cooperate or interact with each other.
- the embodiments are not limited in this context.
- the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion.
- a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
- “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Theoretical Computer Science (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
A viewing window of a map surface is determined, at a certain zoom level corresponding to the magnification of the map surface. A first set of style parameters for applying to a feature of the map surface, where the feature is described in a vector format using several interconnected vertices, is determined. The first set of style parameters corresponds to a first zoom level of the viewing window, and the first zoom level corresponds to a first magnification. A second set of style parameters for the feature is also determined, where the second set of style parameters corresponds to a second zoom level of the viewing window, and where the second zoom level corresponds to a second magnification. A third set of style parameters for displaying the feature is determined by interpolating between the first set of style parameters and the second set of style parameters.
Description
- The present application is a continuation of U.S. application Ser. No. 13/625,722, entitled “Map Rendering Using Interpolation of Style Parameters Across Zoom Levels,” filed on Sep. 24, 2012, now U.S. Pat. No. 8,803,901, which is a continuation of U.S. application Ser. No. 13/247,637, entitled “Map Rendering Using Interpolation of Style Parameters Across Zoom Levels,” filed on Sep. 28, 2011, now U.S. Pat. No. 8,274,524. Both applications are hereby incorporated by reference herein in their entirety.
- The present disclosure relates to map rendering systems, such as electronic map display systems, and more specifically to a map rendering system that renders elements of map features using interpolated style parameter values across different zoom levels.
- Digital maps are found in and may be displayed by a wide variety of devices, including mobile phones, car navigation systems, hand-held global positioning system (GPS) units, computers, and many websites. Although digital maps are easy to view and to use from an end-user's perspective, creating a digital map is a difficult task and can be a time-consuming process. In particular, every digital map begins with storing, in a map database, a set of raw data corresponding to millions of streets and intersections and other features to be displayed as part of a map. The raw map data that is stored in the map database and that is used to generate digital map images is derived from a variety of sources, with each source typically providing different amounts and types of information. This map data must therefore be compiled and stored in the map database before being accessed by map display or map rendering applications and hardware.
- There are, of course, different manners of digitally rendering map images (referred to as digital map images) based on map data stored in a map database. One method of rendering a map image is to store map images within the map database as sets of rasterized or pixilated images made up of numerous pixel data points, with each pixel data point including properties defining how a particular pixel in an image is to be displayed on an electronic display device. While this type of map data is relatively easy to create and store, the map rendering technique using this data typically requires a large amount of storage space for comprehensive digital map images, and it is difficult to manipulate the digital map images as displayed on a display device in very many useful manners.
- Another, more flexible methodology of rendering images uses what is traditionally called vector image data. Vector image data is typically used in high-resolution and fast-moving imaging systems, such as those associated with gaming systems, and in particular three-dimensional gaming systems. Generally speaking, vector image data (or vector data) includes data that defines specific image objects or elements (also referred to as primitives) to be displayed as part of an image via an image display device. In the context of a map image, such image elements or primitives may be, for example, individual roads, text labels (e.g., map or street labels), areas, text boxes, buildings, points of interest markers, terrain features, bike paths, etc. Each image element is generally made up or drawn as a set of one or more triangles (of different sizes, shapes, colors, fill patterns, etc.), with each triangle including three vertices interconnected by lines. Thus, for any particular image element, the image database stores a set of vertex data points, with each set of vertex data points defining a particular vertex of one of the triangles making up the image element. Generally speaking, each vertex data point includes data pertaining to a two-dimensional or a three-dimensional position of the vertex (in an X, Y or an X, Y, Z coordinate system, for example) and various vertex attributes defining properties of the vertex, such as color properties, fill properties, line width properties for lines emanating from the vertex, etc.
- During the image rendering process, the vertices defined for various image elements of an image to be rendered are provided to and are processed in one or more image shaders which operate in conjunction with a graphics processing unit (GPU), such as a graphics card or a rasterizer, to produce a two-dimensional image on a display screen. Generally speaking, an image shader is a set of software instructions used primarily to calculate rendering effects on graphics hardware with a high degree of flexibility. Image shaders are well known in the art and various types of image shaders are available in various application programming interfaces (APIs) provided by, for example, OpenGL and Direct3D, to define special shading functions. Basically, image shaders are simple programs in a high level programming language, for example, that describe or determine the traits of either a vertex or a pixel. Vertex shaders, for example, define the traits (e.g., position, texture coordinates, colors, etc.) of a vertex, while pixel or fragment shaders define the traits (color, z-depth and alpha value) of a pixel. A vertex shader is called for each vertex in an image element or primitive so that, for each vertex input into the vertex shader, the vertex shader produces one (updated) vertex output. Each vertex output by the vertex shader is then rendered as a series of pixels onto a block of memory that will eventually be sent to a display screen.
- Unfortunately, there are certain image processing actions, such as changing colors, fill properties, line widths, etc. of image objects or elements on a displayed map image that require the downloading of new vector data (with new properties or vertex attributes) from the map database or from the application or system creating the vector data points. As a result, it may be relatively time consuming to simply change style properties of rendered images, such as the colors or fill patterns of image elements (such as roads, for example), the line widths used to display image element outlines, text, etc., because doing so means that new vector data, with the new vertex attributes (style attributes) defining these new properties, must be sent to the rendering engine from the map database or other application. As a result, it is still relatively difficult to change or modify vector images created using vector data to perform simple visual modifications of features within the images themselves at the image rendering device, such as changing properties of map elements like line widths, color properties, etc.
- A computer-implemented method for rendering a map on a display device includes determining a first set of style parameters for a feature of a map at a first zoom level, a second set of style parameters for a second zoom level, and a third set of style parameters for a third zoom level, where the third set of style parameters is determined by interpolating the first and second set of style parameters. The method then renders or displays the first feature of the map in a viewing window based on the interpolated third set of style parameters. The method further determines a fourth set of style parameters based on a fourth zoom level and determines the third set of style parameters based on the first, second and fourth set of style parameters. The method further determines whether to retrieve the fourth set of style parameters based on a current bandwidth, a current processor capacity, or a resolution setting. The method includes using a linear interpolation process and/or a polynomial interpolation process.
- In one embodiment, style attribute information may be stored as a style lookup table and the method may determine style attribute tables that are associated with the first, second, third and/or fourth zoom levels.
- In one embodiment, the method may determine to retrieve only a subset of available style parameters for a given zoom level based on any combination of an interpolation attribute, a priority attribute, a current bandwidth, or a current processor capacity.
- In another embodiment, a computer device includes a communications network interface, one or more processors, one or more memories coupled to the one or more processors and a display device coupled to the one or more processors. The one or more memories include computer executable instructions that are executed on the processor to determine a first view of a map surface defined by a first magnification of the surface of the map, where a first zoom level of the first view corresponds to the first magnification. The computer executable instructions are executed to retrieve a first set of style parameters for a first feature of the map surface, the first set of style parameters corresponding to the first zoom level. The computer executable instructions are executed to determine a second view of the map surface defined by a second magnification of the map surface, where a second zoom level of the second view corresponds to the second magnification. The computer executable instructions are executed to retrieve a second set of style parameters for the first feature of the map surface, the second set of style parameters corresponding to a third zoom level, where the third zoom level corresponds to a third magnification different from the first and the second magnification. The computer executable instructions are executed to determine a third set of style parameters for the first feature of the map surface at the second zoom level based on interpolating the first set of style parameters and the second set of style parameters when the second zoom level is between the first and third zoom level. The computer executable instructions are executed to render the first feature in the second view at the second zoom level using the third set of style parameters.
- The computer device executes the computer executable instructions to retrieve the first and the second set of style parameters as a subset of style parameters available for retrieval based on a style attribute setting associated with each of the style parameters indicating whether the style parameter is designated for interpolation.
- The computer device executes the computer executable instructions to retrieve the first and the second set of style parameters as a subset of style parameters available for retrieval based on a style priority attribute associated with each of the style parameters.
- The computer device executes the computer executable instructions to retrieve the first and the second set of style parameters as a subset of style parameters available for retrieval based on whether a current processor capacity is above a threshold.
- The computer device executes the computer executable instructions to retrieve the first and the second set of style parameters as a subset of style parameters available for retrieval based on whether a current available bandwidth is above a threshold.
- The computer device executes the computer executable instructions to retrieve a fourth set of style parameters for the first feature of the map surface when a resolution setting is increased, the fourth set of style parameters corresponding to a fourth zoom level, wherein the fourth zoom level corresponds to a fourth magnification different from the first, the second, and the third magnification. The computer device executes the computer executable instructions to determine the third set of style parameters for the first feature of the map surface at the second zoom level based on interpolating at least the first set of style parameters at the first magnification, the second set of style parameters at the second magnification, and the fourth set of style parameters at the fourth magnification, wherein the third magnification is different from the first, second, and fourth magnification.
- In another embodiment, a computer device includes a communications network interface, one or more processors, one or more memories coupled to the one or more processors and a display device coupled to the one or more processors. The one or more memories include computer executable instructions that are executed on the processor to determine a first view of a map surface defined by a first magnification of the surface of the map, and where a first zoom level of the first view corresponds to the first magnification. The computer executable instructions are executed to retrieve a first set of style parameters for a first feature of the map surface, the first set of style parameters corresponding to a zoom level closest to the first zoom level. The computer executable instructions are executed to determine if the first set of style parameters corresponds to the first zoom level. The computer executable instructions are executed to render a first feature in the first view at the first zoom level using the first set of style parameters if the first set of style parameters corresponds to the first zoom level. The computer executable instructions are executed to retrieve a second set of style parameters for the first feature of the map surface if the first set of style parameters do not correspond to the first zoom level, the second set of style parameters corresponding to a third zoom level, where the third zoom level corresponds to a third magnification different from the first and the second magnification and where the first zoom level is between the second and the third zoom level. The computer executable instructions are executed to determine a third set of style parameters for the first feature of the map surface at the first zoom level based on interpolating the first and the second set of style parameters. The computer executable instructions are executed to render the first feature in the first view at the first zoom level using the third set of style parameters.
-
FIG. 1 is a high-level block diagram of a map imaging system that implements communications between a map database stored in a server and one or more map image rendering devices, according to an embodiment. -
FIG. 2 is a high level block diagram of an image rendering engine used to render map images using map vector data, according to an embodiment. -
FIG. 3A is a data diagram illustrating a set of vector data in the form of vertex data points encoded using a vertex style attribute, according to an embodiment. -
FIG. 3B is a first texture map in the form of a style lookup table that defines vertex style attribute values for each of a number of different styles and which is used in the image rendering engine ofFIG. 2 to resolve vertex style attributes based on a style reference, according to an embodiment. -
FIG. 4 illustrates a zoom scale including multiple zoom levels, according to an embodiment. -
FIG. 5A illustrates a graph of style parameter values and zoom level without interpolation. -
FIG. 5B illustrates a graph of style parameter values and zoom level with linear interpolation, according to an embodiment. -
FIG. 5C illustrates a graph of style parameter values and zoom level with polynomial interpolation, according to an embodiment. -
FIG. 6 illustrates a process flow diagram of a method that may be used to render a map surface using the described interpolation techniques, according to an embodiment. -
FIG. 7 illustrates a process flow diagram for retrieving style attribute data and for interpolating the style data for rendering a feature of a map, according to an embodiment. -
FIG. 8 illustrates a process flow for handling additional data sets of style parameters, according to an embodiment. -
FIG. 9 illustrates a process for determining what style parameters to include in reduced style attribute data set, according to an embodiment. - The present application generally relates to techniques for rendering map features during zooming operations of a viewing window. A graphics or image rendering system, such as a map image rendering system, may receive map data for a given set of zoom levels, where the map data includes style parameter values for various features of a map surface, where the style parameter values correspond to a particular zoom level. The techniques may interpolate at least some of the style parameter values from the received map data to provide style parameter values over a range of zoom levels and the map image rendering system may render a viewing window at a zoom level based on the interpolated style parameters.
- Referring now to
FIG. 1 , a map-relatedimaging system 10, according to an embodiment, includes amap database 12 stored in aserver 14 or in multiple servers located at, for example, a central site or at various different spaced apart sites, and also includes multiplemap client devices server 14 via any hardwired orwireless communication network 25, including for example a hardwired or wireless local area network (LAN), metropolitan area network (MAN) or wide area network (WAN), the Internet, or any combination thereof. The map client devices 16-22 may be, for example, mobile phone devices (18), computers such a laptop, tablet, desktop or other suitable types of computers (16, 20) or components of other imaging systems such components of automobile navigation systems (22), etc. Moreover, the client devices 16-22 may be communicatively connected to theserver 14 via any suitable communication system, such as any publically available and/or privately owned communication network, including those that use hardwired based communication structure, such as telephone and cable hardware, and/or wireless communication structure, such as wireless communication networks, including for example, wireless LANs and WANs, satellite and cellular phone communication systems, etc. - The
map database 12 may store any desired types or kinds of map data including raster image map data and vector image map data. However, the image rendering systems described herein are best suited for use with vector image data which defines or includes a series of vertices or vertex data points for each of numerous sets of image objects, elements or primitives within an image to be displayed. Generally speaking, each of the image objects defined by the vector data will have a plurality of vertices associated therewith and these vertices will be used to display a map related image object to a user via one or more of the client devices 16-22. As will also be understood, each of the client devices 16-22 includes an image rendering engine having one ormore processors 30, one ormore memories 32, adisplay device 34, and in many cases a rasterizer orgraphics card 36 which are generally programmed and interconnected in known manners to implement or to render graphics (images) on the associateddisplay device 34. Thedisplay device 34 for any particular client device 16-22 may be any type of electronic display device such as a liquid crystal display (LCD), a light emitting diode (LED) display, a plasma display, a cathode ray tube (CRT) display, or any other type of known or suitable electronic display. - Generally, speaking, the map-related
imaging system 10 ofFIG. 1 operates such that a user, at one of the client devices 16-22, opens or executes a map application (not shown inFIG. 1 ) that operates to communicate with and obtain map information or map related data from themap database 12 via theserver 14, and that then displays or renders a map image based on the received map data. The map application may allow the user to view different geographical portions of the map data stored in themap database 12, to zoom in or zoom out on a particular geographical location, to rotate, spin or change the two-dimensional or three-dimensional viewing angle of the map being displayed, etc. More particularly, when rendering a map image on a display device or adisplay screen 34 using the system described below, each of the client devices 16-22 downloads map data in the form of vector data from themap database 12 and processes that vector data using one or more image shaders to render an image on the associateddisplay device 34. - Referring now to
FIG. 2 , an image generation orimaging rendering device 40, according to an embodiment, associated with or implemented by one of the client devices 16-22 is illustrated in more detail. Theimage rendering system 40 ofFIG. 2 includes twoprocessors memories user interface 34 and arasterizer 36. In this case, theprocessor 30 b, thememory 32 b and therasterizer 36 are disposed on a separate graphics card (denoted below the horizontal line), although this need not be the case in all embodiments. For example, in other embodiments, a single processor may be used instead. In addition, theimage rendering system 40 includes anetwork interface 42, a communications andstorage routine 43 and onemore map applications 48 having map display logic therein stored on thememory 32 a, which may be executed on theprocessor 30 a (e.g., which may be a central processing unit (CPU)). Likewise one or more image shaders in the form of, for example, vertex shaders 44 andfragment shaders 46 are stored on thememory 32 b and are executed on theprocessor 30 b. Thememories processors network interface 42 includes any well known software and/or hardware components that operate to communicate with, for example, theserver 14 ofFIG. 1 via a hardwired or wireless communications network to obtain image data in the form of vector data for use in creating an image display on the user interface ordisplay device 34. Theimage rendering device 40 also includes adata memory 49, which may be a buffer or volatile memory for example, that stores vector data received from themap database 12, the vector data including any number of vertex data points and one or more lookup tables as will be described in more detail. - During operation, the map logic of the
map application 48 executes on theprocessor 30 to determine the particular image data needed for display to a user via thedisplay device 34 using, for example, user input, GPS signals, prestored logic or programming, etc. The display or map logic of theapplication 48 interacts with themap database 12, using the communications routine 43, by communicating with theserver 14 through thenetwork interface 42 to obtain map data, preferably in the form of vector data or compressed vector data from themap database 12. This vector data is returned via thenetwork interface 42 and may be decompressed and stored in thedata memory 49 by the routine 43. In particular, the data downloaded from themap database 12 may be a compact, structured, or otherwise optimized version of the ultimate vector data to be used, and themap application 48 may operate to transform the downloaded vector data into specific vertex data points using theprocessor 30 a. In one embodiment, the image data sent from theserver 14 includes vector data generally defining data for each of a set of vertices associated with a number of different image elements or image objects to be displayed on thescreen 34 and possibly one or more lookup tables which will be described in more detail below. If desired, the lookup tables may be sent in, or may be decoded to be in, or may be generated by themap application 48 to be in the form of vector texture maps which are known types of data files typically defining a particular texture or color field (pixel values) to be displayed as part of an image created using vector graphics. More particularly, the vector data for each image element or image object may include multiple vertices associated with one or more triangles making up the particular element or object of an image. Each such triangle includes three vertices (defined by vertex data points) and each vertex data point has vertex data associated therewith. In one embodiment, each vertex data point includes vertex location data defining a two-dimensional or a three-dimensional position or location of the vertex in a reference or virtual space, as well as an attribute reference. Each vertex data point may additionally include other information, such as an object type identifier that identifies the type of image object with which the vertex data point is associated. The attribute reference, referred to herein as a style reference or as a feature reference, references or points to a location or a set of locations in one or more of the lookup tables downloaded and stored in thedata memory 43. -
FIG. 3A illustrates an embodiment of map data that may be sent to a client device, such asdevice 40 ofFIG. 2 , for processing, according to an embodiment. AsFIG. 3A illustrates, map data contains location data for a vertex, an object type, and a style attribute(s) for the vertex. A set of one or more of the vertices may comprise an image object or feature of a map, such as a road or building. The style attributes may be sent for each vertex or may reference a style look up table such as that illustrate inFIG. 3B that can be used to decode a style reference fromFIG. 3A into a complete set of one or more style attribute parameters values, according to an embodiment. - Style parameters may include a fill color (e.g., for area objects), an outline color, an outline width, an outline dashing pattern and an indication of whether to use rounded end caps (e.g., for road objects), an interior color, an interior width, an interior dashing pattern, and interior rounded end caps (e.g., for road objects), a text color and a text outline color (e.g., for text objects), an arrow color, an arrow width, an arrow dashing pattern (e.g., for arrow objects), a text box fill color and a set of text box outline properties (e.g., for text box objects) to name but a few. Of course, different ones of the vertex style attributes provided may be applicable or relevant to only a subset of image objects and thus the vertex style data points associated with a particular type of image object may only refer to a subset of the vertex attributes listed for each style.
- Generally speaking, the techniques for rendering a map involve determining a viewing window of a map surface at a first zoom level and determining a set of style parameters for a plurality of zoom levels (a set comprises one or more elements). When a viewing window is to be rendered at the first zoom level, the techniques described herein may interpolate across the plurality of style parameters to provide style parameter values appropriate for the first zoom level.
- A zoom level generally corresponds to a magnification which is used, in part, to define a displayable area of a map surface within a viewing window. A magnification of the viewing window may correspond with a scale for which the map surface is rendered or drawn. For example, where magnification or scale is expressed as a ratio such as 1:1,000, one of any unit of measurement on the viewing window may correspond exactly or approximately to 1,000 actual units. When the viewing window size is measured in inches, the distance scale may translate an inch of the viewing window to a length of 1,000 miles (or kilometers).
- Some computerized maps allow users to zoom in or zoom out of a map surface, where a zoom level generally corresponds to a magnification of the viewing window that displays the map surface. Unlike a paper map that displays all possible map surface data in one fixed rendering, computer mapping applications may only display certain map features that can be resolved by eye at a certain zoom level or magnification (corresponding to a distance to the object) while excluding other map features that normally cannot be seen by eye at the same zoom level without the aid of a magnification device. In these computer mapping applications, increasing a zoom level of a viewing window may not only enlarge features already displayed on a map, but also cause the mapping application to draw additional features of the map.
- Not all zoom level increases, however warrant drawing additional map features. For example, where an increase in magnification between a first and a second zoom level of a mapping application is relatively small, existing features of the map surface may simply be magnified without any additional feature data rendered. This viewing difference between magnification of existing objects and magnification plus additional rendering of map data may be demonstrated as follows. When increasing magnification of a map surface without retrieving additional zoom level data tiles, a building represented as a block (or square) will simply be displayed as an enlarged block (or square). When increasing magnification and retrieving additional zoom level data, the same building may be displayed as an enlarged block but with additional sub-features such as windows, columns doors, etc.
-
FIG. 4 illustrates a zoom scale including multiple zoom levels increasing from left to right, according to an embodiment. As discussed, each incremental zoom level generally corresponds to a particular magnification level. However, not all zoom levels involve acquisition of additional, usually higher resolution map data for rendering.FIG. 4 illustrates that zoom levels 402 are zoom levels that involve additional retrieval of map data and additional processing of that map data. Zoom levels 404 involve magnification but do not involve additional retrieval of map data and additional processing of that map data. - As discussed above, map vector data may comprise location data and/or style attribute data for a set of vertices. The map vector data may be segregated or organized by zoom level, similar to that illustrated in
FIG. 4 , and style attribute data may be provided along with the map data at the particular zoom levels. When style information is not provided for a zoom level, style information may simply be drawn from existing or previous style attributes from other zoom levels. Generally, this may provide abrupt changes in a visual aspect of a feature as a viewing window changes, for example when zoom level changes. -
FIG. 5A illustrates a graph of style parameter values and zoom level. Map data may be available for retrieval and rendering atzoom level 0. The map data may include style attribute data that corresponds with and is appropriate forzoom level 0 magnification. For example, where style attribute data corresponds to a width of a road, the style attribute value available forzoom level 0 may have the road width scaled appropriately for that zoom level. When additional data forzoom levels zoom level 0 forzoom levels zoom levels level 0. Additional map data may be available forzoom level 3 that includes style data that corresponds with and is appropriate forzoom level 3 magnification. Again, where zoom level data is not available forzoom levels 4 and 5, previously retrieved and available style data may be used for that zoom range. Abrupt style changes may occur between certain zoom levels, e.g., betweenzoom level FIG. 5A . - One method of remedying these abrupt changes is to provide style parameters for each zoom level. However, this can be costly in terms of retrieval bandwidth (such as network bandwidth between two or more computers or intra device communication bandwidth between a processor and a local memory) and processor capacity. Graphics cards (having dedicated graphics processing units), such as those described in this application, are designed to specifically and efficiently process graphics calculations which often involve mathematical interpolations. In some embodiments, client processing efficiencies (e.g., interpolation calculation resources of a graphics card) may be utilized for interpolating certain style parameters for rendering an image of a map when a viewing window changes from one zoom level to another. In other embodiments, a CPU, such as the
CPU 30 a (FIG. 2 ) may be utilized to interpolate style parameters. -
FIG. 5B illustrates a linear interpolation of style parameter values between thezoom levels FIG. 5B , a smoother transition line can be made in parameter values as compared toFIG. 5A , for example.FIG. 5C illustrates a more complex interpolation of three or more points (including map data at zoom level 6) that can provide a smoother style function over a range of magnifications or zoom levels, according to an embodiment. The interpolation ofFIG. 5C may involve a polynomial interpolation function, for example. - In some embodiments, while rendering richer style aspects of map features may be accomplished by sending more data (i.e., style attribute data) from a server to a client for every view change (e.g., zoom level change), having a graphics processor card perform a bulk of calculations for adjusting style changes at a client device may save both bandwidth and time of download, as well as decrease overall system processing times since specialized graphics cards can more efficiently process style adjustments at a client than, for example, at a server for a plurality of clients. Thus, an aspect of the techniques is determining a reduced set of style parameters that may be retrieved by a client for applying interpolation processing, for example via a graphic cards/processor. In other embodiments, however, a CPU, such as the
CPU 30 a (FIG. 2 ) may be utilized to interpolate style parameters. -
FIG. 6 illustrates a process flow diagram or flow chart of a method, routine, orprocess 600 that may be used to render a map surface using a suitable interpolation technique, such as interpolation techniques disclosed herein, according to an embodiment. In any event, ablock 602 may determine a viewing window state with respect to a map surface to be displayed. This viewing window state may include a viewing window magnification as well as a viewing window size, a viewing window position, and a viewing window direction for a view of a map surface. This determination may be made by receiving an input from a user of a computer device. For example, the user may input a particular longitude, latitude, and altitude, as well as a zoom level corresponding to a magnification level and a viewing angle (i.e., a viewing direction). In some embodiments, the determination may be made based on a pre-stored or pre-determined value for an initial rendering of a map location (e.g., an initial view of a city landmark or popular city feature) or may be based on pre-stored settings that are based on user preferences. As another example, a determination may be made in response to a user input indicating a pan action (e.g., a selection of an arrow indicating a pan in a particular direction, a swipe, etc.), a selection of a map feature or map point (e.g., via a touch screen press, a mouse click, etc.), etc. - A
block 604 may determine a closest zoom level which has associated style attribute data that is appropriate for that zoom level. Determining a closest zoom level may be accomplished by any programmatic manner. For example, a closest zoom level may be determined by querying a lookup table that provides information on available zoom levels and that indicates what zoom levels are associated with style attribute data that is appropriate for that zoom level. In situations in which map data is always provided with attribute data for particular zoom levels, the determination may be based on querying for a prior lower level zoom level or a subsequent higher level zoom level that initiates a retrieval of additional data. Ablock 606 may determine whether a current viewing window zoom level (as determined to be part of the viewing state of block 602) is the same as the closest zoom level ofblock 604. If the current viewing window is the same as the closest zoom level data, then the style attribute data for the current zoom level may be used to render the viewing window atblock 608 without interpolation since the current zoom level already contains data appropriate for the zoom level. In one embodiment, block 604 may not select the closest zoom level that contains an associated style attribute data for the zoom level. Instead, block 604 may choose a second closest, third closest, etc. This may be the case, when, for example, the closest zoom level that has associated style attribute data may not be available. - If the viewing window zoom level is not the same as the closest zoom level determined at
block 604, then ablock 610 may determine at least two zoom levels containing map data including style attribute data, where the at least two zoom levels define a range that includes the viewing window zoom level. In other words, theblock 610 may determine two additional zoom levels such that the viewing window zoom level is between the two additional zoom levels. A closest zoom level ofblock 604 may be used as a first zoom level forblock 610. A second zoom level may then be determined by searching a closest zoom level in an opposite direction from the viewing window zoom level ofblock 602. For example, ifblock 604 determined a zoom level closest to the viewing window zoom level at a subsequent higher zoom level, block 610 may determine a prior lower zoom level as a second zoom level, and vice versa. - In one embodiment, a
block 612 may interpolate the style parameter values of the two zoom levels to produce interpolated style attribute data for use in rendering the map display. Ablock 614 may, based on the interpolated style attribute data of theblock 612, render the map surface for the viewing window at the viewing window magnification/zoom level. In an embodiment, block 612 may include determining an interpolation parameter, and utilizing the interpolation parameter to interpolate the style attribute data. The interpolation parameter may generally represent a degree of a particular style attribute corresponding to a first zoom level of the two zoom levels versus a degree of the particular style attribute corresponding to a second zoom level of the two zoom levels. In an embodiment, the interpolation parameter is determined based on the viewing window zoom level. As an illustrative example, if the viewing window zoom level is halfway between the first zoom level and the second zoom level, the interpolation parameter may correspond to 50% of particular style attribute corresponding to the first zoom level and 50% of the particular style attribute corresponding to the second zoom level. As another illustrative example, if the viewing window zoom level is one quarter of the way from the first zoom level to the second zoom level, the interpolation parameter may correspond to 75% of particular style attribute corresponding to the first zoom level and 25% of the particular style attribute corresponding to the second zoom level. - In another embodiment, the interpolation parameter may be determined and varied over time. As an illustrative example, if the viewing window zoom level is changed from the first zoom level to the second zoom level, the interpolation parameter may be varied over a time period so that the interpolated style parameter changes gradually over time from 100% of a particular style attribute corresponding to the first zoom level, to 50% of the particular style attribute corresponding to the first zoom level and 50% of the particular style attribute corresponding to the second zoom level, until finally becoming 100% of the particular style attribute corresponding to the second zoom level. For example, the map display may be “animated” so that a feature, e.g., a road width, initially is displayed corresponding to a first style parameter value (e.g., corresponding to the first zoom level), and gradually is changed over time so that the feature is eventually displayed corresponding to a second style parameter value (e.g., corresponding to the second zoom level). Thus, in an embodiment, the map surface may initially be rendered with style data of one of the zoom levels (e.g., a closest zoom level) until a trigger is activated. A trigger may be an event in which a current viewing window zoom level is changed to exceed a threshold zoom level, for example. When the trigger occurs, blocks 612 and block 614 may be repeated over time so that interpolated style attributes are gradually changed over time (e.g., from the first zoom level to the second zoom level) and the map surface is “animated” so that the rendering of map features changes gradually over time.
-
FIG. 7 illustrates aprocess flow 700 for a mapping application, according to an embodiment. An initial zooming level, e.g., at an initial mapping application startup, may be pre-determined to be a zoom level in which a complete set of map data, including style attribute data, is designated for that zoom level. In this case, an initial set of style parameters is first retrieved at ablock 702 and used to render the map atblock 704. In one embodiment, when the initial zooming level does not include style attribute data appropriate for the initial zooming level, the process ofFIG. 6 may be executed so that the rendering ofblock 704 is based on style attribute data that corresponds with its zoom level (via a previous interpolation). The mapping application may initiate a view change atblock 706, such as a zooming change, and the mapping application may determine whether additional map data is to be retrieved atblock 708.Block 708 may determine that additional zoom level style data is needed if the style attribute data used to render the viewing window atblock 704 corresponds to data provided for a single zoom level (i.e., the zoom level of the initial viewing window) without prior interpolation (i.e., only one data point exists). In a case where the initial viewing is rendered based on a prior interpolation, block 708 may determine whether the new zoom level ofbock 706 is within the prior interpolation range. If the new zoom level is within the prior interpolation range, then additional style data may not be needed and the process may end atblock 716. In this case, style data from the prior interpolation (based on at least two style attribute data points) can be used to provide appropriate style data for the new zoom level ofblock 706. In a different embodiment, even though the new zoom level is within a prior interpolation range, block 708 may still determine to retrieve additional data. This may be the case in a situation in which greater interpolation accuracy is required (to be discussed further below). - It should be noted that some view changes may not affect style parameters (e.g., a viewing window direction). Zoom changes, however, usually do affect some style parameters. Also, not all the available set of style parameters may be affected by the view change and thus, sometimes only a subset of style attributes may be retrieved.
- If additional map data is needed for rendering, the mapping application determines and retrieves an additional amount of map data at
block 710 that includes at least one additional style parameter value (e.g., for the reduced set of style parameters). The process then determines a new set of style parameter values at ablock 712 for the view change by interpolating the newly retrieved style parameter values ofblock 710 and the previous style parameter values ofblock 702. Because interpolation needs at least two data points, the minimum amount of additional data retrieved atblock 710 is at least a one additional style parameter value. The viewing window may be re-rendered atblock 714 based on the interpolated style data. In an embodiment, blocks 712 and 714 may be repeated and an interpolation parameter utilized atblock 712 is changed over time so that so that the interpolated style parameter changes gradually over time from 100%, for example, of a particular style attribute corresponding to the first zoom level, to 50% of the particular style attribute corresponding to the first zoom level and 50% of the particular style attribute corresponding to the second zoom level, until finally becoming 100%, for example, of the particular style attribute corresponding to the second zoom level. - In some embodiments, when a determined, selected, or current zoom level is outside the range of interpolation data, extrapolation based on existing style attribute data points may be used to adjust a style parameter. In some of these embodiments, extrapolation may be a default process of adjusting a style parameter when no attribute data exists that is appropriate for a given zoom level and the given zoom level is outside the range style attribute data points that can be used for interpolation.
- While the described method and system may be illustrated for interpolating style parameters over zooming ranges, other view changes that require a style parameter value to be adjusted may be implemented in a similar manner using the described method and system. For example, where style changes may be adjusted for changes in viewing direction, the same interpolation method and system may be used to provide values for the style aspects over the viewing window direction change.
- Because retrieving and processing map data incurs a bandwidth cost and a processing cost, receiving too much style parameter data may negate the savings of interpolating at the client. Additional modifications to the described method and system may thus function to reduce the number style attributes that are retrieved in a set of style parameters (i.e., number of elements in the set) as well as limit the number of sets of style parameters needed to perform interpolation calculations.
-
FIG. 8 illustrates aprocess flow 800 for handling additional data sets of style parameters, according to an embodiment. Where style attribute data is associated with particular zoom levels, additional data sets correspond to additional attribute data of other zoom levels.Block 802 determines an initial viewing window state having an initial zoom level and corresponding initial magnification.Block 804 may determine a minimum number of style attribute data sets (e.g., how many zoom levels of style attribute data) to retrieve for an initial rendering of the viewing window. When a viewing window zoom level is chosen (e.g., by a user) that coincides with a zoom level having available style data scaled for that zoom level (i.e., the style data corresponds to or is appropriate for a feature rendered at that zoom level), the viewing window may be rendered without interpolation and without a second style attribute data set. Alternatively, two style attribute data sets corresponding to two different zoom levels surrounding the chosen zoom level may be retrieved and interpolated to provide the style parameter data for the chosen zoom level. In an embodiment, at least one of the two style attribute data sets is retrieved prior to determining the viewing window (block 802). For example, a style attribute data set for azoom level 3 may have been retrieved previously when, atblock 802, a viewing window corresponding to a zoom level 4 is determined. As another example, a style attribute data set for both thezoom level 3 and azoom level 6 may have been retrieved previously when, atblock 802, a viewing window corresponding to a zoom level 4 is determined. - A
block 806 may determine, depending on a number of factors, whether to retrieve for an initial view rendering style attribute data that includes more than a minimum two data sets of style parameter values used in performing interpolation. When this additional data is retrieved at one time, interpolation calculations may be performed for an entire range of zoom levels at one time. Additional style attribute data (i.e., additional sets of style parameter values) corresponding to zoom levels beyond what is minimally needed (e.g., to perform a basic interpolation) may be retrieved and processed depending on factors such as a type of style parameter, a frequency or likelihood of additional view changes in the near future, a degree of accuracy desired for interpolating a style parameter, etc. It should be noted that these parameters may be interrelated to each other. Further, when determining whether to retrieve and interpolate additional data points, the techniques may involve some indication that additional map data points/style attribute data is available. For example, the client may be provided information about what zoom levels are associated with additional map data and/or what zoom levels have additional map data for retrieval (map data that includes additional style attribute data). - A
block 808 may interpolate the sets of style parameters determined/retrieved inblocks block 810 may then render a feature of a map surface using the interpolated style attribute values. In an embodiment, blocks 808 and 810 may be repeated and an interpolation parameter utilized atblock 808 is changed over time so that the interpolated style parameter changes gradually over time from 100%, for example, of a particular style attribute corresponding to the first zoom level, to 50% of the particular style attribute corresponding to the first zoom level and 50% of the particular style attribute corresponding to the second zoom level, until finally becoming 100%, for example, of the particular style attribute corresponding to the second zoom level. - Increasing the number of data points for interpolation generally increases the accuracy of the interpolation. For example, in addition to rendering a smooth change between a first and second zoom level within two interpolation points, the described method and system may require that additional data points that define a range of zoom levels be interpolated in advance for the range so that a transition anywhere within the range is more accurate for the entire range.
- The tradeoff with increasing interpolation accuracy may often be additional bandwidth and processing capacity needed for retrieving and/or processing the additional data points. A level of interpolation accuracy may depend on a resolution parameter that may be set by a user or automatically set by the mapping application. Processor capacity for performing interpolation calculations may be considered. For example, a current processor capacity may be checked against a threshold. Where the processor that is to perform the interpolation (e.g., the processor of the graphics card) may generally be more efficient in calculating and/or performing interpolation mathematics, decisions of whether to download a plurality of additional style data points may depend more on bandwidth and/or latency considerations associated with retrieving the additional style data points from the server. In some embodiments, bandwidth considerations may depend on checking whether a current bandwidth/time-to-download for retrieving style attribute data is above a threshold. For example, the client may check a current download rate of a retrieval process to determine whether to retrieve additional style parameters. This may also depend on the amount of style data that may be requested.
- While the process of
FIG. 8 determines how many style attribute data sets to retrieve,FIG. 9 illustrates a process for determining what style parameters to include in each data set, according to an embodiment. Generally, a system may provide a complete set of style attribute values whenever any style attribute value is retrieved and/or requested. For example, a complete style attribute lookup table may be retrieved each time map data is retrieved and rendered for a new zoom level. Alternatively, only a subset of style parameters available for retrieval may be retrieved, requested, or processed. Using the described techniques herein, a subset or reduced set of style parameters may be determined based on whether the style parameters can be interpolated and how those style parameters may be interpolated. -
FIG. 9 illustrates aprocess flow 900 for determining a reduced set of style parameters that may be subject to interpolation functions, according to an embodiment. Atblock 902, a set of style parameters may be retrieved. In some embodiments, this may include a complete set of all style parameters (this may be a listing of all style parameters) that can be used by a mapping application. In some situations, this may represent only a subset. At block 904 a style parameter is selected for processing. - Some style parameters cannot or are chosen not to be interpolated.
Block 906 may determine whether a style parameter is designated for interpolation. This determination may be performed based on an associated flag, attribute, or other indicator associated with the style parameter indicating whether the style parameter is suitable for interpolation or can be subject to interpolation. The flag may be set by a map application designer based on aesthetic considerations of the map. Some style parameters do not represent values that can be interpolated or extrapolated. For example, where a style parameter value is selected from a set of fixed values, no interpolation may be possible or appropriate. - Some style parameters may not be suitable for interpolation because an interpolation function used by the graphics processor may not be suitable for interpolating the values of a particular style parameter. Depending on the capabilities of the processor that is to calculate the interpolation, sometimes only a linear interpolation may be implemented. Perhaps other interpolations such as exponential interpolations may be more suitable for calculating, for example, a curve, which the graphics processor is not programmed to perform. In this case, the method and system may simply determine that a particular style parameter will not be interpolated. In which case a default value may be used for the style parameter value in lieu of interpolation.
-
Block 908 may determine whether a style parameter can be interpolated based on a current viewing or rendering condition. Some style parameters may not be subject to interpolation because of a current viewing condition, such as a current zoom level range. For example, certain style parameters may only be relevant at higher zoom levels than lower zoom levels. In those cases, the style parameters may not be retrieved and interpolated until a zoom level adjustment to the higher zoom level is initiated. -
Block 910 may determine whether a style parameter is to be interpolated based on a priority parameter. Similar to the interpolation indication, the priority parameter may be an associated flag, attribute, or other indicator associated with the style parameter indicating a priority value. This priority parameter may be used by the mapping system to determine a set of high priority style parameters to retrieve and process over low priority style parameters. This may be the case when a current condition of the mapping application requires reduced data retrieval and/or processing due to processor load. For example, where the processor is overloaded or backed up (the processor capacity is low or below a threshold), low priority style parameters may not be retrieved and interpolated to reduce processor workload. -
Block 912 may determine whether high and/or low priority style parameters (based on priority attribute) will be retrieved and processed based on a current bandwidth of a retrieval path, channel, or connection. For example, block 902 may determine not to retrieve and process low priority style parameters when bandwidth is below a threshold. -
Block 914 determines whether there is data sufficient for interpolating the style parameter. Interpolation generally requires at least two data points. -
Block 916 may select or not select the style parameter for retrieval and/or processing based blocks 904-914. Blocks 904-914 may be repeated to determine a reduced set of style parameters for retrieval and processing. - Any suitable subset of the blocks may be implemented in any suitable order by a number of different devices (e.g., client or server) and remain consistent with the method and system described herein. Moreover, additional determination blocks may be added to refine the filtering of style parameters subject to interpolation processing.
- Throughout this specification, plural instances may implement components, operations, or structures described as a single instance. Although individual operations of one or more methods are illustrated and described as separate operations, one or more of the individual operations may be performed concurrently, and nothing requires that the operations be performed in the order illustrated. Structures and functionality presented as separate components in example configurations may be implemented as a combined structure or component. Similarly, structures and functionality presented as a single component may be implemented as separate components. These and other variations, modifications, additions, and improvements fall within the scope of the subject matter herein.
- For example, the
network 25 may include but is not limited to any combination of a LAN, a MAN, a WAN, a mobile, a wired or wireless network, a private network, or a virtual private network. Moreover, while only four client devices are illustrated inFIG. 1 to simplify and clarify the description, it is understood that any number of client computers or display devices are supported and can be in communication with theserver 14. - Additionally, certain embodiments are described herein as including logic or a number of components, modules, or mechanisms. Modules may constitute either software modules (e.g., code embodied on a machine-readable medium or in a transmission signal) or hardware modules. A hardware module is tangible unit capable of performing certain operations and may be configured or arranged in a certain manner. In example embodiments, one or more computer systems (e.g., a standalone, client or server computer system) or one or more hardware modules of a computer system (e.g., a processor or a group of processors) may be configured by software (e.g., an application or application portion) as a hardware module that operates to perform certain operations as described herein.
- In various embodiments, a hardware module may be implemented mechanically or electronically. For example, a hardware module may comprise dedicated circuitry or logic that is permanently configured (e.g., as a special-purpose processor, such as a field programmable gate array (FPGA) or an application-specific integrated circuit (ASIC)) to perform certain operations. A hardware module may also comprise programmable logic or circuitry (e.g., as encompassed within a general-purpose processor or other programmable processor) that is temporarily configured by software to perform certain operations. It will be appreciated that the decision to implement a hardware module mechanically, in dedicated and permanently configured circuitry, or in temporarily configured circuitry (e.g., configured by software) may be driven by cost and time considerations.
- Accordingly, the term hardware should be understood to encompass a tangible entity, be that an entity that is physically constructed, permanently configured (e.g., hardwired), or temporarily configured (e.g., programmed) to operate in a certain manner or to perform certain operations described herein. Considering embodiments in which hardware modules are temporarily configured (e.g., programmed), each of the hardware modules need not be configured or instantiated at any one instance in time. For example, where the hardware modules comprise a general-purpose processor configured using software, the general-purpose processor may be configured as respective different hardware modules at different times. Software may accordingly configure a processor, for example, to constitute a particular hardware module at one instance of time and to constitute a different hardware module at a different instance of time.
- Hardware and software modules can provide information to, and receive information from, other hardware and/or software modules. Accordingly, the described hardware modules may be regarded as being communicatively coupled. Where multiple of such hardware or software modules exist contemporaneously, communications may be achieved through signal transmission (e.g., over appropriate circuits and buses) that connect the hardware or software modules. In embodiments in which multiple hardware modules or software are configured or instantiated at different times, communications between such hardware or software modules may be achieved, for example, through the storage and retrieval of information in memory structures to which the multiple hardware or software modules have access. For example, one hardware or software module may perform an operation and store the output of that operation in a memory device to which it is communicatively coupled. A further hardware or software module may then, at a later time, access the memory device to retrieve and process the stored output. Hardware and software modules may also initiate communications with input or output devices, and can operate on a resource (e.g., a collection of information).
- The various operations of example methods described herein may be performed, at least partially, by one or more processors that are temporarily configured (e.g., by software) or permanently configured to perform the relevant operations. Whether temporarily or permanently configured, such processors may constitute processor-implemented modules that operate to perform one or more operations or functions. The modules referred to herein may, in some example embodiments, comprise processor-implemented modules.
- Similarly, the methods or routines described herein may be at least partially processor-implemented. For example, at least some of the operations of a method may be performed by one or processors or processor-implemented hardware modules. The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the processor or processors may be located in a single location (e.g., within a home environment, an office environment or as a server farm), while in other embodiments the processors may be distributed across a number of locations.
- The one or more processors may also operate to support performance of the relevant operations in a “cloud computing” environment or as a “software as a service” (SaaS). For example, at least some of the operations may be performed by a group of computers (as examples of machines including processors), these operations being accessible via a network (e.g., the Internet) and via one or more appropriate interfaces (e.g., application program interfaces (APIs).)
- The performance of certain of the operations may be distributed among the one or more processors, not only residing within a single machine, but deployed across a number of machines. In some example embodiments, the one or more processors or processor-implemented modules may be located in a single geographic location (e.g., within a home environment, an office environment, or a server farm). In other example embodiments, the one or more processors or processor-implemented modules may be distributed across a number of geographic locations.
- Some portions of this specification are presented in terms of algorithms or symbolic representations of operations on data stored as bits or binary digital signals within a machine memory (e.g., a computer memory). These algorithms or symbolic representations are examples of techniques used by those of ordinary skill in the data processing arts to convey the substance of their work to others skilled in the art. As used herein, an “algorithm” or a “routine” is a self-consistent sequence of operations or similar processing leading to a desired result. In this context, algorithms, routines and operations involve physical manipulation of physical quantities. Typically, but not necessarily, such quantities may take the form of electrical, magnetic, or optical signals capable of being stored, accessed, transferred, combined, compared, or otherwise manipulated by a machine. It is convenient at times, principally for reasons of common usage, to refer to such signals using words such as “data,” “content,” “bits,” “values,” “elements,” “symbols,” “characters,” “terms,” “numbers,” “numerals,” or the like. These words, however, are merely convenient labels and are to be associated with appropriate physical quantities.
- Unless specifically stated otherwise, discussions herein using words such as “processing,” “computing,” “calculating,” “determining,” “presenting,” “displaying,” or the like may refer to actions or processes of a machine (e.g., a computer) that manipulates or transforms data represented as physical (e.g., electronic, magnetic, or optical) quantities within one or more memories (e.g., volatile memory, non-volatile memory, or a combination thereof), registers, or other machine components that receive, store, transmit, or display information.
- As used herein any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
- Some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. For example, some embodiments may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still cooperate or interact with each other. The embodiments are not limited in this context.
- As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
- In addition, use of the “a” or “an” are employed to describe elements and components of the embodiments herein. This is done merely for convenience and to give a general sense of the description. This description should be read to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
- Still further, the figures depict preferred embodiments of a map rendering system for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein.
- Upon reading this disclosure, those of skill in the art will appreciate still additional alternative structural and functional designs for a system and a process for rendering map or other types of images using the principles disclosed herein. Thus, while particular embodiments and applications have been illustrated and described, it is to be understood that the disclosed embodiments are not limited to the precise construction and components disclosed herein. Various modifications, changes and variations, which will be apparent to those skilled in the art, may be made in the arrangement, operation and details of the method and apparatus disclosed herein without departing from the spirit and scope defined in the appended claims.
Claims (20)
1. A method for rendering a map on a display device, the method comprising:
determining, by one or more processing devices, a viewing window of a map surface, at a certain zoom level corresponding to the magnification of the map surface;
determining, by the one or more processing devices, a first set of style parameters for applying to a feature of the map surface, wherein the feature is described in a vector format using a plurality of interconnected vertices, wherein the first set of style parameters corresponds to a first zoom level of the viewing window, and wherein the first zoom level corresponds to a first magnification;
determining, by the one or more processing devices, a second set of style parameters for the feature, wherein the second set of style parameters corresponds to a second zoom level of the viewing window, and wherein the second zoom level corresponds to a second magnification;
determining, by the one or more processing devices, a third set of style parameters by interpolating between the first set of style parameters and the second set of style parameters, including interpolating at least color and outline width; and
displaying the feature using the third set of style parameters.
2. The method of claim 1 , wherein the third set of style parameters corresponds with a third zoom level having a third magnification between the first magnification and the second magnification.
3. The method of claim 1 , wherein interpolating between the first set of style parameters and the second set of style parameters includes interpolating an outline dashing pattern.
4. The method of claim 1 , wherein interpolating the color includes interpolating text color.
5. The method of claim 1 , wherein interpolating the color includes interpolating interior color.
6. The method of claim 1 , wherein determining the third set of style parameters includes using an interpolation parameter to interpolate between at least the first set of style parameters at the first magnification and the second set of style parameters at the second magnification, the method further comprising:
changing the interpolation parameter over time; and
repeating the acts of i) determining the third set of style parameters, and ii) displaying the first feature using the third set of style parameters, in response to changing the interpolation parameter over time.
7. The method of claim 1 , wherein the first set of style parameters and the second set of style parameters are stored in a lookup table associated with the plurality of vertices.
8. The method of claim 7 , wherein the first and the second set of style parameters are located in a first and a second lookup table, respectively, each of the lookup tables associated with a different zoom level at a different magnification.
9. The method of claim 1 , wherein determining the first set and the second set of style parameters includes retrieving only a subset of style parameters available for retrieval based on a style attribute setting associated with each style parameter indicating whether the style parameter is designated for interpolation.
10. The method of claim 1 , wherein determining the first set and the second set of style parameters includes retrieving only a subset of the style parameters available for retrieval based on a priority attribute associated with each style parameter and whether a current available bandwidth is above a threshold.
11. A computer device comprising:
a communications network interface;
one or more processors;
a display device coupled to the one or more processors;
one or more non-transitory memories, coupled to the one or more processors, storing instructions that, when executed on the one or more processors, cause the computer device to:
determine a viewing window of a map surface, at a certain zoom level corresponding to the magnification of the map surface,
determine a first set of style parameters for applying to a feature of the map surface, wherein the feature is described in a vector format using a plurality of interconnected vertices, wherein the first set of style parameters corresponds to a first zoom level of the viewing window, and wherein the first zoom level corresponds to a first magnification;
determine a second set of style parameters for the feature, wherein the second set of style parameters corresponds to a second zoom level of the viewing window, and wherein the second zoom level corresponds to a second magnification;
determine a third set of style parameters by interpolating between the first set of style parameters and the second set of style parameters, including interpolate at least color and outline width; and
display the feature using the third set of style parameters.
12. The computer device of claim 11 , wherein the third set of style parameters corresponds with a third zoom level having a third magnification between the first magnification and the second magnification.
13. The computer device of claim 11 , wherein to interpolate between the first set of style parameters and the second set of style parameters, the instructions cause the computer device to interpolate an outline dashing pattern.
14. The computer device of claim 11 , wherein the instructions cause the computer device to interpolate text color.
15. The computer device of claim 11 , wherein the instructions cause the computer device to interpolate interior color.
16. The computer device of claim 11 , wherein to determine the third set of style parameters, the instructions cause the computing device to use an interpolation parameter to interpolate between at least the first set of style parameters at the first magnification and the second set of style parameters at the second magnification, and wherein the instructions further cause the computer device to:
change the interpolation parameter over time; and
repeat the acts of i) determining the third set of style parameters, and ii) displaying the first feature using the third set of style parameters, in response to changing the interpolation parameter over time.
17. The computer device of claim 11 , wherein the first set of style parameters and the second set of style parameters are stored in a lookup table associated with the plurality of vertices.
18. The computer device of claim 17 , wherein the first and the second set of style parameters are located in a first and a second lookup table, respectively, each of the lookup tables associated with a different zoom level at a different magnification.
19. The computer device of claim 11 , wherein to determine the first set and the second set of style parameters, the instructions cause the computer device to retrieve only a subset of style parameters available for retrieval based on a style attribute setting associated with each style parameter indicating whether the style parameter is designated for interpolation.
20. The computer device of claim 11 , wherein to determine the first set and the second set of style parameters, the instructions cause the computer device to retrieve only a subset of the style parameters available for retrieval based on a priority attribute associated with each style parameter and whether a current available bandwidth is above a threshold.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/456,872 US20140347383A1 (en) | 2011-09-28 | 2014-08-11 | Map rendering using interpolation of style parameters across zoom levels |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/247,637 US8274524B1 (en) | 2011-09-28 | 2011-09-28 | Map rendering using interpolation of style parameters across zoom levels |
US13/625,722 US8803901B1 (en) | 2011-09-28 | 2012-09-24 | Map rendering using interpolation of style parameters across zoom levels |
US14/456,872 US20140347383A1 (en) | 2011-09-28 | 2014-08-11 | Map rendering using interpolation of style parameters across zoom levels |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/625,722 Continuation US8803901B1 (en) | 2011-09-28 | 2012-09-24 | Map rendering using interpolation of style parameters across zoom levels |
Publications (1)
Publication Number | Publication Date |
---|---|
US20140347383A1 true US20140347383A1 (en) | 2014-11-27 |
Family
ID=46846336
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/247,637 Active US8274524B1 (en) | 2011-09-28 | 2011-09-28 | Map rendering using interpolation of style parameters across zoom levels |
US13/625,722 Active US8803901B1 (en) | 2011-09-28 | 2012-09-24 | Map rendering using interpolation of style parameters across zoom levels |
US14/456,872 Abandoned US20140347383A1 (en) | 2011-09-28 | 2014-08-11 | Map rendering using interpolation of style parameters across zoom levels |
Family Applications Before (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/247,637 Active US8274524B1 (en) | 2011-09-28 | 2011-09-28 | Map rendering using interpolation of style parameters across zoom levels |
US13/625,722 Active US8803901B1 (en) | 2011-09-28 | 2012-09-24 | Map rendering using interpolation of style parameters across zoom levels |
Country Status (1)
Country | Link |
---|---|
US (3) | US8274524B1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016138259A1 (en) * | 2015-02-25 | 2016-09-01 | Environmental Systems Research Institute (ESRI) | Systems and methods for smart cartography |
US9495767B2 (en) | 2014-05-15 | 2016-11-15 | Google Inc. | Indexed uniform styles for stroke rendering |
US9542724B1 (en) * | 2013-07-09 | 2017-01-10 | Google Inc. | Systems and methods for stroke rendering on digital maps |
CN112579719A (en) * | 2020-12-18 | 2021-03-30 | 国网福建省电力有限公司经济技术研究院 | High-voltage line multi-loop display method based on power map and storage medium |
US20210272234A1 (en) * | 2016-12-13 | 2021-09-02 | Palantir Technologies Inc. | Zoom-adaptive data granularity to achieve a flexible high-performance interface for a geospatial mapping system |
Families Citing this family (35)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8274524B1 (en) | 2011-09-28 | 2012-09-25 | Google Inc. | Map rendering using interpolation of style parameters across zoom levels |
US8612491B2 (en) | 2011-10-25 | 2013-12-17 | The United States Of America, As Represented By The Secretary Of The Navy | System and method for storing a dataset of image tiles |
US10013474B2 (en) | 2011-10-25 | 2018-07-03 | The United States Of America, As Represented By The Secretary Of The Navy | System and method for hierarchical synchronization of a dataset of image tiles |
US8635021B2 (en) | 2012-05-04 | 2014-01-21 | Google Inc. | Indicators for off-screen content |
US9052197B2 (en) | 2012-06-05 | 2015-06-09 | Apple Inc. | Providing navigation instructions while device is in locked mode |
US9886794B2 (en) | 2012-06-05 | 2018-02-06 | Apple Inc. | Problem reporting in maps |
US10156455B2 (en) | 2012-06-05 | 2018-12-18 | Apple Inc. | Context-aware voice guidance |
US10176633B2 (en) | 2012-06-05 | 2019-01-08 | Apple Inc. | Integrated mapping and navigation application |
US9418672B2 (en) | 2012-06-05 | 2016-08-16 | Apple Inc. | Navigation application with adaptive instruction text |
US8983778B2 (en) | 2012-06-05 | 2015-03-17 | Apple Inc. | Generation of intersection information by a mapping service |
US9997069B2 (en) | 2012-06-05 | 2018-06-12 | Apple Inc. | Context-aware voice guidance |
US9482296B2 (en) | 2012-06-05 | 2016-11-01 | Apple Inc. | Rendering road signs during navigation |
US9489754B2 (en) * | 2012-06-06 | 2016-11-08 | Apple Inc. | Annotation of map geometry vertices |
US8928698B2 (en) * | 2012-06-10 | 2015-01-06 | Apple Inc. | Compression of road features in map tiles |
US9305330B2 (en) | 2012-10-25 | 2016-04-05 | Microsoft Technology Licensing, Llc | Providing images with zoomspots |
WO2015038039A1 (en) * | 2013-09-10 | 2015-03-19 | Telefonaktiebolaget L M Ericsson (Publ) | Method and monitoring centre for monitoring occurrence of an event |
USD781318S1 (en) | 2014-04-22 | 2017-03-14 | Google Inc. | Display screen with graphical user interface or portion thereof |
US9934222B2 (en) | 2014-04-22 | 2018-04-03 | Google Llc | Providing a thumbnail image that follows a main image |
USD780777S1 (en) | 2014-04-22 | 2017-03-07 | Google Inc. | Display screen with graphical user interface or portion thereof |
USD781317S1 (en) | 2014-04-22 | 2017-03-14 | Google Inc. | Display screen with graphical user interface or portion thereof |
US9972121B2 (en) | 2014-04-22 | 2018-05-15 | Google Llc | Selecting time-distributed panoramic images for display |
US10067950B1 (en) * | 2014-06-25 | 2018-09-04 | Google Llc | Systems and methods for efficiently organizing map styling information |
US9684425B2 (en) | 2014-08-18 | 2017-06-20 | Google Inc. | Suggesting a target location upon viewport movement |
US9672656B1 (en) * | 2015-12-16 | 2017-06-06 | Google Inc. | Variable level-of-detail map rendering |
RU2632128C1 (en) | 2016-04-04 | 2017-10-02 | Общество С Ограниченной Ответственностью "Яндекс" | Method and system of downloading image fragments to client device |
RU2632150C1 (en) | 2016-04-04 | 2017-10-02 | Общество С Ограниченной Ответственностью "Яндекс" | Method and system of downloading the image to the customer's device |
US10474340B2 (en) | 2016-08-18 | 2019-11-12 | Mapbox, Inc. | Providing graphical indication of label boundaries in digital maps |
US20180181576A1 (en) * | 2016-12-22 | 2018-06-28 | Mapbox, Inc. | Real-Time Transmittal Of Optimized Map Vector Tile Data |
US10198413B2 (en) | 2016-12-30 | 2019-02-05 | Dropbox, Inc. | Image annotations in collaborative content items |
US11113855B2 (en) | 2017-11-01 | 2021-09-07 | Mapbox, Inc. | Expression interpretation engine for computer map visualizations |
US10460495B1 (en) | 2018-05-23 | 2019-10-29 | Mapbox, Inc. | Efficient label insertion and collision handling |
US10489954B1 (en) * | 2018-05-23 | 2019-11-26 | Mapbox, Inc. | Efficient duplicate label handling |
CN110781324B (en) * | 2019-08-31 | 2022-07-19 | 中国科学院电子学研究所苏州研究院 | Symbol library based on three-dimensional plotting system |
CN110807135B (en) * | 2019-10-14 | 2023-04-11 | 车智互联(北京)科技有限公司 | Data processing method, thermodynamic diagram generation method and device |
CN112330769B (en) * | 2020-12-14 | 2023-08-22 | 智道网联科技(北京)有限公司 | Method and device for generating dotted line texture and electronic equipment |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6028584A (en) * | 1997-08-29 | 2000-02-22 | Industrial Technology Research Institute | Real-time player for panoramic imaged-based virtual worlds |
US20040109004A1 (en) * | 2002-12-09 | 2004-06-10 | Bastos Rui M. | Depth-of-field effects using texture lookup |
US20050206657A1 (en) * | 2004-03-17 | 2005-09-22 | Arcas Blaise A Y | Methods and apparatus for navigating an image |
US20050223311A1 (en) * | 1996-09-30 | 2005-10-06 | Interland, Inc. | Hypermedia authoring and publishing system |
US20060138238A1 (en) * | 2004-12-23 | 2006-06-29 | University Of Washington | Methods of driving a scanning beam device to achieve high frame rates |
US20060267982A1 (en) * | 2003-03-05 | 2006-11-30 | Seadragon Software, Inc. | System and method for exact rendering in a zooming user interface |
US20090263026A1 (en) * | 2008-04-18 | 2009-10-22 | Google Inc. | Content item placement |
US7839926B1 (en) * | 2000-11-17 | 2010-11-23 | Metzger Raymond R | Bandwidth management and control |
US20110074811A1 (en) * | 2009-09-25 | 2011-03-31 | Apple Inc. | Map Layout for Print Production |
US20120206469A1 (en) * | 2011-02-15 | 2012-08-16 | Tudor Hulubei | Efficient pre-computing of simplified vector data for rendering at multiple zoom levels |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5613048A (en) | 1993-08-03 | 1997-03-18 | Apple Computer, Inc. | Three-dimensional image synthesis using view interpolation |
US5798770A (en) | 1995-03-24 | 1998-08-25 | 3Dlabs Inc. Ltd. | Graphics rendering system with reconfigurable pipeline sequence |
US5701405A (en) | 1995-06-21 | 1997-12-23 | Apple Computer, Inc. | Method and apparatus for directly evaluating a parameter interpolation function used in rendering images in a graphics system that uses screen partitioning |
US6016151A (en) | 1997-09-12 | 2000-01-18 | Neomagic Corp. | 3D triangle rendering by texture hardware and color software using simultaneous triangle-walking and interpolation for parallel operation |
KR100362817B1 (en) | 1997-10-27 | 2002-11-30 | 마쯔시다덴기산교 가부시키가이샤 | Three-dimensional map display device, model transforming data used therein, device for creating three-dimensional polygon data or three-dimensional image data, navigation device for performing display on the basis of data thereof, three-dimensional map display method, and storage medium for model transforming data |
JP4559555B2 (en) | 1999-03-16 | 2010-10-06 | 株式会社日立製作所 | 3D map display method and navigation apparatus |
US6888544B2 (en) | 2000-03-17 | 2005-05-03 | Hewlett-Packard Development Company, L.P. | Apparatus for and method of rendering 3D objects with parametric texture maps |
US6424933B1 (en) | 2000-03-17 | 2002-07-23 | Vicinity Corporation | System and method for non-uniform scaled mapping |
EP1282855B1 (en) | 2000-03-17 | 2011-10-12 | Microsoft Corporation | System and method for abstracting and visualizing a route map |
US6822650B1 (en) | 2000-06-19 | 2004-11-23 | Microsoft Corporation | Formatting object for modifying the visual attributes of visual objects to reflect data values |
JP3992227B2 (en) | 2002-04-26 | 2007-10-17 | パイオニア株式会社 | 3D information display device |
KR100520707B1 (en) | 2003-10-20 | 2005-10-17 | 엘지전자 주식회사 | Method for displaying multi-level text data in three dimensional map |
ES2698264T3 (en) * | 2006-02-10 | 2019-02-01 | Freedom Scientific Inc | Stylization and replacement of text sensitive to content applicable to the entire system |
US7948500B2 (en) * | 2007-06-07 | 2011-05-24 | Nvidia Corporation | Extrapolation of nonresident mipmap data using resident mipmap data |
US8806331B2 (en) | 2009-07-20 | 2014-08-12 | Interactive Memories, Inc. | System and methods for creating and editing photo-based projects on a digital network |
US8655632B2 (en) | 2009-09-03 | 2014-02-18 | Schlumberger Technology Corporation | Gridless geological modeling |
US8566020B2 (en) | 2009-12-01 | 2013-10-22 | Nokia Corporation | Method and apparatus for transforming three-dimensional map objects to present navigation information |
US8560600B2 (en) * | 2011-09-26 | 2013-10-15 | Google Inc. | Managing map elements using aggregate feature identifiers |
US8274524B1 (en) | 2011-09-28 | 2012-09-25 | Google Inc. | Map rendering using interpolation of style parameters across zoom levels |
-
2011
- 2011-09-28 US US13/247,637 patent/US8274524B1/en active Active
-
2012
- 2012-09-24 US US13/625,722 patent/US8803901B1/en active Active
-
2014
- 2014-08-11 US US14/456,872 patent/US20140347383A1/en not_active Abandoned
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050223311A1 (en) * | 1996-09-30 | 2005-10-06 | Interland, Inc. | Hypermedia authoring and publishing system |
US6028584A (en) * | 1997-08-29 | 2000-02-22 | Industrial Technology Research Institute | Real-time player for panoramic imaged-based virtual worlds |
US7839926B1 (en) * | 2000-11-17 | 2010-11-23 | Metzger Raymond R | Bandwidth management and control |
US20040109004A1 (en) * | 2002-12-09 | 2004-06-10 | Bastos Rui M. | Depth-of-field effects using texture lookup |
US20060267982A1 (en) * | 2003-03-05 | 2006-11-30 | Seadragon Software, Inc. | System and method for exact rendering in a zooming user interface |
US20050206657A1 (en) * | 2004-03-17 | 2005-09-22 | Arcas Blaise A Y | Methods and apparatus for navigating an image |
US20060138238A1 (en) * | 2004-12-23 | 2006-06-29 | University Of Washington | Methods of driving a scanning beam device to achieve high frame rates |
US20090263026A1 (en) * | 2008-04-18 | 2009-10-22 | Google Inc. | Content item placement |
US20110074811A1 (en) * | 2009-09-25 | 2011-03-31 | Apple Inc. | Map Layout for Print Production |
US20120206469A1 (en) * | 2011-02-15 | 2012-08-16 | Tudor Hulubei | Efficient pre-computing of simplified vector data for rendering at multiple zoom levels |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9542724B1 (en) * | 2013-07-09 | 2017-01-10 | Google Inc. | Systems and methods for stroke rendering on digital maps |
US9495767B2 (en) | 2014-05-15 | 2016-11-15 | Google Inc. | Indexed uniform styles for stroke rendering |
WO2016138259A1 (en) * | 2015-02-25 | 2016-09-01 | Environmental Systems Research Institute (ESRI) | Systems and methods for smart cartography |
US10431122B2 (en) | 2015-02-25 | 2019-10-01 | Environmental Systems Research Institute (ESRI) | Systems and methods for smart cartography |
US20210272234A1 (en) * | 2016-12-13 | 2021-09-02 | Palantir Technologies Inc. | Zoom-adaptive data granularity to achieve a flexible high-performance interface for a geospatial mapping system |
US11663694B2 (en) * | 2016-12-13 | 2023-05-30 | Palantir Technologies Inc. | Zoom-adaptive data granularity to achieve a flexible high-performance interface for a geospatial mapping system |
CN112579719A (en) * | 2020-12-18 | 2021-03-30 | 国网福建省电力有限公司经济技术研究院 | High-voltage line multi-loop display method based on power map and storage medium |
Also Published As
Publication number | Publication date |
---|---|
US8803901B1 (en) | 2014-08-12 |
US8274524B1 (en) | 2012-09-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8803901B1 (en) | Map rendering using interpolation of style parameters across zoom levels | |
US8237745B1 (en) | Label positioning technique to reduce crawling during zoom activities | |
US8243102B1 (en) | Derivative-based selection of zones for banded map display | |
US8400453B2 (en) | Rendering a text image following a line | |
US9811879B2 (en) | Keeping map labels consistent across multiple zoom levels | |
US8970583B1 (en) | Image space stylization of level of detail artifacts in a real-time rendering engine | |
KR102001191B1 (en) | Rendering a text image following a line | |
US8730258B1 (en) | Anti-aliasing of straight lines within a map image | |
US20130093750A1 (en) | Use of banding to optimize map rendering in a three-dimensional tilt view | |
US9093006B2 (en) | Image shader using style attribute references | |
US20150130788A1 (en) | Visualize the obscure object in 3d space | |
US9495767B2 (en) | Indexed uniform styles for stroke rendering | |
US9721363B2 (en) | Encoding polygon data for fast retrieval and rendering | |
EP2766876B1 (en) | Use of banding to optimize map rendering in a three-dimensional tilt view | |
US8760451B2 (en) | Rendering a text image using texture map character center encoding with character reference encoding | |
US20150130845A1 (en) | Out-of-viewpoint indicators for relevant map features | |
US9092907B2 (en) | Image shader using two-tiered lookup table for implementing style attribute references | |
JP7086180B2 (en) | Dynamic styling of digital maps | |
US9275481B2 (en) | Viewport-based contrast adjustment for map features | |
US8976188B1 (en) | Optimized data communication system and method for an image rendering system | |
US9911205B1 (en) | Visual continuity for arbitrary length stipple patterns | |
US20130002679A1 (en) | Rendering a text image using texture map character center encoding |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |