US20120256919A1 - Geospatial data system for selectively retrieving and displaying geospatial texture data based upon user-selected point-of-view and related methods - Google Patents
Geospatial data system for selectively retrieving and displaying geospatial texture data based upon user-selected point-of-view and related methods Download PDFInfo
- Publication number
- US20120256919A1 US20120256919A1 US13/527,184 US201213527184A US2012256919A1 US 20120256919 A1 US20120256919 A1 US 20120256919A1 US 201213527184 A US201213527184 A US 201213527184A US 2012256919 A1 US2012256919 A1 US 2012256919A1
- Authority
- US
- United States
- Prior art keywords
- geospatial
- display
- scene
- data
- structures
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
- G06T15/20—Perspective computation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/05—Geographic models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
Definitions
- the present invention relates to the field of modeling systems, and, more particularly, to geospatial modeling systems and related methods.
- Topographical models of geographical areas may be used for many applications. For example, topographical models may be used in flight simulators and for planning military missions. Furthermore, topographical models of man-made structures (e.g., cities) may be extremely helpful in applications such as cellular antenna placement, urban planning, disaster preparedness and analysis, and mapping, for example.
- man-made structures e.g., cities
- DEM digital elevation map
- a DEM is a sampled matrix representation of a geographical area which may be generated in an automated fashion by a computer.
- coordinate points are made to correspond with a height value.
- DEMs are typically used for modeling terrain where the transitions between different elevations (e.g., valleys, mountains, etc.) are generally smooth from one to a next. That is, DEMs typically model terrain as a plurality of curved surfaces and any discontinuities therebetween are thus “smoothed” over. Thus, in a typical DEM no distinct objects are present on the terrain.
- RealSite® is used to register overlapping images of a geographical area of interest, and extract high resolution DEMs using stereo and nadir view techniques.
- RealSite® provides a semi-automated process for making three-dimensional (3D) topographical models of geographical areas, including cities, which have accurate textures and structure boundaries.
- RealSite® models are geospatially accurate. That is, the location of any given point within the model corresponds to an actual location in the geographical area with very high accuracy.
- the data used to generate RealSite® models may include aerial and satellite photography, electro-optical, infrared, and light detection and ranging (LIDAR).
- U.S. Pat. No. 6,654,690 to Rahmes et al. which is also assigned to the present Assignee and is hereby incorporated herein in its entirety by reference.
- This patent discloses an automated method for making a topographical model of an area including terrain and buildings thereon based upon randomly spaced data of elevation versus position. The method includes processing the randomly spaced data to generate gridded data of elevation versus position conforming to a predetermined position grid, processing the gridded data to distinguish building data from terrain data, and performing polygon extraction for the building data to make the topographical model of the area including terrain and buildings thereon.
- topographical models are no longer reserved for advanced modeling systems such as those discussed above.
- Various Internet service providers such as GoogleTM and Microsoft® are looking to provide access to 3D topographical models over the Internet that show users how a city or location appears in as much realism as possible. This may advantageously help increase a user's awareness of a given area and provide an exploratory environment.
- Such companies are striving to provide environments that are easier to use, more realistic and ultimately more useful. Improving the user experience involves increasing the quality of the 3D environment in terms of better terrain, more highly detailed city/building models, and higher resolution imagery of the terrain and buildings.
- This patent discloses a method of providing data blocks describing three-dimensional terrain to a renderer.
- the data blocks belong to a hierarchical structure which includes blocks at a plurality of different resolution layers.
- the method includes receiving from the renderer one or more coordinates in the terrain along with indication of a respective resolution layer, providing the renderer with a first data block which includes data corresponding to the coordinate(s) from a local memory, and downloading from a remote server one or more additional data blocks which include data corresponding to the coordinate(s) if the provided block from the local memory is not at the indicated resolution layer.
- further advancements may be desirable for remotely retrieving and displaying large amounts of geospatial data.
- a geospatial data system may include at least one geospatial database containing three-dimensional (3D) geospatial structure data, and also containing geospatial texture data associated with the geospatial 3D structure data.
- the system may further include at least one geospatial data access device, which may comprise a display and a processor cooperating therewith for communicating remotely with the at least one geospatial database to retrieve and display a scene on the display based upon the 3D structure data and the geospatial texture data associated therewith.
- the at least one geospatial data access device may further comprise at least one user input device cooperating with the processor for permitting user selection of a point-of-view (POV) within the scene on the display with the POV determining revealed portions and obscured portions of 3D geospatial structures within the scene on the display.
- the processor may advantageously selectively retrieve geospatial texture data based upon the revealed portions and not the obscured portions of the 3D geospatial structures within the scene on the display.
- the geospatial texture data contained in the at least one geospatial database may be retrievable in successive additive layers of resolution, and the processor may therefore retrieve and display the geospatial texture data in successive additive layers of resolution in the scene on the display.
- the processor may prioritize retrieval and display of successive additive layers of resolution of associated geospatial texture data to different 3D geospatial structures within the scene on the display.
- the processor may prioritize based upon relative distances of the 3D geospatial structures within the scene on the display.
- the processor may prioritize based upon different relative areas of the 3D geospatial structures within the scene on the display.
- the geospatial data system may further comprise a communications channel coupling the at least one geospatial database and the geospatial data access device.
- the communications channel may have a capacity insufficient to carry within a predetermined time all of the associated geospatial texture data for the 3D geospatial structures within the scene on the display.
- the communications channel may comprise the Internet.
- the at least one geospatial database and the at least one geospatial data access device may communicate using a streaming wavelet-based imagery compression protocol, such as the JPEG 2000 Interactive Protocol, for example.
- a related geospatial data access method aspect may include storing 3D geospatial structure data and geospatial texture data associated with the geospatial 3D structure data in at least one geospatial database.
- the method may further include remotely retrieving the 3D structure data and the geospatial texture data associated therewith from the at least one geospatial database.
- a scene is displayed on a display based upon the retrieved 3D structure data and the geospatial texture data associated therewith and also based upon a user selection of a point-of-view (POV) within the scene on the display with the POV determining revealed portions and obscured portions of 3D geospatial structures within the scene on the display.
- remotely retrieving may further comprise selectively retrieving geospatial texture data based upon the revealed portions and not the obscured portions of the 3D geospatial structures within the scene on the display.
- a related computer-readable medium having computer-executable instructions for causing a computer to perform steps including remotely retrieving three-dimensional (3D) structure data and geospatial texture data associated therewith from at least one geospatial database, and displaying a scene on a display based upon the retrieved 3D structure data and the geospatial texture data associated therewith and also based upon a user selection of a point-of-view (POV) within the scene on the display.
- the POV may advantageously determine revealed portions and obscured portions of 3D geospatial structures within the scene on the display.
- Remotely retrieving may further comprise selectively retrieving geospatial texture data based upon the revealed portions and not the obscured portions of the 3D geospatial structures within the scene on the display.
- FIG. 1 is a schematic block diagram of a geospatial data system in accordance with the invention.
- FIGS. 2 and 3 are schematic block diagrams of the geospatial data system of FIG. 1 in greater detail for a JPEG 2000 implementation.
- FIG. 4 is a series of geospatial texture images illustrating progressive texture data rendering of the system of FIG. 1 .
- FIGS. 5A-5C are another series of geospatial texture images also illustrating progressive texture data rendering of the system of FIG. 1 .
- FIG. 6 is a system flow diagram illustrating method aspects of the invention.
- FIG. 7 is a schematic block diagram of an alternative embodiment of the system of FIG. 1 .
- the system 30 illustratively includes one or more geospatial data storage devices 31 containing three-dimensional (3D) geospatial structure data, and also containing geospatial texture data associated with the geospatial 3D structure data and being retrievable in successive additive layers of resolution.
- structure data includes man-made (e.g., buildings, bridges, etc.) data
- 3D geospatial structure data may be in the form of a DEM, such as a tiled triangulated irregular network (T-TIN), for example.
- T-TIN tiled triangulated irregular network
- the geospatial texture data may be optical (i.e., image) data, for example, that is used to overlay or texture the DEM, etc., to make the image appear more realistic, as will be appreciated by those skilled in the art.
- the geospatial data storage device 31 is implemented in an Internet model library server 39 , as will be appreciated by those skilled in the art.
- the system further illustratively includes one or more geospatial data access devices 32 for remotely accessing the geospatial data storage device(s) 31 , such as via a wide area network 33 , which in the illustrated embodiment is the Internet.
- the geospatial access device 32 illustratively includes a display 34 and a processor 35 , such as the central processing unit (CPU) of a personal computer (PC) or Macintosh computer, for example, although other types of processors (workstations, personal digital assistant (PDA) devices, laptops, etc., may also be used).
- the geospatial access device 32 is an Internet-enabled device.
- the processor 35 runs a viewer program 60 that cooperates with the display 34 for communicating remotely with the geospatial data storage device 31 to retrieve and display a scene on the display based upon the 3D structure data and the geospatial texture data associated therewith.
- a viewer program 60 that cooperates with the display 34 for communicating remotely with the geospatial data storage device 31 to retrieve and display a scene on the display based upon the 3D structure data and the geospatial texture data associated therewith.
- a relatively limited bandwidth communications channel such as the Internet (compared to a local high speed network connection, for example)
- this can make rendering of a geospatial scene or model on the display 34 very cumbersome and frustrating for the user.
- the communications channel e.g., the Internet
- the communications channel may have a capacity insufficient to carry within a predetermined time (i.e., the time the processor 35 could otherwise render the scene) all of the associated geospatial texture data for the 3D geospatial structures within the scene on the display 34 .
- the transfer of 3D geospatial structure data will be relatively fast due to its smaller file size (e.g., on the order of kilobytes), and can therefore be substantially immediately sent and displayed upon request from the geospatial data access device 32 .
- the geospatial texture data can be on the order of several megabytes or larger, for example, which delays the rendering of the geometry and the processor 35 otherwise waits until all data is retrieved to begin the rendering process.
- the geospatial texture data is advantageously retrieved and displayed in successive additive layers 36 a - 36 d of resolution (i.e., it is “streamed” in layers). This may advantageously make the user experience more interactive as model textures progressively sharpen as the user navigates through a geospatial model/scene, as will be appreciated by those skilled in the art.
- JPEG 2000 wavelet-based imagery compression technology
- JPEG 2000 Interactive Protocol JPIP
- this technique may allow users to effectively browse images that are several Gigabytes in size over connections as slow as 16 kB/sec.
- the effective user experience may include loading of untextured models, followed by textured models that progressively increase in resolution as the user approaches buildings or other objects within the scene (i.e., changes the point-of-view (POV)).
- the viewer program may use whichever texture is available, and the user might not ever see an untextured model. For example, if the client-software requests both the structure and the texture data and the texture stream arrives first, the user would not see the untextured model.
- the viewer program will typically display the scene from an initial (startup) viewpoint (Block 61 ), and the user can change the POV using any suitable user input device, such as the illustrated keyboard 38 , a mouse, joystick, etc. (Block 62 ).
- Objects that are farther away are only rendered using lower resolutions of the image (known as quality layers within the JPEG 2000 file), at Blocks 63 - 64 as discussed further below.
- the structure/geometry data therefor is retrieved and displayed (Blocks 65 - 67 ), which may initially be without texture (or with only a first layer of texture).
- Successive additive layers of texture are then streamed in to increase the scene or model's appearance and displayed accordingly, as will be discussed further below.
- This technique may advantageously be leveraged over networks of modest bandwidth and in effect, makes very efficient use of network resources.
- the additional texture data to be streamed may advantageously be selected based upon a position or relative distance of a structure within the scene, and/or based upon whether the data is revealed (i.e., visible) in the scene.
- FIG. 3 A system 30 ′ implemented using JPIP is illustrated in FIG. 3 .
- geospatial texture data layers 36 a ′- 36 d ′ are stored in a data storage device 31 ′ on the server 39 ′ in a JPEG 2000 format that is arranged in a manner that permits efficient streaming by a JPIP streaming module 41 ′.
- a JPIP module 40 ′ translates the requests into JPIP requests.
- Responses are returned in successive additive layers 36 a ′- 36 d ′, and each layer is converted to a texture.
- JPIP-aware model viewer can make successive texture requests, each time resulting in sharper and sharper textures, as seen in FIG. 4 .
- JPEG 2000 files may be encoded using profiles that produce quality layers.
- each of the layers 36 a - 36 d represents a different JPEG 2000 quality layer.
- Each quality layer contains a portion of each pixel's information, and each successive layer adds to the previous ones to provide progressively sharper pixels until the final layer contains the remaining information to complete the full resolution image, as shown. Another example is shown in FIGS.
- the processor 35 ′′ may advantageously prioritize retrieval and display of successive additive layers of resolution of geospatial texture data to different 3D geospatial structures within the scene on the display 34 ′′ (Blocks 68 - 72 ).
- the processor 35 ′′ may prioritize based upon relative distances of the 3D geospatial structures within the scene on the display, and/or based upon different relative areas of the 3D geospatial structures within the scene on the display.
- buildings/terrain that are closer in the scene would receive more successive additive layers of resolution than buildings/terrain that is farther away in the scene.
- the POV will determine revealed portions (e.g., front of buildings) and obscured portions (e.g., back of buildings) of 3D geospatial structures and/or terrain within the scene on the display.
- the processor 35 ′′ may advantageously selectively retrieve geospatial texture data based upon the revealed portions and not the obscured portions of the 3D geospatial structures within the scene on the display 34 ′′.
- the invention may also be embodied in a computer-readable medium having computer-executable instructions for causing a computer, such as the processor 35 , to perform the steps/operations set forth above, as will be appreciated by those skilled in the art.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- General Physics & Mathematics (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Remote Sensing (AREA)
- Computing Systems (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Processing Or Creating Images (AREA)
- Image Generation (AREA)
Abstract
A geospatial data system may include at least one geospatial database containing three-dimensional (3D) geospatial structure data and geospatial texture data associated with the geospatial 3D structure data. At least one geospatial data access device may also be included and comprise a display and a processor cooperating therewith for communicating remotely with the at least one geospatial database to retrieve and display a scene on the display based upon the 3D structure data and the geospatial texture data associated therewith. The geospatial data access device(s) may further comprise at least one user input device cooperating with the processor for permitting user selection of a point-of-view (POV) within the scene on the display with the POV determining revealed portions and obscured portions of 3D geospatial structures within the scene on the display. The processor may selectively retrieve geospatial texture data based upon the revealed portions and not the obscured portions.
Description
- The present invention relates to the field of modeling systems, and, more particularly, to geospatial modeling systems and related methods.
- Topographical models of geographical areas may be used for many applications. For example, topographical models may be used in flight simulators and for planning military missions. Furthermore, topographical models of man-made structures (e.g., cities) may be extremely helpful in applications such as cellular antenna placement, urban planning, disaster preparedness and analysis, and mapping, for example.
- Various types and methods for making topographical models are presently being used. One common topographical model is the digital elevation map (DEM). A DEM is a sampled matrix representation of a geographical area which may be generated in an automated fashion by a computer. In a DEM, coordinate points are made to correspond with a height value. DEMs are typically used for modeling terrain where the transitions between different elevations (e.g., valleys, mountains, etc.) are generally smooth from one to a next. That is, DEMs typically model terrain as a plurality of curved surfaces and any discontinuities therebetween are thus “smoothed” over. Thus, in a typical DEM no distinct objects are present on the terrain.
- One particularly advantageous 3D site modeling product is RealSite® from the present Assignee Harris Corp. RealSite® may be used to register overlapping images of a geographical area of interest, and extract high resolution DEMs using stereo and nadir view techniques. RealSite® provides a semi-automated process for making three-dimensional (3D) topographical models of geographical areas, including cities, which have accurate textures and structure boundaries. Moreover, RealSite® models are geospatially accurate. That is, the location of any given point within the model corresponds to an actual location in the geographical area with very high accuracy. The data used to generate RealSite® models may include aerial and satellite photography, electro-optical, infrared, and light detection and ranging (LIDAR).
- Another advantageous approach for generating 3D site models is set forth in U.S. Pat. No. 6,654,690 to Rahmes et al., which is also assigned to the present Assignee and is hereby incorporated herein in its entirety by reference. This patent discloses an automated method for making a topographical model of an area including terrain and buildings thereon based upon randomly spaced data of elevation versus position. The method includes processing the randomly spaced data to generate gridded data of elevation versus position conforming to a predetermined position grid, processing the gridded data to distinguish building data from terrain data, and performing polygon extraction for the building data to make the topographical model of the area including terrain and buildings thereon.
- Nonetheless, topographical models are no longer reserved for advanced modeling systems such as those discussed above. Various Internet service providers such as Google™ and Microsoft® are looking to provide access to 3D topographical models over the Internet that show users how a city or location appears in as much realism as possible. This may advantageously help increase a user's awareness of a given area and provide an exploratory environment. Such companies are striving to provide environments that are easier to use, more realistic and ultimately more useful. Improving the user experience involves increasing the quality of the 3D environment in terms of better terrain, more highly detailed city/building models, and higher resolution imagery of the terrain and buildings.
- However, one significant challenge is that, while the terrain and models are quite small in terms of their geometries or structure, the imagery and textures used to enhance the basic models are typically very large. Over a high-speed network, such as that found within most corporate networks, downloading models and textures from a local network server is relatively fast and therefore not particularly problematic. Over the Internet, however, downloading these quantities of data can be extremely slow and significantly diminish user experience because of the relatively limited bandwidth available.
- Currently, several network-enabled 3D viewers exist that permit users to view models from a network or Internet server. These viewers include Google™ Earth, Microsoft® VirtualEarth, and NASA WorldWind. All viewers share the ability to view untextured building models with some varying degree of textured terrain. Textured models tend to be very rudimentary. Microsoft® VirtualEarth attempts to apply textures over their models, but the delay can be so long as to become unacceptable to users.
- Various approaches have been developed for remotely accessing terrain data. One example is set forth in U.S. Pat. No. 6,496,189 to Yaron et al. This patent discloses a method of providing data blocks describing three-dimensional terrain to a renderer. The data blocks belong to a hierarchical structure which includes blocks at a plurality of different resolution layers. The method includes receiving from the renderer one or more coordinates in the terrain along with indication of a respective resolution layer, providing the renderer with a first data block which includes data corresponding to the coordinate(s) from a local memory, and downloading from a remote server one or more additional data blocks which include data corresponding to the coordinate(s) if the provided block from the local memory is not at the indicated resolution layer. Despite the existence of such approaches, further advancements may be desirable for remotely retrieving and displaying large amounts of geospatial data.
- In view of the foregoing background, it is therefore an object of the present invention to provide a system and related methods for efficiently retrieving and displaying geospatial data.
- This and other objects, features, and advantages are provided by a geospatial data system that may include at least one geospatial database containing three-dimensional (3D) geospatial structure data, and also containing geospatial texture data associated with the geospatial 3D structure data. The system may further include at least one geospatial data access device, which may comprise a display and a processor cooperating therewith for communicating remotely with the at least one geospatial database to retrieve and display a scene on the display based upon the 3D structure data and the geospatial texture data associated therewith. Moreover, the at least one geospatial data access device may further comprise at least one user input device cooperating with the processor for permitting user selection of a point-of-view (POV) within the scene on the display with the POV determining revealed portions and obscured portions of 3D geospatial structures within the scene on the display. Further, the processor may advantageously selectively retrieve geospatial texture data based upon the revealed portions and not the obscured portions of the 3D geospatial structures within the scene on the display.
- More particularly, the geospatial texture data contained in the at least one geospatial database may be retrievable in successive additive layers of resolution, and the processor may therefore retrieve and display the geospatial texture data in successive additive layers of resolution in the scene on the display. Furthermore, the processor may prioritize retrieval and display of successive additive layers of resolution of associated geospatial texture data to different 3D geospatial structures within the scene on the display. By way of example, the processor may prioritize based upon relative distances of the 3D geospatial structures within the scene on the display. Also, the processor may prioritize based upon different relative areas of the 3D geospatial structures within the scene on the display.
- The geospatial data system may further comprise a communications channel coupling the at least one geospatial database and the geospatial data access device. The communications channel may have a capacity insufficient to carry within a predetermined time all of the associated geospatial texture data for the 3D geospatial structures within the scene on the display. By way of example, the communications channel may comprise the Internet. Additionally, the at least one geospatial database and the at least one geospatial data access device may communicate using a streaming wavelet-based imagery compression protocol, such as the JPEG 2000 Interactive Protocol, for example.
- A related geospatial data access method aspect may include storing 3D geospatial structure data and geospatial texture data associated with the geospatial 3D structure data in at least one geospatial database. The method may further include remotely retrieving the 3D structure data and the geospatial texture data associated therewith from the at least one geospatial database. Additionally, a scene is displayed on a display based upon the retrieved 3D structure data and the geospatial texture data associated therewith and also based upon a user selection of a point-of-view (POV) within the scene on the display with the POV determining revealed portions and obscured portions of 3D geospatial structures within the scene on the display. In particular, remotely retrieving may further comprise selectively retrieving geospatial texture data based upon the revealed portions and not the obscured portions of the 3D geospatial structures within the scene on the display.
- A related computer-readable medium is also provided having computer-executable instructions for causing a computer to perform steps including remotely retrieving three-dimensional (3D) structure data and geospatial texture data associated therewith from at least one geospatial database, and displaying a scene on a display based upon the retrieved 3D structure data and the geospatial texture data associated therewith and also based upon a user selection of a point-of-view (POV) within the scene on the display. The POV may advantageously determine revealed portions and obscured portions of 3D geospatial structures within the scene on the display. Remotely retrieving may further comprise selectively retrieving geospatial texture data based upon the revealed portions and not the obscured portions of the 3D geospatial structures within the scene on the display.
-
FIG. 1 is a schematic block diagram of a geospatial data system in accordance with the invention. -
FIGS. 2 and 3 are schematic block diagrams of the geospatial data system ofFIG. 1 in greater detail for a JPEG 2000 implementation. -
FIG. 4 is a series of geospatial texture images illustrating progressive texture data rendering of the system ofFIG. 1 . -
FIGS. 5A-5C are another series of geospatial texture images also illustrating progressive texture data rendering of the system ofFIG. 1 . -
FIG. 6 is a system flow diagram illustrating method aspects of the invention. -
FIG. 7 is a schematic block diagram of an alternative embodiment of the system ofFIG. 1 . - The present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which preferred embodiments of the invention are shown. This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like numbers refer to like elements throughout, and prime notation is used to indicate similar elements in alternate embodiments.
- Referring initially to
FIGS. 1-6 , ageospatial data system 30 and associated methods are now described. Thesystem 30 illustratively includes one or more geospatialdata storage devices 31 containing three-dimensional (3D) geospatial structure data, and also containing geospatial texture data associated with the geospatial 3D structure data and being retrievable in successive additive layers of resolution. As used herein, “structure” data includes man-made (e.g., buildings, bridges, etc.) data, and the 3D geospatial structure data may be in the form of a DEM, such as a tiled triangulated irregular network (T-TIN), for example. The geospatial texture data may be optical (i.e., image) data, for example, that is used to overlay or texture the DEM, etc., to make the image appear more realistic, as will be appreciated by those skilled in the art. In the example ofFIG. 2 , the geospatialdata storage device 31 is implemented in an Internetmodel library server 39, as will be appreciated by those skilled in the art. - The system further illustratively includes one or more geospatial
data access devices 32 for remotely accessing the geospatial data storage device(s) 31, such as via awide area network 33, which in the illustrated embodiment is the Internet. Thegeospatial access device 32 illustratively includes adisplay 34 and aprocessor 35, such as the central processing unit (CPU) of a personal computer (PC) or Macintosh computer, for example, although other types of processors (workstations, personal digital assistant (PDA) devices, laptops, etc., may also be used). In the example illustrated inFIG. 2 , thegeospatial access device 32 is an Internet-enabled device. - Generally speaking, the
processor 35 runs aviewer program 60 that cooperates with thedisplay 34 for communicating remotely with the geospatialdata storage device 31 to retrieve and display a scene on the display based upon the 3D structure data and the geospatial texture data associated therewith. As discussed above, when retrieving high volumes of geospatial texture data over a relatively limited bandwidth communications channel, such as the Internet (compared to a local high speed network connection, for example), this can make rendering of a geospatial scene or model on thedisplay 34 very cumbersome and frustrating for the user. Stated alternatively, the communications channel (e.g., the Internet) may have a capacity insufficient to carry within a predetermined time (i.e., the time theprocessor 35 could otherwise render the scene) all of the associated geospatial texture data for the 3D geospatial structures within the scene on thedisplay 34. - Typically, the transfer of 3D geospatial structure data will be relatively fast due to its smaller file size (e.g., on the order of kilobytes), and can therefore be substantially immediately sent and displayed upon request from the geospatial
data access device 32. On the other hand, the geospatial texture data can be on the order of several megabytes or larger, for example, which delays the rendering of the geometry and theprocessor 35 otherwise waits until all data is retrieved to begin the rendering process. - Rather than compromise the geospatial texture data (and thus the ultimate image) by reducing the resolution, or using smaller size synthetic textures that can provide false or misleading images, the geospatial texture data is advantageously retrieved and displayed in successive additive layers 36 a-36 d of resolution (i.e., it is “streamed” in layers). This may advantageously make the user experience more interactive as model textures progressively sharpen as the user navigates through a geospatial model/scene, as will be appreciated by those skilled in the art.
- More particularly, within the past several years, a wavelet-based imagery compression technology known as JPEG 2000 has been established and standardized that decreases the data required for a given image. A section of this specification enables imagery streaming, known as JPEG 2000 Interactive Protocol (JPIP) under part 9 of the specification, which is hereby incorporated herein in its entirety by reference. In the satellite imagery markets, this technique may allow users to effectively browse images that are several Gigabytes in size over connections as slow as 16 kB/sec.
- Applicants have discovered that if the JPIP technique is applied to model textures, this effectively enhances the user experience by reducing the amount of data necessary to texture a model in varying resolutions. Streaming textures is a different approach than the current method of downloading full-resolution textures (or multiple textures of varying resolutions), which takes advantage of the more efficient and interactive protocol noted above.
- In accordance with one embodiment, the effective user experience may include loading of untextured models, followed by textured models that progressively increase in resolution as the user approaches buildings or other objects within the scene (i.e., changes the point-of-view (POV)). In other embodiments, the viewer program may use whichever texture is available, and the user might not ever see an untextured model. For example, if the client-software requests both the structure and the texture data and the texture stream arrives first, the user would not see the untextured model. The viewer program will typically display the scene from an initial (startup) viewpoint (Block 61), and the user can change the POV using any suitable user input device, such as the illustrated
keyboard 38, a mouse, joystick, etc. (Block 62). Objects that are farther away are only rendered using lower resolutions of the image (known as quality layers within the JPEG 2000 file), at Blocks 63-64 as discussed further below. As the user moves closer to a structure(s) (i.e., zooms in the POV), the structure/geometry data therefor is retrieved and displayed (Blocks 65-67), which may initially be without texture (or with only a first layer of texture). Successive additive layers of texture are then streamed in to increase the scene or model's appearance and displayed accordingly, as will be discussed further below. This technique may advantageously be leveraged over networks of modest bandwidth and in effect, makes very efficient use of network resources. As will be discussed further below, the additional texture data to be streamed may advantageously be selected based upon a position or relative distance of a structure within the scene, and/or based upon whether the data is revealed (i.e., visible) in the scene. - A
system 30′ implemented using JPIP is illustrated inFIG. 3 . In this embodiment, geospatial texture data layers 36 a′-36 d′ are stored in adata storage device 31′ on theserver 39′ in a JPEG 2000 format that is arranged in a manner that permits efficient streaming by a JPIP streaming module 41′. As the rendering program on theprocessor 35′ requests textures, aJPIP module 40′ translates the requests into JPIP requests. Responses are returned in successiveadditive layers 36 a′-36 d′, and each layer is converted to a texture. - A JPIP-aware model viewer can make successive texture requests, each time resulting in sharper and sharper textures, as seen in
FIG. 4 . JPEG 2000 files may be encoded using profiles that produce quality layers. InFIG. 4 , each of the layers 36 a-36 d represents a different JPEG 2000 quality layer. Each quality layer contains a portion of each pixel's information, and each successive layer adds to the previous ones to provide progressively sharper pixels until the final layer contains the remaining information to complete the full resolution image, as shown. Another example is shown inFIGS. 5A-5C , in which three successive additive layers result in the illustrated buildings 51 going from having an obscured surface with little window or picture definition (51 c), to the well definedbuildings 51 a having relatively crisp window delineation and a visible image of whales on the side of one of the buildings. - Referring additionally to
FIG. 7 , in accordance with another advantageous aspect models/scenes that are farther away from the user need only receive lower resolution textures, and the user is advantageously not burdened with downloading unnecessary texture data. That is, theprocessor 35″ may advantageously prioritize retrieval and display of successive additive layers of resolution of geospatial texture data to different 3D geospatial structures within the scene on thedisplay 34″ (Blocks 68-72). By way of example, theprocessor 35″ may prioritize based upon relative distances of the 3D geospatial structures within the scene on the display, and/or based upon different relative areas of the 3D geospatial structures within the scene on the display. Thus, for example, buildings/terrain that are closer in the scene would receive more successive additive layers of resolution than buildings/terrain that is farther away in the scene. - Moreover, as will be appreciated by those skilled in the art, as the user selects a given POV within the scene, the POV will determine revealed portions (e.g., front of buildings) and obscured portions (e.g., back of buildings) of 3D geospatial structures and/or terrain within the scene on the display. Further, the
processor 35″ may advantageously selectively retrieve geospatial texture data based upon the revealed portions and not the obscured portions of the 3D geospatial structures within the scene on thedisplay 34″. Thus, further bandwidth savings are provided by not downloading portions of the scene that are not going to be displayed on thedisplay 34″ anyway from the given POV. - The invention may also be embodied in a computer-readable medium having computer-executable instructions for causing a computer, such as the
processor 35, to perform the steps/operations set forth above, as will be appreciated by those skilled in the art. - Many modifications and other embodiments of the invention will come to the mind of one skilled in the art having the benefit of the teachings presented in the foregoing descriptions and the associated drawings. Therefore, it is understood that the invention is not to be limited to the specific embodiments disclosed, and that modifications and embodiments are intended to be included within the scope of the appended claims.
Claims (24)
1. A geospatial data system comprising:
at least one geospatial database containing three-dimensional (3D) geospatial structure data, and containing geospatial texture data associated with the geospatial 3D structure data; and
at least one geospatial data access device comprising a display and a processor cooperating therewith for communicating remotely with said at least one geospatial database to retrieve and display a scene on said display based upon the 3D structure data and the geospatial texture data associated therewith;
said at least one geospatial data access device further comprising at least one user input device cooperating with said processor for permitting user selection of a point-of-view (POV) within the scene on said display with the POV determining revealed portions and obscured portions of 3D geospatial structures within the scene on said display, said processor selectively retrieving geospatial texture data based upon the revealed portions and not the obscured portions of the 3D geospatial structures within the scene on said display.
2. The geospatial data system of claim 1 wherein the geospatial texture data contained in said at least one geospatial database is retrievable in successive additive layers of resolution; and wherein said processor retrieves and displays the geospatial texture data in successive additive layers of resolution in the scene on said display.
3. The geospatial data system of claim 1 wherein said processor prioritizes retrieval and display of successive additive layers of resolution of associated geospatial texture data to different 3D geospatial structures within the scene on said display.
4. The geospatial data system of claim 3 wherein said processor prioritizes based upon relative distances of the 3D geospatial structures within the scene on said display.
5. The geospatial data system of claim 3 wherein said processor prioritizes based upon different relative areas of the 3D geospatial structures within the scene on said display.
6. The geospatial data system of claim 1 further comprising a communications channel coupling said at least one geospatial database and said geospatial data access device; and wherein said communications channel has a capacity insufficient to carry within a predetermined time all of the associated geospatial texture data for the 3D geospatial structures within the scene on said display.
7. The geospatial data system of claim 6 wherein said communications channel comprises the Internet.
8. The geospatial data system of claim 1 wherein said at least one geospatial database and said at least one geospatial data access device communicate using a streaming wavelet-based imagery compression protocol.
9. The geospatial data system of claim 8 wherein the streaming wavelet-based imagery compression protocol comprises the JPEG 2000 Interactive Protocol.
10. A geospatial data access device for accessing at least one geospatial database containing three-dimensional (3D) geospatial structure data, and also containing geospatial texture data associated with the geospatial 3D structure data, the geospatial data access device comprising:
a display;
a processor cooperating with said display for communicating remotely with the at least one geospatial database to retrieve and display a scene on said display based upon the 3D structure data and the geospatial texture data associated therewith; and
at least one user input device cooperating with said processor for permitting user selection of a point-of-view (POV) within the scene on said display with the POV determining revealed portions and obscured portions of 3D geospatial structures within the scene on said display;
said processor selectively retrieving geospatial texture data based upon the revealed portions and not the obscured portions of the 3D geospatial structures within the scene on said display.
11. The geospatial data access device of claim 10 wherein the geospatial texture data contained in the at least one geospatial database is retrievable in successive additive layers of resolution; and wherein said processor retrieves and displays the geospatial texture data in successive additive layers of resolution in the scene on said display.
12. The geospatial data access device of claim 10 wherein said processor prioritizes retrieval and display of successive additive layers of resolution of associated geospatial texture data to different 3D geospatial structures within the scene on said display.
13. The geospatial data access device of claim 12 wherein said processor prioritizes based upon relative distances of the 3D geospatial structures within the scene on said display.
14. The geospatial data access device of claim 12 wherein said processor prioritizes based upon different relative areas of the 3D geospatial structures within the scene on said display.
15. A geospatial data access method comprising:
storing three-dimensional (3D) geospatial structure data and geospatial texture data associated with the geospatial 3D structure data in at least one geospatial database;
remotely retrieving the 3D structure data and the geospatial texture data associated therewith from the at least one geospatial database; and
displaying a scene on a display based upon the retrieved 3D structure data and the geospatial texture data associated therewith and also based upon a user selection of a point-of-view (POV) within the scene on the display with the POV determining revealed portions and obscured portions of 3D geospatial structures within the scene on the display;
wherein remotely retrieving further comprises selectively retrieving geospatial texture data based upon the revealed portions and not the obscured portions of the 3D geospatial structures within the scene on the display.
16. The method of claim 15 wherein the geospatial texture data contained in the at least one geospatial database is retrievable in successive additive layers of resolution; and wherein remotely retrieving and displaying comprise remotely retrieving and displaying the geospatial texture data in successive additive layers of resolution in the scene on the display.
17. The method of claim 15 further comprising prioritizing retrieval and display of successive additive layers of resolution of associated geospatial texture data to different 3D geospatial structures within the scene on the display.
18. The method of claim 17 wherein prioritizing comprises prioritizing based upon relative distances of the 3D geospatial structures within the scene on the display.
19. The method of claim 17 wherein prioritizing comprises prioritizing based upon different relative areas of the 3D geospatial structures within the scene on the display.
20. A computer-readable medium having computer-executable instructions for causing a computer to perform steps comprising:
remotely retrieving three-dimensional (3D) structure data and geospatial texture data associated therewith from at least one geospatial database; and
displaying a scene on a display based upon the retrieved 3D structure data and the geospatial texture data associated therewith and also based upon a user selection of a point-of-view (POV) within the scene on the display with the POV determining revealed portions and obscured portions of 3D geospatial structures within the scene on the display;
wherein remotely retrieving further comprises selectively retrieving geospatial texture data based upon the revealed portions and not the obscured portions of the 3D geospatial structures within the scene on the display.
21. The computer-readable medium of claim 20 wherein the geospatial texture data contained in the at least one geospatial database is retrievable in successive additive layers of resolution; and wherein remotely retrieving and displaying comprise remotely retrieving and displaying the geospatial texture data in successive additive layers of resolution in the scene on the display.
22. The computer-readable medium of claim 20 further comprising prioritizing retrieval and display of successive additive layers of resolution of associated geospatial texture data to different 3D geospatial structures within the scene on the display.
23. The computer-readable medium of claim 22 wherein prioritizing comprises prioritizing based upon relative distances of the 3D geospatial structures within the scene on the display.
24. The computer-readable medium of claim 22 wherein prioritizing comprises prioritizing based upon different relative areas of the 3D geospatial structures within the scene on the display.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/527,184 US20120256919A1 (en) | 2007-08-30 | 2012-06-19 | Geospatial data system for selectively retrieving and displaying geospatial texture data based upon user-selected point-of-view and related methods |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/847,510 US8212807B2 (en) | 2007-08-30 | 2007-08-30 | Geospatial data system for selectively retrieving and displaying geospatial texture data based upon user-selected point-of-view and related methods |
US13/527,184 US20120256919A1 (en) | 2007-08-30 | 2012-06-19 | Geospatial data system for selectively retrieving and displaying geospatial texture data based upon user-selected point-of-view and related methods |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/847,510 Continuation US8212807B2 (en) | 2007-08-30 | 2007-08-30 | Geospatial data system for selectively retrieving and displaying geospatial texture data based upon user-selected point-of-view and related methods |
Publications (1)
Publication Number | Publication Date |
---|---|
US20120256919A1 true US20120256919A1 (en) | 2012-10-11 |
Family
ID=39967437
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/847,510 Expired - Fee Related US8212807B2 (en) | 2007-08-30 | 2007-08-30 | Geospatial data system for selectively retrieving and displaying geospatial texture data based upon user-selected point-of-view and related methods |
US13/527,184 Abandoned US20120256919A1 (en) | 2007-08-30 | 2012-06-19 | Geospatial data system for selectively retrieving and displaying geospatial texture data based upon user-selected point-of-view and related methods |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/847,510 Expired - Fee Related US8212807B2 (en) | 2007-08-30 | 2007-08-30 | Geospatial data system for selectively retrieving and displaying geospatial texture data based upon user-selected point-of-view and related methods |
Country Status (9)
Country | Link |
---|---|
US (2) | US8212807B2 (en) |
EP (1) | EP2195784A1 (en) |
JP (1) | JP2010537349A (en) |
KR (1) | KR20100047889A (en) |
CN (1) | CN101802875B (en) |
BR (1) | BRPI0815289A2 (en) |
CA (1) | CA2697554A1 (en) |
TW (1) | TW200926056A (en) |
WO (1) | WO2009032697A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10066925B2 (en) | 2016-02-02 | 2018-09-04 | The Boeing Company | Point cloud processing apparatus and method |
US11170568B2 (en) | 2020-01-23 | 2021-11-09 | Rockwell Collins, Inc. | Photo-realistic image generation using geo-specific data |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8812990B2 (en) * | 2009-12-11 | 2014-08-19 | Nokia Corporation | Method and apparatus for presenting a first person world view of content |
US8749580B1 (en) * | 2011-08-12 | 2014-06-10 | Google Inc. | System and method of texturing a 3D model from video |
US9311748B2 (en) | 2013-02-20 | 2016-04-12 | Google Inc. | Method and system for generating and storing data objects for multi-resolution geometry in a three dimensional model |
JP6353053B2 (en) | 2014-02-03 | 2018-07-04 | ジョンソン コントロールズ テクノロジー カンパニーJohnson Controls Technology Company | Multi-pulse constant voltage transformer for variable speed drive in cooling device applications |
DE102014007914A1 (en) * | 2014-05-27 | 2015-12-03 | Elektrobit Automotive Gmbh | Graphing roads and routes using hardware tessellation |
US11775545B1 (en) | 2020-09-23 | 2023-10-03 | Amazon Technologies, Inc. | Cloud-based database for spatial data lifecycle management |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2610752B1 (en) | 1987-02-10 | 1989-07-21 | Sagem | METHOD FOR REPRESENTING THE PERSPECTIVE IMAGE OF A FIELD AND SYSTEM FOR IMPLEMENTING SAME |
US5490240A (en) | 1993-07-09 | 1996-02-06 | Silicon Graphics, Inc. | System and method of generating interactive computer graphic images incorporating three dimensional textures |
US5432871A (en) | 1993-08-04 | 1995-07-11 | Universal Systems & Technology, Inc. | Systems and methods for interactive image data acquisition and compression |
AU682268B2 (en) | 1994-05-27 | 1997-09-25 | Raytheon Company | Low latency update of graphic objects in an air traffic control display |
US5566073A (en) | 1994-07-11 | 1996-10-15 | Margolin; Jed | Pilot aid using a synthetic environment |
US5613051A (en) | 1994-12-21 | 1997-03-18 | Harris Corp. | Remote image exploitation display system and method |
US5646677A (en) | 1995-02-23 | 1997-07-08 | Motorola, Inc. | Method and apparatus for interactively viewing wide-angle images from terrestrial, space, and underwater viewpoints |
US5760783A (en) | 1995-11-06 | 1998-06-02 | Silicon Graphics, Inc. | Method and system for providing texture using a selected portion of a texture map |
US6111583A (en) | 1997-09-29 | 2000-08-29 | Skyline Software Systems Ltd. | Apparatus and method for three-dimensional terrain rendering |
US6496189B1 (en) | 1997-09-29 | 2002-12-17 | Skyline Software Systems Ltd. | Remote landscape display and pilot training |
JP2001229402A (en) | 2000-02-16 | 2001-08-24 | Mitsubishi Electric Corp | Device and method for three-dimensional image display and computer-readable recording medium with recorded program making computer, implement the method |
US6484101B1 (en) * | 2000-08-16 | 2002-11-19 | Imagelinks, Inc. | 3-dimensional interactive image modeling system |
US6985929B1 (en) * | 2000-08-31 | 2006-01-10 | The United States Of America As Represented By The Secretary Of The Navy | Distributed object-oriented geospatial information distribution system and method thereof |
US7127453B1 (en) * | 2000-10-31 | 2006-10-24 | Ncr Corp. | Gathering data from a database for display |
US6654690B2 (en) | 2001-04-05 | 2003-11-25 | Harris Corporation | Automated method for making a topographical model and related system |
JP2003044879A (en) | 2001-08-02 | 2003-02-14 | Mitsubishi Electric Corp | Method and device for generating three-dimensional data, program for executing this method on computer, method and device for transmitting three-dimensional data, and program for executing this method on computer |
US7225207B1 (en) * | 2001-10-10 | 2007-05-29 | Google Inc. | Server for geospatially organized flat file data |
US7324695B2 (en) | 2002-03-18 | 2008-01-29 | Seimens Medical Solutions Usa, Inc. | Prioritized image visualization from scalable compressed data |
US7373612B2 (en) * | 2002-10-21 | 2008-05-13 | Battelle Memorial Institute | Multidimensional structured data visualization method and apparatus, text visualization method and apparatus, method and apparatus for visualizing and graphically navigating the world wide web, method and apparatus for visualizing hierarchies |
US7116833B2 (en) | 2002-12-23 | 2006-10-03 | Eastman Kodak Company | Method of transmitting selected regions of interest of digital video data at selected resolutions |
US7930434B2 (en) | 2003-03-05 | 2011-04-19 | Microsoft Corporation | System and method for managing communication and/or storage of image data |
US7254271B2 (en) | 2003-03-05 | 2007-08-07 | Seadragon Software, Inc. | Method for encoding and serving geospatial or other vector data as images |
FR2852128A1 (en) | 2003-03-07 | 2004-09-10 | France Telecom | METHOD FOR MANAGING THE REPRESENTATION OF AT LEAST ONE MODELIZED 3D SCENE |
US7353114B1 (en) | 2005-06-27 | 2008-04-01 | Google Inc. | Markup language for an interactive geographic information system |
US7554539B2 (en) | 2005-07-27 | 2009-06-30 | Balfour Technologies Llc | System for viewing a collection of oblique imagery in a three or four dimensional virtual scene |
JP4672493B2 (en) | 2005-09-08 | 2011-04-20 | 三菱電機株式会社 | 3D graphic display device and 3D graphic display method |
CN100489851C (en) * | 2006-04-29 | 2009-05-20 | 上海杰图软件技术有限公司 | Method for establishing panorama electronic map service |
US7643673B2 (en) * | 2006-06-12 | 2010-01-05 | Google Inc. | Markup language for interactive geographic information system |
-
2007
- 2007-08-30 US US11/847,510 patent/US8212807B2/en not_active Expired - Fee Related
-
2008
- 2008-08-27 CN CN2008801046541A patent/CN101802875B/en not_active Expired - Fee Related
- 2008-08-27 JP JP2010523115A patent/JP2010537349A/en active Pending
- 2008-08-27 KR KR1020107005176A patent/KR20100047889A/en not_active Application Discontinuation
- 2008-08-27 EP EP08798809A patent/EP2195784A1/en not_active Withdrawn
- 2008-08-27 WO PCT/US2008/074477 patent/WO2009032697A1/en active Application Filing
- 2008-08-27 BR BRPI0815289-6A2A patent/BRPI0815289A2/en not_active IP Right Cessation
- 2008-08-27 CA CA2697554A patent/CA2697554A1/en not_active Abandoned
- 2008-08-29 TW TW097133320A patent/TW200926056A/en unknown
-
2012
- 2012-06-19 US US13/527,184 patent/US20120256919A1/en not_active Abandoned
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10066925B2 (en) | 2016-02-02 | 2018-09-04 | The Boeing Company | Point cloud processing apparatus and method |
US11170568B2 (en) | 2020-01-23 | 2021-11-09 | Rockwell Collins, Inc. | Photo-realistic image generation using geo-specific data |
Also Published As
Publication number | Publication date |
---|---|
TW200926056A (en) | 2009-06-16 |
US20090058854A1 (en) | 2009-03-05 |
JP2010537349A (en) | 2010-12-02 |
KR20100047889A (en) | 2010-05-10 |
WO2009032697A1 (en) | 2009-03-12 |
CA2697554A1 (en) | 2009-03-12 |
US8212807B2 (en) | 2012-07-03 |
BRPI0815289A2 (en) | 2015-02-03 |
CN101802875B (en) | 2012-04-18 |
CN101802875A (en) | 2010-08-11 |
EP2195784A1 (en) | 2010-06-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8379016B2 (en) | Geospatial data system for selectively retrieving and displaying geospatial texture data in successive additive layers of resolution and related methods | |
US20120256919A1 (en) | Geospatial data system for selectively retrieving and displaying geospatial texture data based upon user-selected point-of-view and related methods | |
US9024947B2 (en) | Rendering and navigating photographic panoramas with depth information in a geographic information system | |
US9424373B2 (en) | Site modeling using image data fusion | |
US9454847B2 (en) | System and method of indicating transition between street level images | |
KR20100013059A (en) | 3 dimensional geographical information client apparatus, server apparatus and 3 dimensional geographical information system having the same | |
CN110832278A (en) | Rendering map data using a description of grid differences | |
Devaux et al. | A web-based 3D mapping application using WebGL allowing interaction with images, point clouds and models | |
Rau et al. | A cost-effective strategy for multi-scale photo-realistic building modeling and web-based 3-D GIS applications in real estate | |
Yu et al. | A hybrid system of expanding 2D GIS into 3D space | |
KR20130137076A (en) | Device and method for providing 3d map representing positon of interest in real time | |
Zheng et al. | Pervasive Views: Area exploration and guidance using extended image media | |
US20240371079A1 (en) | Face-Oriented Geometry Streaming | |
WO2023224627A1 (en) | Face-oriented geometry streaming | |
Yu et al. | Pervasive Views: Area Exploration and Guidance Using Extended Image Media | |
Sierikov et al. | INTERACTIVE ELEVATION MAPVISUALIZATION WITH SELECTIVE 3D RENDERING ON THE WEB |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |