CN105931284B - Fusion method and device of three-dimensional texture TIN data and large scene data - Google Patents

Fusion method and device of three-dimensional texture TIN data and large scene data Download PDF

Info

Publication number
CN105931284B
CN105931284B CN201610228555.0A CN201610228555A CN105931284B CN 105931284 B CN105931284 B CN 105931284B CN 201610228555 A CN201610228555 A CN 201610228555A CN 105931284 B CN105931284 B CN 105931284B
Authority
CN
China
Prior art keywords
data
tin
dimensional texture
texture
large scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610228555.0A
Other languages
Chinese (zh)
Other versions
CN105931284A (en
Inventor
李英成
耿中元
王恩泉
刘洪岐
任丽艳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhongxing New Map (beijing) Remote Sensing Technology Co Ltd
Original Assignee
Zhongxing New Map (beijing) Remote Sensing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhongxing New Map (beijing) Remote Sensing Technology Co Ltd filed Critical Zhongxing New Map (beijing) Remote Sensing Technology Co Ltd
Priority to CN201610228555.0A priority Critical patent/CN105931284B/en
Publication of CN105931284A publication Critical patent/CN105931284A/en
Application granted granted Critical
Publication of CN105931284B publication Critical patent/CN105931284B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering

Abstract

The invention provides a method and a device for fusing three-dimensional texture TIN data and large scene data, wherein the method comprises the following steps: oblique photography is carried out on the set area, and three-dimensional texture TIN data corresponding to the set area are obtained; according to geometric data included in the three-dimensional texture TIN data, carrying out spatial position fusion on the three-dimensional texture TIN data and large scene data corresponding to a set area through coordinate conversion; and performing image fusion on the three-dimensional texture TIN data and the large scene data through texture rendering according to the geometric data and the texture data included in the three-dimensional texture TIN data. Three-dimensional texture TIN data are transformed to coordinates which are the same as those of large scene data through coordinate transformation, deformation caused by earth curvature is eliminated, and therefore spatial position fusion accuracy is higher and errors are smaller. Image fusion is carried out through texture rendering, so that fusion of spatial data on texture details is more perfect, and the visualization problems of patch shielding, loopholes and the like in the subsequent display process are avoided.

Description

Fusion method and device of three-dimensional texture TIN data and large scene data
Technical Field
The invention relates to the technical field of spatial data fusion display, in particular to a method and a device for fusing three-dimensional texture TIN data and large scene data.
Background
In a digital earth large scene, a specific region is often subjected to oblique photography to obtain high-precision three-dimensional texture tin (triangular Irregular network) data of the specific region. At this time, the obtained TIN data and the large scene data in the digital earth need to be fused, and then the fused data is displayed to the user.
Currently, in the related art, when three-dimensional texture TIN data and large scene data are fused, a first connection point of a triangular patch connected to the three-dimensional texture TIN data is determined from the large scene data, and a second connection point of the triangular patch connected to the large scene data is determined from the three-dimensional texture TIN data. And connecting a second connecting point in the three-dimensional texture TIN data with a first connecting point in the large scene data to realize fusion of the three-dimensional texture TIN data and the large scene data, then carrying out visual processing on the fused data, and displaying the fused data to a user.
The three-dimensional texture TIN data and the large scene data have different precision and errors in data management, the high-precision three-dimensional texture TIN data is directly fused into the low-precision large scene data, the fusion accuracy is very low, and the errors are very high. And the patch of the three-dimensional texture TIN data and the patch of the large scene terrain can be shielded mutually during subsequent display, and a leak can exist during browsing from different angles.
Disclosure of Invention
In view of this, an object of the embodiments of the present invention is to provide a method and an apparatus for fusing three-dimensional texture TIN data and large scene data, which eliminate deformation caused by curvature of the earth, improve the fusion accuracy in spatial position, and reduce the fusion error. In the image fusion, the fusion of the spatial data on the texture details is more perfect, and the visualization problems of patch shielding, loopholes and the like in the subsequent display process are avoided.
In a first aspect, an embodiment of the present invention provides a method for fusing three-dimensional texture TIN data and large scene data, where the method includes:
performing oblique photography on a set area to obtain three-dimensional texture TIN data corresponding to the set area;
according to geometric data included in the three-dimensional texture TIN data, carrying out spatial position fusion on the three-dimensional texture TIN data and large scene data corresponding to the set area through coordinate conversion;
and performing image fusion on the three-dimensional texture TIN data and the large scene data through texture rendering according to the geometric data and the texture data included in the three-dimensional texture TIN data.
With reference to the first aspect, an embodiment of the present invention provides a first possible implementation manner of the first aspect, where the performing, according to geometric data included in the three-dimensional texture TIN data, spatial position fusion on the three-dimensional texture TIN data and large scene data corresponding to the set area through coordinate transformation includes:
determining the type of the three-dimensional texture TIN data according to a preset range threshold;
when the three-dimensional texture TIN data is determined to be of the first type, converting a reference plane of the three-dimensional texture TIN data into a reference plane of large scene data corresponding to the set area through coordinate conversion, and converting the converted three-dimensional texture TIN data into a geographic coordinate system;
and when the three-dimensional texture TIN data is determined to be of the second type, carrying out same-name feature point matching on the three-dimensional texture TIN data in a graphics conversion mode, and converting the matched three-dimensional texture TIN data into the geographic coordinate system.
With reference to the first aspect, an embodiment of the present invention provides a second possible implementation manner of the first aspect, where the performing, according to the geometric data and the texture data included in the three-dimensional texture TIN data, image-wise fusion on the three-dimensional texture TIN data and the large scene data through texture rendering includes:
generating a real projective image TDOM corresponding to the three-dimensional texture TIN according to the geometric data and the texture data included in the three-dimensional texture TIN data;
and overlaying the TDOM on a terrain texture corresponding to the large scene data through texture rendering.
With reference to the first aspect, an embodiment of the present invention provides a third possible implementation manner of the first aspect, where the method further includes:
and fusing the three-dimensional texture TIN data and the large scene data, and performing visual processing on fused data corresponding to the viewpoint position according to the viewpoint position and the resolution of the three-dimensional texture TIN data.
With reference to the second possible implementation manner of the first aspect, an embodiment of the present invention provides a fourth possible implementation manner of the first aspect, where the generating a true projective image TDOM corresponding to the three-dimensional texture TIN according to the geometric data and the texture data included in the three-dimensional texture TIN data includes:
setting a rendering window according to the range of the TDOM to be generated;
acquiring slice data intersected with the rendering window from the three-dimensional texture TIN data;
and coloring the slice data through a graphic processing program, and rendering the slice data through a parallel projection rendering mode to obtain the TDOM corresponding to the three-dimensional texture TIN.
In a second aspect, an embodiment of the present invention provides a device for fusing three-dimensional texture TIN data and large scene data, where the device includes:
the oblique photography module is used for carrying out oblique photography on a set area to obtain three-dimensional texture TIN data corresponding to the set area;
the spatial position fusion module is used for carrying out spatial position fusion on the three-dimensional texture TIN data and large scene data corresponding to the set area through coordinate conversion according to geometric data included in the three-dimensional texture TIN data;
and the image fusion module is used for performing image fusion on the three-dimensional texture TIN data and the large scene data through texture rendering according to the geometric data and the texture data which are included in the three-dimensional texture TIN data.
With reference to the second aspect, an embodiment of the present invention provides a first possible implementation manner of the second aspect, where the spatial position fusion module includes:
the determining unit is used for determining the type of the three-dimensional texture TIN data according to a preset range threshold;
the first conversion unit is used for converting a reference surface of the three-dimensional texture TIN data into a reference surface of large scene data corresponding to the set area through coordinate conversion and converting the converted three-dimensional texture TIN data into a geographic coordinate system when the three-dimensional texture TIN data is determined to be of the first type;
and the second conversion unit is used for carrying out same-name feature point matching on the three-dimensional texture TIN data in a graphical conversion mode when the three-dimensional texture TIN data is determined to be of the second type, and converting the matched three-dimensional texture TIN data into the geographic coordinate system.
With reference to the second aspect, an embodiment of the present invention provides a second possible implementation manner of the second aspect, where the image fusion module includes:
the generating unit is used for generating a real projective image TDOM corresponding to the three-dimensional texture TIN according to the geometric data and the texture data included in the three-dimensional texture TIN data;
and the overlaying unit is used for overlaying the TDOM on the terrain texture corresponding to the large scene data through texture rendering.
With reference to the second aspect, an embodiment of the present invention provides a third possible implementation manner of the second aspect, where the apparatus further includes:
and the visualization module is used for fusing the three-dimensional texture TIN data with the large scene data and then visualizing the fused data corresponding to the viewpoint position according to the viewpoint position and the resolution of the three-dimensional texture TIN data.
With reference to the second possible implementation manner of the second aspect, an embodiment of the present invention provides a fourth possible implementation manner of the second aspect, where the generating unit includes:
a setting subunit, configured to set a rendering window according to a range of the TDOM to be generated;
an obtaining subunit, configured to obtain, from the three-dimensional texture TIN data, slice data that intersects with the rendering window;
and the rendering subunit is used for performing coloring processing on the slice data through a graphics processing program and rendering the slice data through a parallel projection rendering mode to obtain the TDOM corresponding to the three-dimensional texture TIN.
In the method and the device provided by the embodiment of the invention, according to the geometric data included in the three-dimensional texture TIN data, the three-dimensional texture TIN data and the large scene data corresponding to the set area are fused in the space position through coordinate transformation; and performing image fusion on the three-dimensional texture TIN data and the large scene data through texture rendering according to the geometric data and the texture data included in the three-dimensional texture TIN data. According to the invention, the three-dimensional texture TIN data is transformed to the coordinate which is the same as the large scene data through coordinate transformation, so that the deformation caused by the curvature of the earth is eliminated, the fusion precision on the spatial position is higher, and the error is smaller. Image fusion is carried out through texture rendering, so that fusion of spatial data on texture details is more perfect, and the visualization problems of patch shielding, loopholes and the like in the subsequent display process are avoided.
In order to make the aforementioned and other objects, features and advantages of the present invention comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
Fig. 1 shows a flowchart of a fusion method of three-dimensional texture TIN data and large scene data according to embodiment 1 of the present invention;
fig. 2 is a schematic structural diagram illustrating a device for fusing three-dimensional texture TIN data and large scene data according to embodiment 2 of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present invention without making any creative effort, shall fall within the protection scope of the present invention.
Considering that the three-dimensional texture TIN data and the large scene data have different precisions and errors in data management, the high-precision three-dimensional texture TIN data is directly fused into the low-precision large scene data in the related technology, the fusion accuracy is very low, and the errors are very high. And the patch of the three-dimensional texture TIN data and the patch of the large scene terrain can be shielded mutually during subsequent display, and a leak can exist during browsing from different angles. Based on the above, the embodiment of the invention provides a method and a device for fusing three-dimensional texture TIN data and large scene data. The following is described by way of example.
Example 1
Referring to fig. 1, an embodiment of the present invention provides a method for fusing three-dimensional texture TIN data and large scene data. The method specifically comprises the following steps:
step 101: and carrying out oblique photography on the set area to obtain three-dimensional texture TIN data corresponding to the set area.
The set area may be a specific geographic area. The three-dimensional texture TIN data corresponding to the set area is obtained by performing oblique photography on the set area through oblique photography technology and then processing the three-dimensional data obtained by oblique photography by using street view factory or photoscan software.
The three-dimensional texture TIN data comprises geometric data, texture data and scene organization information. In the embodiment of the invention, three-dimensional texture TIN data corresponding to a storage setting area is organized and stored in a sub-directory form through LOD (Levels of Detail), and slice data of LOD layers are stored in each sub-directory. Wherein the slice data in each level corresponds to a different data resolution. For example, assuming that three-dimensional texture TIN data of 2 levels are generated, the three-dimensional texture TIN data of the layer with the highest data resolution and the clearest data resolution is generated first, and then another layer of three-dimensional texture TIN data is generated through a patch reduction algorithm.
In the embodiment of the present invention, when storing the three-dimensional texture TIN data corresponding to the set area according to the subdirectory by the LOD, the three-dimensional texture TIN data is stored as data in a preset format, where the preset format may be an osgb (open Scene graphical binary) format. In each subdirectory, the geometry data included with the slice data in that subdirectory can be stored in some custom format. The texture data included in the slice data for the subdirectory may be stored in a format of jpg (jpeg, picture format) or the like. The scene organization information included in the slice data for the subdirectory may be stored in a format such as XML (Extensible markup language).
Three-dimensional texture TIN data corresponding to a set area is obtained through the operation of the step 101, and after the three-dimensional texture TIN data is stored in layers according to subdirectories through LOD, the three-dimensional texture TIN data is fused into large scene data of the digital earth corresponding to the set area through the following operations of the steps 102 and 103.
Step 102: and according to the geometric data included in the three-dimensional texture TIN data, carrying out spatial position fusion on the three-dimensional texture TIN data and the large scene data corresponding to the set area through coordinate conversion.
Since the earth has a certain curvature of the earth, the curvature of the three-dimensional texture TIN data obtained by shooting a set area through oblique photography is greatly different from the curvature of the earth, and in order to improve the fusion accuracy of the three-dimensional texture TIN data and the large scene data of the digital earth, the three-dimensional texture TIN data needs to be converted into a geographic coordinate system where the large scene data is located through coordinate conversion, so that fusion deformation caused by the difference between the curvature of the earth and the curvature of the three-dimensional texture TIN data is eliminated, and the fusion accuracy is improved.
The specific fusion process for performing spatial fusion described above is as follows:
determining the type of the three-dimensional texture TIN data according to a preset range threshold; when the three-dimensional texture TIN data is determined to be of the first type, converting a reference plane of the three-dimensional texture TIN data into a reference plane of large scene data corresponding to a set area through coordinate conversion, and converting the converted three-dimensional texture TIN data into a geographic coordinate system; and when the three-dimensional texture TIN data is determined to be of the second type, carrying out same-name feature point matching on the three-dimensional texture TIN data in a graphical conversion mode, and converting the matched three-dimensional texture TIN data into a geographic coordinate system.
The preset range threshold is a preset range that the aerial photography aircraft flies through, namely a range covered by a flight line of the aerial photography aircraft, and the preset range threshold can be 100 square kilometers or 200 square kilometers. And determining the type of the three-dimensional texture TIN data according to the preset range threshold, and if the shooting range of the three-dimensional texture TIN data is larger than the preset range threshold, determining that the type of the three-dimensional texture TIN data is a first type, namely the large-range three-dimensional texture TIN data. And if the shooting range of the three-dimensional texture TIN data is smaller than or equal to the preset range threshold, determining that the type of the three-dimensional texture TIN data is a second type, namely, the three-dimensional texture TIN data in a small range. The larger the shooting range is, the more significant the influence of the earth curvature is, and the smaller the shooting range is, the weaker the influence of the earth curvature is.
When the type of the three-dimensional texture TIN data is determined to be the first type, the three-dimensional texture TIN data is converted from a projection coordinate system to a Cartesian space rectangular coordinate system. In a Cartesian space rectangular coordinate system, a reference plane of the three-dimensional texture TIN data is converted into a reference plane of large scene data corresponding to a set area by adopting a Boolean parameter conversion mode, and then the converted three-dimensional texture TIN data is converted into a geographic coordinate system.
When the type of the three-dimensional texture TIN data is determined to be the second type, the three-dimensional texture TIN data is small-range data, the shooting range is small, and the influence of the curvature of the earth is not obvious, so when the three-dimensional texture TIN data is converted into a geographic coordinate system, the complex coordinate transformation mode used for converting the large-range three-dimensional texture TIN data is not used, the translation, rotation and scaling conversion in a graphical conversion mode is directly adopted, the adjustment parameters for translating, rotating and scaling the three-dimensional texture TIN data are adjusted, the matching of the three-dimensional texture TIN data and the same-name feature points in large scene data corresponding to a set area is realized, and then the three-dimensional texture TIN data is converted into the geographic coordinate system.
In the embodiment of the present invention, for a large range of three-dimensional texture TIN data, the schedule parameters of the LOD level in the scene organization information corresponding to the three-dimensional texture TIN data also need to be converted into the geographic coordinate system in the same conversion manner as the large range of three-dimensional texture TIN data, and then converted into the northeast coordinate system. For small-range three-dimensional texture TIN data, when the coordinate conversion is carried out by adopting the modes of translation, rotation and scaling of graphics, the original parameters are directly used for scheduling without modifying the scheduling parameters of the LOD level.
After the three-dimensional texture TIN data are converted into the geographic coordinate system in the mode, the curvature of the three-dimensional texture TIN data is the same as that of the large scene data corresponding to the set area, the three-dimensional texture TIN data and the large scene data are fused in the spatial position, and the fusion accuracy is greatly improved.
After the fusion of the three-dimensional texture TIN data and the large scene data at the spatial position is realized through the operation of step 102, the image fusion needs to be performed through the operation of step 103, so that the complete fusion of the three-dimensional texture TIN data and the large scene data in the set area can be finally realized.
Step 103: and performing image fusion on the three-dimensional texture TIN data and the large scene data through texture rendering according to the geometric data and the texture data included in the three-dimensional texture TIN data.
The specific fusion process of the image fusion comprises the following steps:
generating a TDOM (True Digital Ortho Map) corresponding to the three-dimensional texture TIN according to the geometric data and the texture data included in the three-dimensional texture TIN data; and overlaying the TDOM on the terrain texture corresponding to the large scene data through texture rendering.
In the embodiment of the present invention, the TDOM corresponding to the three-dimensional texture TIN may be generated by the following method, which specifically includes:
setting a rendering window according to the range of the TDOM to be generated; acquiring slice data intersected with a rendering window from the three-dimensional texture TIN data; and coloring the slice data through a graphic processing program, and rendering the slice data through a parallel projection rendering mode to obtain the TDOM corresponding to the three-dimensional texture TIN.
Because the TDOM to be generated has a large range and a large number of pixels, and the number of pixels which can be rendered by the graphics hardware is limited, the invention adopts a block rendering mode, thereby not only adapting to the rendering capability of the graphics hardware, but also improving the rendering efficiency.
During block rendering, firstly setting the size of a geographic space of a TDOM block, namely the length and the width of the TDOM block according to the resolution of texture data of a three-dimensional texture TIN and the number of pixels of a rendering window; setting the geospatial range of each TDOM partition according to the geospatial range of the three-dimensional texture TIN data and the geospatial size of the TDOM partition; the range of the rendering window is set according to the geospatial range of the TDOM patch to be generated.
The range of the rendering window is the geospatial range of the TDOM patch to be generated. The range of the rendering window is used to compute the Matrix used in generating the TDOM, and includes a model view Matrix (ModelView Matrix) and a Projection Matrix (Projection Matrix).
In the embodiment of the invention, the geometric data and the texture data converted into the geographic coordinate system in the fusion process are subjected to parallel projection rendering. In the embodiment of the invention, the LOD is used for organizing and storing the three-dimensional texture TIN data according to the subdirectories, and the slice data in the subdirectories are organized and stored by the quadtree by adopting two-level indexes. Therefore, block rendering can be performed according to the quadtree organization form of the slice data in each subdirectory during rendering, and rendering efficiency is greatly improved.
When a block is rendered, slice data of an LOD level with low resolution is rendered first, Z-Buffer is cleared after the slice data of one level is rendered, namely a depth Buffer area is cleared, then slice data of an LOD level with higher resolution is rendered, and the process is repeated until all LOD levels are rendered. And synthesizing all the block images into a TDOM image. The rendering method can ensure that each pixel of the TDOM is generated by using the data rendering of the LOD level with the highest resolution under the condition that the coverage areas of the data of different LOD levels are not completely consistent.
When rendering, firstly, a graphics rasterization rendering mode is utilized, a Projection direction is set to be vertical to the ground by using a parallel Projection Matrix, and a model view Matrix (ModelView Matrix) and a Projection Matrix (Projection Matrix) are set according to the texture resolution of three-dimensional texture TIN data and the range of the rendering window. Secondly, according to the range of the rendering window and the subdirectory index, a subdirectory which has an intersection with the rendering window is screened from each subdirectory of the three-dimensional texture TIN data, then the slice data index is used for selecting the slice data which has the intersection in the selected subdirectory, and the geometric data and the texture data of the slice data are loaded in the memory.
The color is written into a Frame Buffer (Frame Buffer) as a pixel value by using a GPU (Graphics Processing Unit) program, because a texture object of RGBA (Red Green blue alpha) four channels is bound in the Frame Buffer before rendering, after rendering, the generated TDOM is stored in the texture object, and the rendered result is read into a memory from a display memory, that is, the TDOM in the texture object is read into the memory from the display memory and then stored on a hard disk.
In an embodiment of the present invention, LRU (Least Recently Used) is Used to maintain the three-dimensional texture TIN data cached in the memory. If the memory cache is full, deleting the three-dimensional texture TIN data which is not Used for the long time from the memory cache according to the idea of LRU (Least Recently Used).
In the embodiment of the invention, after the TDOM corresponding to the three-dimensional texture TIN data is generated, the TDOM is sliced, the slice of the clearest layer directly uses the block image generated when the TDOM is rendered, the block image which is intersected with the slice data is loaded into a video memory as the texture, and the slice is obtained through parallel projection rendering and the loaded block image is cached.
In the embodiment of the invention, a bilinear filtering algorithm with graphic hardware acceleration is adopted, and the slices of the next layer are merged and scaled to generate the slices of the previous layer until all the slices are finished so as to accelerate the TDOM slicing process.
In the real-time example of the present invention, TDOMs may be generated from the geometry data and texture data of the three-dimensional texture TIN data in the manner described above. The generated TDOM includes color information of each detailed position in the set area. In the embodiment of the invention, the generated TDOM is superposed on the terrain Texture corresponding To the large scene data corresponding To the set area through a Render To Texture technology, namely, the TDOM slice corresponding To the set area is fused with the DOM (digital ortho Map) corresponding To the large scene data, and then the fused Texture is superposed on the terrain corresponding To the large scene data through a Texture mapping technology, so that the three-dimensional Texture TIN data and the large scene data are fused on the Texture and color details, and the fusion accuracy is improved.
The fusion of the three-dimensional texture TIN data and the large scene data of the set area is realized through the operation of the above step 101-103, and then when the user needs to browse the image of the set area, the spatial image of the set area can be displayed to the user by scheduling and visualizing the fused spatial data.
After the three-dimensional texture TIN data and the large scene data are fused, the fused data corresponding to the viewpoint position are visualized according to the viewpoint position and the resolution of the three-dimensional texture TIN data.
And realizing data scheduling and visualization of the three-dimensional texture TIN according to the viewpoint position, the position of the three-dimensional texture TIN slice and the resolution of the three-dimensional texture TIN data. When scheduling three-dimensional texture TIN data, for the large-range three-dimensional texture TIN data, the LOD-level scheduling parameters in the data scene organization information also need to be converted into a geographic coordinate system in the same conversion mode as the large-range three-dimensional texture TIN data, and then converted into an northeast (ENU) coordinate system. For small-range three-dimensional texture TIN data, a graphical translation, rotation and scaling mode is adopted, the scheduling parameters of the LOD level are not required to be modified, and the original parameters are directly used for scheduling.
The embodiment of the invention provides an effective fusion method of three-dimensional texture TIN data and digital earth large scene data, and establishes a unified theoretical framework for digital earth-oriented three-dimensional texture TIN data fusion display. The method not only ensures the fusion efficiency, but also ensures the fusion effect and satisfies the balance.
In the embodiment of the invention, according to the geometric data included in the three-dimensional texture TIN data, the three-dimensional texture TIN data and the large scene data corresponding to the set area are fused in the spatial position through coordinate transformation; and performing image fusion on the three-dimensional texture TIN data and the large scene data through texture rendering according to the geometric data and the texture data included in the three-dimensional texture TIN data. According to the invention, the three-dimensional texture TIN data is transformed to the coordinate which is the same as the large scene data through coordinate transformation, so that the deformation caused by the curvature of the earth is eliminated, the fusion precision on the spatial position is higher, and the error is smaller. Image fusion is carried out through texture rendering, so that fusion of spatial data on texture details is more perfect, and the visualization problems of patch shielding, loopholes and the like in the subsequent display process are avoided.
Example 2
Referring to fig. 2, an embodiment of the present invention provides a device for fusing three-dimensional texture TIN data and large scene data, where the device is configured to execute the method for fusing three-dimensional texture TIN data and large scene data provided in embodiment 1. The device specifically includes:
the oblique photography module 201 is configured to perform oblique photography on the set area to obtain three-dimensional texture TIN data corresponding to the set area;
the spatial position fusion module 202 is configured to perform spatial position fusion on the three-dimensional texture TIN data and large scene data corresponding to the set region through coordinate transformation according to geometric data included in the three-dimensional texture TIN data;
and the image fusion module 203 is configured to perform image fusion on the three-dimensional texture TIN data and the large scene data through texture rendering according to the geometric data and the texture data included in the three-dimensional texture TIN data.
The spatial position fusion module 202 includes:
the determining unit is used for determining the type of the three-dimensional texture TIN data according to a preset range threshold;
the first conversion unit is used for converting a reference surface of the three-dimensional texture TIN data into a reference surface of large scene data corresponding to a set area through coordinate conversion when the three-dimensional texture TIN data is determined to be of a first type, and converting the converted three-dimensional texture TIN data into a geographic coordinate system;
and the second conversion unit is used for carrying out same-name feature point matching on the three-dimensional texture TIN data in a graphical conversion mode when the three-dimensional texture TIN data is determined to be of the second type, and converting the matched three-dimensional texture TIN data into a geographic coordinate system.
The image fusion module 203 includes:
the generating unit is used for generating a real projective image TDOM corresponding to the three-dimensional texture TIN according to the geometric data and the texture data included in the three-dimensional texture TIN data;
and the overlaying unit is used for overlaying the TDOM on the terrain texture corresponding to the large scene data through texture rendering.
In an embodiment of the present invention, the apparatus further includes:
and the visualization module is used for fusing the three-dimensional texture TIN data with the large scene data and then carrying out visualization processing on fused data corresponding to the viewpoint position according to the viewpoint position and the resolution of the three-dimensional texture TIN data.
The generating unit includes:
a setting subunit, configured to set a rendering window according to a range of the TDOM to be generated;
the acquisition subunit is used for acquiring slice data intersected with the rendering window from the three-dimensional texture TIN data;
and the rendering subunit is used for performing coloring processing on the slice data through a graphic processing program and rendering the slice data through a parallel projection rendering mode to obtain the TDOM corresponding to the three-dimensional texture TIN data.
In the embodiment of the invention, according to the geometric data included in the three-dimensional texture TIN data, the three-dimensional texture TIN data and the large scene data corresponding to the set area are fused in the spatial position through coordinate transformation; and performing image fusion on the three-dimensional texture TIN data and the large scene data through texture rendering according to the geometric data and the texture data included in the three-dimensional texture TIN data. According to the invention, the three-dimensional texture TIN data is transformed to the coordinate which is the same as the large scene data through coordinate transformation, so that the deformation caused by the curvature of the earth is eliminated, the fusion precision on the spatial position is higher, and the error is smaller. Image fusion is carried out through texture rendering, so that fusion of spatial data on texture details is more perfect, and the visualization problems of patch shielding, loopholes and the like in the subsequent display process are avoided.
The fusion device of the three-dimensional texture TIN data and the large scene data provided by the embodiment of the invention can be specific hardware on equipment or software or firmware installed on the equipment and the like. It will be clear to those skilled in the art that for convenience and brevity of description, the specific operations of the system, apparatus and unit described above may all refer to corresponding processes in the above described method embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (6)

1. A method for fusing three-dimensional texture TIN data and large scene data is characterized by comprising the following steps:
performing oblique photography on a set area to obtain three-dimensional texture TIN data corresponding to the set area;
according to geometric data included in the three-dimensional texture TIN data, carrying out spatial position fusion on the three-dimensional texture TIN data and large scene data corresponding to the set area through coordinate conversion;
according to the geometric data and the texture data included in the three-dimensional texture TIN data, performing image fusion on the three-dimensional texture TIN data and the large scene data through texture rendering;
the performing image fusion on the three-dimensional texture TIN data and the large scene data through texture rendering according to the geometric data and the texture data included in the three-dimensional texture TIN data comprises:
generating a real projective image TDOM corresponding to the three-dimensional texture TIN according to the geometric data and the texture data included in the three-dimensional texture TIN data;
overlaying the TDOM on a terrain texture corresponding to the large scene data through texture rendering;
generating a real projective image TDOM corresponding to the three-dimensional texture TIN according to the geometric data and the texture data included in the three-dimensional texture TIN data, wherein the method comprises the following steps:
setting a rendering window according to the range of the TDOM to be generated;
acquiring slice data intersected with the rendering window from the three-dimensional texture TIN data;
and coloring the slice data through a graphic processing program, and rendering the slice data through a parallel projection rendering mode to obtain the TDOM corresponding to the three-dimensional texture TIN.
2. The method according to claim 1, wherein the spatially fusing the three-dimensional texture TIN data and the large scene data corresponding to the set area by coordinate transformation according to the geometric data included in the three-dimensional texture TIN data comprises:
determining the type of the three-dimensional texture TIN data according to a preset range threshold;
when the three-dimensional texture TIN data is determined to be of the first type, converting a reference plane of the three-dimensional texture TIN data into a reference plane of large scene data corresponding to the set area through coordinate conversion, and converting the converted three-dimensional texture TIN data into a geographic coordinate system;
and when the three-dimensional texture TIN data is determined to be of the second type, carrying out same-name feature point matching on the three-dimensional texture TIN data in a graphics conversion mode, and converting the matched three-dimensional texture TIN data into the geographic coordinate system.
3. The method of claim 1, further comprising:
and fusing the three-dimensional texture TIN data and the large scene data, and performing visual processing on fused data corresponding to the viewpoint position according to the viewpoint position and the resolution of the three-dimensional texture TIN data.
4. An apparatus for fusing three-dimensional texture TIN data and large scene data, the apparatus comprising:
the oblique photography module is used for carrying out oblique photography on a set area to obtain three-dimensional texture TIN data corresponding to the set area;
the spatial position fusion module is used for carrying out spatial position fusion on the three-dimensional texture TIN data and large scene data corresponding to the set area through coordinate conversion according to geometric data included in the three-dimensional texture TIN data;
the image fusion module is used for performing image fusion on the three-dimensional texture TIN data and the large scene data through texture rendering according to the geometric data and the texture data which are included in the three-dimensional texture TIN data;
the image fusion module comprises:
the generating unit is used for generating a real projective image TDOM corresponding to the three-dimensional texture TIN according to the geometric data and the texture data included in the three-dimensional texture TIN data;
the overlaying unit is used for overlaying the TDOM on a terrain texture corresponding to the large scene data through texture rendering;
the generation unit includes:
a setting subunit, configured to set a rendering window according to a range of the TDOM to be generated;
an obtaining subunit, configured to obtain, from the three-dimensional texture TIN data, slice data that intersects with the rendering window;
and the rendering subunit is used for performing coloring processing on the slice data through a graphics processing program and rendering the slice data through a parallel projection rendering mode to obtain the TDOM corresponding to the three-dimensional texture TIN.
5. The apparatus of claim 4, wherein the spatial location fusion module comprises:
the determining unit is used for determining the type of the three-dimensional texture TIN data according to a preset range threshold;
the first conversion unit is used for converting a reference surface of the three-dimensional texture TIN data into a reference surface of large scene data corresponding to the set area through coordinate conversion and converting the converted three-dimensional texture TIN data into a geographic coordinate system when the three-dimensional texture TIN data is determined to be of the first type;
and the second conversion unit is used for carrying out same-name feature point matching on the three-dimensional texture TIN data in a graphical conversion mode when the three-dimensional texture TIN data is determined to be of the second type, and converting the matched three-dimensional texture TIN data into the geographic coordinate system.
6. The apparatus of claim 4, further comprising:
and the visualization module is used for fusing the three-dimensional texture TIN data with the large scene data and then visualizing the fused data corresponding to the viewpoint position according to the viewpoint position and the resolution of the three-dimensional texture TIN data.
CN201610228555.0A 2016-04-13 2016-04-13 Fusion method and device of three-dimensional texture TIN data and large scene data Active CN105931284B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610228555.0A CN105931284B (en) 2016-04-13 2016-04-13 Fusion method and device of three-dimensional texture TIN data and large scene data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610228555.0A CN105931284B (en) 2016-04-13 2016-04-13 Fusion method and device of three-dimensional texture TIN data and large scene data

Publications (2)

Publication Number Publication Date
CN105931284A CN105931284A (en) 2016-09-07
CN105931284B true CN105931284B (en) 2019-12-31

Family

ID=56838158

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610228555.0A Active CN105931284B (en) 2016-04-13 2016-04-13 Fusion method and device of three-dimensional texture TIN data and large scene data

Country Status (1)

Country Link
CN (1) CN105931284B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108665536B (en) * 2018-05-14 2021-07-09 广州市城市规划勘测设计研究院 Three-dimensional and live-action data visualization method and device and computer readable storage medium
CN108717729A (en) * 2018-05-25 2018-10-30 武汉大学 A kind of online method for visualizing of landform multi-scale TIN of the Virtual earth
CN110097624B (en) * 2019-05-07 2023-08-25 洛阳众智软件科技股份有限公司 Method and device for generating three-dimensional data LOD simplified model
CN112652046B (en) * 2020-12-18 2024-03-22 完美世界(重庆)互动科技有限公司 Game picture generation method, device, equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101887595A (en) * 2009-05-14 2010-11-17 武汉如临其境科技创意有限公司 Three-dimensional digital earth-space data organizing and rendering method based on quad-tree index
CN104361628A (en) * 2014-11-27 2015-02-18 南宁市界围工程咨询有限公司 Three-dimensional real scene modeling system based on aviation oblique photograph measurement
CN105354832A (en) * 2015-10-10 2016-02-24 西南林业大学 Method for automatically registering mountainous area satellite image to geographical base map

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101887595A (en) * 2009-05-14 2010-11-17 武汉如临其境科技创意有限公司 Three-dimensional digital earth-space data organizing and rendering method based on quad-tree index
CN104361628A (en) * 2014-11-27 2015-02-18 南宁市界围工程咨询有限公司 Three-dimensional real scene modeling system based on aviation oblique photograph measurement
CN105354832A (en) * 2015-10-10 2016-02-24 西南林业大学 Method for automatically registering mountainous area satellite image to geographical base map

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
QuadTIN: Quadtree based Triangulated Irregular Networks;Renato Pajarola et al;《Proceedings IEEE Visualization 2002》;20021231;全文 *
局部区域表面一致性约束的三维模型纹理映射方法;李媛 等;《gis空间站》;http://www.gissky.net/Article/3459.htm;20150228;第1-3节,图38 *

Also Published As

Publication number Publication date
CN105931284A (en) 2016-09-07

Similar Documents

Publication Publication Date Title
US11551418B2 (en) Image rendering of laser scan data
CN108564527B (en) Panoramic image content completion and restoration method and device based on neural network
US20230053462A1 (en) Image rendering method and apparatus, device, medium, and computer program product
CN110956673A (en) Map drawing method and device
CN105931284B (en) Fusion method and device of three-dimensional texture TIN data and large scene data
EP2209092A1 (en) Method for unified visualisation of heterogeneous datasets
US10607409B2 (en) Synthetic geotagging for computer-generated images
CN110969691B (en) WebGL-based photographic data scheduling method and system
US10319062B2 (en) Rendering map data using descriptions of raster differences
JP7390497B2 (en) Image processing methods, apparatus, computer programs, and electronic devices
EP3655928B1 (en) Soft-occlusion for computer graphics rendering
CN110503718B (en) Three-dimensional engineering model lightweight display method
CN115546377B (en) Video fusion method and device, electronic equipment and storage medium
CN110852952B (en) Large-scale terrain real-time drawing method based on GPU
CN114782648A (en) Image processing method, image processing device, electronic equipment and storage medium
CN111026891A (en) Map bottom map integration method
CN108268138A (en) Processing method, device and the electronic equipment of augmented reality
CN112330806B (en) Visual synthesis method and system based on low-power-consumption hardware platform
CN113902832A (en) Flood three-dimensional dynamic evolution and rendering method and device and electronic equipment
CN113495933A (en) Vector tile display method and system
CN111639149A (en) Ocean data visualization method and device
CN114140593B (en) Digital earth and panorama fusion display method and device
CN115496829A (en) Method and device for quickly manufacturing local high-definition image map based on webpage
CN112825198B (en) Mobile tag display method, device, terminal equipment and readable storage medium
WO2020261392A1 (en) Learning device, object detection device, and learning method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant