CN114140593A - Digital earth and panorama fusion display method and device - Google Patents

Digital earth and panorama fusion display method and device Download PDF

Info

Publication number
CN114140593A
CN114140593A CN202111463394.0A CN202111463394A CN114140593A CN 114140593 A CN114140593 A CN 114140593A CN 202111463394 A CN202111463394 A CN 202111463394A CN 114140593 A CN114140593 A CN 114140593A
Authority
CN
China
Prior art keywords
digital earth
matrix
viewpoint
curved surface
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111463394.0A
Other languages
Chinese (zh)
Other versions
CN114140593B (en
Inventor
高宏建
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Morning Power Technology Co ltd
Original Assignee
Beijing Morning Power Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Morning Power Technology Co ltd filed Critical Beijing Morning Power Technology Co ltd
Priority to CN202111463394.0A priority Critical patent/CN114140593B/en
Publication of CN114140593A publication Critical patent/CN114140593A/en
Application granted granted Critical
Publication of CN114140593B publication Critical patent/CN114140593B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Remote Sensing (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

The invention provides a method and a device for fusion display of a digital earth and a panoramic picture, wherein the method comprises the following steps: constructing a digital earth; constructing a rotation body curved surface grid; dynamically loading a panoramic image, mapping the panoramic image as a texture to the surface of the surface mesh of the rotating body curved surface, rendering the rotating body curved surface on the digital earth, and moving the center point of the rotating body curved surface to the actual position for shooting the panoramic image; and moving the viewpoint of the digital earth to the central point of the rotating body curved surface, and setting the transparency control variable proportion to be obviously and implicitly rotated into the rotating body curved surface, thereby realizing the fusion display of the digital earth and the panoramic image. The invention can seamlessly integrate the panoramic picture created by the live-action picture into the digital earth in situ, so that the display effect of the digital earth takes the advantages of the panoramic picture and the digital ortho-image into account, the problem of insufficient display of the local detail of the digital earth is solved, the all-dimensional detail display capability of the local digital earth is improved, the all-dimensional live-action information of the designated position is displayed in a quasi-real-time manner, and the real-time display capability of the local detail information of the digital earth is improved.

Description

Digital earth and panorama fusion display method and device
Technical Field
The invention relates to the technical field of digital earth simulation of a three-dimensional geographic information system, in particular to a method and a device for fusion display of a digital earth and a panorama.
Background
The Digital earth is a Digital model constructed based on Digital ortho images (DOM) and Digital Elevation Models (DEM), and can display global landforms in a three-dimensional roaming manner and manage and display various geographic information elements. In the application of the digital earth, the data volume of geographic information such as digital orthographic images and the like is large, the processing and updating period of the data is too long, and the real-time performance is poor. In some applications, a local DOM or a Digital Surface Model (DSM) can be quickly acquired by means of remote measurement of an unmanned aerial vehicle and the like and displayed on the Digital earth, but the processing mode still has the problems of poor real-time performance, incapability of three-dimensional display, incapability of displaying all-dimensional real scenes and the like. The panoramic image is constructed based on the omnibearing live-action photos, the real scene of a certain time and place can be shown, the panoramic image is obtained in a quasi-real-time manner, the coverage area of the panoramic image is limited, the landform characteristics in a large range cannot be provided, and the global situation is lacked. The existing display method of the panoramic picture cannot be seamlessly integrated with the digital earth for display, and cannot realize gradual change fusion and in-situ comparison between the real scene and the digital earth effect.
Disclosure of Invention
Aiming at the problems of the existing digital earth and a panoramic image in data display, the invention aims to provide a method and a device for fusion display of the digital earth and the panoramic image, which are used for realizing seamless integrated display of the panoramic image on the digital earth, improving the local omnibearing detail display capability and real-time display capability of the digital earth and improving the information acquisition quality and effect based on the digital earth.
In order to achieve the purpose, the invention adopts the following technical scheme:
one aspect of the invention provides a method for fusing and displaying a digital earth and a panoramic image, which comprises the following steps:
constructing a digital earth;
constructing a rotation body curved surface grid;
dynamically loading a panoramic image, mapping the panoramic image to the surface of a rotating body curved surface grid by taking the panoramic image as a texture, three-dimensionally rendering the rotating body curved surface on a digital earth, and moving the center point of the rotating body curved surface to the actual position for shooting the panoramic image; and
and under the condition that the position of the digital earth viewpoint changes, displaying the panoramic image based on the projection transformation relation between the digital earth viewpoint and the center point of the surface of the revolution body, so that the digital earth and the panoramic image are displayed in a fusion mode when the digital earth viewpoint is transformed to the center point of the surface of the revolution body.
In some embodiments of the present invention, the step of constructing the digital earth comprises: the method comprises the steps of creating or obtaining DOM and DEM basic geographic data, wherein the DOM and DEM basic geographic data comprise DEM land tiles and DOM land tile data; managing the DEM block tile and DOM block tile data according to the position of the digital earth viewpoint; and loading DEM and DOM tile data based on the created tile shader, and performing three-dimensional rendering on the digital earth.
In some embodiments of the invention, the step of constructing the rotation volumetric surface mesh comprises: constructing vertex position data of the revolution body curved surface; constructing revolution body curved surface mesh index data based on the data of the vertex positions of the revolution body curved surface; and setting vertex texture coordinates of the rotated volume surface based on the rotated volume surface mesh index data.
In some embodiments of the invention, the body surface of revolution is obtained by a 360 degree rotation of the characteristic curve about a vertical axis.
In some embodiments of the present invention, the dynamically loading the panorama, mapping the panorama as a texture to a surface of a mesh of a rotated body surface, rendering the rotated body surface on the digital globe, and moving a center point of the rotated body surface to an actual position where the panorama is shot, comprises: dynamically loading the panoramic image, and calculating the starting point and the range of texture coordinates of the panoramic image based on the shooting parameters of the panoramic image; performing three-dimensional rendering of the rotated body surface based on the vertex position, the index, the texture coordinate data, the panoramic picture, the starting point of the texture coordinate of the panoramic picture and the range data of the rotated body surface; and calculating a rotation body position transformation matrix according to longitude, latitude and altitude data in the shooting parameters of the panoramic image, and moving the rotation body curved surface to the corresponding position of the digital earth.
In some embodiments of the invention, the rotation body position transformation matrix is obtained based on a matrix transformation operation as follows: translating the unit matrix to a geocentric rectangular coordinate system position which is rotated to the center point of the body curved surface to obtain a first matrix; rotating the first matrix around a vertical axis by a longitude value to obtain a second matrix; and rotating the latitude value of the second matrix around a specific horizontal axis to obtain a third matrix, and amplifying the third matrix in equal proportion to obtain the transformation matrix.
In some embodiments of the present invention, in a case that a position of a digital earth viewpoint changes, the displaying the panorama based on a projective transformation relationship between the digital earth viewpoint and a center point of the rotated body curved surface, so that the digital earth and the panorama are displayed in a fusion manner when the digital earth viewpoint is transformed to the center point of the rotated body curved surface, includes: calculating a digital earth viewpoint transformation matrix according to the position of the rotation body and the digital earth viewpoint angle; the digital earth viewpoint is transformed to a rotation body curved surface central point based on the digital earth viewpoint transformation matrix, and the digital earth and the panoramic image are displayed in a fusion mode; the method further comprises the following steps: when the digital earth viewpoint is converted to the center point of the revolution body curved surface, the digital earth and the rendered revolution body are displayed on the display interface.
In some embodiments of the invention, the digital earth viewpoint transformation matrix is obtained based on a matrix transformation operation as follows: translating the unit matrix to a geocentric rectangular coordinate system position which is rotated to the center point of the body curved surface to obtain a first matrix; rotating the first matrix around a vertical axis by a longitude value to obtain a second matrix; rotating the second matrix around a specific horizontal axis to obtain a third matrix; calculating a pitch transformation matrix according to the pitch angle of the current viewpoint direction, and obtaining a fourth matrix based on the third matrix and the pitch transformation matrix; and calculating an azimuth transformation matrix according to the azimuth angle of the current viewpoint direction, and obtaining the digital earth viewpoint transformation matrix based on the fourth matrix and the azimuth transformation matrix.
In some embodiments of the invention, the method further comprises: and controlling the hiding and fusion display of the panoramic image on the surface of the revolution body through a transparency control variable.
In some embodiments of the invention, the method further comprises displaying the fusion of the digital earth and the panorama on a virtual display device.
In another aspect of the present invention, there is provided a digital earth and panorama fusion display apparatus, which includes a processor and a memory, wherein the memory stores computer instructions, the processor is configured to execute the computer instructions stored in the memory, and when the computer instructions are executed by the processor, the color-taking apparatus implements the steps of the method as described above.
In a further aspect of the invention, a computer storage medium is also provided, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method as set forth above.
The method and the device for fusing and displaying the digital earth and the panoramic image can seamlessly fuse the panoramic image created by the live-action picture into the digital earth in situ, so that the display effect of the digital earth is combined with the advantages of the panoramic image and the digital ortho-image, the problem of insufficient display of the local detail of the digital earth is solved, the all-round detail display capability of the local digital earth is improved, the all-round live-action information at the specified position can be displayed in a quasi-real-time manner under the condition that the basic data of the digital earth is slowly updated, and the real-time display capability of the local detail information of the digital earth is improved.
Additional advantages, objects, and features of the invention will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
It will be appreciated by those skilled in the art that the objects and advantages that can be achieved with the present invention are not limited to the specific details set forth above, and that these and other objects that can be achieved with the present invention will be more clearly understood from the detailed description that follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the principles of the invention. In the drawings:
FIG. 1 is a schematic flow chart of a method for displaying a digital earth and a panorama in a fused manner according to an embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating the operation steps of a method for displaying a digital earth and a panorama in a fused manner according to an embodiment of the present invention;
FIG. 3 is a schematic representation of a characteristic curve for constructing a revolution in accordance with an embodiment of the present invention;
FIG. 4 is a schematic diagram illustrating the construction of a revolution solid surface index according to an embodiment of the present invention;
FIG. 5 is a diagram illustrating a rendering effect of a rotated volumetric surface mesh on a digital globe according to an embodiment of the present invention;
FIG. 6 is a schematic view of a panoramic picture according to an embodiment of the present invention;
FIG. 7 is a diagram illustrating the effect of a digital earth viewpoint outside a rotator according to an embodiment of the present invention;
FIG. 8 is a diagram illustrating the effect of a digital earth viewpoint on the center point of a rotation body according to an embodiment of the present invention;
FIG. 9 is a schematic diagram illustrating the fusion effect of the digital earth and the panorama according to an embodiment of the present invention;
fig. 10 is a schematic diagram illustrating the display effect of the digital earth and the panorama fused on the VR headset according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the following embodiments and accompanying drawings. The exemplary embodiments and descriptions of the present invention are provided to explain the present invention, but the present invention is not limited thereto.
It should be noted that, in order to avoid obscuring the present invention with unnecessary details, only the structures and/or processing steps closely related to the scheme according to the present invention are shown in the drawings, and other details not so relevant to the present invention are omitted.
It should be emphasized that the term "comprises/comprising" when used herein, is taken to specify the presence of stated features, elements, steps or components, but does not preclude the presence or addition of one or more other features, elements, steps or components.
In the embodiment of the invention, the seamless fusion of the digital earth and the panoramic image is realized by innovatively providing a fusion scheme of the digital earth and the panoramic image, and the problems that the omnibearing stereoscopic display cannot be realized and the real scene cannot be displayed in the prior art are solved.
Fig. 1 is a schematic flow chart of a method for displaying a digital earth and a panorama in a fusion manner according to an embodiment of the present invention. As an example, the method may be implemented by performing optimization on the basis of an open graphics library (OpenGL) under a Qt application development framework, and therefore, variables and function names of the Qt OpenGL are designed in the detailed implementation description, but the present invention is not limited thereto. As shown in fig. 1, the method for displaying the digital earth and the panorama by fusing comprises the following steps S110 to S140:
step S110, constructing a digital earth.
More specifically, as an example, in this step, a digital earth is constructed based on the DOM and the DEM.
DEMs typically present three-dimensional aspects of the earth's surface digitally in the form of a regular grid. DEM is a dataset of plane coordinates (X, Y) and elevation (Z) of regular grid points over a range. The grid spacing of the DEM is adapted to the elevation accuracy of the DEM and forms a regular grid series. The DEM is a virtual representation of the landform form, can derive information such as contour lines, gradient maps and the like, can be superposed with DOM (document object model) or other thematic data and is used for analysis application related to the landform, and is basic data for manufacturing the DOM.
The DOM is a digital ortho-image (i.e. satellite picture) data set generated by performing projection difference correction on scanned digital aerial pictures and the like by pixel elements by utilizing the DEM and cutting according to the map and the map range of the national basic scale topographic map, and the image of the DOM has map geometric accuracy and image characteristics at the same time.
In an embodiment of the present invention, the step of constructing the digital earth based on the DOM and the DEM may specifically include:
and step S111, preparing DOM and DEM basic geographic data.
The DOM and DEM base geographic data may include DEM parcel tiles and DOM parcel tile data, and thus preparation of the DOM and DEM base geographic data may include creating or obtaining base geographic data parcel tiles.
The parcel tile is the smallest organization unit for reading, saving and rendering data of the digital earth, and the tile data mainly comprises DEM tile data and DOM tile data. In one embodiment, a pyramid of tiles may be constructed according to the quadtree principle, with the tile files organized and managed by level, with a number name assigned to each tile file. For example, the named earth DOM and DEM tile files may be organized according to the projection slicing rules of WGS84 (World Geodetic System 1984), with 2 tile files at level 1, 8 tile files at level 2, 32 tile files at level 3, and so on to form a tile quadtree structure, which is merely an example and the invention is not limited thereto.
In another embodiment of the invention, the tile data of the land area which is already created as the basic geographic data can also be directly obtained from the predetermined DOM and DEM resource supplier.
Step S112, the tile data is managed according to the digital earth viewpoint position.
After preparation of the DOM and DEM underlying geographic data is completed, the digital earth initial viewpoint latitude and longitude position and altitude may be set. Determining the central point of the tile according to the longitude and latitude positions of the digital earth viewpoint; the displayable tile level and range is determined from the viewpoint altitude. And determining the current tile file to be loaded according to the center point, the level and the range of the tile.
Since digital earth tile data is massive data and a computer has limited memory, it is necessary to dynamically manage the tile data. The management of tile data includes, for example, dynamic loading, caching of tile data for a viewpoint area range, and deletion of tile data that does not need to be displayed.
For example, when loading a tile file, the embodiment of the present invention first retrieves the current tile cache, and if the tile file data to be loaded does not exist in the cache, loads the tile file from the specified location of the computer and caches the tile file. And if the tile file data to be loaded exists in the cache, directly reading the tile file from the cache. If the tile cache is full, the non-display area tile data is deleted.
And S113, loading DEM and DOM tile data to a computer video memory based on the created tile shader, and finishing three-dimensional rendering of the digital earth through a programmable rendering pipeline.
More specifically, the method comprises the steps of firstly processing DOM and DEM tile data in a computer memory to be displayed, creating a vertex array and an index array based on the DEM tile data, and creating mapping textures based on the DOM tile data. And then creating a tile shader, and loading the created vertex, index and texture data to a computer video memory by using the created tile shader. In a GPU, tile shaders include Vertex shaders (Vertex shaders) and Fragment shaders (Fragment shaders), and Vertex-by-Vertex and pixel-by-pixel rendering is performed through a Vertex Shader and Fragment Shader rendering pipeline.
In an alternative embodiment of the present invention, the tile shader may also be a pre-created tile shader.
Since the step S110 can be implemented by using the existing means for constructing the digital earth, the step is not described in detail. The present invention is not limited to the implementation of the present step by existing means, and may be implemented by other means for constructing a digital earth, which may come in the future.
Step S120, constructing a rotation volumetric surface mesh.
In an embodiment of the invention, the unit revolution body can be constructed as a basic revolution body curved surface, and the revolution body can be enlarged in equal proportion during three-dimensional rendering. The characteristic curve of the revolution body may be a circle, an ellipse, or a curve having any other shape. In one embodiment, a combined curve of a circle and an ellipse is used, as shown in FIG. 3, for applications where the panorama center is at a lower elevation, but the invention is not so limited. In fig. 3, the XY coordinate data of each data point of the characteristic curve may be stored in an array, such as a "baserotartcurve" array, for a total of 17 data points, and the 17 data points are only examples, and the present invention is not limited thereto.
When the characteristic curve is rotated from 0 degree to 360 degrees, and coordinate data of 17 data points on the characteristic curve is acquired every 10 degrees, a rotated body surface mesh having 17 × 360/10 =612 grid points (data points) can be obtained, and grid data in a case where the interval of 0 degrees to 360 degrees is 10 degrees (including a start angle, and data of 360 degrees in the azimuth is the same as data of 0 degrees in the azimuth) is obtained, that is, the rotated body surface mesh has 37 = 17=629 grid points in total, which is also referred to as a vertex.
The data of the rotated volume surface mesh includes two parts, namely vertex data and index data, and therefore, in an embodiment of the present invention, constructing the rotated volume surface mesh specifically includes the following steps S121 to S123:
and step S121, constructing vertex position data of the revolution solid curved surface.
The rotated volumetric surface vertex data may include vertex position data and vertex texture coordinates. The rotation body can be obtained by rotating the characteristic curve by 360 degrees around a specified axis (such as the Y axis, but not limited thereto), and the vertex position data of the curved surface of the rotation body is the grid coordinate data forming the surface of the rotation body.
In an embodiment of the present invention, the calculation process of the vertex position data is as follows: the characteristic curve of the rotation body is rotated around the Y axis from 0 degree until 360 degrees, and a set of coordinate data is taken every predetermined degree (for example, 10 degrees), and a total of 37 sets of coordinate data are taken. Specifically, the X value of a data point on the characteristic curve is multiplied by the sine value and the cosine value of the rotation angle respectively to obtain the X value and the Z value of a space rectangular coordinate system of the data point at different rotation angles, and the Y value is unchanged, so that three-dimensional coordinate data corresponding to 629 grid points is obtained and serves as vertex position data of the surface of the rotating body.
And S122, constructing the revolution body curved surface mesh index data based on the data of the vertex positions of the revolution body curved surface.
Specifically, the rotation volume surface mesh index data may be constructed according to the vertex value data arrangement order.
As an example, the generation of the rotated volumetric surface mesh index data is shown in fig. 4. The 629 grid points are arranged in a planar manner, 37 points in the transverse direction and 17 points in the longitudinal direction, and fig. 4 shows only a part of all grid points of the revolution body surface. Starting with the first row data at the bottom with the numbers 0,1,2, … …,36, the second row of data points with the numbers 37-73, the third row of data points with the numbers 74-110, and the top row of data points with the numbers 592-628. The numbers are stored in stripes (stripes), the first stripe may contain the first and second row of data, and the second stripe may contain the second and third row of data, … …, for a total of 16 stripes. Each stripe data point number is alternately saved according to the OpenGL triangle stripe index method, and each stripe is finally added with an identifier, such as data value 65535, to identify the end of the stripe data. For example, the first band data is 0,37,1,38,2,39, … …,36, 73, 65535. These strip data are the rotated volume surface mesh index data. Taking the data value 65535 as the end identifier of the stripe data is merely an example, and other identifiers may be used to indicate the end of the stripe data.
And S123, setting vertex texture coordinates of the rotated body curved surface based on the rotated body curved surface mesh index data.
After the mesh index data of the rotated body surface is obtained, texture coordinates can be set for each vertex of the rotated body surface to prepare for subsequent texture mapping.
The vertex texture coordinate data is texture (U, V) coordinates (or UV coordinates), the U value in the UV coordinates changes along with the change of the rotation angle of the characteristic curve, and the V value does not change along with the change of the rotation angle. The value of U is obtained by dividing the rotation angle of the characteristic curve by 360, and the value of V is calculated from the index data of the array corresponding to the rotated characteristic curve. Assuming that the k value is the first dimension index of the array of the basic rotation body curve (baserotartcurve), and the value range is 0-16, the value of the texture coordinate V of the corresponding vertex is obtained by dividing k by 16.
The vertex data and the index data are basic data for rotating the three-dimensional rendering of the body surface, and the three-dimensional rendering operation of the subsequent panoramic image is realized based on the basic data. For example, when performing texture mapping on a surface of a revolution solid, it is necessary to assign a set of texture coordinates to each vertex on the surface of the revolution solid, and mark the position of the vertex in the panorama, so as to establish a mapping relationship between the elements of the panorama and the texture coordinates. That is, in the case where vertex data including vertex position data and vertex texture coordinate data and index data are obtained, three-dimensional rendering of the panorama on the rotated surface of the body may be subsequently achieved by the rendering tool.
The rotated volumetric curved surface mesh created in this step S120 may be directly displayed at a designated position of the digital earth, such as a rendering effect diagram of the rotated volumetric curved surface mesh on the digital earth as shown in fig. 5.
And step S130, dynamically loading the panoramic image, mapping the panoramic image as a texture to the surface of the mesh of the rotating body curved surface, three-dimensionally rendering the rotating body curved surface on the digital earth, and moving the center point of the rotating body curved surface to the actual position for shooting the panoramic image.
The panorama may be a still picture or a frame extracted from the panoramic video. In the panoramic image shooting process, geographic coordinates (longitude, latitude and altitude) and angles (azimuth angle and pitch angle) of shooting points are recorded, so that the spliced panoramic image product has spatial attributes and can be fused in situ on a digital earth.
In an embodiment of the present invention, step S130 may include the following steps:
step S131, dynamically loading the panorama into a memory of the computer by using OpenGL. Then, an OpenGL texture variable is created based on the specified shooting parameters of the panorama, and the OpenGL texture variable includes a start point and a range of texture coordinates of the panorama.
More specifically, the start point and range of texture coordinates of the panorama can be calculated from the location parameters such as the azimuth and the elevation angle of the panorama and the coverage area. The complete panorama coverage is 360 degrees in azimuth and 180 degrees in elevation. In practice, the panorama may be incomplete, covering only a portion of the azimuth and pitch angles.
When the panorama acquisition is real-time acquisition, the superimposed display of the rotation body displayed on the digital globe will also have real-time performance, and the panorama used in an embodiment of the present invention is shown in fig. 6. Texture coordinates can be calculated according to the azimuth and pitch angle ranges of the panorama, and texUDeg, texVDeg, texURange and texVRange can respectively represent an azimuth angle starting value, a pitch angle starting value, an azimuth angle coverage range and a pitch angle coverage range of the panorama. Dividing texUDeg and texURange by 360 may yield a panorama texture U value starting point and range, and dividing texVDeg and texVRange by 180 may yield a panorama texture V value starting point and range, respectively.
The panorama texture coordinates are used to align the panorama and the rotation volume mesh.
And step S132, performing three-dimensional rendering of the rotating body curved surface based on the vertex position, the index, the texture coordinate data, the panoramic picture, the starting point of the texture coordinate of the panoramic picture and the range data of the rotating body curved surface.
Three-dimensional rendering of the rotated volumetric surface may be performed, for example, by a programmable rendering pipeline.
More specifically, a revolution shader can be created for a revolution and utilized to load the revolution surface vertex positions, indices, texture coordinate data, and panorama, panorama texture coordinate start point, range data into a video memory, and then the three-dimensional rendering of the revolution surface is completed through a programmable rendering pipeline.
In one embodiment of the invention, a mapping relationship between vertex texture coordinate data and panorama texture table data may be established to align the panorama and the rotated volume mesh, and the panorama is three-dimensionally rendered onto the rotated volume surface by texture mapping based on the established mapping relationship. When rendering the rotation volume surface, the texture variables are bound to a programmable rendering pipeline.
And step S133, calculating a rotation body position transformation matrix according to the actual longitude, latitude and altitude of the shot panoramic image, and moving the rotation body curved surface to the corresponding position of the digital earth.
More specifically, the shooting parameters of the panorama further include information such as actual longitude, latitude, and altitude, and the longitude, latitude, and altitude position of the rotating surface on the digital earth can be determined based on the information of the actual longitude, latitude, and altitude of the shot panorama. And calculating a position transformation matrix of the rotation body according to the longitude and latitude and the height position of the rotation body curved surface on the digital earth, wherein the position transformation matrix is used for moving the center point of the rotation body curved surface to the actual position of the shot panoramic picture on the digital earth.
In an embodiment of the present invention, the rotation matrix is obtained by performing a translation, two rotations and a scaling on the identity matrix. A transformed matrix baseMatrix variable under a Qt environment is defined as QMatrix4x4 types, one translation is to translate a unit matrix to the geocentric rectangular coordinate system position of a curved surface central point to obtain a first matrix, the geocentric rectangular coordinate system position is obtained by calculation according to longitude, latitude and altitude of the curved surface central point, and the transformation matrix is realized by a baseMatrix. The first rotation is to rotate the first matrix by the longitude value around the vertical axis (Y axis) to obtain a second matrix, the second rotation is to rotate the first matrix by the latitude value around a specific horizontal axis (such as X axis) to obtain a third matrix, and the matrix rotation is obtained by the functions of basematrix. One scaling up is to scale up the third matrix, or to scale up the unit rotation to size, by the function basematrix. The rotation body magnification is generally not less than 100 (equivalent to the panoramic image is 100 meters away from the viewpoint), the size is too small, and the matching degree between the panoramic image at the edge of the view field and the digital earth is easy to reduce. And the unit matrix is transformed for four times to obtain a rotation body position transformation matrix. And based on the obtained transformation matrix, the rotation body curved surface can be moved to the corresponding position of the digital earth.
In this step S130, the rotation body after the texture mapping of the panoramic image can be rendered in three dimensions and moved to the designated position of the digital earth through a three-dimensional rendering command gldraawelementsbasevertex in OpenGL.
And step S140, under the condition that the position of the digital earth viewpoint changes, displaying the panoramic image based on the projection transformation relation between the digital earth viewpoint and the center point of the rotating body curved surface, so that the digital earth and the panoramic image are displayed in a fusion mode when the digital earth viewpoint is transformed to the center point of the rotating body curved surface.
When the panorama is acquired in real time and dynamically loaded in the implementation process of the invention, the superposition display on the digital earth also has real time, and the visual angle of the panorama on the rotating body is changed based on the projection transformation relation between the digital earth viewpoint and the curved surface central point of the rotating body every time the digital earth viewpoint is moved through predefined operation. And when the digital earth viewpoint is transformed to the center point of the rotation body curved surface, the digital earth and the panoramic image are fused together to be displayed.
This step S140 is used to adjust the display of the digital earth and the panorama in real time using the transformation matrix when the viewpoint of the digital earth is transformed.
In an embodiment of the invention, a digital earth viewpoint transformation matrix can be calculated according to the position of the rotation body and the digital earth viewpoint angle, the digital earth viewpoint transformation matrix is used for transforming the digital earth viewpoint, the relation between the digital earth viewpoint and the central point of the panorama is matched, and the digital earth viewpoint angle is changed to browse the panorama.
In a specific example, the digital earth viewpoint position and digital earth viewpoint angle change may be accomplished through a series of matrix transformations. As an example, the digital earth viewpoint transformation matrix may be obtained by five transformations from the identity matrix, the first three transformations may be the same as the first three transformations of the rotation body position transformation matrix in step S130, the first matrix is obtained after one translation, and the second matrix and the third matrix are obtained after the first rotation and the second rotation, respectively. The digital earth viewpoint transformation matrix viewMatrix variable is defined as QMatrix4x4 class. The fourth transformation is to calculate a pitch transformation matrix according to the pitch angle of the current viewpoint direction, obtain a fourth matrix based on the third matrix and the pitch transformation matrix, and obtain the fourth matrix from a function viewmatrix. The fifth transformation is to calculate an azimuth transformation matrix according to the azimuth angle of the current viewpoint direction, and the azimuth transformation matrix is obtained by a function viewmatrix.
When the digital earth viewpoint is outside the rotation body, the digital earth posture and the rotation body are seen, as shown in fig. 7; when the viewpoint of the digital earth is moved to the inside of the rotation body, the effect of the fused display of the panorama and the digital earth is seen, and as shown in fig. 8, the effect of only displaying the panorama when the panorama is set to be opaque is shown.
In one embodiment of the invention, a transparency control variable may be set to display or hide the panorama on the surface of the rotated body. For example, when the transparency variable takes a value of 0 or 1, the digital earth effect is a digital orthophoto or panorama effect; when the transparency is between 0 and 1, the digital earth effect is a proportional fusion effect of the panoramic image and the digital ortho image. The transparency fusion realizes the gradual change switching between the real scene effect and the digital orthographic image effect. More specifically, the transparency of the rotated body surface display may be controlled by setting a floating point variable vralopha, which is passed to the rotated body shader. In the code of Fragment Shader (Fragment Shader), the panorama texture transparency can be set by setting the value of a variable of vrallpha. When vralppha is set to 0, the rotation body is completely hidden; when vralppha is set to 1, the revolution is fully displayed; when vrallpha is set between 0 and 1, the rotation body is a proportional fusion effect of the digital earth and the panorama, as shown in fig. 9.
In an embodiment of the present invention, the merged view of the digital earth and the panorama can be displayed on a Virtual Reality (VR) head mounted display, as shown in fig. 10, which is a merged display view of the digital earth viewpoint when the digital earth viewpoint is rotated into a body curvature plane by the left eye and the right eye through the VR head mounted display, respectively, and the merged display of the digital earth and the panorama is particularly immersive.
The method moves the digital earth viewpoint to the center point of the rotating body curved surface through matrix transformation operation, moves the viewpoint direction and can display the panoramic image in any direction.
Fig. 2 is a schematic diagram illustrating the operation steps of the method for displaying the digital earth and the panorama in a fused manner according to an embodiment of the present invention. As shown in fig. 2, before the digital earth rendering, a digital earth viewpoint position is first determined, where the digital earth viewpoint may be a viewpoint currently set or selected by performing an operation on the digital earth, or an earth viewpoint transformed by an earth viewpoint transformation matrix after a previous frame of panoramic image rendering is rotated into a volume. After determining the digital earth viewpoint, the tile data is managed according to the digital earth viewpoint position, and then the digital earth is rendered using the created tile shader. And further, loading a panoramic image based on the created revolution body curved surface mesh, after the panoramic image is aligned with the revolution body curved surface mesh, three-dimensionally rendering the panoramic image onto the revolution body curved surface by using the created revolution body shader, and further realizing the fusion display of the digital earth and the panoramic image through the position change and/or transparency setting of the revolution body and the transformation of the viewpoint of the digital earth. By receiving new panoramic picture frames, continuous and real-time seamless fusion display of the digital earth and the panoramic can be realized.
In conclusion, the method disclosed by the invention is verified by examples, and the seamless fusion display effect of the digital earth and the panoramic image is realized. The method for fusing and displaying the digital earth and the panoramic image is an innovation in the field of three-dimensional geographic information display based on the digital earth, and has wide application prospects in various fields of flight simulation display, remote sensing mapping, emergency rescue, science popularization, travel, entertainment and the like.
The digital earth and panorama fusion display method has the following beneficial effects:
1) the panoramic picture created by the live-action picture is seamlessly fused into the digital earth, so that the display effect of the digital earth takes the advantages of the panoramic picture and the digital ortho-image into account, the problem of insufficient display of local details of the digital earth is solved, and the omnibearing detail display capability of the local digital earth is improved;
2) the method and the device have the advantages that the omnibearing live-action information at the specified position can be displayed in a quasi-real-time manner under the condition that the basic data of the digital earth is slowly updated, and the real-time display capability of the local detail information of the digital earth is improved;
3) seamless fusion display and in-situ display of the panoramic image in the digital earth are realized, gradual change switching between the virtual digital earth and the real image is realized through transparency change, and virtual-real comparison is facilitated;
4) the unification of digital global display and panoramic picture omnibearing display is realized, and particularly on virtual reality display equipment, the information acquisition perception effect can be remarkably improved.
In accordance with the above method, the present invention further provides a digital earth and panorama fusion display device, which may include a processor and a memory, wherein the memory stores computer instructions, and the processor is configured to execute the computer instructions stored in the memory, and when the computer instructions are executed by the processor, the device implements the steps of the digital earth and panorama fusion display method as described above.
Embodiments of the present invention further provide a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by a processor to implement the foregoing steps of the edge computing server deployment method. The computer readable storage medium may be a tangible storage medium such as an optical disk, a U disk, a floppy disk, a hard disk, and the like.
It is to be understood that the invention is not limited to the specific arrangements and instrumentality described above and shown in the drawings. A detailed description of known methods is omitted herein for the sake of brevity. In the above embodiments, several specific steps are described and shown as examples. However, the method processes of the present invention are not limited to the specific steps described and illustrated, and those skilled in the art can make various changes, modifications and additions or change the order between the steps after comprehending the spirit of the present invention.
Those of ordinary skill in the art will appreciate that the various illustrative components, systems, and methods described in connection with the embodiments disclosed herein may be implemented as hardware, software, or combinations of both. Whether this is done in hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention. When implemented in hardware, it may be, for example, an electronic circuit, an Application Specific Integrated Circuit (ASIC), suitable firmware, plug-in, function card, or the like. When implemented in software, the elements of the invention are the programs or code segments used to perform the required tasks. The program or code segments may be stored in a machine-readable medium or transmitted by a data signal carried in a carrier wave over a transmission medium or a communication link.
It should also be noted that the exemplary embodiments mentioned in this patent describe some methods or systems based on a series of steps or devices. However, the present invention is not limited to the order of the above-described steps, that is, the steps may be performed in the order mentioned in the embodiments, may be performed in an order different from the order in the embodiments, or may be performed simultaneously.
Features that are described and/or illustrated with respect to one embodiment may be used in the same way or in a similar way in one or more other embodiments and/or in combination with or instead of the features of the other embodiments in the present invention.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made to the embodiment of the present invention by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (11)

1. A method for fusing and displaying a digital earth and a panoramic picture is characterized by comprising the following steps:
constructing a digital earth;
constructing a rotation body curved surface grid;
dynamically loading a panoramic image, mapping the panoramic image to the surface of a rotating body curved surface grid by taking the panoramic image as a texture, three-dimensionally rendering the rotating body curved surface on a digital earth, and moving the center point of the rotating body curved surface to the actual position for shooting the panoramic image; and
and under the condition that the position of the digital earth viewpoint changes, displaying the panoramic image based on the projection transformation relation between the digital earth viewpoint and the center point of the surface of the revolution body, so that the digital earth and the panoramic image are displayed in a fusion mode when the digital earth viewpoint is transformed to the center point of the surface of the revolution body.
2. The method of claim 1, wherein the step of building a digital earth comprises:
the method comprises the steps of creating or obtaining DOM and DEM basic geographic data, wherein the DOM and DEM basic geographic data comprise DEM land tiles and DOM land tile data;
managing the DEM block tile and DOM block tile data according to the position of the digital earth viewpoint; and
and loading DEM and DOM tile data based on the created tile shader, and performing three-dimensional rendering on the digital earth.
3. The method of claim 1, wherein the step of constructing a rotated volumetric surface mesh comprises:
constructing vertex position data of the revolution body curved surface;
constructing revolution body curved surface mesh index data based on the data of the vertex positions of the revolution body curved surface; and
and setting vertex texture coordinates of the rotated body surface based on the rotated body surface mesh index data.
4. A method according to claim 3, wherein the body surface of revolution is obtained by a characteristic curve rotated 360 degrees about a vertical axis.
5. The method of claim 1, wherein dynamically loading the panorama, mapping the panorama as a texture to a surface of a mesh of rotated volumetric surfaces, rendering the rotated volumetric surfaces on digital earth, and moving a center point of the rotated volumetric surfaces to an actual location where the panorama was captured comprises:
dynamically loading the panoramic image, and calculating the starting point and the range of texture coordinates of the panoramic image based on the shooting parameters of the panoramic image;
performing three-dimensional rendering of the rotated body surface based on the vertex position, the index, the texture coordinate data, the panoramic picture, the starting point of the texture coordinate of the panoramic picture and the range data of the rotated body surface;
and calculating a rotation body position transformation matrix according to longitude, latitude and altitude data in the shooting parameters of the panoramic image, and moving the rotation body curved surface to the corresponding position of the digital earth.
6. The method of claim 5, wherein the rotation body position transformation matrix is obtained based on a matrix transformation operation of:
translating the unit matrix to a geocentric rectangular coordinate system position which is rotated to the center point of the body curved surface to obtain a first matrix;
rotating the first matrix around a vertical axis by a longitude value to obtain a second matrix;
and rotating the latitude value of the second matrix around a specific horizontal axis to obtain a third matrix, and amplifying the third matrix in equal proportion to obtain the position transformation matrix of the rotated body.
7. The method of claim 5, wherein the displaying the panorama based on the projective transformation relationship between the digital earth viewpoint and the rotated body surface center point under the condition of the position change of the digital earth viewpoint, so that the digital earth and the panorama are displayed in a fusion mode when the digital earth viewpoint is transformed to the rotated body surface center point comprises:
calculating a digital earth viewpoint transformation matrix according to the position of the rotation body and the digital earth viewpoint angle;
transforming the digital earth viewpoint to a rotation body curved surface central point based on the digital earth transformation matrix, and fusing and displaying the digital earth and the panoramic image;
the method further comprises the following steps: when the digital earth viewpoint is converted to the center point of the revolution body curved surface, the digital earth and the rendered revolution body are displayed on the display interface.
8. The method of claim 7, wherein the digital earth viewpoint transformation matrix is derived based on a matrix transformation operation of:
translating the unit matrix to a geocentric rectangular coordinate system position which is rotated to the center point of the body curved surface to obtain a first matrix;
rotating the first matrix around a vertical axis by a longitude value to obtain a second matrix;
rotating the second matrix around a specific horizontal axis to obtain a third matrix;
calculating a pitch transformation matrix according to the pitch angle of the current viewpoint direction, and obtaining a fourth matrix based on the third matrix and the pitch transformation matrix;
and calculating an azimuth transformation matrix according to the azimuth angle of the current viewpoint direction, and obtaining the digital earth viewpoint transformation matrix based on the fourth matrix and the azimuth transformation matrix.
9. The method of claim 1, further comprising: and controlling the hiding and fusion display of the panoramic image on the surface of the revolution body through a transparency control variable.
10. A digital earth and panorama fused display device comprising a processor and a memory, wherein the memory has stored therein computer instructions for executing the computer instructions stored in the memory, the device implementing the steps of the method according to any one of claims 1 to 9 when the computer instructions are executed by the processor.
11. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 9.
CN202111463394.0A 2021-12-02 2021-12-02 Digital earth and panorama fusion display method and device Active CN114140593B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111463394.0A CN114140593B (en) 2021-12-02 2021-12-02 Digital earth and panorama fusion display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111463394.0A CN114140593B (en) 2021-12-02 2021-12-02 Digital earth and panorama fusion display method and device

Publications (2)

Publication Number Publication Date
CN114140593A true CN114140593A (en) 2022-03-04
CN114140593B CN114140593B (en) 2022-06-14

Family

ID=80387334

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111463394.0A Active CN114140593B (en) 2021-12-02 2021-12-02 Digital earth and panorama fusion display method and device

Country Status (1)

Country Link
CN (1) CN114140593B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116630567A (en) * 2023-07-24 2023-08-22 中国电子科技集团公司第十五研究所 Geometric modeling and rendering method for ellipsoidal route slice of digital earth

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103824228A (en) * 2012-11-16 2014-05-28 宁海县供电局 Application of three-dimensional system based on electric power GIS
CN203858755U (en) * 2013-12-10 2014-10-01 重庆交通大学 Data acquisition device for traffic scene three-dimensional reconstruction
CN106157304A (en) * 2016-07-01 2016-11-23 成都通甲优博科技有限责任公司 A kind of Panoramagram montage method based on multiple cameras and system
CN106373148A (en) * 2016-08-31 2017-02-01 中国科学院遥感与数字地球研究所 Equipment and method for realizing registration and fusion of multipath video images to three-dimensional digital earth system
CN108765576A (en) * 2018-03-28 2018-11-06 中国人民解放军92859部队 VIVE virtual earths based on OsgEarth roam browsing method
CN112584060A (en) * 2020-12-15 2021-03-30 北京京航计算通讯研究所 Video fusion system
CN112584120A (en) * 2020-12-15 2021-03-30 北京京航计算通讯研究所 Video fusion method
CN113593027A (en) * 2021-08-02 2021-11-02 四川汉科计算机信息技术有限公司 Three-dimensional avionics display control interface device
CN113593028A (en) * 2021-08-02 2021-11-02 四川汉科计算机信息技术有限公司 Three-dimensional digital earth construction method for avionic display control

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103824228A (en) * 2012-11-16 2014-05-28 宁海县供电局 Application of three-dimensional system based on electric power GIS
CN203858755U (en) * 2013-12-10 2014-10-01 重庆交通大学 Data acquisition device for traffic scene three-dimensional reconstruction
CN106157304A (en) * 2016-07-01 2016-11-23 成都通甲优博科技有限责任公司 A kind of Panoramagram montage method based on multiple cameras and system
CN106373148A (en) * 2016-08-31 2017-02-01 中国科学院遥感与数字地球研究所 Equipment and method for realizing registration and fusion of multipath video images to three-dimensional digital earth system
CN108765576A (en) * 2018-03-28 2018-11-06 中国人民解放军92859部队 VIVE virtual earths based on OsgEarth roam browsing method
CN112584060A (en) * 2020-12-15 2021-03-30 北京京航计算通讯研究所 Video fusion system
CN112584120A (en) * 2020-12-15 2021-03-30 北京京航计算通讯研究所 Video fusion method
CN113593027A (en) * 2021-08-02 2021-11-02 四川汉科计算机信息技术有限公司 Three-dimensional avionics display control interface device
CN113593028A (en) * 2021-08-02 2021-11-02 四川汉科计算机信息技术有限公司 Three-dimensional digital earth construction method for avionic display control

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
郭胜等: "《基于数字地球的三维云图实现技术》", 《首都师范大学学报(自然科学版)》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116630567A (en) * 2023-07-24 2023-08-22 中国电子科技集团公司第十五研究所 Geometric modeling and rendering method for ellipsoidal route slice of digital earth
CN116630567B (en) * 2023-07-24 2023-09-29 中国电子科技集团公司第十五研究所 Geometric modeling and rendering method for ellipsoidal route slice of digital earth

Also Published As

Publication number Publication date
CN114140593B (en) 2022-06-14

Similar Documents

Publication Publication Date Title
CN109934914B (en) Embedded city design scene simulation method and system
US8139111B2 (en) Height measurement in a perspective image
US7415356B1 (en) Techniques for accurately synchronizing portions of an aerial image with composited visual information
US8665266B2 (en) Global visualization process terrain database builder
US20080279447A1 (en) Computational Solution Of A Building Of Three Dimensional Virtual Models From Aerial Photographs
US7098915B2 (en) System and method for determining line-of-sight volume for a specified point
CN110852952B (en) Large-scale terrain real-time drawing method based on GPU
CN107993282A (en) One kind can dynamically measure live-action map production method
CN109242966B (en) 3D panoramic model modeling method based on laser point cloud data
CN109934911B (en) OpenGL-based three-dimensional modeling method for high-precision oblique photography of mobile terminal
CN108733711B (en) Distribution line space distance obtaining method based on three-dimensional GIS technology
CN110310367A (en) Based on large scene outdoor scene three-dimensional multi-angle 2.5D image lightweight browsing method
CN111656132A (en) Planning method and device for surveying and mapping sampling point, control terminal and storage medium
CN113593027A (en) Three-dimensional avionics display control interface device
Yoo et al. Image‐Based Modeling of Urban Buildings Using Aerial Photographs and Digital Maps
CN114140593B (en) Digital earth and panorama fusion display method and device
CN111527375B (en) Planning method and device for surveying and mapping sampling point, control terminal and storage medium
CN114494563B (en) Method and device for fusion display of aerial video on digital earth
KR100732915B1 (en) Method for three-dimensional determining of basic design road route using digital photommetry and satellite image
CN115859414B (en) Global scale geographic information base map cross-coordinate system using method
Frommholz et al. Inlining 3d reconstruction, multi-source texture mapping and semantic analysis using oblique aerial imagery
Deng et al. Automatic true orthophoto generation based on three-dimensional building model using multiview urban aerial images
Ma et al. Low‐Altitude Photogrammetry and Remote Sensing in UAV for Improving Mapping Accuracy
CN115409962A (en) Method for constructing coordinate system in illusion engine, electronic equipment and storage medium
CN111868656A (en) Operation control system, operation control method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant